Dr Mark Sagar
Dr Mark Sagar is currently Director for the Auckland Bioengineering Institute's Laboratory for Animate Technologies, though he may arguably be best known as our resident two-time Oscar winner.
Dr Mark Sagar has a passion for recreating the human face, whether on a screen or on a sketchpad.
It has taken him from a Mechanical Engineering PhD to key technical roles in Weta, producing films like Avatar and King Kong. His contributions to the motion picture industry were recently recognised with an Academy Award.
Mark was one of four awarded a Scientific and Engineering Oscar for a lighting stage and facial rendering system, used to create realistic digital characters in Spider-Man 2. It helps computers make digital characters look real on the big screen, and was later used in Superman Returns, The Curious Case of Benjamin Button and Avatar.
The technique meant the subtle qualities of the skin, such as colour, texture, shine, and translucency, could be digitally reproduced in an entirely convincing way. Based on research by Paul Debevec, a Professor at the University of Southern California, the lighting stage illuminates an actor’s face from 500 different angles, telling a computer how to light a digital version of the actor in any conditions.
The technical Academy Awards were held in Beverly Hills in February 2010, a few weeks before the glitzy televised Oscars ceremony. “It’s not nearly as glamorous at the main awards, but it was the most glamorous thing I’ve been to,” Mark says, grinning widely.
Mark completed a Bachelor of Science, and a PhD in Engineering at the University of Auckland. His research, completed in the late 1990s, was a landmark study in how to develop an anatomically correct virtual eye and realistic models of biomechanically-simulated anatomy. It was one of the first examples of how believable human features could be created on a screen by combining computer graphics with mathematics and human physiology.
“Combining computer graphics with something organic, the eyeball, was a fantastic place to start. The eye is the visible part of the brain. It’s the main interface with the world and the most challenging part of the face to make believable in a digital form,” Mark says.
His supervisors, Professor Gordon Mallinson from Mechanical Engineering and Professor Peter Hunter, Director of the Auckland Bioengineering Institute (ABI), considered Mark to be a unique researcher because he had both outstanding artistic and mathematical abilities.
Mark was born in Kenya to an artist mother and systems analyst father. His mother taught him from an early age about the fundamentals of drawing faces. Before his PhD, he spent three years travelling the world sketching portraits before returning to Auckland to study.
“He came into a group that was pioneering mathematical modelling of biological functions, and he added an extra layer by thinking about how we could do it in the most visually realistic way, so it made sense to clinicians,” according to Professor Hunter.
“You couldn’t do this research unless you were an artist", according to Professor Mallinson. “It is almost like he had full use of both the left and right hemispheres of the brain, which is quite rare.”
Mark’s virtual eye was made for a surgery robot being developed by Peter’s brother, Professor Ian Hunter at the Massachusetts Institute of Technology. After completing his thesis, Mark relocated to the States to join Ian’s lab.
“I’d been making eyeballs but I was definitely interested in extending this to faces. It was a natural progression, influenced by my portrait work. My goal was to make a completely photorealistic digital actor that people wouldn’t suspect isn’t real. I really wanted to push it as far as we could go,” he says.
Mark ended up in an MIT spin-out company set up in Hollywood to realise the potential of their anatomically-based graphical animations to the film industry. One of their first projects was a digital animation of Jim Carrey’s face pulling complicated expressions. It never made it to the big screen, but it was viewed by “everyone” in Hollywood, creating a buzz. The company went on to make short films, demonstrating the scope of what was now being achieved in digital animation, including a film called Young at Heart, which portrayed a young actress as a very convincing 80-year-old woman.
And then the dot-com crash came. It wiped out the production company; however Mark was able to move on to LA-based Sony Imageworks, where he applied the lighting stage techniques to his first blockbuster film, Spider-Man 2.
It was also where he returned to his main passion of recreating facial expression. He continued working on motion capture techniques, which essentially record an actor’s movements and expressions to create a computer-generated character.
In 2004 he relocated back to New Zealand, where he joined Weta and was given the opportunity to work on Peter Jackson’s King Kong. On this film he was able to push the boundaries of motion capture techniques much further in an effort to give a gorilla highly believable expressions and emotions.
“It wasn’t a speaking part, so capturing the subtleties in the eyes and emotions was critical to how that character came across,” Mark says.
“What I love about film is you get crazy problems like how to convert human expressions into gorilla expressions. No-one has ever had to solve that before. But if you get it wrong, people know straight away.”
A few years later the technology took another huge leap forward when Mark and his Weta colleagues started collaborating with James Cameron on Avatar. The possibility of working with Cameron pushed the team to make the motion capture system work in real-time.
“He wanted to capture actor’s faces during a scene using helmet cameras, and convert that information instantly into digital alien characters,” Mark says.
“The great thing about working with James Cameron is he knows when something can be achieved that’s never been done before. He understands that as long as the resources are put to one of his visions, it can be done.”
Mark says the blue-skinned alien characters in Avatar are so believable because a great deal of effort and attention went into the eyes. Every subtle contour of the eyelid and movement of the eyeball had to be just right.
“The success with Avatar is the shock people get when a very alien creature comes across in a natural way. They think ‘how did that get past that part of my brain?’ And in Avatar it’s not just the faces in the film; they’ve meticulously created an entire visual world - every blade of grass, the wind through the trees, the light bouncing off leaves – it’s incredible.”
Mark insists though that we are still at the tip of the iceberg. His goal remains to capture a sense of consciousness in his characters that is completely indistinguishable from a real human.
“At some point in the future we should have a decent enough computational model to create the external manifestation of consciousness. It's so cool to explore because it taps into what it means to be alive; people just know when something is alive or not.”
“The problem with film though is it’s a passive medium, and we could take this so much further, into new forms of entertainment, human computer interactions, and into the medical fields, helping surgeons with simulated patients; the possibilities are endless.”
To get there, Mark hopes we can create more synergies between different fields, from psychology and human behaviour to biomedical engineering and the arts, with a common goal in mind.
“There are so many technologies that haven’t yet been applied to creating virtual humans, and with places like Weta and ABI, we have the creative excellence in New Zealand to do it really well.”