According to Team Bondi, the face capture technology which gives L.A. Noire's characters their complex expressions will eventually be capable of capturing full-body acting, which will result in an unprecedented level of realism. Currently, the MotionScan system used by Team Bondi is only capable of capturing heads, but Depth Analysis, the company behind it, is working in tandem with the developer on the next phase.
"MotionScan embodies the future on a few levels," said Team Bondi boss Brendan McNamara in a recent interview with Develop. "Firstly, when this technology can capture full body performances, the level of realism will be hard to differentiate between game, film and television. That will make the gameplay experience pretty seamless from exposition to action."
McNamara's confidence is in direct opposition with earlier comments from Heavy Rain creator David Cage, who called L.A. Noire's tech a "dead end," and claimed that it "will never be able to shoot body and face at the same time." That's not the case, according to McNamara.
"Secondly, for film makers it will mean they can create whole scenes from capture data on the desktop the way they currently edit films. They will be able to adjust the action, move characters, change cameras and relight the scene until their heart's content. Overall, for filmmakers that's pretty exciting. And for games creators, it means we can compete with films and TV on a pure storytelling and performance level, along with leveraging all of the other interactive strengths that will pave the way for more exciting games."
McNamara wrapped up by saying, "For MotionScan the goal is to continually make it better. As I said earlier, it’s still very early days and we are listening to feedback from the people who are testing the rig and pipeline."
“We want to be able to use shaders more cleverly, take a look at subsurface scattering and also computer generated hair too, which we see a lot of our film customers are working with. We are also looking at retargeting so that you could take an actor’s performance from MotionScan and apply it to various non-human characters."
"We are already doing initial research for full body capture in costume for phase two – it’s exciting times for Depth Analysis and MotionScan for sure."
The times definitely are exciting, but not just for Team Bondi - David Cage isn't the only one challenging the developer's efforts to eradicate the "uncanny valley" with realistic facial expressions. In an interview in the next issue of EDGE, Alan Wake creator Remedy claimed that it's raising the bar even higher.
"L.A. Noire has set a bar for facial animation," said Remedy CEO Matthias Myllyrinne. Then, raising one hand higher than the other, he added, "But [Rockstar’s game] is here, we’re aiming to be here."
According to EDGE's summary of the interview, Remedy's lead animator John Root has created a powerful system "which uses motion capture as a starting point for generating scans of actors (in this case, Alan Wake’s physical model, lkka Villi) accurate to ½ mm, including 64 facial poses from which Root claims every human expression can be derived."
The post continues, "Armed with a highly accurate model of the actor, animators can then use sliders to adjust expressions based on the captured positions, allowing them to control and edit a realistic human face in real time rather than rely on bespoke performances for every cutscene and action, or manually animate the same expression."
"The results are strikingly nuanced, telegraphing a great deal of subtlety even at this early stage, the system promising a significantly more flexible way for animators to work. Despite this, Root has even bigger plans: one additional component yet to be implemented is colour mapping, a system which will simulate blood flow beneath the skin, adjusting its colour as brows are furrowed or lips pursed."
Regardless of the varying approaches, the apparent battle to seamlessly integrate realistic acting into games can only lead to more interesting experiences for us, so... keep up the fight, we guess!
Apr 11, 2011