... One of the steepest challenges for computer animators and game developers is crafting realistic human characters without dropping their audience off the proverbial cliff of the uncanny valley. At the annual GPU Technology Conference this week, NVIDIA demonstrated “Face Works,” a technology made possible by their Titan graphics card. It’s a technology that may end up shielding our eyes from the uncanny valley forever.
NVIDIA defines Face Works as “advanced real-time character performance.” They’re able to take 32GB of facial data (the bump maps, texture maps, lighting, expressions, etc) and compress it down to 400MB in a new way of rendering facial expression. Considering that it takes an 8000-instruction long program to render each pixel on Ira’s face, that’s quite an achievement. ...
Watch the last couple minutes of the linked video. I thought I was looking at a live actor, but it was a computer creation. (Better than the human computer creations that have gone before, yes?)
But honestly, I don't know how this applies to what we call "animated features." Because if it's a synthetic visual creation that looks and acts like a live action image, won't audiences' eyeballs and brains register it as "live action?" The same way that they now register old-style character animation as a "cartoon?"
If you build something that looks completely real, isn't that what people watching will consider it to be?