Digital doubles: Bringing actors back to life

Actor Peter Cushing (left, in “Star Wars”) was brought back to life digitally for the 2016 film, “Rogue One.”

Lucasfilm

It’s a technological miracle of sorts -- actors seemingly performing from beyond the grave. How do they do it? Call it a “double take,” as David Pogue of Yahoo Tech found out: 

The new “Star Wars” movie, “Rogue One,” was a big hit at the box office. Exciting story, thrilling effects, and gifted actors -- including one who’s been dead for over 20 years.

That’s right: British actor Peter Cushing was there, reprising his role as Grand Moff Tarkin, even though he passed away in 1994.

How did they do that?

Actor Guy Henry performed the new scenes, and then the Oscar-nominated special effects engineers replaced his face with Peter Cushing’s.

And how did the computer know exactly what Peter Cushing’s face looked like, down to the tiniest detail? That’s where Paul Debevec comes in.

Debevec invented the Light Stage, which sits at his office at the Institute for Creative Technologies, at the University of Southern California.

Surrounded by more than 10,000 LEDs, the subject is photographed with about 20 high-quality DSLR cameras, producing a series of high-resolution photos from different angles to reconstruct a 3-D model of the subject’s face.

More than a hundred famous actors have stood on this spot to be scanned for movies, including Angelina Jolie, Tom Cruise, Brad Pitt, Sigourney Weaver and Duane Johnson.

Once an actor has been scanned into the Light Stage, engineers can digitally insert him or her into scenes, even if that actor is unavailable, much older or younger, or deceased. That’s how actor Paul Walker was able to appear in “Fast and Furious 7,” even though he died partway into filming.

Debevec said, “We’ll have the actor make a succession of about 50 different facial expressions. And that produces all of the different motions of their face. But we also can record a facial performance from all these different angles, and then create a digital performance of that character that does exactly what they did in the video.”

The Light Stage might have cost several million dollars 10 years ago. But today, you can build a person-scanning setup with parts you pick up at the hardware store. 

Just ask Ari Shapiro. He runs the Character Animation and Simulation research group at the USC. He’s been developing a human scanning system that uses one hundred $20 cameras, sewn into a shower curtain purchased at Home Depot. The subject steps into the “shower” and presto!

COMPUTER:  “Please remain still … Scanning completed.”

And in just minutes Shapiro has a digital version of you, which he can animate.

But here’s the thing: It’s fine to create virtual clones of people as long as everybody knows it’s for entertainment purposes. But how long will it be before someone tries to pass it off as reality?

Pogue said, “Let’s say I decide to make a presidential candidate do something heinous and I release that as news. Is that plausible?”

“I think it’s not only plausible, I think that there are definitely people in various countries that are working on exactly that,” said Todd Richmond, the director of the mixed reality lab at USC’s Institute for Creative Technologies -- and a man who thinks a lot about the implications of digital clones. 

Should the government be involved? Should there be a new Bureau of Digital Ethics?  “I think artists should have a place at the table,” Richmond said. “Technologists and practitioners need to be at the table because they’re the ones who are knee-deep in the goo of this stuff. Politicians have to have some understanding of this, because invariably, policy will need to be made to address this.”

According to Richmond, it’s past time for us to consider the very real power of make-believe people.

“I can create a virtual version of somebody who can walk and talk and say things that they never actually did,” he said. “And that’s a power that’s never existed ever in the history of humans.”      

      
For more info: