The 3 Biggest AI Questions Actors Should Be Worried About

The writers on strike worry about what AI might do to their livelihoods, but the actors already know that. Their work is based on voice and appearance, which AI can clearly replicate and manipulate, but the law governing AI is complicated.

SAG-AFTRA is pushing for informed consent and redress in the union’s ongoing negotiations (which conclude this week, after a brief extension) with the studies. The language that the WGA proposed in its own battle – and what the DGA has now sanctioned – is all about protecting job losses from AI, but this is not enough for the actors. The DGA currently wrote that AI is not a person and producers should consult a director before using AI. This is still important for the actors and a good start, but SAG-AFTRA will aim for broader protections specific to its members.

IndieWire spoke with experts in AI cinema, visual effects and advertising law to outline the three biggest concerns that should be on SAG-AFTRA’s mind.

Owen Thiele, Patti Harrison, Jimmy Tatro, Molly Gordon, Nick Lieberman, Noah Galvin and Ben Platt at the IndieWire Sundance Studio presented by Dropbox on January 20, 2023 in Park City, Utah.

Adam Driver in 65

Who can be the AI ​​Me?

Tom Hanks was one of the first actors to work with artificial intelligence, through the mo-cap production of Robert Zemeckis’ 2004 “The Polar Express.” The technology and the film were then seen as cutting edge, even if only partially successful; was a box office flop and some critics complained that his characters didn’t seem quite human and mired in the “uncanny valley”. Zemeckis and Hanks are now in production on “Here,” a 2024 Sony title that will use AI to age Hanks to play more iterations of the character than he is.

The actor served as a producer on “The Polar Express” and is producing “Here,” but he realizes that artificial intelligence could also extend his performances long after his death — and there’s very little he can do about it. to check the result.

In a May 13 interview on “The Adam Buxton Podcast,” Hanks described it as a “bona fide chance right now” to “pitch a seven-film series that would star me, in which I would be 32 years old to now until kingdom comes. I might get hit by a bus tomorrow and that’s it, but my performances can go on and on. Outside of understanding that it was done by artificial intelligence or a deep fake, there will be nothing to tell you it’s not me.

Entertainment attorney Simon Pulman, a partner at Pryor Cashman, told IndieWire that, when it comes to shooting a commercial, there are advertising laws that protect an actor’s likeness. For example, I can’t create a brad pitt or sandra bullock artificial intelligence and ask them to approve a product without paying them. perform an AI resurrection.

“This is certainly an area that varies from state to state. There are some organizations that claim to possess the likeness of dead celebrities, and they will go after that quite aggressively,” Pulman said. “You have two kinds of questions: What happens when people are increasingly using AI to put famous dead people in commercials, and what happens when you put dead people in movies?”

Ad caps, Pulman explains, are for commercials. A film or television show is a work of expression, and Pulman says SAG-AFTRA is seeking “explicit and clear guidelines for replicating actors’ names, likenesses and performances” in a way that is an extension of their existing contract and to integrate current legislation. That’s important, because actors like James Earl Jones, 92, have already agreed to use his voice to create synthetic versions of Darth Vader for years to come.

James Earl Jones/David Prowse as Darth Vader and Carrie Fisher
“Star Wars”Image courtesy

The guild has already said it expects the use of AI and digital doubles to be negotiated through the guild, and members with commercial and low-budget contracts already have some similarity protections, but any guidelines will need to be specific to that. contract with the guild. AMPTP extension. Guidelines released last week by IATSE may provide a hint at how other guilds are thinking, but Pulman says it’s hard to get the full picture, simply because technology is moving so fast.

“There are some use cases where it’s easier to prescribe. On the one hand, the very benign abuses and on the other, the abuses that SAG will want to limit,” Pulman said. “There’s more of a gray area in between, because we can’t see all the uses.”

Do I have creative control over my AI self?

AI can give actors the ability to appear as if a single performance could contain multitudes, as it allows editors to manipulate an actor’s mouth and face to choose different words or languages. In Lionsgate’s R-rated film “Fall,” the TrueSync AI manipulated the actor’s mouth to match the dialogue alterations for a PG-13 version.

Actors may be okay with it as long as it doesn’t change their performance, but there’s no guarantee it won’t. What if the director or studio doesn’t like an actor’s creative choice? Or what if a script change means a director needs a shot of the actor smiling rather than frowning? AI gives the director the power to make the change quickly or try multiple alternatives, rather than costly reshoots that require the actor to be present.

The key lies in the “source code” of the actor. This is what Remington Scott, CEO of VFX company Hyperreal, calls the AI ​​model based on an actor’s previous performances. Hyperreal’s mission is to enable “creators to own, copyright, and monetize” their digital identities, like the aging Paul McCartney or the Notorious BIG who performed in the metaverse. These were created with specific datasets relating to an individual’s movement, speech, appearance, and logic behind their actions. The more data the model acquires, the more realistic and specific it becomes.

“If I have all of Paul Newman’s movements, everything he does – walking, talking, standing, sitting – and we put that into our database, we can now ask a director to say, ‘OK, Paul, get up, go to door, turn around, say your joke and go out.’ And then he does it,” Scott said. “It’s Paul Newman who does it.”

Pulman predicts that SAG-AFTRA will likely say that AI cannot be allowed to change the fundamental character of a performance, which AMPTP will object to with the exception of voice-over, subtitles, color corrections, and other corrections common in post production.

But the guild will also want to know how those AI models are trained and who makes those decisions. Edward Saatchi, the Emmy-winning producer at Oculus Story Studio who now runs his own AI shop Fable Studios, uses Jack Nicholson as an example. Is the model trained on the Jack in “Chinatown”, “As Good As It Gets” or both? How does this affect the performance or image of the AI?

“Which one do you take?” Saatchi asked. “Does it just depend on the director? Can the director change his mind halfway through? In 2021, Saatchi produced a VR film that premiered at the 2021 SXSW Film Festival, “Wolves in the Walls”, adapted from the Neil Gaiman comic of the same name. He played Lucy, an AI-enabled character developed with an early version of ChatGPT.

Saatchi is now producing “White Mirror”, an animated feature film about artificial intelligence, made by artists with access to AI technology from Runway, Open AI, Midjourney and NVIDIA. Usually, an animated film costs at least $10,000 a second, but with “White Mirror,” the costs, according to Saatchi, are less than $10,000 a minute, 60 times lower.

“Self [a director is] like, ‘OK, now you’re going to be more fun. Now, I want you to be this, ‘It can get pretty complicated pretty fast,’ Saatchi said. “You’re giving the director an incredible, possibly wrong, amount of control.”

Who owns my AI self?

Deepfakes and voice modulators can be used to replace makeup, costumes or awkward VFX dots on an actor’s leotard. “In real time, the actor is able to see himself – as well as the director, as well as the co-stars – precisely as this younger person and acting in the moment,” Saatchi said. “With this, you’ll be able to be someone else.”

What’s much less clear is who gets paid for that performance. “Over the next 12 months, each individual actor’s estate or actor’s production company or actor’s team will set the barriers of how they want it to be used so they can be compensated fairly for it,” Saatchi said.

A 2013 sci-fi film starring Robin Wright, ‘The Congress’ argued that a company could possess an AI likeness for all eternity creating spectacles for decades to come. Saatchi said he suspects AI development will stabilize around the system we have now where a bespoke AI model is used film by film, and Pulman agrees that the guild will demand that any AI model be destroyed after its use.

Here’s the tricky part, according to Scott: An actor should control the rights to his own image, but the IP that has value is an actor’s images in a movie or show. Those images are owned and copyrighted by the studios, which must be paid for when the generative AI uses the material to create new content. The industry still hasn’t figured that out either.

Currently, Generative AI tools pull whatever they can find on the internet and use it to train their models, and no one gets paid. If the training data went into a generative AI model owned and authenticated by the actors themselves, there would be something (financially) for SAG members.

Scott says he has the right way to do it. When he builds an actor’s digital double, Hyperreal develops the “source code” derived from an individual’s movement, voice, appearance and logic. The company can create detailed 3D models around iconic images and moments that might otherwise be copyrighted. Scott’s customers then own the underlying data of the digital model and can license it for future use.

“We are working on building the AI ​​so that it is not contaminated by ChatGPT,” he said. “What we are building is very specific to talent. So that way they own it. It’s not off-brand. It’s that person’s actual source code.

#Biggest #Questions #Actors #Worried
Image Source :

Leave a Comment