(7-minute read)
The use of artificial intelligence (AI) in our everyday lives is increasing exponentially each year.
While consumers are exposed to AI in everything from self-checkouts at the grocery store to smart speakers (Alexa, Siri), professional performers face more serious challenges with AI as a threat to their livelihood and, if used for nefarious reasons, their reputation.
While AI technology is not new, it is growing at rapid speed with the application of AI to a new task being reported almost daily.
Between 2010 and 2021, global patent filings for AI technologies have grown over 50-fold (from 2,560 to 141,241).[1] According to a recent WIPO[2] study, there were 54,000 generated AI-related (GenAI) patent families filed between 2014 and 2023 with the number of GenAI patents increasing eightfold since the 2017 introduction of the deep neural network architecture behind the Large Language Models that have become synonymous with GenAI.
What’s more, the global AI market is valued at over $196 billion (an increase of around $60 billion since 2022) thanks to increasing practical use cases of AI technology, from content creation to self-driving cars, and the AI industry value is projected to increase by over 13 times over the next six years. Within the screen-based industry, Netflix’s automated recommendations technology is worth $1 billion in revenue annually alone.[3]
These numbers are startling, and the global creative class – especially those in the screen-based sector, including performers’ unions – is paying attention to these trends. Interestingly, image and video data lead GenAI patents followed by text and speech/music.
The U.S. actors and writers’ strikes dominated news headlines last summer with AI protections being one of the key issues of concern.
Additionally, SAG-AFTRA members who work under the union’s Interactive Media Agreement have been involved in a labour dispute with major videogame publishers since late July, with the main sticking point being protections against the use of AI.
In fact, Duncan Crabtree-Ireland, national executive director of SAG-AFTRA, said “AI has become the most challenging issue in many of the union’s negotiations.”
The U.S. performers’ union has, however, secured what has been dubbed as a ‘landmark agreement’ with AI startup Narrativ. The deal provides SAG-AFTRA members the option to license a digital replica of their voice to Narrativ for use in its audio advertising online marketplace. While the deal is being promoted as an ethical framework for AI in voice replication, a recent article in Forbes Magazine noted “it could also accelerate the shift from human to artificial voices, raising critical questions about the future role of human voice in media.” The article also questions whether “the Union effectively considered the consequences of AI ‘training,’” which includes the value of voice data, and ethical and legal challenges.
In addition to these concerns, it is also important to address how the rise of AI in the entertainment industry could disproportionately impact marginalized performers, including those from underrepresented racial, gender and disability groups. The ability of AI technologies to replicate voices and likenesses increases the risk these performers – who already face limited opportunities in the screen-based media industry – could be further sidelined in favour of cheaper, AI-generated alternatives. This would only exacerbate existing inequities as the distinct cultural, racial and gender identities these performers bring to their roles may be reduced to stereotypes or homogenized representations. Without careful regulation, AI could widen the gap in an industry already struggling with diversity and inclusion.
AI has also become a key concern here for screen industry unions in Canada. While ACTRA recognizes there are advantages to AI and AI tools being used in the industry, there are three key conditions (the “3C’s”) that must be met: Consent by performers, Compensation for performers, and Controls to ensure AI is not used for nefarious purposes.
The use of AI is top-of-mind for many ACTRA performers across the country. In a recent survey of ACTRA members, 98 per cent of participants expressed concern about potential misuse of their name, image and likeness (NIL) rights. Additionally, 93 per cent of participants expressed concern that AI will eventually replace human actors in certain roles or performances in the Canadian entertainment industry.
And performers have a right to be concerned. We’re already seeing instances where AI has been used without consent or control for notable public figures. This past winter, U.S. musician Taylor Swift was the target of sexually explicit, nonconsensual deepfake images made using AI, while this past spring, U.S. actor Scarlett Johansson was “shocked, angered and in disbelief” that the updated version of ChatGPT had a voice “eerily similar” to hers.
Canadian performers have also been victims of AI. ACTRA Toronto member Ellen Dubin is all too familiar with the dangers of AI misuse:
“I felt completely violated after a fan notified me that a major character I had voiced in an AAA video game had been compromised without my knowledge and had been downloaded onto a deepfake porn site. It was beyond anything I could have imagined.
As an actor, I was completely blindsided, but then I thought about all the others who were affected as well – the creators, writers, artists – and realized what a huge trickle-down effect the misuse of AI has on our industry.
No one should have to deal with having their reputation compromised. No one should have to compromise their integrity to be replaced by AI unknowingly! Actors should be able to go in and do a job knowing exactly what they are voicing and what their characters are going to look like and what they’re going to be doing – that is the consent part – as well as being compensated for the use of our voice, likeness or image.”
![](https://performersmagazine.com/wp-content/uploads/2023/03/Ellen-Dubin.png)
It’s getting easier and easier for users to find apps that produce non-consensual deepfake images where the app has the ability to pull videos directly from a variety of online sources – whether that be YouTube or other, more nefarious, websites – to create the deepfake. Media outlet 404Media.com highlighted the issues facing app stores in a recent article, which only reaffirms the urgency for updated government legislation.
While the U.S. government has tabled two new bills over the last 10 months aimed at protecting against GenAI abuses, Canada is falling behind and has yet to introduce similar legislative protections. This despite the fact that GenAI and the spread of misinformation are top of mind for many Canadian internet users, according to a recent survey by the Canadian Internet Registration Authority (CIRA).[4]
Over half of survey participants said they’re concerned about AI technology. Among those concerned, most cite its contribution to the spread of fake images or videos (69 per cent), mis/disinformation (67 per cent) and insufficient regulations/controls on its use (65 per cent) as the reason for their concern. Additionally, most Canadians (76 per cent) believe posting or sharing deepfakes should not be allowed on social media.
Clearly, there’s a lot at stake for Canadian workers and consumers across every industry when it comes to AI. Ultimately, it’s up to society to decide where it stands on supporting vs. replacing humans and to demand that the necessary laws and protections are implemented.
AI Resources
Want to learn more about AI and its impact on the screen-based industries? Check out ACTRA Toronto’s AI Resources page.
References
[1] AI & The Five Ws: Why, What, Who, When, Where?, Deutsche Bank Research, March 21, 2023
[2] China-Based Inventors Filing Most GenAI Patents, WIPO Data Shows, World Intellectual Property Organization (WIPO), July 3, 2024;
[3] 57 NEW Artificial Intelligence Statistics (Aug 2024), explodingtopics.com, by Josh Howarth, July 25, 2024;
[4] New poll finds that half of Canadians are concerned about generative artificial intelligence and the spread of misinformation, Canadian Internet Registration Authority (CIRA), July 10, 2024.
![](https://performersmagazine.com/wp-content/uploads/2024/09/IPA-BARGAINING-2024_SOCIAL_UWB_102924-1024x1024.png)
Interested in learning more about IPA bargaining? Check out ACTRA Toronto’s 2024 IPA bargaining page!