The platform said people will be able to request the removal of videos that use AI to simulate an identifiable person
YouTube will soon require users to to disclose when their videos feature content made with artificial intelligence.
As part of a blog post Tuesday on the company’s evolving approach to AI, the platform acknowledged AI’s potential to be used to generate content that can “mislead viewers — particularly if they’re unaware that the video has been altered or is synthetically created,” adding that in the coming months, they will be rolling out updates that will “inform viewers when the content they’re seeing is synthetic.”
YouTube will also allow people to request the removal of content “that simulates an identifiable individual, including their face or voice.”
“Not all content will be removed from YouTube, and we’ll consider a variety of factors when evaluating these requests,” the company explained in the announcement. “This could include whether the content is parody or satire, whether the person making the request can be uniquely identified, or whether it features a public official or well-known individual, in which case there may be a higher bar.”
The Google-owned platform also said that it will have a similar removal process for its music partners that will give them the option to “request the removal of AI-generated music content that mimics an artist’s unique singing or rapping voice.” The requests will be available to labels or distributors representing “artists participating in YouTube’s early AI music experiments,” while access will later be expanded to additional labels and distributors.
YouTube’s updated policies come at a time when AI has become a point of contention in the entertainment industry. Last week, SAG-AFTRA reached a tentative agreement on a new contract with the Alliance of Motion Picture and Television Producers (AMPTP), after months of hard-nosed negotiations over AI protections. Of the provisions outlined in the new contract, actors’ consent played a large role, with studios being required to request consent before making digital replicas of actors and disclose what the replica will be used for, along with actors being entitled to compensation for the creation and use of their replicas.