- YouTube demonized two channels to share false trailers made in AI
- Some Hollywood studies secretly claimed the advertising income of the deceptive trailers
- Repression occurs in the midst of new contracts and laws that limit the replicas of unauthorized
If you have ever visited YouTube and click on a trailer for the next superhero movie and thought it seemed too good to be true, well, I could have been right. Lighting, intelligent edition and a first ai Fakery produced clips that attracted billions of clicks and gained a lot of effective through advertising. The shocking part is that much of that money apparently reached the same studies that you can expect to close any unauthorized use of your intellectual property, at least according to the information recently discovered by Deadline.
That side can now have finished with YouTube eliminating two of the largest houses of these false trailers, screen culture and KH Studio, of its associated program. That means that no more advertising income for them or studies, according to reports, obtain a part of the action.
Screen Culture has made many popular trailers full of shots generated by AI for the next films such as The Fantastic Four: First Steps and Superman. Kh Studio is more famous for his imaginary casting, such as Leonardo DiCaprio in the next Squid game Or Henry Cavill as the next James Bond. You would be forgiven by assuming the plots, the characters and the images on display were causing details of the films, but they occurred far from the real development of the film.
The falsifications were good enough to appear sometimes at searches before real trailers, and enough clicks could boost the YouTube recommendation algorithm to highlight falsifications above the real business. That translates into a lot of effective for a monetized video. It is likely why, according to Deadline, The studies made arrangements with YouTube to redirect the advertising income of these false trailers to their own accounts.
Towing tricks
Even so, YouTube has its own rules. The monetization agreement may have been well in theory, but the channels broke other rules. For example, to obtain advertising income, a creator cannot only remind the content of another person; They need to add original elements. A critic can show a brief clip of a movie to comment, but most of the video is the review, not the movie. Nor can the work of others copy, deceive viewers or make content with the “sole purpose of obtaining points of view.”
The screen of the screen and the KH study can appeal demonetization, but that could be a remote possibility. YouTube’s decision reflects a broader continuous debate on AI in the entertainment industry. The SAG-AFTRA strike highlighted the demands of the actors for the limits and control of any replica of the people in film and television. The final agreement reached after the long strike established new rules for the consent of an artist before any study can use AI to imitate its similarity.
In case that was not clear enough, California legislators approved two bills, except for the use of AI to recreate the voice or image of an interpreter without their consent, even posthumously. That makes it more difficult for dishonest studies or creators conjure digital versions of famous faces only to judge a trailer, real or other.
YouTube is somewhat stuck since fan made by fans have long been a type of popular content. However, using AI can make a false trailer look good enough to deceive people, even if only by accident. And YouTube does not want to encourage the practice monetizing it. For now, YouTube’s message is clear: you can imagine a world where Cavill is Bond or Galactus appears in Fantastic Four, but you can’t collect that fantasy if it is built only around.