Videoage International October 2024

22 Artificial Intelligence, specifically Generative AI, is poised to disrupt the entertainment industry like no innovation since the advent of film itself. Just as the Lumière brothers astonished Parisian audiences in 1895 with moving pictures, today’s innovators can astonish us by creating the capacity to generate cinematic clips merely by typing prompts into a computer. For content owners, this moment presents not just a revenue opportunity but also an opportunity to enhance the value of their content by influencing the future of copyright law. Generative AI video models require vast amounts of video content — millions of hours — for training. Training involves teaching AI models to understand the world like a child learns to associate words with objects. For instance, if you prompt an AI to generate “a cowboy riding a camel through the streets of Cannes,” the model needs prior exposure to images of cowboys, camels, and Cannes to produce a coherent output. Unfortunately, many AI companies have been scraping video content from the Internet without authorization, often violating platforms’ terms of service. If your content is on YouTube, there’s a real possibility it has been used in this way. These companies argue that publicly available content falls under the “fair use” exception in U.S. copyright law, a claim they bolster by highlighting the administrative burden of licensing disaggregated content. So, how can video library owners both protect their content and seize this revenue opportunity? The answer lies in leveraging new infrastructures that facilitate licensing to AI companies. Emerging companies like Calliope Networks (video), Created By Humans (books), Rightsify (music), and Human Native (which handles multimodal AI) are simplifying the licensing process. Calliope Networks, for example, has assembled a global film and TV catalog specifically for AI model training. There’s also a growing emphasis on “provenance,” wherein AI companies demonstrate that their training data is ethically sourced. Xinsere is one such company, offering secure content storage and delivery accompanied by auditable smart contracts to prove content provenance. Content owners now face a critical decision: Should they litigate against unauthorized use, engage with AI companies through licensing, or ignore the issue altogether? Consider the example of Viacom in the 2000s. Faced with unauthorized use of their content on the burgeoning platform YouTube, Viacom chose litigation over engagement. They spent seven years in court and, despite reaching a settlement, lost cultural relevance among the “MTV Generation” as competitors who chose to engage flourished. In 2024, media companies are at a similar crossroads. Engaging through licensing not only opens new revenue streams but also strengthens legal positions. By demonstrating that viable licensing options exist, content owners can weaken AI companies’ “fair use” defense in court. What does an ideal customer for training data look like? Many AI companies are willing to pay for high-quality content that isn’t readily available online. Companies like Adobe, Flawless, and BRIA.AI are committed to “Responsible Training,” using only licensed or public domain content. Adobe’s Firefly model aims to prove that ethical sourcing doesn’t compromise quality. These companies address market demands from industries wary of legal repercussions from using unlicensed content. A recent article in London-based marketing publication The Drum emphasized this point: “While our teams have already been working with generative video tools from Runway [and other companies], we’ve only been able to consider them for internal and R&D purposes due to the [unlicensed] content their models have been trained on,” wrote James Young, head of creative innovation at ad agency BBDO. He added: “Firefly’s video tool has the capacity to massively expand our sandbox.” There’s also speculation that courts might even mandate the disposal of models built on unlicensed data, positioning responsibly trained models advantageously. When negotiating deals, volume and content variety are crucial for AI companies. Some startups focus on specialized content — like talking-head avatars — and seek specific types of footage. Early pricing benchmarks suggest $6 to $14 dollars per hour of high quality video, depending on format. Content owners should strive to create real economic value while negotiating use restrictions. Establishing deal precedents with clear restrictions — such as prohibiting redistribution or reproduction of the dataset — will set industry standards and influence future agreements, even with companies that don’t yet exist. By actively participating in shaping this new marketplace, content owners compel technology companies to meet their terms and support data-licensing infrastructures that facilitate transactions. As Nick Thompson, CEO of The Atlantic, said regarding their deal with OpenAI: “AI is coming. It is coming quickly. We want to be part of whatever transition happens.” By Dave Davis and Max Einhorn AI for Content Owners: Litigate, Engage, or Ignore Dave Davis is CEO of the Los Angeles-based Calliope Networks, which aggregates global audiovisual content to license as training data to AI companies. Before Calliope Networks, Davis spent 20 years at Hollywood studios, including NBCUniversal, Paramount Pictures, and 20th Century Fox, driving international content distribution. Davis is a member of the California Bar, and serves on the Copyright Society’s AI subcommittee. Max Einhorn is the founder of the Los Angeles-based consulting firm Dangerous Ideas, which helps media companies leverage Generative AI in both business and creative production. He was recently named SVP, Acquisitions & Innovations at the Los Angeles-based Shout! Studios, and functions as an advisor to Calliope Networks. VIDEOAGE October 2024 Film-TV Rights

RkJQdWJsaXNoZXIy MTI4OTA5