View all newsletters
Sign up for our free email newsletters

Fighting for quality news media in the digital age.

  1. North America
January 16, 2024updated 17 Jan 2024 9:54am

While New York Times litigates over AI, many media companies will liquidate – US Congress warned

The US Senate heard from news industry leaders about the threats posed by generative AI tools.

By Charlotte Tobitt

Conde Nast chief executive Roger Lynch has warned that “many” media companies could go out of business in the time it would take for litigation against generative AI companies to pass through the courts.

He instead urged US Congress to take “immediate action” by making a simple clarification that publishers should be compensated for the use of their content in the training and output of generative AI tools and that licensing deals must be struck for onward future use.

If legislators decide the “fair use” argument made by OpenAI and others in the generative AI industry is wrong, Lynch suggested they won’t need to do any more as the free market will sort out the situation.

OpenAI has maintained that its use of publisher content is legal because it is covered by “fair use”, which in US copyright law protects “transformative” work that adds something new and does not substitute the original work.

Lynch was speaking at a subcommittee hearing at the US Senate last week about the future of journalism in a post-AI world.

Select and enter your email address Weekly insight into the big strategic issues affecting the future of the news industry. Essential reading for media leaders every Thursday. Your morning brew of news about the world of news from Press Gazette and elsewhere in the media. Sent at around 10am UK time. Our weekly does of strategic insight about the future of news media aimed at US readers. A fortnightly update from the front-line of news and advertising. Aimed at marketers and those involved in the advertising industry.
  • Business owner/co-owner
  • CEO
  • COO
  • CFO
  • CTO
  • Chairperson
  • Non-Exec Director
  • Other C-Suite
  • Managing Director
  • President/Partner
  • Senior Executive/SVP or Corporate VP or equivalent
  • Director or equivalent
  • Group or Senior Manager
  • Head of Department/Function
  • Manager
  • Non-manager
  • Retired
  • Other
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
Thank you

Thanks for subscribing.

The hearing came two weeks after The New York Times launched a surprise lawsuit against ChatGPT creator OpenAI for infringing its copyright and threatening its ability to produce quality journalism by impacting its advertising, subscription, affiliate and licensing revenues.

Content from our partners
Free journalism awards for journalists under 30: Deadline today
MHP Group's 30 To Watch awards for young journalists open for entries
How PA Media is helping newspapers make the digital transition

Lynch told the hearing generative AI poses a threat that is “real and of great consequence” to the media and publishing industries.

He said: “Big tech companies understand that time is on their side, that litigation is slow, and, for many publishers, prohibitively expensive.

“The time to act is now and the stakes are nothing short of the continued viability of journalism.”

Asked about the difficulty of smaller publishers with fewer resources to pursue ad-hoc negotiations and litigation, Lynch added: “Certainly that would be a major concern, is that the amount of time it would take to litigate, appeal, go back to the courts, appeal, maybe ultimately make it to the Supreme Court to settle – between now and then there would be many, many companies, media companies that would go out of business.”

Conde Nast itself cut at least 300 jobs, or 5% of its global workforce, last year. Press Gazette analysis found that at least 8,000 journalism industry jobs were cut in the UK, US and Canada.

Lynch argued that securing licensing deals would “directly result in future investment in content”, pointing to the $140m going into the Australian journalism industry as a result of legislation forcing platforms to pay for news content.

Lynch said that generative AI tools, by providing users with “sometimes verbatim, sometimes paraphrased” content, train people to go to them rather than publishers for information and “unlike traditional search, they are keeping customers within their experiences, depriving us of the opportunity to connect with our audiences directly, customise our content for them, and generate advertising and subscription revenue, sales leads and other value data.

“By misappropriating our content in this way, they are directly threatening the viability of the media ecosystem.”

Opposing the tech companies’ view on “fair use”, Lynch said it was “designed to allow criticism, parody, scholarship, research and news reporting. The law is clear that it is not fair use when there is an adverse effect on the market for the copyrighted material… Fair use is not intended simply to enrich technology companies that prefer not to pay.

“If content is the raw material of gen AI, then it should be licensed and compensated, just as engineers and computer time must be paid for and acquired lawfully.”

Asked what he thought Congress could do to protect copyright law, Lynch said: “I think quite simply if Congress could clarify that the use of our content and other publisher content for training and output of AI models is not fair use, then the free market will take care of the rest. Just like it has in the music industry where I worked [as CEO of streaming service Pandora], in film and television, sports rights. It can enable private negotiations…”

OpenAI has also repeatedly noted that since last year it has allowed publishers to opt out of having their content be used “because it’s the right thing to do”.

But Lynch said this came “too late” as publishers’ content had already been used to train the tools.

“The only thing the opt outs will do is to prevent a new competitor from training new models to compete with them,” he said.

In terms of opting out of allowing your content to be used in an AI tool’s output, Lynch raised the fact that for search companies like Google and Microsoft Bing (which uses ChatGPT), “if you opt out of the output, you have to opt out of all of their search. Search is the lifeblood of digital publishers, most digital publishers, half or more of their traffic originates from a search engine. If you cut off your search engine, you cut off your business.”

However Google and Bing have both explained to publishers how to opt out of trawling without being blocked from their search results.

Danielle Coffey, president and chief executive of the News/Media Alliance which represents more than 2,200 publishers in the US, told the subcommittee that generative AI has proved to be an “exacerbation of an existing problem where revenue cannot be generated by, but in fact is diverted from, those who create the original work”.

She suggested several actions for Congress to take, including requiring AI developers to explain how their models produce outputs and provide links to materials cited and preventing them from conditioning or modifying the provision of other services such as advertising or search ranking as a result of content owners deciding whether or not to allow their content to be used.

At the same committee, National Association of Broadcasters president and chief executive Curtis LeGeyt said AI, including deepfakes, “imperils” the “unique level of trust” built by local TV and radio stations.

He also warned that use of broadcasters’ content in AI settings can increase their costs as well as threatening their own revenue streams.

“Broadcasters have already seen numerous examples where content created by broadcast journalists has been ingested and regurgitated by AI bots, with little or no attribution,” he said.

“Not only are broadcasters losing out on compensation for their own work product, but this unauthorised usage actually increases costs for local stations due to additional vetting of stories and footage and the costs associated with protecting broadcast content.

“Broadcasters’ expressive content is particularly valuable for AI ingestion precisely because it is vetted and trusted. If broadcasters are not compensated for use of their valuable, expressive works, they will be less able to invest in local news content creation.”

Topics in this article : , ,

Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog

Select and enter your email address Weekly insight into the big strategic issues affecting the future of the news industry. Essential reading for media leaders every Thursday. Your morning brew of news about the world of news from Press Gazette and elsewhere in the media. Sent at around 10am UK time. Our weekly does of strategic insight about the future of news media aimed at US readers. A fortnightly update from the front-line of news and advertising. Aimed at marketers and those involved in the advertising industry.
  • Business owner/co-owner
  • CEO
  • COO
  • CFO
  • CTO
  • Chairperson
  • Non-Exec Director
  • Other C-Suite
  • Managing Director
  • President/Partner
  • Senior Executive/SVP or Corporate VP or equivalent
  • Director or equivalent
  • Group or Senior Manager
  • Head of Department/Function
  • Manager
  • Non-manager
  • Retired
  • Other
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
Thank you

Thanks for subscribing.

Websites in our network