According to three people familiar with the matter, Google is testing a product that uses artificial intelligence technology to produce and pitch news stories to news organizations, including The New York Times, The Washington Post and The Wall Street Journal’s owner, News Corp.
The tool, known internally by the working title of Genesis, can record information — details of current events, for example — and generate news reports, the people said, speaking on the condition of anonymity to discuss the product.
One of three people familiar with the product said Google believed it could serve as a kind of personal assistant for journalists, automating some tasks to free up time for others, and that the company saw it as responsible technology that could help direct publications. industry away from the pitfalls of generative AI
Some executives who saw Google’s pitch described it as disturbing and asked not to be identified while discussing a confidential matter. Two people said it seemed obvious that an effort had been made to produce accurate and artful news stories.
A spokeswoman for Google did not immediately respond to a request for comment. The Times and The Post declined to comment.
“We have an excellent relationship with Google, and we appreciate Sundar Pichai’s longstanding commitment to journalism,” a News Corp spokesperson said in a statement, referring to Google’s CEO.
Jeff Jarvis, a journalism professor and media commentator, said Google’s new tool, as described, has potential pros and cons.
“If this technology can provide reliable factual information, journalists should use the tool,” said Mr. Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at the Craig Newmark Graduate School of Journalism at the City University of New York.
“If, on the other hand, it is misused by journalists and news organizations on topics that require nuance and cultural understanding,” he continued, “it could damage the credibility not only of the tool, but also of the news organizations using it.”
News organizations around the world are grappling with whether they should use artificial intelligence tools in their newsrooms. Many, including The Times, NPR and Insider, have told employees they plan to explore possible applications of AI to see how it can be responsibly applied to the high-stakes news, where seconds count and accuracy are paramount.
But Google’s new tool will certainly also worry journalists who have been writing their own articles for centuries. Some news organizations, including The Associated Press, have long used AI to generate stories on corporate earnings reports, among other things, but this remains a small fraction of the agency’s articles compared to those written by journalists.
Artificial intelligence could change that, by enabling users to generate articles on a larger scale that, if not carefully edited and checked, could spread misinformation and affect how traditionally written stories are perceived.
While Google has moved to developing and deploying generative AI at a breakneck pace, the technology has also presented some challenges for the advertising juggernaut. While Google has traditionally played the role of curating information and directing users to publisher websites to read more, tools such as its chatbot, Bard, present factual claims that are sometimes false and fail to drive traffic to more authoritative sources such as news publishers.
The technology was introduced when governments around the world called on Google to give news outlets a larger share of their ad revenue. After the Australian government tried to force Google to negotiate payments with publishers in 2021, the company expanded its partnerships with news organizations in several countries under the News Showcase program.
Publishers and other content creators have already criticized Google and other major AI companies for using decades of their articles and posts to help train these AI systems, without compensating the publishers. News organizations including NBC News and The Times have taken a stand against AIs sucking their data without permission.