Skip to content

To build AI technology, startups are turning to bigger rivals for help

    The tech industry loves the stories about starting garages. From Hewlett-Packard to Google, the stories of bootstrap companies turned giants have inspired generations of entrepreneurs.

    But the massive amounts of money and computing power required for startups trying to make it with today’s latest technology, the artificial intelligence used in chatbots like ChatGPT and Google Bard, may make those inspirational stories a thing of the past.

    In 2019, Aidan Gomez and Nick Frosst left Google to create an AI start-up in Toronto called Cohere that could compete with their former employer. A few months later, they went back to Google and asked if it would sell them the massive computing power they would need to build their own AI technology. After Google CEO Sundar Pichai personally approved the deal, the tech giant gave them what they wanted.

    It’s Game of Thrones. That’s what it is,” said David Katz, partner at Radical Ventures, Cohere’s first investor. The big companies like Google, Microsoft and Amazon, he added, control the chips. “They control the computing power,” he said. “They select who gets it.”

    Building a breakthrough AI start-up is difficult without the support of “the hyperscalers,” who control the massive data centers that AI systems can run. And that has put industry giants at the helm — again — of what many believe will be the most significant shift for the tech industry in decades.

    OpenAI, the startup behind ChatGPT, recently raised $10 billion from Microsoft. It will pump most of that money back to Microsoft as it pays for time on the massive clusters of computer servers run by the larger company. These machines occupy thousands of specialized computer chips and are essential to improving and expanding the capabilities of ChatGPT and similar technologies.

    Competitors can’t keep up with OpenAI unless they get their hands on comparable amounts of computing power. Cohere recently raised $270 million, bringing its total funding to more than $440 million. It will use much of that money to buy computing power from the likes of Google.

    Other startups have made similar arrangements, most notably a Silicon Valley company called Anthropic, which was founded in 2021 by a group of former OpenAI researchers; Character.AI, founded by two top Google researchers; and Inflection AI, founded by a former Google executive. Inflection raised a $1.3 billion round of funding last week, bringing the total to $1.5 billion.

    At Google, Mr. Gomez was part of a small research team that designed the Transformer, the fundamental technology used to create chatbots like ChatGPT and Google Bard.

    The Transformer is a powerful example of what scientists call a neural network: a mathematical system that can learn skills by analyzing data. Neural networks have been around for years, helping with everything from talking digital assistants like Siri to instant translation services like Google Translate.

    The Transformer took the idea into new territory. By using hundreds or even thousands of computer chips, it could analyze much more data much faster.

    Using this technology, companies like Google and OpenAI began building systems that learned from vast amounts of digital text, including Wikipedia articles, digital books, and chat logs. As these systems analyzed more and more data, they learned to generate text themselves, including theses, blog posts, poetry and computer code.

    These systems, called large language models, now support chatbots such as Google Bard and ChatGPT.

    Well before the arrival of ChatGPT, Mr. Gomez Google to start his own company with Mr. Frost and another Toronto entrepreneur, Ivan Zhang. The goal was to build large language models similar to Google’s.

    At Google, he and his fellow researchers had access to almost unlimited amounts of computing power. After he left the company, he needed something similar. So he and his co-founders bought it from Google, which sells access to the same chips through cloud computing services.

    Over the next three years, Cohere built a large language model that rivaled almost any other. Now it sells this technology to other companies. The idea is to provide every business with the technology they need to build and run their own AI applications, from chatbots to search engines to personal tutors.

    “The strategy is to build a platform that others can build on and experiment with,” Gomez said.

    OpenAI offers a service along the same lines called GPT-4, which many companies already use to build chatbots and other applications. This new technology can analyze, generate and edit text. But soon he will also be able to handle image and sound. OpenAI is preparing a version of GPT-4 that can examine a photo, describe it directly and even answer questions about it.

    Microsoft CEO Satya Nadella said the company’s agreement with OpenAI is the kind of mutually beneficial relationship it has long cherished with smaller competitors. “I grew up in a company that always made these kinds of deals with other companies,” he told The New York Times earlier this year.

    As the industry races to match GPT-4, entrepreneurs, investors and pundits are debating who the ultimate winners will be. Most agree that OpenAI leads the way. But Cohere and a small group of other companies are building similar technology.

    The tech giants are in a strong position because they have the massive resources required to take these systems further than anyone else. Google also has a patent on the Transformer, the fundamental technology behind the AI ​​systems that Cohere and many other companies build.

    But there is a wild card: Open source software.

    Meta, another giant with the computing power needed to build the next wave of AI, recently made its latest major language model open source, meaning anyone can reuse and build on it. Many in the field believe that this type of freely available software will allow anyone to compete.

    “The collective mind of all the researchers on Earth would beat any company,” said Amr Awadallah, CEO of the AI ​​start-up Vectara and a former Google executive. But they will still have to pay for access to a much larger competitor’s data centers.