The tech industry loves its garage start-up stories. From Hewlett-Packard to Google, stories of bootstrapped companies that grew into giants have inspired generations of entrepreneurs.
But the massive amounts of funding and computing power required by start-ups trying to keep up with today’s cutting-edge technology, the artificial intelligence used in chatbots like ChatGPT and Google Bard, make those inspirational stories a thing of the past. Can make
In 2019, Aidan Gomez and Nick Frost left Google to form an AI start-up in Toronto called Foghere that could compete with their former employer. After several months, they went back to Google and asked if it would sell them the enormous computing power they needed to build their own AI technology. After the arrangement was personally approved by Google’s chief executive, Sundar Pichai, the tech giant gave them what they wanted.
“It’s ‘Game of Thrones’.” That’s it,” said David Katz, partner at Radical Ventures, the first investor in Fog. He said that big companies like Google, Microsoft and Amazon are controlling the chips. “They’re controlling computing power,” he said. “They’re choosing who gets it.”
It’s difficult to build a groundbreaking AI start-up without the backing of “hyperscalers,” who control huge data centers capable of running AI systems. And it has put industry veterans in the driver’s seat again — leading to what many expect to be the most significant change for the tech industry in decades.
OpenAI, the start-up behind ChatGPT, recently raised $10 billion from Microsoft. It will put most of that money back into Microsoft as it pays for time on huge clusters of computer servers operated by the big company. These machines, spread over thousands of specialized computer chips, are essential for improving and expanding the skills of ChatGPT and similar technologies.
Competitors may not be able to keep pace with OpenAI until they have access to a similar amount of computing power. Cohair recently raised $270 million, bringing its total funding to over $440 million. It will use most of that money to buy computing power from companies like Google.
Other start-ups have made similar arrangements, notably a Silicon Valley company called Anthropic, founded in 2021 by a group of former OpenAI researchers; Character.AI, founded by two leading Google researchers; and Inflection AI, which was founded by a former Google executive. Inflection raised $1.3 billion in funding last week, bringing its total to $1.5 billion.
At Google, Mr. Gomez was part of a small research team that designed TransformerBasic technology used to build chatbots like ChatGPT and Google Bard.
Transformers are a powerful example of what scientists call a neural network – a mathematical system that can learn skills by analyzing data. Neural networks have existed for years, helping to run everything from talking digital assistants like Siri to instant translation services like Google Translate.
Transformers took this idea into new territory. Running on hundreds or even thousands of computer chips, it can analyze far more data, far more quickly.
Using this technology, companies such as Google and OpenAI began building systems that learned from massive amounts of digital text, including Wikipedia articles, digital books, and chat logs. As these systems analyzed more and more data, they learned to generate text themselves, including term papers, blog posts, poetry, and computer code.
These systems – called large language models – now underpin chatbots such as Google Bard and ChatGPT.
Long before the advent of ChatGPT, Mr. Gomez left Google to start his own company with Mr. Frost and another Toronto entrepreneur, Evan Zhang. The aim was to create a larger language model to rival Google’s.
At Google, he and his fellow researchers had access to an almost unlimited amount of computing power. After leaving the company, he needed something similar. So he and his co-founders bought it from Google, which sells access to similar chips through cloud computing services.
Over the next three years, Fog built a large language model rival almost any other, Now, it’s selling this technology to other businesses. The idea is to provide the technology needed for any company to build and run their own AI applications, from chatbots to search engines to personal tutors.
“The strategy is to create a platform that other people can build on and use,” Mr. Gomez said.
OpenAI offers a service along the same lines called GPT-4, which many businesses are already using to build chatbots and other applications. This new technology can analyze, create and edit text. But it will soon handle images and sounds as well. OpenAI is building a version of GPT-4 that can examine an image, instantly describe it, and even answer questions about it.
Microsoft’s chief executive, Satya Nadella, said the company’s arrangement with OpenAI is one of the types of mutually beneficial relationships it has developed with smaller competitors over a long period of time. “I grew up in a company that always did these kinds of deals with other companies,” he told The New York Times earlier this year.
As the industry races to match GPT-4, entrepreneurs, investors and pundits are debating who will be the ultimate winner. Most agree that OpenAI is a leader in this area. But Fog and a small group of other companies are building similar technology.
The tech giants are in a strong position because they have the vast resources needed to advance these systems more than anyone else. google too holds patent on transformerThe fundamental technology behind the AI systems that Foge and many other companies are building.
But there is one wild card: open source software.
Meta, another giant with the computing power needed to build the next wave of AI, recently open sourced its latest big language model, meaning anyone can reuse it and build on top of it. can build. Many in the field believe that this type of freely available software will allow anyone to compete.
“The collective mind of every researcher on Earth will beat any company,” said Amr Awdallah, chief executive of AI start-up Vectara and former Google executive. But they still have to pay for access to a much larger competitor’s data centers.