OpenAI, one of the most well-funded AI startups in business, is looking to manufacture its own AI chips.
Internal discussions regarding AI chip strategy have been ongoing since at least last year. according to He told Reuters as the shortage of chips to train AI models worsens. OpenAI is reportedly considering a number of strategies to advance its chip ambitions, including acquiring an AI chip maker and moving toward designing its chips in-house.
OpenAI CEO Sam Altman has made acquiring more AI chips a top priority for the company, Reuters reported.
Currently, OpenAI, like most of its competitors, relies on GPU-based hardware to develop models such as ChatGPT, GPT-4, and DALL-E 3. GPUs are well-suited for today’s best training because they can perform many calculations in parallel. Competent AI.
But the generative AI boom, a windfall for GPU makers like Nvidia, is putting a huge strain on the GPU supply chain. Microsoft is facing a shortage of the server hardware needed to run AI, which the company says could lead to service interruptions. warned In the summer financial report. Nvidia’s highest performing AI chip is reportedly Sold out until 2024.
GPUs are also essential for running and serving OpenAI models. The company relies on clusters of his GPUs in the cloud to run customer workloads. However, they come at a very high cost.
If ChatGPT queries grew to one-tenth the size of Google searches, they would initially require about $48.1 billion worth of GPUs to maintain operations, according to analysis by Bernstein analyst Stacey Rasgon. will require about $16 billion worth of chips each year.
OpenAI would not be the first company to pursue developing its own AI chip.
Google has TPUs (short for “tensor processing units”), which are processors for training large-scale generative AI systems like PaLM-2 and Imagen. Amazon offers its own chips to his AWS customers for both training (Trainium) and inference (Inferentia). And Microsoft, reportedlyis working with AMD to develop its own AI chip called Athena, and OpenAI is said to be testing it.
Indeed, OpenAI is well-positioned to invest heavily in research and development. The company has raised more than $11 billion in venture capital and is approaching $1 billion in annual revenue. The company is also considering a stock sale that could increase its secondary market valuation to $90 billion, according to a recent report in the Wall Street Journal.
But hardware, especially AI chips, is an unforgiving business.
AI chip maker Graphcore, which reportedly had its valuation reduced by $1 billion last year after its deal with Microsoft fell apart, said it plans to cut jobs, citing an “extremely challenging” macroeconomic environment. (The situation has become even more dire in the past few months, as Graphcore has reported declining revenues and increasing losses.) Meanwhile, Intel-owned AI chip company Havana Labs has laid off an estimated 10% of its workforce. And Meta’s custom AI chip efforts have been plagued by problems, leading the company to scrap some of its experimental hardware.
Even if OpenAI were to try to bring a custom chip to market, such an effort could take years and cost hundreds of millions of dollars a year. It remains to be seen whether the startup’s investors, including Microsoft, are willing to take such a risky bet.