OpenAI CEO Sam Altman has publicly reaffirmed the company's strong relationship with Nvidia, dismissing recent reports suggesting a potential shift towards alternative chip providers. In a post on X, Altman described Nvidia as the "gold standard" in AI hardware, stating that OpenAI intends to remain a "gigantic customer" for a very long time. He expressed confusion regarding the rumors questioning the partnership's strength.
Altman's comments came in response to a Reuters report that OpenAI had been exploring alternatives to some of Nvidia's AI chips since 2025, citing dissatisfaction with their performance on certain inference-heavy workloads. This exploration was framed as a part of OpenAI's broader strategy to address rapidly increasing computing demands, rather than a complete departure from Nvidia. The Wall Street Journal further fueled speculation with a report that discussions about a potential $100 billion investment in OpenAI by Nvidia had stalled.
Jensen Huang, Nvidia's CEO, addressed the reports, clarifying that the $100 billion figure was "never a commitment". However, Huang emphasized Nvidia's continued commitment to investing "a great deal of money" in OpenAI and confirmed that Nvidia will "absolutely" participate in OpenAI's current funding round, potentially making it their largest investment ever. He also dismissed reports of friction between the two companies as "nonsense".
The core of OpenAI's exploration of alternative chip solutions reportedly stems from a growing focus on "inference," the stage where trained AI models generate responses to user queries. While Nvidia remains a dominant provider of chips used to train large-scale AI models, the increasing importance of inference performance as AI tools like ChatGPT scale has led OpenAI to consider other options. Specifically, OpenAI is said to be evaluating specialized hardware that could potentially support a portion of its inference computing needs.
OpenAI has reportedly engaged with startups like Cerebras and Groq, which design chips optimized for inference through large amounts of on-chip memory. These architectures can reduce delays associated with external memory access, improving response times for chatbots and AI-powered coding tools. OpenAI signed a $10 billion chip supply deal with Cerebras on January 14. While Nvidia remains central to OpenAI's infrastructure, the company's spokesperson noted that Nvidia delivers the best performance per dollar for many workloads and remains deeply integrated into OpenAI's infrastructure.
These developments occur against the backdrop of broader corporate negotiations and market movements. Amazon is reportedly in talks to invest up to $50 billion in OpenAI. Nvidia's share price experienced short-term pressure due to reports of shifting chip strategies and investment uncertainties. However, some analysts suggest that investors are reacting more to the uncertainty surrounding the investment details rather than a fundamental shift away from AI investments.
