Nvidia Pledges Up to $100B to Scale OpenAI’s AI Power

Chipmaking giant Nvidia announced plans to invest up to $100 billion in OpenAI, a landmark deal designed to massively expand the artificial intelligence company’s computing capacity and cement Nvidia’s dominance in the AI hardware race. The partnership, which will run over several years, is expected to give OpenAI the resources to deploy at least 10 gigawatts of AI data-center power using Nvidia’s chips and systems starting in 2026.

Under the terms of the agreement, Nvidia will provide hardware, systems, and support, while also receiving a non-controlling equity stake in OpenAI. Executives described the investment as critical for keeping up with the surging demand for AI training and deployment, which has skyrocketed since the success of tools like ChatGPT. Industry observers say the deal could represent one of the largest corporate commitments in the history of AI infrastructure.

The move comes as part of OpenAI’s broader “Stargate” initiative, a sweeping plan to construct a nationwide network of advanced AI data centers in collaboration with partners including Oracle and SoftBank. These centers are expected to push the boundaries of computing power, but also raise questions about energy consumption, infrastructure readiness, and the feasibility of delivering projects of such scale on time.

Reactions to the announcement have been mixed. Supporters see the investment as a bold and necessary step to ensure the U.S. maintains a competitive edge in the AI arms race. Critics, however, warn the deal has shades of “vendor financing”—essentially Nvidia funding its own customer to buy more of its chips—which could pose risks if AI growth slows or infrastructure hurdles mount. Some analysts also pointed out potential regulatory scrutiny, as the arrangement tightens Nvidia’s already outsized influence over global AI development.

For OpenAI, CEO Sam Altman has positioned the deal as a cornerstone for future innovation, arguing that breakthroughs in artificial intelligence hinge directly on scaling computing resources. For Nvidia, it is both a financial bet and a strategic reinforcement of its role as the indispensable supplier powering nearly every major AI effort worldwide.