Nvidia plans new chip to speed AI processing, WSJ reports

Nvidia plans new chip to speed AI processing, WSJ reports


Published Sat, Feb 28, 2026 · 01:38 PM

NVIDIA plans to launch a new processor designed to help OpenAI and other customers build faster, more efficient AI systems, the Wall Street Journal reported on Friday, citing people familiar with the matter.

Nvidia is developing a new system for “inference” computing, a form of processing that allows AI models to respond to queries, the report said.

The new platform is set to be unveiled at Nvidia’s GTC developer conference in San Jose next month and will incorporate a chip designed by startup Groq, the report added citing people familiar.

Reuters could not immediately verify the report. Nvidia and OpenAI did not immediately respond to Reuters request for comment.

Reuters earlier this month reported OpenAI is unsatisfied with the speed at which Nvidia’s hardware can spit out answers to ChatGPT users for specific types of problems such as software development and AI communicating with other software.

It needs new hardware that would eventually provide about 10 per cent of OpenAI’s inference computing needs in the future, one of the sources told Reuters.

SEE ALSO

OpenAI and Amazon have struck a deal in which OpenAI will utilise two gigawatts of computing capacity powered by Amazon’s in-house Trainium chips.

Navigate Asia in
a new global order

Get the insights delivered to your inbox.

The ChatGPT maker has discussed working with startups including Cerebras and Groq to provide chips for faster inference, two sources said. But Nvidia struck a US$20-billion licensing deal with Groq that shut down OpenAI’s talks, one of the sources told Reuters.

In September, Nvidia said it intended to pour as much as US$100 billion into OpenAI as part of a deal that gave the chipmaker a stake in the startup and gave OpenAI the cash it needed to buy the advanced chips. REUTERS

Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.



Source link

Posted in

Liam Redmond

Leave a Comment