While the market is experiencing a hard landing after the euphoria of the Covid period, Nvidia is not really in a slump. According to a Bloomeberg investigation, OpenAI, the creator of ChatGPT, was able to create it thanks to the huge amount of Nvidia GPUs used by Microsoft in its AI supercomputer. Microsoft’s supercomputer would thus be a monster built from tens of thousands of Nvidia A100 GPUs. The challenge began in 2019 when Microsoft invested $1 billion in OpenAI while agreeing to build a supercomputer for the startup’s needs.
Even today, the figures of this investment are still confidential. But Microsoft had to completely relearn how to create a computing center for the needs of this project. The main challenge was to create a computing network integrating thousands of GPUs to avoid overheating and power outages. But multiple constraints resulted from the aggregation of this power. Microsoft had to develop new software to increase efficiency and ensure that the network equipment could handle huge amounts of data. Finally, the question of cooling methods was also a complex issue given the times and the multiple issues around climate disruption. In all cases, Nvidia was always the partner of reference.
A big deal for Nvidia
Microsoft intends to reap the rewards of this investment and the release of its new AI-powered Bing engine, its Edge web browser and Microsoft Dynamics 365 are the first step. But the company is also looking to widen the gap and has just announced an upgrade of its AI architecture from Nvidia A100 GPUs to H100…A mega order in prospect for the greens especially since the agreement also involves the use of other technologies like NVSwitch and NVLink 4.0.