Microsoft has built a new powerful supercomputer in collaboration with Artificial Intelligence (AI) startup OpenAI, making new infrastructure available in Azure to train extremely large AI models, the company is announcing at its Build developers conference.
Microsoft announced a multi-year supercomputer partnership with OpenAI in 2019, including a $1 billion investment by the tech giant.
The supercomputer developed for OpenAI is a single system with more than 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server.
Compared with other machines listed on the TOP500 supercomputers in the world, it ranks in the top five, said Microsoft.
"Built in collaboration with and exclusively for OpenAI, the supercomputer hosted in Azure was designed specifically to train that company's AI models," the company announced at its virtual ‘Build 2020' conference on Tuesday.
Hosted in Azure, the supercomputer also benefits from all the capabilities of a robust modern cloud infrastructure, including rapid deployment, sustainable datacenters and access to Azure services.
"This is about being able to do a hundred exciting things in natural language processing at once and a hundred exciting things in computer vision, and when you start to see combinations of these perceptual domains, you're going to have new applications that are hard to even imagine right now," explained Microsoft Chief Technical Officer Kevin Scott.
As part of its ‘AI at Scale' initiative, Microsoft has developed its own family of large AI models, the Microsoft Turing models, which it has used to improve many different language understanding tasks across Bing, Office, Dynamics and other productivity products.