Microsoft partners with OpenAI to create Azure supercomputer
Microsoft has partnered with OpenAI to create an Azure-hosted supercomputer for testing large-scale models. The supercomputer will deliver eye-watering amounts of power from its 285,000 CPU cores and 10,000 GPUs (yes, it can probably even run Crysis.)
OpenAI may be a non-profit that was founded by one Elon Musk to market the moral development of AI technologies and he however, departed OpenAI following disagreements over the company’s direction.
Elon Musk skilled an MIT Technology Review profile in February, of OpenAI saying that it “should be more open,” which all organizations “developing advanced AI should be regulated, including Tesla.”
Microsoft invested $1 billion in OpenAI last year and it seems we’re just starting to see the fruits of that relationship. While most AIs today specialize in doing single tasks well, the subsequent wave of research is that they specialize in performing multiple directly.
“The exciting thing about these models is that the breadth of things they’re getting to enable,” said Microsoft Chief Technical Officer Kevin Scott.
“This is about having the ability to try to 100 exciting things in tongue processing directly and 100 exciting things in computer vision, and once you start to ascertain combinations of those perceptual domains, you’re getting to have new applications that are hard to even imagine immediately .”
So-called Artificial General Intelligence (AGI) is that the ultimate goal for AI research; the purpose when a machine can understand or learn any task a bit like the human brain.
“We believe it’s crucial that AGI is deployed safely and securely which its economic benefits are cosmopolitan and are excited about how deeply Microsoft shares this vision.”
Microsoft and OpenAI claim their new supercomputer would rank within the top five but don’t give any specific power measurements. To rank within the top five, a supercomputer would currently require quite 23,000 teraflops of performance. the present leader, the IBM Summit, reaches over 148,000 teraflops.
“As we’ve learned more and more about what we’d like and therefore the different limits of all the components that structure a supercomputer, we were really ready to say, ‘If we could design our dream system, what wouldn’t it look like?’” said Altman. “And then Microsoft was ready to build it.” Unfortunately, for now a minimum of, the supercomputer is made exclusively for OpenAI.