Microsoft has collaborated with OpenAI to co-design Azure’s AI optimized down to the silicon for every layer of the latter’s models.
Microsoft has partnered with Open AI to introduce Maia AI Accelerator, a custom-designed artificial intelligence chip for running AI tasks and generative AI, specifically on Microsoft Azure hardware.
OpenAI is testing the Maia 100 AI Accelerator, designed specifically for the Azure hardware stack and will power internal AI workloads running on Microsoft Azure. “We’ve collaborated with Microsoft to co-design Azure’s AI infrastructure at every layer for our models,” said Sam Altman, CEO of OpenAI. “We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models. Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”
“Our goal is to ensure that the ultimate efficiency, performance and scale is something that we can bring to you from us and our partners,” Microsoft Chief Executive Officer Satya Nadella said at the conference. Maia will power Microsoft’s own AI apps first and then be available to partners and customers, he added.
In addition to Maia, the company also announced the release of Cobalt CPU, an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Azure Cloud.
The artificial intelligence chip and cloud-computing processor will start to roll out early next year to Microsoft’s data centres, initially powering the company’s services such as Microsoft Copilot or Azure OpenAI Service. The company also announced new software, Copilot Studio, that lets clients design their own AI assistants.
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centres to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group.