Microsoft has revealed a long-refined project involving its own custom silicon, introducing two new server chips. Unveiled at Microsoft Ignite, the Azure Maia AI Accelerator and the Azure Cobalt CPU represent the company’s foray into custom chip design. Notably, the ARM-based nature of the Azure Cobalt CPU marks a departure from the traditional dominance of Microsoft and Intel in the computing market.
Microsoft sought feedback on the Azure Maia from OpenAI, using the company’s models for testing. OpenAI CEO Sam Altman highlighted that the updated Microsoft Azure will not only benefit from the collaboration but also offer customers the opportunity to train improved models more cost-effectively.
These custom-designed chips aim to optimize Microsoft’s infrastructure without relying on third-party solutions. The company likens the approach to building a house, allowing for control over every design choice and detail to tailor elements for Microsoft’s cloud and AI workloads. The chips will be integrated into custom server boards and placed within specially designed racks compatible with existing Microsoft data centers. The synergy between hardware and software, co-designed for maximum efficiency, is expected to unlock new capabilities and opportunities.
Microsoft intends to deploy the Maia 100 AI Accelerator to power some of Azure’s most significant internal AI workloads. The company asserts that both the accelerator and Azure Cobalt CPU will enhance efficiency and performance. These chips are scheduled to be integrated into Microsoft’s data centers early next year, supporting services like Microsoft Copilot (now encompassing Bing Chat) and Azure OpenAI Service.