Microsoft Corp.
The chip pairs a redesigned memory system, 216GB of HBM3E (High Bandwidth Memory 3 Extended) delivering 7 TB/s and 272MB of on-chip Static Random-Access Memory (SRAM).
Microsoft said Maia 200 delivers three times the FP4 performance of Amazon.com, Inc.'s
Microsoft placed Maia 200 inside its heterogeneous AI infrastructure to serve multiple models, including OpenAI's latest GPT-5.2 models, and to support Microsoft Foundry and Microsoft 365 Copilot.
The Microsoft Superintelligence team will use Maia 200 for synthetic data generation and reinforcement learning to improve next-generation in-house models.
Microsoft deployed Maia 200 in its U.S. Central datacenter region near Des Moines, Iowa, and plans to bring it next to the U.S. West 3 region near Phoenix, Arizona, with additional regions to follow.
Microsoft said Maia 200 integrates seamlessly with Azure and will be supported by a preview of the Maia SDK, a full developer toolkit designed to help build and optimize models for the chip.
The company said the SDK includes PyTorch integration, a Triton compiler, optimized kernel libraries, and access to Maia's low-level programming language, giving developers both fine-grained control and the ability to easily port models across different hardware accelerators.
MSFT Price Action: Microsoft shares were up 1.67% at $473.72 at the time of publication on Monday, according to Benzinga Pro data.
