Microsoft unveils its own AI chips, prioritizing cost-efficiency

0
157

Microsoft (MSFT.O) announced two custom-designed computing chips on Wednesday, joining other tech firms in bringing key technologies in-house to address the high cost of delivering artificial intelligence services. The company stated that it does not plan to sell the chips, but instead will use them to power its own subscription software offerings and as part of its Azure cloud computing service.

At its Ignite developer conference in Seattle, the company introduced a new chip called Maia, designed to speed up AI computing tasks and provide a foundation for its $30-a-month “Copilot” service for business software users, as well as for developers wanting to make custom AI services. The Maia chip was specifically designed to run large language models, a type of AI software that underpins Microsoft’s Azure OpenAI service and is the product of Microsoft’s collaboration with ChatGPT creator OpenAI.

Microsoft executives plan to address the high costs of delivering AI services by routing the majority of the company’s efforts to put AI in its products through a common set of foundational AI models. The Maia chip is optimized for that work, according to Scott Guthrie, the executive vice president of Microsoft’s cloud and AI group. Additionally, the company also announced that next year it will offer its Azure customers cloud services that run on the newest flagship chips from Nvidia (NVDA.O) and Advanced Micro Devices (AMD.O).

In addition to the Maia chip, Microsoft announced a second chip named Cobalt, designed to be an internal cost saver and an answer to Microsoft’s chief cloud rival, Amazon Web Services. The new chip is a central processing unit (CPU) made with technology from Arm Holdings. Microsoft aims to sell direct access to Cobalt to compete with the “Graviton” series of in-house chips offered by Amazon Web Services (AWS).

AWS will continue to innovate to deliver future generations of AWS-designed chips to provide better price-performance for customer workloads. Microsoft disclosed few technical details that would allow gauging the chips’ competitiveness versus those of traditional chipmakers. Rani Borkar, corporate vice president for Azure hardware systems and infrastructure, confirmed that both chips are made with 5-nanometer manufacturing technology from Taiwan Semiconductor Manufacturing Co.

The company also stated that the Maia chip would be strung together with standard Ethernet network cabling, rather than a more expensive custom Nvidia networking technology that Microsoft used in the supercomputers it built for OpenAI.

LEAVE A REPLY

Please enter your comment!
Please enter your name here