Microsoft is working on AI chips to avoid costly dependence on Nvidia and train large language models. According to The Information, Microsoft has been working on the chips secretly since 2019. Some Microsoft and OpenAI staff have already had access to the chips to test their performance with the latest large languages models such as GPT-4.
Nvidia has emerged as the leading supplier of AI server chip. estimates suggest that OpenAI may need over 30,000 A100 GPUs to commercialize ChatGPT. Nvidia H100 GPUs sell for over $40,000 on eBay. This shows the demand for high-end AI chips.
Microsoft, meanwhile, is looking to save money by bringing its AI efforts in-house. Microsoft is accelerating its work on Athena codename, a project that will allow it to create its own AI chip. Microsoft is not yet certain if these chips will be made available to Azure cloud customers. However, it plans to make the AI chips more widely available within Microsoft and OpenAI by next year. Microsoft also has a roadmap for the chips, which includes future generations.
Microsoft’s AI chips won’t be a direct replacement for Nvidia, but they could reduce costs as Microsoft pushes to introduce AI-powered features to Bing Office Apps GitHub and other places.
Microsoft has been working on its ARM-based chip for many years. Bloomberg stated in late 2020 that Microsoft was considering designing its own ARM processors for servers, and perhaps even a future Surface product. Microsoft has designed custom chips for Surface Laptop and Surface Pro X. We haven’t yet seen these ARM-based processors.
Microsoft would be the latest tech giant to develop its own AI chip. Amazon, Google and Meta have their own AI chips, but most companies still use Nvidia chips in order to run the latest large language models.