Microsoft is alleged to have teamed up AMD in order to support the chipmaker’s growth into artificial intelligence processors. A report states that Microsoft provides engineering resources to AMD to help them develop AI processors.
Bloomberg sources claim, in turn, that AMD is also helping Microsoft develop its own AI chips, codenamed Athena. Microsoft’s Silicon Division is reportedly working with around 200 employees on the project. The company has reportedly already invested $2 billion in its development. Microsoft spokesperson Frank Shaw, however, has denied AMD’s involvement with Athena.
If we receive confirmation from AMD or Microsoft, we will update the story.
Nvidia’s dominant market share in GPUs has allowed it to dominate the AI chip sector
As AI services such as OpenAI’s ChatGPT become more popular, the demand for processors capable of handling the massive computational workloads they require is increasing. Nvidia dominates this market due to its dominant position in the graphic processing unit (GPU) space. These specialized chips provide the computing power required. There is currently no alternative. This poses a problem to companies such as Microsoft, which rely on Nvidia’s expensive chips to power its Azure Cloud’s AI services.
Nvidia’s CUDA libraries are responsible for most of the AI progress over the last decade. Despite AMD’s position as a major competitor in the gaming industry, it still lacks a viable alternative to the CUDA environment for large-scale deployments of machine learning. AMD wants to be in a position to take advantage of the AI boom. The chipmaker held a earnings call on Tuesday. Chief Executive Lisa Su stated, “We are excited about our AI opportunity — this is the number one strategic goal.” We are at the beginning of the AI computing age, and adoption and growth rates are faster than ever before.
Su says AMD is in a good position to provide partially customized chips to its largest customers for their AI data centres. Su said, “I believe we have a complete IP portfolio that includes CPUs and GPUs as well as FPGAs, adaptive SOCs, DPUs and a highly capable semi-custom team.” The company also sees “higher volumes beyond game consoles.”
AMD is confident that its Instinct Mi300 data center chip can be adapted to generative AI workloads. Su said that MI300 was well suited for AI workloads and HPC workloads. “With the recent interest in AI generative, I’d say that the pipeline for MI300 here has expanded significantly over the past few months. We are excited about this.” We’re investing a lot of resources.”
Microsoft will continue to work closely with Nvidia to try and secure more processors from the company. Nvidia’s near-monopoly on GPU hardware has further restricted the availability of these chips. Microsoft and AMD aren’t the only companies trying to create their own AI chips. Google created its own TPU chip to train its AI models. Amazon also developed Trainium chips to teach machine learning computer models.