Highlights:

  • Microsoft has only released information about one in-house AI processor thus far, the Maia 100, which made its debut in November.
  • In addition to increasing the number of data centers it operates in Sweden, Microsoft plans to donate a portion of the $3.2 billion project’s funding to regional educational programs.

Microsoft Corp. to spend USD 3.2 billion or 33.7 billion Swedish crowns, to broaden the capacity of its data center in Sweden.

The investment will be spread out over two years by the corporation as part of an initiative that was unveiled recently. The company launched its first Azure region, or data center cluster, in Sweden three years prior to this relocation. Three facilities, one each in the cities of Staffanstorp, Sandviken, and Gävle, make up the cluster.

Microsoft will use a portion of the project’s funding to expand the three locations. The company plans to boost artificial intelligence workloads by deploying 20,000 new graphics processing units as part of this initiative. According to reports, Brad Smith said that the GPUs will be chips similar to the Nvidia H100.

The flagship data center GPU from Nvidia Corp. is two generations ahead of the H100. The chip manufacturer first released the older model in 2022, and then, last November, unveiled the much more competent new model with a bigger memory pool. The Blackwell B200, a GPU that Nvidia unveiled in March, is significantly faster and can train AI models multiple times quicker.

Despite this, there is still a lot of demand for the H100. About two months prior to the Blackwell B200’s launch, in January, Meta Platforms Inc. disclosed plans to purchase 350,000 units.

Not all the chips used in Microsoft’s ambitious plan to install 20,000 additional AI accelerators in its data centers in Sweden will come from Nvidia. Smith stated, “You will see us increasingly diversify the chips that we have. We’ve been public about being very bullish on Nvidia but also AMD and ultimately some of our own chips as well.”

Microsoft has only released information about one in-house AI processor thus far, the Maia 100, which made its debut in November. It is billed as one of the biggest processors made with a manufacturing node that uses five nanometers. A unique Ethernet-based network protocol, which Microsoft claims can handle 4.8 terabits of traffic per second per accelerator, is used to transfer data into and out of Maia 100 processors.

AI processors use liquid cooling technology to dissipate heat. Microsoft revealed that its data centers weren’t built to support the massive liquid chillers that are typically required to cool chip deployments when it unveiled the Maia 100 last year. The business overcame the difficulty by creating a unique heat dissipation technology.

Installing the Maia 100 in Microsoft’s Swedish data centers might be possible because of the company’s specialized cooling technology. In the future, the corporation may install more customized chips at the locations. During its Ignite conference in November last year, Microsoft unveiled the Cobalt 100, an internally developed central processing unit, and disclosed that it is working on a second-generation Maia accelerator.

In addition to increasing the number of data centers it operates in Sweden, Microsoft plans to donate a portion of the USD 3.2 billion project’s funding to regional educational programs. By 2027, 250,000 people are expected to have access to AI training. Only a few weeks have passed since competitor Google LLC unveiled a USD 1.1 billion plan to grow its data center site in Finland.