“The downside, of course, is the margin stacking that results,” Harrowell said. “AWS is not the cheapest LLM API provider, and their margin is layered on top of Anthropic’s. Microsoft will want to bring it on-platform as soon as they understand it and have the capacity. They seem to be buying capacity in every direction at the moment, with the deal with Nebius possibly reflecting delays in the Maia AI-ASICs.”
This means that delays in Microsoft’s Maia AI-ASICs, which underpin Azure’s AI capacity, may be forcing the company to rely on AWS to run Anthropic’s models.
However, this kind of cooperation between competitors is not unusual for the tech sector, according to Sharath Srinivasamurthy, research vice president at IDC. For instance, Apple sources display panels from Samsung despite being direct competitors in the smartphone market.
This story originally appeared on Computerworld