CUTTING COST: Customers are evaluating ways to optimize cloud spending. (Photo: Istock)

The era of cloud optimization is upon us

As everyone prepares to jump headlong into generative AI and large language models, cloud will continue its strong performance.

Publisert

The big three cloud providers (AWS, Microsoft, Google) all reported earnings last week, and the word on every cloud executive’s lips was optimization. To wit, “Customers continue to evaluate ways to optimize their cloud spending in response to these tough economic conditions.” That was from AWS’ earnings call. From Microsoft, “Customers continued to exercise some caution as optimization … trends … continued.” And Alphabet/Google joined in the chorus of “slower growth of consumption as customers optimized GCP costs reflecting the macro backdrop.”

You can be forgiven for rolling your eyes at this “optimization” euphemism. In previous recessions we were a bit more straightforward: Customers are cutting costs in the face of uncertainty. Except, according to these same execs, customers aren’t cutting their net cloud spend. Instead, they’re cutting in some areas so that they can increase spending elsewhere. And which “elsewhere” is seeing disproportionate interest? Artificial intelligence.

Reallocating resources

Customers are cutting costs in the face of uncertainty.

We may not be in an official recession, but you’d never know it from listening to Amazon, Microsoft, and Alphabet walk through their earnings. The companies continue to grow rather than contract, even though they’re growing at surprisingly slow rates. AWS spluttered to 16% growth, but Microsoft Azure (27%) and Google Cloud (28%) didn’t fare much better, growing on much smaller revenue bases. These are all way off these companies’ metronomic growth in past quarters.

What’s to blame? Optimization.

That’s the word each company used throughout their earnings calls to describe customer behavior. These companies are fierce competitors for cloud workloads, but they seemed to be colluding on nomenclature. Amazon used the “O” word 12 times during its call; Microsoft, 15 times, and Google, 12 times.

Of the three cloud vendors, Amazon was perhaps the most emphatic about the optimization trend. According to Amazon CEO Andy Jassy, “Customers are pretty explicitly telling us that this is not a cost-cutting effort where [they spend] less money on technology or on the cloud.” Rather, he continued, “This is [companies] reprioritizing what matters most to [their] business … and trying to reallocate resources so [they] can build new customer experiences.”

Of the three cloud vendors, Amazon was perhaps the most emphatic about the optimization trend.

This has long been the promise of cloud: enabling enterprises to move with greater agility as they build new applications to care for their customers. Because of this, as I highlighted in June 2022, CIOs will cut spending in many areas, but cloud is somewhat recession-proof. The cost of sitting out potential innovation and digital transformation far exceeds any short-term benefit from saving up dimes and pennies.

Among various uses of this optimized cloud spend, one particular area stands out. As Jassy stressed: “Few folks appreciate how much new cloud business will happen over the next several years from the pending deluge of machine learning that’s coming.”

Survival of the cloudies

I’ve speculated that the big clouds increasingly view large language models (LLMs), specifically, and AI, generally, as a battleground for new workloads. This week the cloud CEOs confirmed it. The only word used more than optimization on the calls was AI, with Alphabet mentioning AI 52 times, Microsoft using it 36 times, and Amazon 12 times (though once was quite long, as Kif Leswing points out). Because of the potential customer interest in AI, each of the cloud giants is spending massive piles of cash to build up their capabilities.

NEXT: Generative Artificial intelligence and machine learning will be the new generation workloads in the cloud. (Foto: Istock)

The capital expenditures necessary to fund LLMs and AI are so big that this isn’t an area where we’re likely to see startups disrupt the big vendors. The closest thing to a startup in the space is OpenAI, but it’s backed by billions of dollars from Microsoft and others. Enterprises looking to catch the AI wave are almost certainly going to need help from the big cloud vendors. “We will continue to invest in our cloud infrastructure, particularly AI-related spend, as we scale to the growing demand driven by customer transformation,” Microsoft CEO Satya Nadella noted. “And we expect the resulting revenue to grow over time.”

Don’t expect those investments to pay immediate returns, Jassy warns. “Those [AI] projects … take time to build.” There’s a gestation period, he goes on, when companies must not only define what they’d like to build, but also plan where they can shutter or deprioritize some existing workloads. “Folks don’t realize the amount of nonconsumption right now that’s going to [turn into consumption] and be spent in the cloud with the advent of large language models and generative AI,” he suggests. Those workloads aren’t going to run on premises. Not most of them, anyway.

Think of our current recessionary environment as the optimization calm before the cloud spending storm. Enterprises are in experimentation mode, cutting back in some areas while positioning their AI bets for future growth. Now, as ever, is a great time to be using or selling cloud.