How AI creates a scarcity play

Portfolio manager outlines how AI buildout creates ‘rolling bottlenecks’ in a host of sectors, views that trend as advantageous if played right

How AI creates a scarcity play

Artificial Intelligence does not benefit from economies of scale, and that makes it fundamentally different from other technology mega-trends. Traditional software is the clearest contrast. Where a traditional software program is expensive to build upfront, largely in the form of human capital required to build the program, each additional user costs very little. As more people pay to use the software, its economics improve and it benefits from scale.

Generative AI has no such scaling benefits. Each additional user requires the same amount of computing power as the user before them, arguably they require more compute as learning how to use an AI tool can often involve running more prompts until someone learns how best to get results. That’s why the development of AI requires an infrastructural buildout perhaps on a greater scale than the laying of global fibreoptic networks or even the construction of transcontinental railways.

For Nicholas Mersch, Portfolio Manager at Purpose Investments, the physical requirements of AI mean investors should view it almost as an emerging utility, rather than just a new form of information technology. The secular trend of AI, therefore, can be viewed through the various forms of scarcity that AI infrastructure will encounter and opportunities for investors can be found in accessing what’s scarce before it becomes abundant.

“I spend probably like 70 to 80 per cent of my time just trying to figure out where the rotating bottlenecks are and what is the golden screw,” Mersch says. “By the golden screw I mean: when you’re building this overall ecosystem, you have this one component that you need in order to turn it all on. And when you think about data centers, just the amount of different suppliers, different components, all these different things that you need, there are so many micro demand and supply balances that you have to match up with each other.”

Mersch explains that the early bottlenecks in AI were in graphics processing units (GPUs) which was what saw demand for Nvidia products, and Nvidia stock, skyrocket. There were bottlenecks in custom silicon, in units called chip on wafer on substrate (CoWoS) which were all made by TSMC, and in application-specific integrated circuits (ASICs). All of these components have experienced serious bottlenecks as demand surges past supply. Memory has more recently seen huge demand, and computer memory is largely produced by three companies: Samsung, SK Hynix, and Micron, all of which have seen their stock prices grow considerably in the past year.  

Now, Mersch says, the new bottleneck is in optical hardware, required to ensure different data centers communicate better with one another, as they start to be viewed as singular compute units on a massive scale.

Looking ahead, energy may prove to be a new bottleneck for AI buildouts, in Mersch’s view, especially as AI hyperscalers elect to build their own power generation capacity. The load placed on local power grids by data centers has caused a high degree of political pushback, and hyperscalers are seeking to circumvent the issue by simply building dedicated power generation of their own. That, itself, is causing bottlenecks in key power generation inputs like turbines, copper, and natural gas.

As this scarcity dynamic continues to ripple its way across the market and back again, the textbook strategy for an investor or advisor would be to simply find the next component that will be in high demand, buy access to it or shares in the company that makes it, wait for demand to far exceed supply, and sell when that investment is worth considerably more. Rinse and repeat.

All of that is far easier said than done, and while some investors and advisors may want to dissect the whole supply chain of the AI industry, finding these individual opportunities and price mismatches, Mersch notes that many elect to buy access to AI hyperscalers in the hopes of riding this trend.

Some may argue that the sheer scale of this buildout will resemble the eventual dot com crash and the countless kilometres of “dark fibre” that were left in its wake. Mersch, however, notes that the continued progress of AI models has resulted in something more akin to urban infrastructure. Just as the building of a new lane on a highway eventually results in more traffic, as more demand is induced by the additional supply of highway space, the building of new AI capacity will induce demand from users to fill that capacity, necessitating further construction. Even a focus on greater efficiency, Mersch says, will simply serve to induce further demand for AI.

One of the tests for that thesis will begin Wednesday with earnings reports from Meta, Amazon, Microsoft, and Alphabet which should show how this demand for AI and computing power is being monetized and how much additional capacity these companies have to spend on their AI compute buildouts. As advisors work to make sense of those earnings reports in the context of this wider scarcity trend, Mersch’s message is to focus on the inputs that matter most.

“Look at the physical economy. Focus on gigawatts just as much as gigabytes in terms of what’s being built out here,” Mersch says. “Throughout these large capex, massive investment cycles, you get these periods of dislocation of demand and supply, and picking out where you think the scarcity is will mean pricing power, which means earnings power.”

LATEST NEWS