Hi folks, welcome to another episode of Who Said What? I’m your host, Krishna. For those of you who are new here, let me quickly set the context for what this show is about.
The idea is that we will pick the most interesting and juiciest comments from business leaders, fund managers, and the like, and contextualize things around them. Now, some of these names might not be familiar, but trust me, they’re influential people, and what they say matters a lot because of their experience and background.
So I’ll make sure to bring a mix—some names you’ll know, some you’ll discover—and hopefully, it’ll give you a wide and useful perspective.
For all the sources mentioned in this video, don’t forget to check out our newsletter; the link is in the description.
With that out of the way, let me get started.
Infosys held its annual Investor Day recently. And a quote from the company’s management looked particularly interesting to us:
“I am going to talk about an interesting segment called energy, utilities, resources and services, and I call this interesting because AI has created a circular economy in energy, utilities and resources sector. While these sectors are heavy users and consumers of AI, they are also critical enablers of AI. If you look at utilities today, particularly electric utilities, they power and decide where the next data center should be and how fast the AI data centers can grow. In fact, there are views that electricity is the only limiting factor in growth of AI.”
There’s a lot to unpack here.
We’ve heard a lot about the immense power requirements of AI data centers. And, in our recent Daily Brief story on data center financing, we briefly highlighted the importance of getting power grid permits for an AI data center. But we didn’t really get into what entails a power grid connection for an AI data center.
A data centre operator usually scouts sites not just based on where there’s a lot of land, but also where power substations are close by. Once they decide on a site, they submit an interconnection request to the appropriate utility company, asking for a certain amount of power capacity over a period of time, as well as ramp-ups. And when a data center is built, the operator signs a long-term power purchase agreement, either with a utility, or even directly with a power-generating company.
The bottlenecks lie in power management. For one, there’s only so much non-renewable energy in the world. And without adequate substations and transmission lines in place, even that will be heavily mismanaged. In the US, data centers draw electricity from a grid that is still heavily powered by fossil fuels, especially natural gas. And policymakers are worried that this might already be pushing up household power prices in some regions.
Naturally, renewables are expected to play a huge role here. In fact, they already power nearly half of India’s new data centers. However, renewables aren’t spot-free either. Without ample energy storage in place, it’ll be very hard to run a data center on solar panels at night.
However, beyond energy being a bottleneck for AI, what we actually want to address is the fusion of two business models that, at first glance, have little resemblance.
Tech has historically been faster-moving, more speculative, and more tolerant of risk, with profits driven by scale. Utilities, by contrast, are slow-moving, politically regulated, and heavily dependent on capex in long-lived physical infrastructure. Their revenues are closer to annuity-like cash flows, and they typically recover investments gradually through tariffs over decades, earning only a regulated return on that asset base.
Based on how fast they want to move, tech presents its power forecasts to utilities, who accordingly have to decide their investments. But if those forecasts turn out to be wrong, who bears the risk?
However, this fusion has created an interesting result. Now that AI has forced tech to spend far more on data center capex than they ever have before, AI data centers share many features with power utilities.
Think about it. Running an AI data center forces a tech company to start thinking like an infrastructure operator. It has to forecast demand for computing power, plan capacity additions, secure reliable electricity supply, and build redundancy through backup power systems.
More importantly, the end product of an AI data center is computing power. While in electricity you pay per kilowatt-hour, with AI, the conversation shifts to paying per token, or per API call, or even per GPU-hour. Much like power grids, data centers have to manage supply of compute (through GPUs and servers) with demand for it (which comes from AI labs).
And when you take many data centers together, you essentially have built a grid for computing power. Perhaps, it’s because of this realization that China is attempting to build a national, state-led network of cloud computing centers. After all, there will always be fluctuations in the demand and supply of computing power across regions. A national network will essentially ensure that there is a functioning market for compute, which ensures transfer from excess regions to deficit ones. All of this sounds quite a bit like electricity, if not entirely.
So far, we’ve spoken about how utilities fit into the AI data center story. However, we want you to bring back to the original quote that got us to explore these concepts. Take the circular economy side of things that Infosys talks about. In essence, it is a thesis they’re betting on.
In Infosys’ view, AI growth is driving massive investment into energy, utilities, and resources companies. These companies themselves need AI to optimize grids, predict loads, manage complex supply chains, do predictive maintenance on assets, etc. They also have enormous ERP-heavy, legacy-heavy IT estates that need modernization.
Infosys is positioning themselves as the AI services partner for these companies, and they seem to have made considerable progress on this, saying:
“And the proof of the pudding is we are the AI partners for 15 of our top 25 clients in this segment. We do work across the AI framework. We have created digital twins for a very large oil and gas major to take asset telemetry and make the assets more intelligent, more automated, reduce their downtime. We have worked in AI-grade data engineering for a very large electricity provider to predict the load on the grid and ensure they invest on the grid where there is congestion to provide electricity to the data centers.”
Now, this is certainly an interesting thesis. However, is this the only angle that Indian IT is thinking about when it comes to AI data centers? After all, Infosys’ most important competitor seems to have a slightly different view.
A few months ago, TCS announced that they were building their own AI data center with 1 GW capacity. And just recently, TCS said that they aren’t stopping at one data center, either. Here’s what their CEO, K Krithivasan, said:
“We are having discussions with multiple other hyperscalers. We are very, very bullish. There is going to be lot of latent demand or unmet demand by 2030 so there is going to be a lot of investment required.”
As we covered before, the understanding is that this is a defensive move from TCS. With LLM usage increasing each day, AI labs will require more computing power. However, it would be far-fetched to expect TCS to build their own LLM, considering that they aren’t a product company. Instead, they decided to capture a share in computing power.
TCS does not publicly detail a separate power procurement model for its data centers. But its disclosures show that the company draws on a broader operational energy mix which is renewable-heavy, while also including 10.2 MW of rooftop solar across campuses.
The question, then, is whether Infosys will go down the same route. They do already have non-AI data centers, 45% of which also run on solar energy. It can also be argued that by owning their own AI data centers, they may be essentially doubling down on their thesis. However, they’ve yet to make any announcement on AI data centers like TCS.
That’s it for this edition. Thank you for reading. Do let us know your feedback in the comments.



