AI requires a whole lot of power to run, and as more and more AI workloads are added, the power demand will just get greater. According to a report from Schneider Electric, in 2023 AI consumed 8% of the total power demanded by data centers, and this number is expected to grow to 15-20% by 2028. 

In the most recent episode of our podcast, Get With IT, we spoke with Jason Carolan, chief innovation officer at Flexential, about the possibility of using nuclear energy to meet these growing demands.

Here is an abridged version of that conversation:

David Rubinstein, editor-in-chief of ITOps Times: What’s causing all this need for extra power? And what are some of the options to get that generated?

Jason Carolan: It’s a very interesting time in the data center industry, and I think in the technology industry altogether. What has happened over the course of the last couple of years, with the birth of generational AI, or conversational AI, really is a game changer. The adoption of AI has been a slow rollout since the 50s or even earlier than that with Turing and other things, but what really has happened in the course of the last couple of years is what I call the browser moment for AI. The Internet was around for probably 20 years, and we had these .edu email addresses that we didn’t know what to do with, and we’d use things like Gopher and FTP, and then the browser came along, and all of a sudden, it became very useful. And I think that’s what’s kind of happened with GenAI.

DR: Of course that puts a lot of pressure on data centers to keep things moving. One of the things that you were talking about would be the use of nuclear projects, and a big number of them are starting up now. Why do you think that that’s a good option? I’m sure a lot of people still have Three Mile Island and Chernobyl in the back of their heads. What’s different now that makes this a viable option?

JC: There’s been a lot of money and innovation put into small scale nuclear reactors over the course of the last four or five years, and there are several projects underway. In fact, I just saw one last week, Bill Gates was working with TerraPower up in Wyoming to build a facility up there. There continues to be almost open-source-level innovation in the space because people are starting to share data points and share operational models. 

Just to make it easier and safer these small modular reactors (SMRs) are generally in the 100 to 300 megawatt range. If you think about a large scale nuclear reactor like Vogtle in Georgia — which is really the only two active projects being built right now, one just came online here a few months ago and another one being completed — those are in the sort of 1000 megawatt range. 

But the real driver or the real issue here is that energy consumption in the United States has been pretty flat, really over the course of the last two decades. And part of that was that perhaps COVID sort of slowed things down. But now we’re at this point, whether it’s AI, whether it’s just the electrification in general, that we’re really running out of capacity. In fact, there are states where projects of large scale, electrification builds, as well as data center builds, basically have stopped because there isn’t power capacity available. 

So I think really the only way to meet the growing demand is nuclear power, which could add another million gigawatt hours to the platform here in the next four or five years, which is about 20% more capacity than we’ve ever had before. And moving so rapidly, really nuclear is the only answer. The challenge with nuclear though is it takes five to seven years to build. 

The SMR world is still pretty new and a lot of the designs have not been approved by the Nuclear Regulatory Commission (NRC). And so I think we’re going to be in a little bit of a power gap here, in the course of the next two to three years as we continue to scale up nuclear, which is considered clean energy, in some way, shape or form. It’s about as clean as solar. It really doesn’t emit much and it’s becoming safer. It’s certainly way safer than coal, way  safer than biomass. 

So yeah, I think the time has come to consider how we start being more open in our thinking here and really look to deploy these in a very safe way and more of a micro grid type of topology.

DR: Here on Long Island where I am, we have this notion of NIMBYism, which means “not in my backyard.” And, in fact, we’ve had a couple of projects that the utilities wanted to site for lithium battery storage facilities, and the towns are slow to approve them because residents are up in arms, because there have been a couple of fires and they’re very hard to extinguish and toxins get released and all kinds of things. So it looks like it’s going to be a PR battle, I guess, for companies that want to create these things and convince the public that it’s safer than Chernobyl.

JC: Yeah, no, 100%. I mean, I think that’s been some of the challenges and why the development here has been slow. And why there really haven’t been a lot of active large scale format projects in the US is exactly that reason. 

What’s interesting, with the current rollout of AI capacity is that it doesn’t seem to matter as much where it’s located, which is a little bit of a nuance in the data center world, because typically, data centers were somewhat close to either the network access points like Virginia or LA or Dallas, and Portland’s become kind of an alternate NAP in the northwest. 

But what we’re seeing with AI is really available power capacity is the denominator that needs to be solved first, and then the network will sort of come to it. And I think that’s very different from what we’ve seen over the course of the last 10 years or so. 

I also think we’re very focused right now in the world of GenAI on training and training requires tons of GPUs. In fact, the first ChatGPT model used about 10,000 GPUs for its training. But then as we get through this large language model battlefield, where you have the Llama trying to compete against OpenAI and ChatGPT and Gemini. Everybody’s kind of battling to build the best model, which is great, because I think it drives innovation and drives feature sets. 

Eventually, most enterprises are going to essentially license a model and use it for inference and augmented generation. So we’re kind of in this training paradigm right now, where large scale capacity is super important, and location doesn’t matter. But when you think about ultimately inference, using the AI to solve problems, it could be much, much closer to wherever the manufacturer is. So I think we’ll see this dispersion of AI technology really across the planet that gets much, much closer to the user, your laptop as an example or what Apple announced. So it’s an interesting evolution of the very large scale training requirements that we’re seeing right now. And then the evolution of inference as we go along.

DR: Until some of these other alternative sources come online, what’s going to happen during that period of where there’s that gap, where people are using more power than actually is available? What’s going to happen with AI?

JC: I think over the course of the next couple of years, there’s still enough constraints in the system. The electrification game is interesting, because we’re all kind of competing for the same raw materials, right? Whether you’re building a data center that needs steel, whether you’re building a data center that needs generators — it’s the same. It’s the same raw materials that go build transmission infrastructure for power plants, right. So we’re all sort of needing this raw material that is really kind of tough to get so we’re just spending a lot more time planning. 

First of all, the industry is getting a lot farther out in front with the power companies to understand their load, their capacity, and what their plans are. In the past, we just sort of assumed it was going to be there, and you can’t really do that anymore. So the planning cycle and planning horizon is now really two to three years out to go build a scalable facility. 

And I think what will happen in a lot of these regions for a period of time is they’ll run out of power for two or three years while they bring additional capacity online. Solar still is probably the largest build out that’s actively happening and in the US. It will take some time to build that, then they’ll come back online with more capacity, we’ll eat that up quickly. 

So it’ll just be this sort of continuous cycle of just being smart in your planning, talking more, getting in front of it. 

I just saw this in the southeast, you know, there’s the government coming in and opening up these regions to more competition, which I think ultimately will help drive more capacity in the regions as well. Most of them have been kind of operated as quasi governmental utilities. So maybe reducing the regulation, allowing some more open competition, will help. 

But you’re 100% right, on NIMBYism, and I think it’s really going to take the industry working together to prove that this is operationally safe. The exciting thing with AI is you can use it to go be better at understanding predictive maintenance and predictive failures. It’s almost like using AI to design AI, which I think is a huge benefit that we can use here as well. 


You may also like…

Q&A: What the consolidation of the SIEM market means for IT

Q&A: Bad bots and their impact across the internet