Dr. Arman Shehabi: Around 2017, 2018, we see it start to increase, really going from about a little less than 2% of U.S. electricity use, which is what it has been for a long time, to increasing to more than 4% last year. That is really the largest increase that we’ve seen in a long time for it to go up that much. Then looking forward, that range here shows that it could increase from under 7% to about 12% of U.S. electricity use by 2028.
Jeff Johnston: That was Dr. Arman Shehabi from the Lawrence Berkeley Lab regarding the growth in data center electricity demand as a percent of the country’s entire electricity demand.
Hi, everyone, and welcome to the All Day Digital Podcast, where we talk to industry executives and thought leaders to get their perspective on a wide range of factors shaping the communications industry. This podcast is brought to you by CoBank’s Knowledge Exchange group, and I am your host, Jeff Johnston.
We’re going to do something a little bit different for this episode. Instead of me interviewing a guest like I normally do, I’m going to refer to a webinar that Teri Viswanath, who is CoBank’s energy economist, and myself did earlier this year with two renowned energy scientists. Teri and I sat down with Dr. Shehabi and Andy Satchwell from the DOE’s Lawrence Berkeley Lab to discuss their latest data center report that actually Congress had commissioned them to work on. For those of you who are not aware of the Lawrence Berkeley Lab, this is a world-renowned research lab that does outstanding work in the areas of data centers and their associated impact on the country’s energy complex.
Given all the media attention and geopolitical and economic implications of data centers and AI, and of course, what all this means to our energy complex, I was thrilled to have this opportunity to sit down with these two gentlemen to talk about how things are playing out over the next few years, or at least how they see things playing out over the next three years. Let’s get into it. Let’s hear what Dr. Shehabi, get his take on where the Lawrence Berkeley Lab sees data center energy demand as a percent of total energy demand going over the next three years.
Shehabi: In around 2005, there was a big concern. There was a doubling of electricity use and a lot of people said, “If it doubled in five years, it’s going to double again and keep doubling. After the next 20 years, we’re going to be using more than half of our electricity on data centers.” Congress at the time issued a request for the Department of Energy to submit a report to actually use a rigorous scientific analysis to understand where is electricity use going in data centers.
At that time, we started building what we call a bottom-up model that’s based on all the equipment in data centers. We were able to develop that and estimate where electricity was going in those years. Then we did this again in 2016. What we saw was, around 2010, things increased in the aughts. They started flattening out in the teens. Then we were asked again from the Energy Act of 2020 for us to update our 2016 report to see where things have gone since then. That timing worked out well because even though the legislation went out at the end of 2020, it takes a little time for things to get moving.
We received funding and we started working on this project last year, which was just right in time with the growth of AI. Then I can show some of the initial results here. What we’re seeing here is this was the increase. This is total electricity use of data centers in the United States. What you can see starting from 2014 is it’s pretty flat for the first few years there, and that’s a continuation of it being flat in the 2010s. Around 2017, 2018, we see it start to increase, really going from about a little less than 2% of U.S. electricity use, which is what it has been for a long time, to increasing to more than 4% last year.
That is really the largest increase that we’ve seen in a long time for it to go up that much. Then looking forward, what we do when we try to estimate out the electricity use going forward here is there’s so much uncertainty that we create different scenarios. When we look at these different scenarios of what could happen, there’s a different combinations and it allows us to have a range of where we see the electricity use growing in the near future. That range here shows that it could increase from under 7% to about 12% of U.S. electricity use by 2028.
Johnston: One of the concerns energy and digital infrastructure rural companies have is around this concept of efficiencies, technological efficiencies, and the impact that these efficiencies will have on future infrastructure demand. We know from history that as technologies mature, they become better, they become more efficient, and we can effectively do more with less.
This is a risk area for companies building out infrastructure because they are making investment bets, if you will, over a 10, 20, or 30-year time horizon. They’re trying to determine where energy and where digital infrastructure demand is going to be over that time frame. Of course, when you overlay efficiencies on top of that, it complicates things a little bit. Let’s hear what Dr. Shehabi has to say about this.
Shehabi: On a high level, how I would think of it is what we’re seeing is this large surge of electricity. You can look at this figure, especially if you look at on the high end going from 2024 to 2028, looking at that increase, if things are trending up that way in five years, where is it going to be in another five years? Another five years, you could just see this going exponentially large. I don’t expect that to happen. I expect this growth that I’m showing in the figure to happen.
Things will smooth out, even though I think demand will keep increasing. The reason why is because different efficiency measures will come into play because there is a strong sort of market driver for that type of efficiency. It’s expensive to run these chips. There’s going to be a desire in the future for things to get more efficient. Historically, that is what we have seen. When there was growth in the aughts, there was different efficiency measures that came into play that slowed that growth down.
Johnston: Taking this conversation one step further and getting a little more specific here, what we are starting to see are new types of semiconductors that are being introduced into the AI ecosystem that are exponentially more efficient than what we’ve been dealing with so far. Most of the semiconductors that have been running the AI models so far in data centers have been running what we call graphical processing units or otherwise known as GPUs.
Now we’re starting to move into the next phase of AI, where we’re starting to see application-specific integrated circuits, in short form, it’s ASICs. These ASIC chipsets are starting to play a much bigger role in generative AI and AI applications. Now, depending on who you talk to and what applications these chipsets are running, look, they’re significantly more efficient than GPUs have been in the past. Based on the forecast that we’ve seen from the likes of Broadcom, who is by the way, a major ASIC semiconductor company, these forecasts from these ASICs companies clearly suggest that over the next several years, the growth in AI semiconductors is going to be coming from much more efficient chipsets. I wanted to get Dr. Shehabi’s take on this in terms of, how should we think about this overall in terms of energy demand?
Shehabi: We think of looking at ASICs and is that something that could come into play to reduce this electricity demand compared to GPUs? That’s one avenue. I think that that’s possible. Right now, ASICs is a pretty small part of the market because they’re very expensive, and like the name says, they’re very specific to a certain application. They’re not quite as universally applicable as GPUs are. In some ways, it’s similar to what we saw in the evolution of Bitcoin, almost give an example of where things could potentially go.
Bitcoin was something that initially, college kids were running it in their dorms, on their desktop computers, on their CPUs. Then as the demand increased and the computational requirements increased, it moved to GPUs. Then because the faster you can run things, the better chance you get to make money in Bitcoin, there became faster GPUs, faster GPUs, and then it moved to ASICs. Then it’s been going to more and more quicker ASICs within that time. That’s possible that could happen more broadly in the AI space. The GPUs are also increasing. Then there’s also a lot of potential for those GPUs to be used in different ways that can be more efficient as well.
Johnston: I couldn’t finish my conversation with Dr. Shehabi without mentioning DeepSeek. Now, for those of you who aren’t familiar with DeepSeek, it’s a Chinese AI initiative that built a model comparable to ChatGPT, but this Chinese firm that built it claims that they were able to do so with significantly fewer resources and in a significantly shorter amount of time. They didn’t need all the chipsets that we’re using.
They were able to do it really quick and they did it a heck of a lot cheaper than what it cost OpenAI to build ChatGPT, so they claim. Again, this is continuing this whole efficiency conversation, which again, I know is an area of concern for infrastructure companies over time. I wanted to get Dr. Shehabi’s take on DeepSeek and how he sees it impacting energy demand and capital demand over the next several years as it relates to AI.
Shehabi: I do expect there to be more efficient ways for these GPUs to be used, and DeepSeek looks to be one that did that. DeepSeek has a limited amount of computing power, and so under those constraints, they found a way to be innovative. That’s what it looks like right now. It does look like it’s specific to training and doesn’t have that efficiency and inference from what we understand so far, but it’s still really early stages. What I would say is I don’t think that’s going to completely change where things are going and it doesn’t change these projections going forward.
I think it’s a sign of where things will go in the future, that it’s an example of something that will help slow this growth beyond 2028. If a data center is going to get built out and the amount of capital that’s going into it and they’re able to, let’s say, come up with a new code that allows them to run at one-third the power or do it in, let’s say, one-third the amount of time, the training, they’re not going to just turn the equipment off for the other two-thirds. They’re going to do three times as much training. It wouldn’t affect the overall power demand.
Johnston: My take after spending time with Dr. Shehabi and reading his report is that we are dealing with unprecedented levels of demand, and because of that, we are also dealing with a lot of uncertainty. It’s this uncertainty that really limited Dr. Shehabi and his teams to only forecasting the AI infrastructure market out for three years because once you get past three years from now, things get really murky, and it gets tough to predict anything with a high degree of accuracy.
Look, despite that uncertainty, it sure feels like things are pretty well baked for the next few years, which means that demand should continue to outstrip supply during that timeframe.
Hey, thanks for joining me today. A special thanks goes out to Tyler Herron and Christina Pope who make this podcast a reality. Watch out for the next episode of the All Day Digital podcast.