The Data Center Surge Rides on Rural Power

Episode ID S3E02
February 15, 2024

Thanks to the emergence of artificial intelligence, demand for data centers is red hot. Data centers run on power-hungry chips, so they are locating to rural America – where the electricity is. In this episode of All Day Digital, Alan Bezoza, a managing director at DigitalBridge Investment Management, explains how the main requirement for cloud computing switched from adequate building space to power.  

Transcript

Alan Bezoza: Then with the emergence of AI, we've gone from, as I said in my last quarter the letter to our investors, it went from 10 to 11. It really went to the point where it's been ludicrous on the amount of demand being required, again, using third-party data centers because of the time to market advantage that third-party data centers bring.

Jeff Johnston: That was Alan Bezoza, managing director and portfolio manager for DigitalBridge about the AI-induced data center demand they are seeing in their investment portfolio.

Hi, I’m Jeff Johnston and welcome to the All Day Digital podcast where we talk to industry executives and thought leaders to get their perspective on a wide range of factors shaping the communications industry. This podcast is brought to you by CoBank’s Knowledge Exchange group.

Migrating applications and processes to the cloud has been an ongoing theme for several years as companies look for ways to reduce capex and gain more efficiencies in their operations. But the recent emergence of AI has put this trend into overdrive. And the associated impact on power, pricing and the digital infrastructure ecosystem is nothing short of profound.

Alan is a seasoned communications investor, and has a deep understanding of current market conditions, and how the entire industry is linked together – be it applications, semiconductors, infrastructure equipment or the power complex, he’s got his arms around it.

So, without any further ado, pitter patter, let’s hear what Alan has to say.

Alan Bezoza, great to see you again. Welcome to the podcast. It's a pleasure to have you here with us today.

Bezoza: Hey, Jeff. How are you? Thanks for having me too. Thanks for thinking about me.

Johnston: Yes. No doubt. Well, anything telecom-related from an investment perspective or a technology perspective, you're top of my list for having conversations with people that are well versed in this, so pleasure to have you here today. Hey, before we get into a lot of the specifics around data centers and what's happening in the market, I think it might help listeners get a better perspective of where you're coming from. If you just spend a couple of seconds talking about yourself and then as well DigitalBridge, and what you guys are doing there.

Bezoza: Again, thanks for having me. I think just to put the frame of reference of who we are, I run a public equity fund that is focused on up and down the big food chain of communications in cloud. It includes the telecom operators, the cloud operators, all the way down through the technology vendors, data networking, com equipment, all the way down through their suppliers, through to semiconductors like Qualcomm, and Broadcom, and NVIDIA, and everything in the middle.

I've been doing the same universe of companies for almost 20 years now, which is pretty scary to think about. My firm as a whole, DigitalBridge, we're kind of a big deal in a small circle. We're an $180 billion-plus private equity firm as a firm.

We focus on a very specific niche-y area but a very important area of digital infrastructure. That includes things like towers, mobile towers, things like data centers, which we'll talk about, and also telecom companies as well. If you think about the universe, where the private equity guy sit and invest in, we're one of the largest data center companies in the world, having portfolio companies such as Vantage, which is mostly hyperscaler related databank, which is enterprise and co-lo, and then also we just recently took Switch private maybe last year, I think it was.

What I do, from a public equity point of view, is we're investing up and down the food chain. With DigitalBridge's investment universe is the center of my universe-- towers, data centers, and fiber companies. We're investing in their customers, their competitors, and the suppliers and suppliers' suppliers down the food chain. We get a lot of insight and a lot of information from the private portfolio companies, and then using that to inform our judgment in the public equity context.

Johnston: That’s great, you're really sitting in the catbird seat into this deeply integrated ecosystem of data centers and telecom and semiconductors and applications. For that reason and others, I'm so happy that you agreed to join us here today.

Let's start off with a very high-level overview of the data center market, and maybe take it from the perspective of its evolution over the last five, 10 years, because I think a lot has changed whether we think about on-prem co-lo to cloud adoption and the role that hyperscalers are playing in this. Again, kind of high-level, Alan, if you just give us your perspective on how this market has evolved.

Bezoza: It's changed quite a bit. I'll call it the pre-COVID and post-COVID timing because we've been going down this path of on-prem computing. The move to the cloud, your listeners are not novices, I won't go into the details and pretend that your listeners are much more sophisticated. You have this move to the cloud that's been happening for many, many years at a steady pace. You have companies like Microsoft, Google, Amazon, even Meta, and others that have become bigger and bigger in terms of the spending in the data center landscape.

Historically, I would say that roughly 70% of all data centers built by the hyperscalers were done internally. They would develop their own land or buy their own land, develop their own space, build their shells, secure power, and then bring on data centers online. That was the pace for many years, and maybe about five years ago or so, there was a change. This change was that they all wanted to go faster. Move to the cloud was accelerating and Amazon, AWS, was growing at a very rapid clip, Microsoft and Office 365, Google with its internal base applications, whether it served for YouTube, et cetera.

You now have data center capacity needed accelerated, or the data demand, or the hyperscalers requirements have accelerated. They started going the other way and using more of third-party data centers like Digital Realty and Vantage, and other companies as well. You've seen this big change, where the 70% done internally had become 70% externally. I'm exaggerating a little bit, but it's directionally correct. Why that's important is because now you're having more and more reliance on third-party data centers. That's a big change.

The reason why is because they wanted to go so fast, they couldn't do it internally. They didn't have the land banks. They didn't have the power secured. They may not have had the project managers to be able to handle building the structures and integrating cooling and power systems. Now, then COVID started, things accelerated even more. The work-from-home applications created more demand on these hyperscalers, mainly applications that were built for work from home were built on hyperscale networks, whether with Office 365 or other.

Now all of a sudden you had an acceleration, and the pace of acceleration went from fast to really, really fast. Again, using third-party data centers was the only way to do this because they had the land, the power securement or secured, and also the project managers and the labor skill sets will do this fast.

Then with the emergence of AI, we've gone from, as I said in my last quarter the letter to our investors, it went from 10 to 11. It really went to the point where it's been ludicrous on the amount of demand being required, again, using third-party data centers because of the time to market advantage that third-party data centers bring.

Johnston: Okay, that's super helpful. Let me ask you, just from thinking about these third-party developers and the role they're playing right now. As we think about AI and data centers being built over the next several years, how do you think about this from a location perspective?

Bezoza: Well, it goes into power, and power is the crux, they say. If I would say the other thing that we talked about in the evolution, I'd say about 10 years ago is about space. Space was building, constructing physical shell walls, cooling, and power systems. We're building data centers with a certain amount of wattage per square foot, roughly 200 to 250 watts per square foot have been the metrics we've been building data centers for last few years. However, with the move towards from CPUs to GPUs has been a big change.

Now the space is no longer real limiting factor, it’s more power. Data center capacity is being put anywhere that they have excess capacity. If you think about it, it's much cheaper to deploy bandwidth using fiber optics and data transmission than it is to move around power. It's cheaper to move around bits of data than it is to move around power. Part of it is what we're doing in AI is much less about being local.

These large language models and these very large data centers really don't care so much about latency. They really just care about processing power, and so it doesn't really matter the location as much as it is.

Johnston: That's fascinating. I've heard you, Alan, before talk about the time it takes to build a data center. From the time you decide you want to build a data center till the day it's actually completed, it's about three years. Is that a fair assessment?

Bezoza: Yes. I would say that's elongated with the insatiable demand that's happening right now for labor, the supply chain is fragile still, but I would say it's roughly around three years to build a data center.

Johnston: Help us tease this out a little bit. I would think in the environment that we're in right now, with all of these variables swinging wildly, whether it's the growth in AI and the impact it's going to have on compute and storage, whether it's labor issues, whether it's material cost, and most importantly, power, these are some pretty massive variables that developers and hyperscalers need to think about when they plan their data center plans because a lot can happen in three years. As we know, heck a lot can happen in three months.

How do they think about all this? How do you manage all these variables and navigate these market dynamics? We're in uncharted waters, aren't we?

Bezoza: That is a very good point. The reason why it's funny to think about this is because I would say, the end of last year, end of '22 I should say, is when we really saw the beginning of AI, that all of a sudden the growth in demand from a GPU perspective changed dramatically on how data centers are being built. However, the data centers coming online today are ones that were contemplated three years ago. They're not built for AI workloads. They don't have the same power density. As I said earlier, it's about 200 watts per square foot that was built in the data centers with CPUs.

Now that we're moving to GPUs that are six times more power-hungry. We don't need the space. We need the power. If you're going to take a data center that's coming online today that was built for CPU-based architecture and now you're using it for a GPU-based architecture using six times the bandwidth, you're basically taking that same structure, the same shell, and using only 13% of the space. It's six times more power usage. It's very inefficient how you cool it. Think about this big building, hundreds of thousand square feet building, that was dedicated for CPUs. Now all of a sudden you're using only 13% of the space, but you have to cool the whole building. You've got to manage the whole building.

It's just inefficient, but again, things coming online today were contemplated three years ago. Things that are contemplated today, it's interesting, companies like Microsoft and Meta and Google and Amazon that are trying to build their infrastructure over the next 10 years, it's really hard to think about the next 10 years when we really just flubbed the last three years. You're seeing those guys taking a hard look on dedicated space and dedicated power that they're going to be on the hook for when they really don't know their business models.

That's kind of the thing that worries me in this model. This whole craze that's happening right now is, what is the business model that's going to make these companies successful? And business models that cannot just make them successful, but to generate a return on invested capital for the amount of capital they're deploying or need to deploy over the next 5, 10 years for AI workloads.

Johnston: Yes, that would be concerning certainly. When you model out or think about the next several years in this market, do you see a scenario where access to power, access to the right cost structure, could that really become a chokepoint? Could that really prevent data center growth from reaching its potential, or could that prevent AI and all these wonderful things that we hear about with AI, could literally just access to power prevent or hold back some of the growth in those markets and applications?

Bezoza: Well, if you think about the math, it's a little fuzzy math because there’s a lot of "it depends" is the answer for a lot of things. If you do the math and you say, "Okay, well, the amount of data center revenue that-- and this is NVIDIA by itself, so take the sell-side.

If NVIDIA is going to do $45 billion this year this year, meaning 2023 I should say, in their data center segment, and then the estimates are something like $60 billion and then $90 billion for data center chips in the next three years. 40, 60, 80 to make the math easy, so, it's an extra $20 billion a year, incrementally. That's a big, big number. If you think about the amount of chips and GPU clusters that it creates, it's roughly 1.8 million H100 servers, and that's just in '24. Right now adding in '25, let's just say '24, 1.8 million servers and each server uses about 10 kilowatts of power.

Now that's max power, assuming it's running fully 100% utilization, which, given the cost structure of these servers, they want to run them at a high utilization rate. If you think about them running it at a high utilization, as hot and hard as possible, it implies these 1.8 million servers will consume a little over two gigawatts of incremental power, and that's not assuming any other chipsets from AMD, Intel, internally based designs that Broadcom and Marvell. Now, the entire data center market is expected to grow about two gigawatts a year. Just put that in perspective.

Johnston: Fascinating stuff. EVs get all the attention, but I don't think people really appreciate the impact that data centers are having and will continue to have on the grid. It's pretty wild stuff. I think I know the answer to this question based on everything you just said, but we think about the current market conditions in terms of vacancies and pricing in the data center market. My guess is vacancies are low and pricing is high, but considering everything that we've just discussed, but what are your general thoughts on that?

Bezoza: It's like a game of Tetris sometimes with data centers where someone might come to a data center and say they want a megawatt here, a gigawatt here, this here and that there. Data center operator is trying to play this game of Tetris and fit it all together. What we're seeing is, to your point, is that you're seeing it's an amount of demand that's just so high. It's been no secret that public companies like Equinix and DLR have spoken publicly about, they've been raising prices and that has not slowed down demand whatsoever.

What you're seeing is, I'll take whatever you have, wherever you have it, and I won't pay whatever, but I will pay more than people were paying in the past.

Again, it goes back to it doesn't matter where, it doesn't matter how much, I'll take it. They're raising prices, which is a great environment for the data center operators. However, on the flip side, they're having to spend a ton of capex to be able to support that growth. That's the problem that DLR has had, Digital Realty as the biggest operator, again, this is all very public, is that they have a hard time growing their cash flow per share, which is a key metric for those types of investors that invest in real estate type securities. It's because of the fact that the capex is so high and it dilutes their ability to generate free cash flow.

Johnston: Is it mostly about just the sheer capex budget, or how much do the current interest rates play a role in that cash flow per share equation?

Bezoza: It's both, to be honest, but capex requirement is really a hard part of the envelope. DLR specifically has a capital structure problem to some extent, where they had a higher leverage ratio and rates went up, to your point. At the same time, they need more capital. Really what they need to do is bite the bullet and do some equity infusion to lower their leverage ratio and have the ability for them to execute on their business model, it's been very known by the market that they've been walking away from some business because of the fact that they just don't have the capital to be able to support that kind of growth.

Johnston: Wow. How does this all impact edge computing? This was something we've been hearing about for a while. It feels like this growth in edge computing has been on the come for many years now. What are your thoughts on edge computing in the context of the growth we're seeing in the data center market, the power challenges, et cetera?

Bezoza: It’s like self-flying cars. It's always been a couple of years away. I think, generally speaking, look, there's always a move in every technology to do processing power at the edge, like our cell phones, or processing power in the network. Even with the age of television, moving from centralized video distribution to DVRs, where it was basically stored in hard drives in our homes, then now with the move to streaming, it's now back in the network location. There's always this move to push and pull, but I think generally speaking, the most efficient way to do it is from the central location in any given market.

If you can stream video, you'll do it from a central location. Same thing with processing power in a data center. It's much cheaper to do it in a central location and send out bits from that central location. Now, some applications will require low latency. The problem is, we got ahead of our skis a little bit in terms of calling things as data centers for a long time because a lot of applications don't require that type of latency.

I think over time it'll even out. To my point earlier about these large language models that are being built for AI today, they don't need latency. They don't need to be in an edge location. They could be in the middle of nowhere. Ultimately, another way to think about it too is one person's edge is another person's central location. The word edge is thrown around too much probably because everyone has a different definition of what it could or couldn't be.

Johnston: Got you. Hey, I wanted to ask you a question on thinking longer-term from a risk management standpoint, is there a risk of overbuilding and building too much capacity in the data center market? Sounds like the answer to that is no. There could be more risk on the semiconductor side as they're not able to meet their numbers because of power constraints or whatever, and the data centers not being stood up as fast, but are you at all concerned about one day these developers have just built too much capacity and pricing starts to come in a little bit?

Bezoza: It's a great question. I say there's probably two things that worry me. One is that, again, as I said earlier, can these companies, these massive hyperscale operators that are spending capital at record levels, can they earn a return on that investment? Think about Microsoft. Microsoft's going to spend almost $20 billion of incremental capex in '24 versus '23. That $20 billion is the annual budget of AT&T, and that's just incremental. They're going to spend something like mid-40s next year from mid-20s billion. Those are big, big numbers. They've accelerated significantly. Same thing with Google, Meta, Amazon, are all increasing capex.

The question to me is, can they monetize it? It's very clear to me that Microsoft has the ability with Copilot to get subscribers from their Office 365 and convert them to pay another $360 a year for, I would say, more efficiently use of their applications. One office worker wouldn't have their company spend $360 a year to make them more efficient. I think that's an easy one.

Then you go to Google, and Google has been using AI for a while now in their search algorithms, and YouTube as well. They've been very clear about this, but the question to me is, can they get better search algorithms? Can they get better outcomes and higher CPMs, which is essentially how much they charge for advertising, with AI embedded into their search algorithms?

Amazon is a little cloudier, not to use as a pun, but they're trying to figure out if they were in the last five years, AWS mopped it up in terms of getting third-party workloads run on their cloud infrastructure. Will they get for AI workloads running on AWS as well? They're clearly doing something, but it's a little opaque to me what they're actually doing.

The same thing goes to Meta. They're spending incremental dollars as well, and can they have a better use case for their own internal applications? Can they make Facebook and Instagram better and get better revenue from that incremental spending, or more engagement, which then leads to more revenue?

The second question is by the data center companies, they're signing into 5, 7, 10-year leases. To me, the question is as the hyperscalers become bigger and bigger, and become a bigger part of the total, the customer concentration goes up across the industry. This is not anyone's specific company, but generally speaking, they're spending at such record levels or such high levels that the customer concentration for the industry is going up, where if you don't have exposure to Microsoft, Google, Amazon, Meta, et cetera, it's a problem.

Now, think about 10 years from now, if all of a sudden things change where the efficiencies gained from chips and no longer is it six times more power hungry and it's a sixth power hungry, I can't tell you that's going to happen or not happen. If breakthroughs happen in the semiconductor space, then in 10 years' time when these leases come up that are being signed today, well, maybe they rent roll them down, instead of paying X, they're paying 50% of X. That's what I worry about in the space is that, from a data center point of view, you're underwriting these deals or this construction and the activity levels, based on some level of return on the investment from the data center operator point of view.

Johnston: Yes, those are great points, Alan. You wonder, has Moore's Law run its course, or do we still have more there? Some people suggest you're not going to have that compounding impact going forward like we've seen in the past, but who knows?

Bezoza: Yes, and it's the tip of the iceberg because if you're a CEO of a public company today, every CEO is being asked by the board, "What is your AI strategy?" I don't think most people know what it means, but they have to have some answer. They're looking at different applications. Bloomberg, which is a financial services tool that we use and has lots of information on it, they're spending a lot of money in AI right now as well. So is JPMorgan, and Goldman Sachs, and every investment bank, but they're very cash flow-rich, so they can spend that kind of money.

Think about all the other companies out there that don't have their capabilities or cash flow to be able to spend on those servers, so what are they doing? They're trying to figure out what Microsoft, Amazon, Google to provide for them in AI, the workloads, and run it out on there. There's a lot of skunkworks happening, a lot of testing and really trying to figure out what AI means to me. If every CEO on the planet is being asked by their board what is their AI strategy, and to be honest with you, a lot of things that are called AI aren't really AI, and that's a whole another topic probably.

Johnston: Yes. Fascinating stuff, Alan. This has been great. Before we wrap it up, I just want to give you an opportunity, we've obviously covered a lot here, but if there's anything that I didn't ask or you think we should wrap it up with, the stage is yours. Any closing thoughts would be great.

Bezoza: I think one thing that, this is not just a winner-take-all market. It's not just going to be NVIDIA chipsets. This is going to drive a lot of networking gear, whether it's companies like Arista, it's going to drive a lot of connectivity and communications technology. One thing that I find interesting in companies like Ciena, which has been around for a long time and I'm sure you're familiar with the company as is your listeners, they're predominantly thought of as a seller of optical networking gear to telecom companies. Well, 35% of their sales right now are going to hyperscalers, Facebook, Microsoft, Google. That's a change where it's the same type of technology.

These are networks being built, and as data centers get put in the middle of nowhere, like I said earlier, where low-cost power is, there's going to be telecom networks being built to connect them. You're seeing not just this pull-through of chips, you're pulling through cooling systems, you're pulling through power generation, which is going to be a very interesting space over the next couple of years, not just because of data centers but EVs as you mentioned earlier, isn't a small thing. Telecom networking is going to be a thing, and it's going to continue to grow maybe outside of AT&T and Verizon and Vodafone and Deutsche Telecom but again, the hyperscalers.

Again, thinking about the dollars that Microsoft is spending, their incremental dollars of spending is the size of AT&T in a year. It's just fascinating. It's not just a winner-take all with semiconductors, I think it's going to pull through a lot of stuff and a lot of things that people are thinking about.

Johnston: Yes, those are great thoughts. It makes me think of the definition of what it means to be a ''telecom operator'' are evolving quite a bit. Companies that we wouldn't think of as telecom operators are starting to look like telecom operators. It's a pretty exciting time. A lot of convergence.

Hey, Alan, listen, absolutely fantastic. Great insight. Definitely didn't disappoint. I wasn't worried about that, by the way. It was great having you here, man. Great catching up with you. Thanks so much for being on today.

Bezoza: Well, thanks, Jeff. Thanks for having me. I'm always around if you have any questions.

Johnston: A special thanks goes out to Alan for being on the podcast today. Lost in all the AI hype is the impact it’s having on the industry’s ability to build new data centers and to keep up with the surging demand. The chips that run all these new AI applications are power hungry and this is causing a supply demand imbalance in the energy markets. We simply need more electrons! It used to be that data centers needed to be located near major communication hubs, like in northern Virginia. Being close to fiber lines that connect to other data centers around the world was a gating factor for new data centers. That is no longer true. It’s now about power, which opens up new opportunities for rural communication companies.

Hey, thanks for joining me today and watch out for the next episode of the All Day Digital podcast.

Disclaimer: The information provided in this podcast is not intended to be investment, tax, or legal advice and should not be relied upon by listeners for such purposes. The information contained in this podcast has been compiled from what CoBank regards as reliable sources. However, CoBank does not make any representation or warranty regarding the content, and disclaims any responsibility for the information, materials, third-party opinions, and data included in this podcast. In no event will CoBank be liable for any decision made or actions taken by any person or persons relying on the information contained in this podcast.

Where to Listen

Anchor Apple Podcasts Spotify RSS