Watt's Up with AI? The Role of Artificial Intelligence in the Future Grid

Episode ID S4E02
February 28, 2024

Artificial intelligence is poised to revolutionize the power utility space, enhancing efficiency, reliability, and environmental impact. In this episode, power utility AI experts from NVIDIA, EPRI and Utilidata explore emerging opportunities, from predictive maintenance and optimized energy generation to boosting grid management and developing smart grids.

Listen to this episode on Spotify.com

Transcript

Ken Hester: What does a world look like when we're actually doing our best to be more efficient? A world where we can actually use technology to improve reliability, resiliency, power equity? All these things are on the table with technology.

How do we shift power left and right to better support other communities? These are the hard problems, usually more ethical than scientific, because I think we can solve these problems. I'm hoping that if we're able to talk about the technology and make people feel comfortable, that it's there to help them in their growth, that we can begin to talk about these other fringe topics that are just as important to our society.

Teri Viswanath: That’s Ken Hester, solutions architect director for NVIDIA, a company that is sometimes referred to as “the engine of AI.” And as you’ll hear on this program, new AI resources for utilities are fast emerging and will shape and inform our future grid.

Hello, I’m Teri Viswanath, the energy economist at CoBank and your co-host of Power Plays. As always, I’m joined by my colleague, a managing director here at the bank, Tamra Reynolds. Hey Tamra.

Tamra Reynolds: Hey Teri. For today’s discussion we will be featuring a line-up of the industry’s sharpest thought leaders on generative AI.

As Teri mentioned, we invited Ken Hester to speak with us, along with Dr. Jeremy Renshaw, the senior program manager for AI at EPRI, and Dr. Marissa Hummon, the CTO of Utilidata, a partner of NVIDIA’s developing applied solutions for our industry. For what it’s worth, NVIDIA hit it out of the park this week with their fourth quarter earnings release related to the booming AI business.

Viswanath: Our conversation begins with Ken Hester. NVIDIA is going to play a really important role in our AI future, so where exactly does Ken see the technology being applied?

Hester: We see AI and now genAI really taking hold in all sectors. Whether it's from our call centers and automating those call centers, sort of the genAI strategy, all the way through, how do we process this growing industry of home automation, home power, home power creation?

So, we're able to get a great look back on how the data should actually be projected forward. As we move through the industry and we get all the way out to those endpoints, we're looking at being able to do some AI at the far edge, maybe a lot of AI.

How do we take advantage of the power creation both north-south and how this idea of being able to shift power east and west between our residential homes beside us or our communities to our left and our right. We have a lot of AI at play at the moment.

With real time data processing, or near real-time data processing, we'll be able to determine how to shift that power around the system so we can better support different parts of the community that may need the power, if they don't need it because all of their solar panels came online, we can send it to those that do need it.

Being able to go broader beyond power and utilities, if you start drawing in water, natural gas, these other forms of energy movement that we have available to us, being able to build a bigger picture in how we're utilizing those resources does again help account for where our losses are.

Reynolds:  At the moment, there is heightened interest in how artificial intelligence might help the world get the most from the planet’s scarce energy resources.

Last fall, the U.S. House Energy and Commerce Subcommittee convened a special meeting, inviting EPRI’s Jeremy Renshaw to speak on this topic. So, we asked Jeremy to help us separate “fact from fiction” and here’s what he had to say.

Jeremy Renshaw: AI is a very powerful, but also limited technology. It does some things really well, but can't do everything. It needs large amounts of training data and often struggles with new or different conditions that it hasn't seen before or been trained on. While it's getting better at addressing these types of weaknesses that it has, we still have a ways to go as an industry to be able to get there.

So, one of the things to keep in mind is that you can compare AI to a high potential but limited junior career employee there. It's very eager to please, but it may not always know the answer. So, sometimes the large language models that we have today, they will hallucinate or come up with an answer. You can think of them as a very fancy autocorrect. While they're often very accurate, in some cases they are comically wrong, and it's sometimes hard to disaggregate the incorrect response for a non-expert in that area.

Reynolds: One AI application that listeners are probably already familiar with is ChatGPT. And I’ve started using this more frequently for business communications. So, we asked Jeremy his thoughts about this application.

Renshaw: Everyone was talking about generative technologies, large language models, ChatGPT really for the last year or so ever since ChatGPT was released in November of 2022. So, these GPTs or generative pre-trained transformer models, are trained on vast amounts of data, essentially the entire public data set of information available on the internet, in terms of text. They're trained to be able to understand various prompts or inputs, and then respond to them. Essentially, they turn words into vectors or mathematical representations.

You can imagine something like Obama would be similar in a vector to president or Joe Biden, or medicine and Tylenol would be close in this vector representation. As they are able to build these large vector databases or datasets, they can look at the mathematical similarity between the prompts that you're looking for and potential responses. As I mentioned before, sometimes they really nail the response, and other times, they confidently state the wrong answer, or hallucinate. It's something where you often want to check the validity of any kind of response that you get and realize that it may or may not be completely accurate.

Reynolds:  Jeremy’s perspective is helpful. AI tool usage is surprisingly concentrated around a few stand-out winners, with ChatGPT leading the pack by a long mile. The application has an estimated 180 million global users and generated a massive 1.6 billion site visits in December 2023 alone.

Oftentimes, these large language models, or LLMs, are working in a flawed universe. But what if we train LLMs on a defined or cleaner universe, would we get a better result? Here’s what Jeremy had to say.

Renshaw: One of the things that we're looking at right now at EPRI and many other organizations as well are things like expert systems to where you can train a large language model on a very specific body of knowledge to where it can get very good at doing that one thing. We're looking at how can we, across our different research programs, identify different R&D programs, specific expert systems to be able to help and augment someone at any level, whether it's a junior employee or someone who's very seasoned and experienced, to be able to identify the material that you need faster and more efficiently.

Viswanath: Let's just talk about the applications and where AI is best suited to meet utility challenges.

Renshaw: There are a wide range of opportunities to use artificial intelligence and data science activities for improving what we do in the utility space. Some that we're working on right now at EPRI are related to things like predictive maintenance. We want to make sure that we're doing maintenance for a reason, and so AI helps us to be more efficient there.

You can also look at grid management. Being able to optimize power flow within a power grid is a very complex challenge that AI can help with, as well as looking at things simpler like inspecting the components on an aging power grid to identify which are still good or which are not.

You can also look at things like wildfire risk evaluation or vegetation management. So, you can use a combination of satellite data and imagery with the knowledge of where the power lines are to look at and identify dead and dying trees using multispectral satellite imagery.

Similarly, there are many other use cases that we can go into, things like cybersecurity, how we can use it for offensive and defensive capabilities, load and weather forecasting to be able to match grid loads with grid supply. Things as simple as looking at the electrical signatures that you get from HVAC units to detect early-stage failures. Of course, there are potentials for efficiency and reduced emissions for the power plants that operate today.

Reynolds: Teri, we spoke with NRECA’s Jim Matheson a few months ago for another podcast, and we discussed a future scenario where our power supply becomes increasingly strained. That situation played out in California during August 2020, when hundreds of thousands of Californians briefly lost power during a heat wave — and this was the first time outages were ordered in the state due to insufficient energy supplies in nearly two decades.

A repeat almost occurred again in September 2022, but was narrowly avoided because consumers voluntarily shed load, but the process of keeping the lights on was messy. As it relates to today’s conversation, how can AI help the distribution system to more efficiently respond? Marissa Hummon addresses this issue.

Marissa Hummon: The tools that we have thought about rolling out in the past as a distribution utility are not the tools that we need for the future. Advanced central ADMS or central DERMS are tools that we thought would be right for a very more of traditional grid — directed from the central operations out. The world has really shifted to see that the edge of the grid is going to be the driver of how we transition both the infrastructure, but also the operations.

We see this as a natural evolution of the technologies for the grid and bringing that compute infrastructure to the edge of the grid where it can understand what's going on, make decisions, and then issue actions right from the edge of the grid is a much more scalable enterprise than trying to bring all of the information back, which is incredibly costly.

The edge of the grid, you can really treat as individual pockets, at least on a very short time scale, and only use that central position when you're looking at larger-scale or longer-scale problems.

Viswanath: All of a sudden, the need to be more dynamic in the field — really, at the meter — has become mission critical for electric utilities.

I asked Marissa whether she felt hopeful about the fact that new applications for AI were bubbling up at a moment when the grid is challenged by resource constraints. Coincidence? Here’s what she had to say.

Hummon: If you took the total amount of energy and you spread it out evenly over all the hours of the year, you're really only utilizing 30% of the capacity. That is simply because we haven't put in place the visibility and the signaling necessary to smooth that out.

In a world where you do have supply chain constraints, where the cost of the energy transition could be enormous and incredibly impactful on individual customers, the only path that really drives a low-cost transition is one where we are using technology to provide that visibility and that signaling to smooth out the energy consumption as well as the energy production, so it goes both ways.

I'll use an example from my neighborhood. We're in 1960s houses. There's a transformer. It's actually in my backyard on a pole. I can see that it's got lines dropped to my house and two other houses. We have an EV. We have a Level II charger, and I monitor my electricity use. I can see that, as just a single house on that transformer, we probably top out at 16 kVA, and that whole transformer is probably only a 25-kVA.

If either of my other neighbors got a Level II charger and plugged in at the same time as us, we're going to hit right up against the total capacity of that transformer. It's entirely possible to not replace that transformer if we just coordinated some of those big electricity consumption events, mainly things like EV charging.

Reynolds: There’s this fundamental idea that Marissa is surfacing, that our distribution utilities “can’t manage what they can’t measure.” And to be clear, we’ve heard this very thing from a lot of customers.

Hummon: So, you've got to have their measurements being made, and they're already being made in the meter. Then you've got to couple that with the ability to make sense of those measurements. So, that's why we're bringing the compute to the meter. Then you have to be able to communicate that information, the right piece of information at the right time to the right entity. And that might be to your neighbor, but it also might be to a central system, in order to make a decision that allows us to better use all of that currently built capacity.

We estimate that if a system is currently at about 30% capacity utilization, that if they layered on communication and computation technologies like ours, you could probably get to 50% or 60%. You could probably double the amount that the distribution grid can handle without building out new lines. At some point, you're going to need to build new lines and new transformers, new substations, but there's a lot, lot we can do ahead of that in order to extend the value that the system brings to the customer.

What a non-wires alternative is, is using the existing assets and infrastructure better. You're taking both the customers’ assets and the utilities' assets, and creating ways for those to work together to avoid infrastructure upgrade costs.

I think there's an old adage that the cheapest kilowatt of energy is the kilowatt you saved, and it's the same thing on the built infrastructure side. The cheapest infrastructure investment is the one you don't have to make because you've managed to use your infrastructure more effectively.

Reynolds: That’s helpful guidance.

Teri, Jeremy Renshaw also mentioned that grid-edge or behind the meter AI applications, would be an important use case for the technology. He extended the preventative maintenance example he mentioned downstream to the home, with a discussion of HVAC systems and solar rooftop applications. Here’s that conversation.

Renshaw: One of the things EPRI has been looking at is, can we start to identify these early precursors of degradation in an HVAC unit using electric-only data? Not listening to the noise, the vibration that are the traditional ways of doing it, but literally just taking the AMI data that utilities already have access to, and identifying is it starting to change the amount of energy that it's using, is it turning on more frequently than you would expect for the temperature, and are the signals looking different?

It does appear possible to be able to use electric-only data to proactively identify degradation in an HVAC unit to where your utility could literally send you a notification saying, “Hey, we believe that your HVAC unit is about to fail. We think you should go and do some maintenance on it.” And, the great thing about that is you can do maintenance on it and replace maybe one or two small components versus replacing the entire compressor or some other high-cost component.

One of the drawbacks is it looks like we're not going to be able to use 15-minute AMI data, where it takes one data point every 15 minutes. It's just not a robust enough dataset. It looks like as we get into the next generation of meters that we'll take data, say, every one minute, or some meters could potentially take data 30 or 50 times per second. When we get into those higher frequency, more robust data sets, that's where we think that we can see a lot of value in proactively identifying these HVAC faults and being able to save customers a lot of money.

Viswanath:  As we consider what the future of “electrify everything” and what it might look like, the biggest investment in that future might actually come from the consumer. But in lock-step, an investment will have to be made in upstream supply, starting with distribution.

EPRI and Utilidata are doing remarkable work here. I asked Marissa about Utilidata’s work in the distribution space and what benefits (as consumers) we might be able to soon see.

Hummon: The idea that we can really speed the time-to-market for a meter company to be able to embrace a distributed AI technology is really exciting.

Our history of doing real-time grid operations has put us in a position to know how to handle that data, how to extract the most value out of that data, and make it accessible to applications. The applications that we think are going to be fundamental to operating the grid in the future are anomaly detection that can see everything from a tree branch rubbing against a line to a transformer that's getting overheated.

We can do on-site forecasting at a level of accuracy that you just can't get if you try to pull that data back to a central system. We can detect right when an EV starts charging, within seconds. We can make a forecast for the solar production at a site just by looking at the historic data and also the on-chip physics modeling of solar power production. Those kinds of applications are going to be essential to allow the utility then to build the kind of system operations they need.

But having those building blocks in place will speed up the time that they can make an impact to take it down from like five years to hopefully like a year in terms of making that transition.

Reynolds:  EPRI’s Jeremy Renshaw stressed a few key observations or big picture take-aways on AI that are worth sharing.

Renshaw: There are a few things to keep in mind. I would say, again, going back to there's a lot of hype around AI. Of course, it is not a do everything, solve all your problems kind of technology. I like to use the example of, everyone knows the analogy of if the only tool you have is a hammer, everything looks like a nail. You could definitely try to pound a screw in with a nail, but it probably wouldn't work very well.

Understanding the types of problems where AI can be a useful tool and others where maybe it's not the best tool is very important. A lot of times I hear, "Oh, we'll just throw AI at it." I would always say, "Let's take a step back and look at, do we have the right amount of data? Do we have the right quality of data? What other options are available?" You may say it's weird for the AI guy to say, "Let's not use AI for this," but often it's not the right answer.

The second thing is we've often seen that training our existing workforce to understand how and when to use AI is very valuable. It's often easier or faster to take a current subject matter expert and train them on data science and AI to get them trained up on how they can use it for their particular area, so that then as they're using these tools, they'll understand what the answer should be or what the range of possibilities should be, so if they're getting a bad answer, they'll be able to identify that easier.

Whereas if we're just taking someone who is only trained in AI and throwing them into a new area, they may not have that full depth of understanding of, is the tool providing something that is a useful and valuable response, or is it just coming up with something?

Viswanath:  I want to slightly pivot our conversation.

My KED colleague, Jeff Johnston hosted a terrific podcast in February. I am going to over simplify a key take-away from his episode, but here goes.

Data centers and edge networks enable the magic that emerges from AI applications. They do so by using high-performance computing clusters, which are made up of multiple servers connected through high-speed networks that allow for parallel processing and fast training times.

These hard-working machines require considerable power, as Jeff cautions, and, as a result, generate a lot of heat. So, how can AI be applied to solve that problem? Here’s what Ken had to say.

Hester: NVIDIA, we leverage our best-in-class GPUs, CPUs, networking and reference architectures to really drive a better performance per watt. How do you get the most out of that accelerated computing? Because if we do it faster, we don't have to spend as much power trying to get to an answer.

We want to make sure that we're first person contributors in it and make sure that we're tweaking those frameworks to get the best performance per watt out of those. Every bit of hardware that goes in there, whether it's GPUs, CPUs, networking, the fabric that connects it all together, all of it is important to make sure that we're optimizing for the best performance per watt. We actually go out of our way to release software that does take best practices into account to make sure that you are processing as fast as possible.

We also invest a lot in DLC vendors — direct liquid cooling vendors — and this gets us about 50% or sometimes more back in power in the data centers. This investment across the community is really helping us to get more power back, allowing for that data center to grow.

I know people think of us as hardware. We're probably still 20%, 30% hardware. At this point with generative AI, we're closer to 60%, 70% software innovation. As we tackle those world's hardest problems, they're running on those big data centers, we need them to be performant as well. So, we're investing a lot in those spaces.

Reynolds: So, how does NVIDA see itself being applied in the utility space? Ken does a nice job summarizing today’s program.

Hester: As we become more accustomed to things like direct integration to our lifestyle through our cell phones. Our entire life is customized around these co-pilots that exist in our just direct ecosystem.

I think utility CEOs should begin to expect that the people that are consuming what they're offering, their product, which is electrons, they are actually expecting the same level of integration into their lifestyle. When they think of generative AI, it should be a way for them to offer additional integration downstream to their customer. Now they're actually working closer to their end customer. They're getting direct feedback from them daily through these platforms. They're able to service their questions with a higher level of accuracy. Perhaps that even helps them reduce internal costs if they can get the AI conversational models to support most of the questions that come through, that's great.

But they can actually use those same models to actually reach in and ask for and query those endpoints that they've instrumented. Maybe it's through an AI meter like Utilidata provides, or it's instrumentation on the power lines, the transformers, the substations, being able to ask those devices in real time through generative AI queries.

How much power do you need today? How much do you need for the next hour? Do you have excess power that I need to be able to shed? Or better yet, what if the line just tells you it's about to drop and you can de-energize the line so that it can be maintained so no one gets hurt. If we instrument this equipment, and it's really not as expensive as redeploying $5 trillion worth of infrastructure. It's significantly less than that. And we can go a long way with some best practices.

We're already tackling those impossible problems, and trying to democratize them so that these utility executives can now ask for people, "Please, go turn this on. It gives us a better integration with the evolving technology in the EV space, the evolving expectation of our customers. Our companies only need to grow in our ability to supply our customers with what they're asking for, which is power."

All of these things are best practices for the world that we live in today.

Viswanath: I do hope all of you have enjoyed this episode and will join us next month as we tackle microgrid applications and grant financing. Goodbye for now.

Disclaimer: The information provided in this podcast is not intended to be investment, tax, or legal advice and should not be relied upon by listeners for such purposes. The information contained in this podcast has been compiled from what CoBank regards as reliable sources. However, CoBank does not make any representation or warranty regarding the content, and disclaims any responsibility for the information, materials, third-party opinions, and data included in this podcast. In no event will CoBank be liable for any decision made or actions taken by any person or persons relying on the information contained in this podcast.

Where to Listen

Anchor Apple Podcasts Google Podcasts Pocket Casts RadioPublic Spotify TuneIn RSS