Powering AI Solutions – Dr. Jonathan Koomey, President of Koomey Analytics

Speaker 1:

Welcome to the Upstack podcast, an ever evolving conversation on all things digital infrastructure, giving tech leaders food for thought as they push to stay ahead of the technology curve. I'm Alex Cole, and with my cohost and colleague, Greg Moss, we invite you to join us as we talk candidly about the latest technology infrastructure topics. Stay with us. Well, here we go. Another episode of the Upstack podcast.

Speaker 1:

Can't wait for the day where we we record one like, multiple episodes a day, Greg. You know?

Speaker 2:

That that that's gonna happen, Alex. We're gonna get Yeah.

Speaker 1:

We're gonna get there and and for the listeners out there, hopefully my audio is coming through loud and clear because a tremendous amount of effort went to trying to suppress some some air conditioning sounds. But Greg, I've I've had a few topics buzzing around in my brain over these past few weeks. In no particular order, one, have you fully recovered from that hike we went on? Boy I know.

Speaker 2:

You call it a hike, I call it a walk. But yes, I was a little disappointed I had to bring my boots along for such a such a light walk.

Speaker 1:

I mean, I guess I I need to to get get some more cardio in because I felt that was pretty strenuous. I've I've only just recovered it. It was three months ago. And so that's been very much top of mind because I can still feel it in my legs and my and my lungs. But the other topic that's been buzzing around in my mind, and I imagine the minds of of many people the world over especially since what was it?

Speaker 1:

The fall, think it was November with the dawn of something that we've now learned as chat GPT four is is artificial intelligence. We know it's been around for decades, but it seems like it's it's at the forefront of culture and commerce and not going anywhere. So it feels like the perfect topic for the next episode of the Upstack podcast. Does that work for you, or do you wanna talk about the hike or lack thereof?

Speaker 2:

Sounds super relevant. I'm really excited to kinda peel back the onion and find out how much of it is hype and how much of it is things we really need to kinda take into consideration as we move AI forward.

Speaker 1:

Move AI forward. That's gonna be a big onion. It's a big onion. It's big onion. I feel like you peel back a couple thousand layers and there's a couple thousand more to to go.

Speaker 1:

Well, as I do, I was doing a bit of research. Did you know, our our friends over at Equinix, the data center I do. Yeah. I know You know them well. They released recent results of a recent study they performed.

Speaker 1:

And there's some really interesting stats. They pulled global IT leaders. 85% of IT leaders, so respondents to this study, were already using or planning to take advantage of AI.

Speaker 2:

That's a big step. That's a

Speaker 1:

pretty high number. It's a pretty high number. Point five out of 10. But roughly four out of 10 doubt that their existing infrastructure can handle the demands of AI. So fewer than half.

Speaker 1:

And a similar amount, about 40%, are not comfortable with their IT team's ability to accommodate the growing use of AI. So you have this topic of technology at the forefront of seemingly every aspect of our life, and there's a lot of interesting perspectives and takes on the active application of AI and potential application. But we know there's there's infrastructure that underpins everything. Now on top of all of that, we know that most Americans, as I mentioned, they're now familiar with with ChatGPT from OpenAI. One in three have actually tried it or other AI powered tools, be it Google's Bard or others.

Speaker 1:

On the flip side, AI operations actually take a ton of power, and we're gonna get into that in today's episode. And companies that are looking to utilize AI are actually rushing to figure out how do they meet those demands for power. They wanna use a lot utilize the tech. It requires power as most technology does. And it turns out there is a scramble for power, which kind of takes me back to one of our earlier episodes on, the the power grab.

Speaker 1:

But today, let's talk about how companies are are scaling for AI right now and really the power demands behind that technology. And if we can, why don't we put our break out the crystal ball and figure out figure out what the future holds for for AI and how this power grab might play out. How does that sound?

Speaker 2:

I think it sounds amazing. Let's get it. And I can tell you, I'm happy that the Wheel of Fortune got Ryan Seacrest and not you. I know it was

Speaker 1:

Oh, I tried.

Speaker 2:

Runner there, but I will tell you we're happy that you were able to stay and we were able to keep you on the up cut up

Speaker 1:

I I the out my reel wasn't long enough, but if Seacrest falls on his face, look out world. Look, I've always been a Vanna White's been incredible, a true icon. It'd be a pleasure to work with her. But you're you're a decent second. But, Greg, you and I could banter on forever, and it won't be that entertaining.

Speaker 1:

I think we actually this is where we need to bring in a subject matter expert. And I think we have, as always, the perfect subject matter expert. We have a doctor in the house, Greg. So to help guide us through this discussion of AI, you know, as it relates to power and many other related topics, we're joined today by doctor Jonathan Coomey. Jonathan, welcome to the Upstack podcast.

Speaker 1:

We have in our midst. Buckle up, my friend. It's gonna be it's gonna be a fun ride. Lots of laughs. You are a researcher, an author, a lecturer, and an entrepreneur, and your work has spanned climate solutions and the energy and environment effects of information technology.

Speaker 1:

We could not find a more perfect guest, and we're we're thank you for joining us. We appreciate it.

Speaker 3:

Happy to be here.

Speaker 2:

You know, I'm I'm happier here and I'm happier here because I have some things that have been keeping me up at night. One of the biggest things is around, you know, everybody's interest in AI, right? And what this has led me down a path of is really trying to understand, are all these applications in this kind of rush to figure out how AI is influencing these companies real? Is it not real? I mean, obviously, there's some fluff in it, some fad.

Speaker 2:

But I'd be curious to hear from you how much is demand? How much of this demand and exploration is has staying power?

Speaker 3:

Well, first, one of my goals for this podcast is to make sure you get a good night's sleep Thanks. Going into it. So you see many references to these new AI tools in the media, and you've you've alluded to that in your introduction. And what's really clear is that there are applications for which the software is incredibly useful. And we're hopeful that these applications will be widespread, but at this point we don't really know just how widespread.

Speaker 3:

So there are some good examples. Actually these tools can help programmers be a lot more efficient. In some cases up to twice as efficient in creating code. There's a case study of call center workers using generative AI improving their productivity by 15 to 30%. So those are real effects and it's something that this is why there's so much interest is that there are these examples where these tools have had a dramatic effect.

Speaker 3:

The question is whether you can apply these tools across the board and I think there is a lot of hype here. There's a lot of people kind of jumping on board the bandwagon trying to apply these tools for different things and you know that's probably good in the beginning because you want people to experiment and find where the new tech can be useful. But there's going to be places where it's just not going to help or it's even going to be a problem. One of the things we've seen is that these tools just fabricate references. So for a researcher like me, know, I want to know well who is the person or people working on this topic.

Speaker 3:

So I want to know what articles are out there and we've seen examples where somebody asked it a question and it just creates articles that don't exist. You know? And and, know, and and, you know, coauthors who never worked together and this kind of thing. And so they call this hallucination, which I think may be giving it a little too much credit. It's not actually thinking.

Speaker 3:

Right? It's actually what it's doing is it's saying what is the most likely next word. Right? It's using all this information about all the words that it's ingested and it's saying, what what would be the most likely thing to make this sound like a plausible story or a plausible article? And so if you want factual information, you have to be doubly careful here because this thing will create stuff out of whole cloth.

Speaker 3:

So on the one hand, we've got real benefits in certain cases. On the other hand, we have some downsides and people need to balance those and figure out where the tools are actually gonna make a real difference.

Speaker 2:

It makes a ton of sense to me. Right? But as I start to think through all the possible verticals out there, right, healthcare, finance, legal, non for profit, you know, everybody seems to think if I don't somehow adopt AI or at least look at it to see how it functions within my organization, I'm gonna somehow lose. And what that's created is this kind of mass rush to figure it out. Right?

Speaker 2:

Have you seen a situation like this in the past? Have you seen this type of hype cycle?

Speaker 3:

Yeah. Yeah. This is something, you know, I have enough gray hairs that I've I've seen a few of these hype cycles and in the late nineties we had the the .com mania and so out of that came some great companies like Amazon and and a bunch of others, but it also led to huge amounts of money being thrown at ideas that just didn't have a business model and didn't make a whole heck of a lot of sense. And so it's the same sort of thing and back then people still were worrying about power as well. And you had reports of data centers using 300 watts per square foot.

Speaker 3:

You had people saying that the Internet was going to use half of all electricity in ten years. So so this is not a new thing. Right? People they're fascinated by new technology and their their tendency is to assume that because a technology is important economically or business wise that it also has to use a lot of electricity and turns out that's not actually true. It can be true sometimes but oftentimes you can have the use of this technology and it'll either you know be improving inefficiency very rapidly or it will displace other uses that are less efficient.

Speaker 3:

And I can give some examples of that but the main thing to remember is that people love to exaggerate this. They love the the steep part of the learning curve. They love the the amazing factoids that they can share with their friends. But a lot of times the tendency is to overestimate the electricity used by these technologies, and that's something that people need to really watch out for.

Speaker 2:

Wow. Well, I mean, at least it's it's showing a a brighter outcome. You know, I just feel like, you know, when there's power, such gross amounts of power required to run these types of applications, it just gets scary, right, when a fad like this kinda kicks in. So I just wanted to validate and hear from you, why you believe we're not on a crash course to Exodus, Right? And and mass power outages and blackouts and you know, we're not even Well,

Speaker 3:

yes. So we're just at the early stage of of tapping this technology. And for five or six years the technology companies have been focused on making these devices for machine learning more efficient. And this is a specific case. So artificial intelligence is a special case of machine learning and then machine learning is a what you would call a special purpose application for computing.

Speaker 3:

And it turns out that for these special purpose applications you can design technology that is super good just for that problem. And that's really where a lot of these efficiency gains will come optimizing the hardware and the software to solve the particular problem that allows you to create these big generative AI models and to do it really quickly and really efficiently. And the important thing to remember is that we're just at the early stage of doing this. We're applying our smarts to this and every time that the technology industry applies its smarts to a real problem they get it solved pretty quick. And so you know this happened in the early two thousand's we had a doubling of electricity use for data centers.

Speaker 3:

So it was a real thing. We were building out data centers in The US and around the world and so electricity use doubled. So that was a real thing that happened from 2000 to 02/2005. Around about 2006 and 2007 the industry really got focused because they saw that this was a real problem and they improved the efficiency of their hardware, improved the efficiency of their software, they figured out how to make the cooling and the power distribution and data centers a lot more efficient and they started to move towards what we now call hyperscale data centers which are these super efficient data centers that are pretty homogeneous and that are really good at removing heat using not a lot of electricity. And you know these are the data centers run by the big players whose names you know Amazon, Microsoft, Google, Facebook.

Speaker 3:

And these big data centers turned out to be a lot more efficient than the corporate in house data centers that they're mostly replacing. And that allowed us to vastly increase the delivery of computing services from 2010 to 2018 but the total electricity use for all data centers in the world only went up 6%. So you had you know the compute instances going up 550% and the data transfer is going up 11 fold and the amount of data stored going up 26 fold during that period but total electricity use only up 6% total. So that that's an example of how when the industry focuses on a problem they go after it and they figure out how to how to solve it and I'm sure that we're kind of at the early stage of applying those smarts to the generative AI models and they do that not just because it's efficient and because it saves money but it also generally improves performance. And so that to me you know if you've got a model that takes a long time to train the faster you can do that the better off you're going to be because then you can deploy it faster.

Speaker 1:

So efficiency is coming for AI. Greg can sleep better at night now although but you mentioned hype cycle, and I think that's it's actually an an apt term because there does seem to be this this rush in this over application potentially of AI. But, John, maybe we don't need to worry about, you know, as as the hyperscalers can only you know, there's only so much bandwidth they can offer, only so much space they can give the companies. We don't need to worry about smaller companies, you know, firing up, you know, a coal a coal power plant or anything to to power their AI initiatives. That's something that doesn't need to to come into

Speaker 3:

Yeah. I suspect that on the margin there might be a few examples like that in the next few years, but I don't think it's you know, coal the economics of coal turns out to be not very good nowadays even from the direct cost perspective. If you look at it from society's perspective, coal hasn't been economic for a long time. So people are phasing out coal plants all over the world. And so, you know, there might be some natural gas plants that that operate a little longer on the margin, you know, in a few cases.

Speaker 3:

But I wouldn't worry too much about it. I think the cost advantages that the big providers have, the hyperscale providers have turn out to be really really big. And the reason is that they have many more users, they have many more kinds of users. So they can spread their computations over the whole day and that means more computations per piece of equipment. So if you're looking at it from a cost per computation or energy per computation perspective, the more computations you can get out of a given piece of equipment, the lower the cost is.

Speaker 3:

And it turns out a lot of these smaller companies are running data centers that are really not used very much. They're used single digit percentages and there's a lot of zombie servers that are sitting around doing nothing. You know 2030% of them. So in that case the shift to hyperscale offers so much more efficiency at so much lower cost that the most likely thing is that the big hyperscalers are the ones carrying the load and that you know there might be some incremental energy use and infrastructure build out in the smaller companies, but most of them are gonna be subscribing to this like software as a service. Right?

Speaker 3:

And because the economics are much better for the big players.

Speaker 2:

100%. So I agree the economics definitely will work in the favor of the customers and as they start to understand how the applications work for their organizations. You bring up a really good point. Right? The hyperscalers.

Speaker 2:

Right? You got the hyperscalers, you got Microsoft, Google. So I feel like, you know, if it was just AI, you know, what you're saying has some validity. I feel like if you take that and you marry that to the power demands the hyperscalers are asking for and now the unknown, right? The big unknown kind of sitting right dead smack in the middle is crypto.

Speaker 2:

I mean, whether you like it or not, it's still here. And I think that if you look at the three business models, right? You have hyperscale that has to operate within a very defined cost basis for electricity, right? Crypto, that's probably even more stringent on the cost basis, right? And that fluctuates obviously with the cost of, you know, and the the value of the the crypto.

Speaker 2:

And then you have AI. So if you have three, what we call mega markets kinda chasing power, you know, is there something to prevent, I don't know, despair? I mean

Speaker 3:

Well, okay. So what what happens in these in these situations where there's a new rush towards something is that a lot of companies ask for power in multiple locations because they wanna figure out, okay, where can they put a new data center? And sometimes they'll ask for it in, you know, three or five locations and they're only gonna put it in two. So part of the thing that happens is that the local power people, the utility people, and the local authorities who worry about this, they see all these new applications and they say oh my god this is going to lead to you know a massive increase in our service area. It might and then people kind of roll that up and they say, my god we're seeing it you know in dozens of places that this is happening, but not all of that is real demand.

Speaker 3:

Some of it is companies covering their bets and investigating a bunch of different options, bunch of different places and not all of which are gonna get a new data center. So that's part of it. Another part of it is the kind of conventional hyperscale data centers. Some of their activities may be displaced by the AI applications. So for example, if you see displacement of search by AI, you know, the electricity used for the search on the search side is going to be, you know, reduced.

Speaker 3:

So it's always I think of it like that parable of the blind men and the elephant. And the each of the there's, you know, half a dozen blind men around the elephant, and one of them is feeling the leg. And so, oh, it's a big thick thing. And then there's one of them is feeling the trunk, and one of them is feeling the tail. And each of them has a piece of the picture.

Speaker 3:

But what's happening is that we need someone who can actually look at the whole thing and see what's happening. And that's something that we're not we're not able to do from these piecemeal summaries that we're getting from individuals and institutions in local areas. So I would say it's early days. People need to start counting and adding this stuff up so that we have a clearer sense of what's actually happening. But I think it's likely that this initial enthusiasm will moderate and that people will get better data on which, you know, how many data centers devoted to AI are actually going to be built.

Speaker 3:

And then we'll be able to have a better judgment. But I think always in the early days of the hype cycle, it's it's very hard to know what's happening. And I think that's just the nature of the beast.

Speaker 2:

Okay. So so moving aside power consumption for a minute and just let's just talk locking power up. Right? Because I do see a ton of people on the market going out and just taking 250 megawatts or a gigawatt of power and kinda just getting it into contract, getting their PPA locked down, you know, for a rainy day. You know, that is

Speaker 3:

So so you mean that they're they're investing in a new wind plant or a new solar plant or are they like, what are they are they getting something built or are they just buying power that's already there?

Speaker 2:

So they're buying power that it's maybe that's entitled property, maybe it's greenfield, and maybe it's a substation, maybe it's multiple substations. But what they're doing is they're going to the local power authority and saying, hey. You have a, you know, a gigawatt power here. I want that. Let me sign a ten, fifteen, twenty year PPA.

Speaker 2:

And guess what? I may or may not build. I don't know. And I feel like, you know, as you lock up power, whether it's used or not, if there's a mass, if there's several companies that kind of own that, how does that what kind of implications does it have for I'm shifting gears from b to b to b to c for a minute for just just power costs in general.

Speaker 3:

Well, if if these companies are actually gonna contract for new renewable power that wouldn't have been built otherwise, then I think that's gonna lead to power costs going down for everyone because the the low on the margin renewables, wind and solar are the lowest cost options, particularly from a society perspective, but even from a a direct cost perspective they're increasingly the cheapest game in town. And you know that's including some battery storage to allow you to shift some of the generation to later in the day, this kind of thing. So I think if they're building new renewables then I think that's probably a good thing. And if they decide you know they sign a contract that gives them an option for that I don't know that there's any harm done particularly if they actually end up building some significant fraction of the ones that they're contracting for. So I guess I'm not too worried about that.

Speaker 3:

There's of wind and plenty of solar in lots of places. So it's not like, it's not, it's limited, it's a limited resource in the sense that there's a limit to how fast we can build. But increasingly that even that is becoming less of an issue because the solar and wind industries are doubling every two or three years. And now we're at we're really at the scale where you know in a few years we're going to be you know recently it's been 90% wind and solar on the margin like in terms of the capacity that's been added and now the industry is so big and scaling so fast that there you know we're not going to be as constrained as we used to be. So I guess I'm not too worried about that part.

Speaker 3:

The tech industry I think has been particularly forward thinking in pushing renewable projects that wouldn't otherwise have been built. Because they've got high margins, they've got the need for stable power, they have customers who actually care about the emissions impact of the data centers. So they're really incented to to do something about that problem. And for them, they wanna just put it behind them. They really that that's a problem.

Speaker 3:

They wanna put it behind them. And by signing these long term contracts and actually building these facilities, they're lowering power costs for everyone and they're, you know, contributing to the energy transition which I think is a pretty good thing.

Speaker 2:

I love the optimism.

Speaker 1:

A lot of optimism. What I'm hearing is power supplies if not keeping up with outpacing demand.

Speaker 3:

Well, so far, I mean, you know, it is always possible in specific places where they're gonna have constraints. I tend to think more at the global scale, know, The US national scale and on that, you know, in that from that perspective, you know, data center electricity use is about 1% of our electricity use. Crypto might add another half a percent, You know, what AI right now has been relatively small, and we'll see where that goes. But but we're not talking about 10 or 15 or 50% of electricity. We're talking about one or 2%.

Speaker 2:

So who's using the power? Who's using all Oh

Speaker 3:

of our you mean the rest of it? The other 98%?

Speaker 1:

Was going be my question.

Speaker 3:

All houses and industry and commercial buildings. I mean that that most two thirds of the electricity goes to buildings. So you know residential buildings and commercial buildings of all types and so that's a huge part and then you know as we as we build out for electric vehicles that'll that might increase by, you know, another 20 or 25% on what we're using now. So it's not outrageous. So there will probably be a shift to electricity and we're gonna have to build a lot more, but that's gonna displace fossil fuels.

Speaker 3:

And that's probably a good thing for a lot of reasons.

Speaker 1:

Definitely a good thing. When you read about AI so often, it's new policies, procedures, guidelines, guardrails being put in place to control the application of the technology. I'm wondering, do you foresee similar moderation guidelines being applied to the infrastructure of AI? And we're talking about power, but supporting creating the thing. Do we need to put constraints, not constraints, but guardrails, I guess, I'll stick with.

Speaker 3:

Well well, could you could think of it as standards. You don't have to think of it as as constraints really, but standards turn out to be really important especially when an industry is growing rapidly because ideally you want to be able to take devices from manufacturers A, B and C and put them into your data center and have it all work. Right? And so that's why companies don't agree on standards. And so I think that we're just at the early stage of figuring that out for the kinds of computing installations that would be used for AI.

Speaker 3:

And one of the things is that these data centers or the racks in which you're doing these AI calculations are so dense that you really do need some sort of non air cooling. Like you need to take water to the water to a heat exchanger that clamps on the processor and pulls the heat directly off. So that's a big change, right? And it's not just water. There's other sort of phase change materials that you can use that would you know that's liquid it goes on the chip it heats up it evaporates and comes back and condenses.

Speaker 3:

So there's different ways to do that but I think having standards is really important for allowing the industry to grow in a sustainable way. And there's this thing called the open compute initiative and that's something I think Facebook And they have been really important at getting standardization on here's a rack, this is how big it is. And airflow needs to be a certain direction because it's mayhem in the data center. Like a lot of these devices you know some of them have airflow left to right, some of them have back to front, some of them have front to back. And it's really hard to plan cooling in the data center when the equipment is so variable.

Speaker 3:

And so the open compute initiative has been really powerful at of standardizing the industry in a way that's helpful. It still allows competition but it gets rid of kind of wasteful variation that doesn't contribute to the end result which is more compute, less money, less energy, less emissions.

Speaker 1:

We're early on this back to hype cycle because this tremendous interest in use in AI and you figure on the path ubiquity eventually kind of settles somewhere between the floor and wherever the ceiling may

Speaker 3:

Yeah. And and we'll get more clever. I mean, I think that's part of it is that when the it's early days, they're just basically, people are just throwing in equipment as fast as they can. And once things get a little more, you know, a little slower, they can kind of step back and say, okay, what are we trying to do here and how can we do it in the most effective way? But in the beginning, everyone's just throwing as much equipment as they can into the data center.

Speaker 1:

This is a thing. How does it apply to our business? Mean, we we hear directly from customers and and just in and around the industry, you've got senior level leaders, even nontechnical leaders, board members, asking their teams to figure out what is how does AI apply to our business? What are the positive or even negative influences it could have on our business? And because of that that lack of expertise, there's an increased reliance on third parties to help answer those those important questions, a role we're happy to play.

Speaker 1:

But I'm curious, John, what do you see as what should they be looking for, frankly? How deep should they dig as they look for how AI can apply to their business and what the the implications may be?

Speaker 3:

So I would I would say the most important thing is experimentation because it's early days yet and people need to take these tools and apply them to the places where others have already applied them as well as new new places within their own businesses and see where things can help. Know machine learning has a really long history of doing amazing things. One of the things that Google did was they applied machine learning, not the generative AI but the but a precursor to their data centers. And they were able to save millions and millions of dollars in their operation costs by running what is now a pretty simple model compared to some of these other ones. So there are a lot of applications where it can work really well.

Speaker 3:

And each company has its own unique culture, its own unique set of problems and opportunities. And so I think experimentation is the name of the game and people should be trying to apply this in ways that are broad and in different parts of their businesses. It's not magic. So I think it's important for people not to fall for the idea that it's going to automatically do amazing things for you. In some applications maybe it won't.

Speaker 3:

But there are enough applications that exist that actually generate value and that have a business case that most companies should be at least thinking about doing this kind of experimentation.

Speaker 1:

Test and learn.

Speaker 3:

Test and learn.

Speaker 1:

We like test and learn.

Speaker 3:

Fail fast baby, fail fast.

Speaker 1:

Fail fast, learn hard. Do you see any companies that have good use cases in solid business models using AI?

Speaker 3:

I don't I actually don't I I actually generally don't comment on individual companies like that. I mean, they're if they're doing some things on physical infrastructure, I'll talk about it. But I think the main thing to look for is to find examples where it's been successful. There are studies and it sounds like you'll put some of those in the show notes that that have give those examples. And so if those examples sound like your business or something like your business, they can give you clues as to where you might be able to get the best value for the use of these tools.

Speaker 1:

And that list of examples is growing every day. Yeah. I can only imagine.

Speaker 2:

Yeah. Well, I believe it warrants regulation. Right? At the end of the day, you know, I think that you're right. It doesn't seem like there's gonna be this major concern around running out of power in The United States.

Speaker 2:

Seems like it's a manageable piece. I do believe it coincides with our green initiatives as a country. Right? And I believe that should require regulatory you know, some sort of regulatory laws around at least with the big guys. Right?

Speaker 2:

The people that are coming in and buying up the big tranches. If you don't hit them early and say, hey, listen, we need at least a fixed percentage of this power to be green, then I think that kind of breeds, you know, an opposing force with what we're trying to accomplish as a as a country.

Speaker 3:

Most of of the big players are already pretty far down the path of having you know almost a 100% of their power being from low or zero emitting sources. So they're they're pretty good. That needs to continue and the way this normally works is it's kind of a collaborative thing where the government people will go to them and say you know we really want you to focus on this and then they say oh yeah we can do that and then two years later they've you know they're at a 100% or whatever. Historically that's how it's been. So it's more in a way it's kind of a collaborative thing in a lot of cases.

Speaker 3:

Haven't, nobody forced Amazon and Google to procure all this renewable power. They just saw it was in their interest to do it.

Speaker 2:

That's great.

Speaker 3:

And so I think that's you know that's a hopeful sign there. Not every industry does it that way but these folks generally have been pretty good about that. So in terms of regulation though you know people are also talking about you know are there guardrails on how you use these tools and could they be applied in ways that hurt society and that's a discussion that's ongoing. That discussion will affect the underlying demand for these tools. Right?

Speaker 3:

Because if there are certain applications that could disrupt elections for example or cause some sort of civil unrest which you could imagine easily, then, you know, there will be guardrails that will slow the growth of the technology. And that feeds back to the energy side because the two components of the energy demands are there's the demand for AI services which can grow really fast or it could grow a little slower and then there's the rate of efficiency improvement and those are the two things that can vary a lot. People often forget the efficiency side but the demand also is highly uncertain. Like right now everyone's into it and they think this is going to be amazing, it's going to take over the world, but reality will eventually intrude and there will be applications for which this is more useful and applications for which it is less useful. And then, you know, things will settle down.

Speaker 2:

Alex, I think that's another podcast. Legal behind, I mean, we're still figuring out legal for the web, right? I mean, that's still a thing. I mean, I'm sure there'll be a lot of interesting topics.

Speaker 3:

Well, there's So there's copyright, for example, that's approximate one. Right? Because they're ingesting They're also. All all this material, and the material, a lot of it is copyrighted. And so right now you're seeing first there were some lawsuits but now you're seeing the big companies realizing oh okay we have to pay people for this.

Speaker 3:

But there's all these legal issues around well what is fair use? What are you using and what is the result that you're creating? Do you have legal liability for that? Let's say you ingest a bunch of material that gives people bad advice and they crash their cars because of it. I don't know, I'm just making it up.

Speaker 3:

But there's all sorts of possibilities with anything this new that need to be explored. So another podcast, you guys should do it.

Speaker 1:

Love it. It's one of the aspects of AI that makes it so fascinating is how much of our life it currently touches and could touch as people test and learn and and fail fast. And, Greg, I think we can agree. The onion the onion actually might be bigger than we we originally imagined. I mean, John, you touched on, in terms of of power and cooling, Cooling servers and new ways to cool servers and we've we've looked into and worked with some clients on really interesting submersion options.

Speaker 1:

You know, submerse your servers in goo. Really?

Speaker 3:

Yeah. Or it's oil. It's gooey oil. Gooey oil. Gooey oil.

Speaker 1:

It gets the job done.

Speaker 3:

Gets the job done, but boy, it scares the heck out of a lot of people in the day to day.

Speaker 1:

Yeah. Mission critical infrastructure floating in goo sideways.

Speaker 3:

Well, okay. So that said, it is true that a lot of times these servers when they break, they don't fix them. They just take them out and recycle them. Right? Because they're basically they're they've set up the software stack in a way that allows you to move loads around.

Speaker 3:

It's all virtualized.

Speaker 2:

But you don't wanna you don't wanna have it break at half life because of some of

Speaker 3:

Of course. Yeah. Yeah. Yeah. And fixing it is a big pain.

Speaker 3:

But that's why

Speaker 1:

that's waste.

Speaker 3:

Well, but that's why this direct to chip technology is something that people are exploring more seriously. Yep. And density generally means better economics. It depends on how expensive it is to pull the heat off, but if you can pack more in, you got some fixed costs, you can spread things over more servers, that's generally a better way from a total cost perspective.

Speaker 1:

Agreed. It feels like that the natural evolution just adoption increases, efficiency increases with it for the most part, we hope. We hope that applies to to AI as Greg promotes his favorite seltzer brand.

Speaker 2:

Listen. I'd love to have Otis back on the on the podcast.

Speaker 1:

We we need Otis's take on this.

Speaker 3:

Okay. You know, Otis

Speaker 1:

will AI and come in dogs.

Speaker 3:

Otis will come in in a little bit, I'm sure, because he's gets tired of sleeping and he wants me to do something useful for him.

Speaker 1:

Well, John, as we frantically write write down the other topics we're gonna need to cover on future podcasts, I think we've got the next four years of of podcasting podcast episodes covered. But then again, you you have a day job, and Otis seems pretty keen to to take a walk. So

Speaker 3:

Yeah. Yeah. Yeah.

Speaker 1:

We'll we'll wind things down for today, but only if you promise to come back as we tackle, I mean, the infinite number of of threads and related topics we're gonna need to to sort out as it relates to AI.

Speaker 3:

Okay. Well, I'm happy to do it in any applications. I'm happy to do it, and maybe we'll have more actual data when we talk again. That would think we're right now, we're you know, we really need more data.

Speaker 1:

Numbers numbers are helpful. That that's for sure. What all what is also helpful is related information, and you mentioned it, John, that we'll make sure that's linked in the show notes for people who wanna explore further some of the topics and resources we touched on today's episode. But, for now, we'll bid you a due, and listener will bid you a due as well, and we'll see you on the next episode of the Upstack podcast. Thanks for listening.

Speaker 1:

Thank you for listening to the Upstack podcast. Don't forget to like or subscribe to the show wherever you get your podcasts. We'll see you next time.

Creators and Guests

Alex Cole
Host
Alex Cole
Alex Cole is the SVP of Marketing at UPSTACK
Greg Moss
Host
Greg Moss
Greg Moss is a Partner and Managing Director at UPSTACK
Powering AI Solutions – Dr. Jonathan Koomey, President of Koomey Analytics
Broadcast by