Uncanny Valley: The AI Infrastructure Boom and Its Implications
Introduction
This episode of Uncanny Valley delves into the significant investment of tech giants in AI data centers, the associated concerns, and the various aspects of this growing phenomenon. Tech giants have injected hundreds of billions of dollars into AI data centers in the current year alone. However, as these deals accumulate, so do apprehensions regarding their viability and sustainability.
Michael Calore, Lauren Goode, and Molly Taft engage in a discussion about how these energy – intensive facilities operate, the diverse industry interests at stake, and the potential for this expansion to face setbacks.
Mentioned Articles
- “The AI Industry’s Scaling Obsession Is Headed for a Cliff” by Will Knight
- “OpenAI’s Blockbuster AMD Deal Is a Bet on Near – Limitless Demand for AI” by Will Knight
- “A Political Battle Is Brewing Over Data Centers” by Molly Taft
- “How Much Energy Does AI Use? The People Who Know Aren’t Saying” by Molly Taft
Hosts’ Contact Information
- Michael Calore can be followed on Bluesky at @snackfight.
- Lauren Goode can be followed on Bluesky at @laurengoode.
- Molly Taft can be followed on Bluesky at @mollytaft.
- Write to the show at uncannyvalley@wired.com.
How to Listen
- Listen to this week’s podcast via the audio player on the page.
- For free subscription to get every episode:
- On an iPhone or iPad, open the Podcasts app or tap the provided link. Alternatively, download apps like Overcast or Pocket Casts and search for “uncanny valley.” The show is also available on Spotify.
Transcript
Note: This is an automated transcript and may contain errors.
Michael Calore: Hey, Lauren. How are you?
Lauren Goode: Hey, Mike. I’m great. It’s wonderful to be back in the studio with you. Our schedules didn’t align in the past few weeks.
Michael Calore: Indeed. But now, everything has fallen into place, and here we are again.
Lauren Goode: Here we are. I’m sure our listeners have been wondering when we’d be back together. I attended another AI – related dinner last night. Everyone’s discussing AI, especially whether we’re in an AI bubble. This is likely fueled by the news of numerous AI infrastructure projects.
Michael Calore: Yes, data centers.
Lauren Goode: Data centers.
Michael Calore: Large warehouses filled with computers.
Lauren Goode: Filled with server racks.
Michael Calore: We’ll cover all aspects of this on today’s show. But first, let’s welcome our guest, WIRED’s senior writer and climate – energy expert, Molly Taft. Hello, Molly.
Molly Taft: Hello.
Lauren Goode: Hi, Molly. Is this your first Uncanny Valley podcast?
Molly Taft: Yes, and I’m a fan. This is a significant moment for me. I’m excited to discuss my favorite topic, which is what people ask me about the most at parties these days.
Lauren Goode: And what would that be?
Molly Taft: Data centers. It’s all about data centers.
Lauren Goode: I thought you’d say electricity bills.
Molly Taft: Well, they’re related, aren’t they?
Michael Calore: Yes.
Molly Taft: People are quite passionate about it.
Michael Calore: This is WIRED’s Uncanny Valley, a show focusing on the people, power, and influence of Silicon Valley. Today, we’re exploring the AI infrastructure boom. In recent years, tech giants such as OpenAI, Amazon, Meta, and Microsoft have invested hundreds of billions of dollars in data centers. These are large facilities filled with servers that provide the substantial computing power required to run AI models. Recent announcements, like OpenAI’s new Stargate data center in Texas or the AI startup’s deal with chip – maker AMD, have further increased the hype and capital surrounding the expansion of these data centers. However, as the deals continue to accumulate, the issues associated with this rapid expansion are becoming increasingly concerning. We’ll explore how these data centers operate, their impact on communities, and what they reveal about the current state of the AI industry and the economy. I’m Michael Calore, Director of Consumer Tech and Culture.
Lauren Goode: I’m Lauren Goode, a senior correspondent.
Molly Taft: And I’m Molly Taft, a senior writer covering energy and the environment.
How AI Queries Interact with Data Centers
Michael Calore: I believe most of us, including our listeners, are familiar with data centers from the news and have a general understanding of what they are. But let’s discuss why AI companies rely on them. How does a ChatGPT query from my computer end up in a data center?
Lauren Goode: This is an excellent question. Let’s get into the details. Say you type into ChatGPT something like asking for dinner recipes tonight or “What should I get Mike for his birthday next year?” You send your request, which first goes through several checkpoints on OpenAI servers. There’s authentication to verify you’re a valid user, moderation to ensure the prompt adheres to their guidelines, and load – balancing to determine which data center should handle the request. The text you write is broken down into small chunks called tokens, which are like puzzle pieces that AI models can process. At this stage, the request reaches specialized hardware, usually GPUs.
Molly Taft: Correct. The next part is crucial. GPUs (graphic processing units) are essential for AI systems. These electronic cards are highly efficient at parallel processing, meaning they can perform multiple calculations simultaneously. Data centers are filled with rows of servers containing these GPUs. Once you send your query, it’s processed by these GPUs. You may have heard of some well – known ones in tech headlines, like Nvidia’s H100s.
Lauren Goode: Ah, the H100. It’s quite popular.
Michael Calore: An old acquaintance.
Lauren Goode: To bring it back to our query, once it arrives at the data center and servers, the AI model starts working. This is called the inference time. The model predicts what words, in the form of tokens, should come next, building a complete answer. Finally, the response is sent back through the same network path to your browser or app. This is a simplified explanation, but in general, that’s why your ChatGPT question needs to go through these data centers.
Michael Calore: And all this happens in just a few seconds.
Lauren Goode: It’s quite remarkable.
Michael Calore: Indeed.
Lauren Goode: Yes.
Michael Calore: I appreciate the breakdown. Also, for the record, I probably wouldn’t ask ChatGPT for recipe recommendations. Lauren, I hope you get a good answer for my birthday gift.
Lauren Goode: You might remember I did this last year, and I still haven’t gotten you anything.
Energy Consumption of Data Centers
Michael Calore: The next question regarding data centers is their energy intensity. Even a simple query involves accessing multiple servers, breaking the query into tokens, and distributing them across the data center. What impact do these computing requirements have on the environment? Molly, I’d like to ask you this as you’ve reported on this extensively. What does the energy consumption of a data center look like? Are we only talking about energy, or does water usage come into play as well?
Molly Taft: Large data centers have complex operations. They require cooling systems, lighting, and network equipment, all of which consume a significant amount of energy. Their energy consumption isn’t constant; it cycles based on query volume, often reducing at night when queries are fewer. The environmental footprint of a data center depends on the type of energy it uses. If connected to a dirtier grid, such as one powered by fossil fuels, it will generate more emissions. A data center running on solar or wind power will have a lower footprint. Calculating the exact footprint of a specific data center is challenging as much of the environmental impact information is proprietary. Most of the data we have is volunteered by companies. For example, Meta is building a data center called Hyperion in Louisiana as part of a large – scale expansion. It’s projected to be about five gigawatts, which is massive, nearly half of New York City’s peak power load. There are many such projects globally. Some regions are already concerned about data center energy consumption. In Ireland, data centers currently use over 20% of the country’s electricity. Virginia is also facing a significant increase in projected data center energy use in the coming years. Overall, there’s a clear upward trend in data center energy consumption.
Lauren Goode: You mentioned that each tech or data center company reports its energy usage. I’m curious about the depth of their reporting. For instance, if they use a certain number of GPUs, and the components of these GPUs are shipped or manufactured globally, that also incurs an emissions cost. Are any of them including this in their reports?
Molly Taft: The issue with climate and emissions reporting is that it can be a never – ending rabbit hole. What you’re referring to is part of emissions reporting, which aims to determine the total environmental footprint of a company’s activities. Climate experts and those who analyze emissions are constantly trying to calculate this for specific companies. However, many companies are reluctant to delve too deeply into this chain. When we discuss power use, we usually focus on on – site power use, i.e., what’s needed to operate the data center. The actual total footprint is likely much larger than we currently understand.
Lauren Goode: When you mentioned people analyzing emissions, I think of Sasha Luccioni, the climate lead at Hugging Face. She’s quite vocal in questioning the numbers presented by tech leaders.
Molly Taft: Yes, Sasha is excellent at highlighting issues, especially when model energy usage is less transparent. There are often numbers thrown around, like Sam Altman’s blog post this summer stating that the average ChatGPT query uses about as much energy as an oven in just over one second or a high – efficiency light bulb in a couple of minutes, around 0.34 watt – hours. Sasha correctly points out that such figures are insufficient. What does “average query” mean? How many queries occur? What kind of grid are the data centers connected to? Are they renewable – based or coal – based? There are numerous factors influencing product energy use. Sasha put it bluntly, saying Altman “pulled that figure out of his ass.” She also said, “It’s astonishing that we can know a car’s miles per gallon, yet we use AI tools daily with no efficiency metrics or emissions factors.” A single number doesn’t provide a comprehensive view and may just allow companies to delay releasing more data.
Michael Calore: We lack clear indicators of how much energy each query and data center consumes, other than knowing it’s substantial and a problem. Despite this, many big tech companies investing in AI infrastructure continue to build. Who are the other major players in this space?
Lauren Goode: The major ones are likely those we’ve already mentioned, such as OpenAI, AMD, Nvidia, and the Stargate Project, which is an umbrella term for multiple partnerships. The Stargate Project is a $500 – billion, 10 – gigawatt commitment between OpenAI, SoftBank, Oracle, and MGX, representing the interconnected relationships between hyperscalers and chip – makers. Interestingly, the way these investments are described has shifted, with an emphasis on gigawatt investments. These are often staggered investments based on the assumption that AI demand will continue to grow, and more compute power will be needed. It’s not a straightforward unit – based reporting like “for this investment, a company gets X million GPUs.”
Hyperscaling and Political Influences
Michael Calore: You mentioned hyperscaling. It seems like we’ve been discussing it, but are there companies aiming to grow even more?
Lauren Goode: Well, they all want to grow. Hyperscalers refer to a class of major tech companies or cloud service providers, including Meta, Amazon, Microsoft, and Google.
Molly Taft: These companies have vast financial resources and can raise capital easily. They’re capable of undertaking large – scale, rapid construction projects. They’re getting creative as their goal is to quickly build and operationalize these data centers to compete with each other.
Lauren Goode: I agree, Molly. There’s a lot of “frenemy” building going on. I’d love to see their group chats during announcements.
Michael Calore: Speaking of “frenemies,” these companies also operate in the political sphere. To build a large data center, political support is required, involving local residents, governments, and the state or country. What’s happening politically regarding data center construction and opposition, as well as regulation?
Molly Taft: This is a great question. The national and local conversations differ. At the national level, the Washington administration is generally supportive of an American AI empire. In terms of energy, the Trump administration’s approach was to support fossil fuels for data center power, including oil, gas, some nuclear, and coal. This benefits these industries as the massive expansion of power demand makes them key energy providers. Locally, there’s growing opposition to data centers for various reasons, such as water use, concerns about rising electricity rates, and noise. Some high – profile struggles have brought this issue to national attention. For example, when Elon Musk wanted to set up xAI in Memphis, he installed unpermitted gas turbines in a predominantly Black community already facing air pollution and asthma issues. Earlier this year, there was an attempt in DC to impose a moratorium on state AI regulation in a large bill, which ultimately failed. Marjorie Taylor Greene opposed it, even mentioning data centers and comparing AI to Skynet from the Terminator franchise. This shows the contrast between the administration’s push, the interests of powerful energy companies, and grassroots local movements concerned about community impacts.
Is AI Expansion a Good Idea?
Michael Calore: We’ll take a short break. When we return, we’ll discuss why AI companies’ aggressive scaling might backfire. Stay tuned.
Welcome back to Uncanny Valley. We’ve discussed why data center investments are increasing and their known impacts. Now, is this aggressive expansion a good idea?
Lauren Goode: It depends on who you ask, Mike. AI founders typically claim there’s no need to worry and that it’s necessary to meet current and future AI demand. However, the main issue is that despite all this investment, consumer spending on AI isn’t there yet. Some frontier model companies’ significant revenue comes from enterprise developers building for their companies. I believe that unless consumers become developers, using AI to build and write code, it’s unclear how revenues will increase. The Economist reported that AI hyperscalers, the largest AI spenders, are using accounting tricks to reduce reported infrastructure spending, thus inflating profits. This is why many are concerned about an AI bubble. It’s about supply and demand, and currently, we’re investing heavily in supply, hoping demand will continue to grow.
Michael Calore: Fingers crossed.
Molly Taft: When discussing demand, there’s more than just accurately predicting the future. In the late ’90s and early 2000s, there were claims about the internet’s massive energy consumption, suggesting it would use half of the US’s electricity by the 2010s and requiring more coal – fired power plants. But researcher John Koomey found that these claims were pushed by industries that stood to gain from such a build – out. In reality, efficiency gains reduced internet energy use. We’re in a situation where US energy use has been stagnant, and we’re becoming more efficient. Now, there’s a potential new large – scale customer in various industries, not just energy. It’s difficult to determine if we’re in a bubble when so many have an interest in people believing in continued growth.
Lauren Goode: When investing in infrastructure, it’s a fixed investment based on the assumption that computing methods will remain stable for a while. However, some computationally – intensive AI models may soon offer diminishing returns compared to smaller models. For example, OpenAI’s frontier models are currently more advanced than those from academic labs, but this may not always be the case. There’s a lot of research on new alternatives to deep learning, novel chip designs, and quantum computing. The success of DeepSeek’s low – cost model in China in January served as a reality check for the AI industry.
What Citizens Can Do
Michael Calore: As citizens, we can decide whether to use AI tools and get involved in data center – related decisions. What can we do?
Molly Taft: I often think about this. I assume most listeners are interested in learning. One thing I tell people at parties is to learn more about their local electric utility. If a data center affects your electric bill, it’s related to your utility. US utilities are structured peculiarly; many are investor – owned, aiming to make a profit while controlling your monthly bill. Although the problem may seem overwhelming, delving into the details can help you find people organizing for better utility practices, such as using more renewable energy and monitoring rate hikes. They’ll also be informed about data center impacts on your bill. So, I encourage people to read more about US electric utilities.
Lauren Goode: That’s great advice. Molly, you’ll be glad to know that a few weeks ago, I visited Intel’s new chip fabrication plant in Chandler, Arizona. Among the mainstream tech journalists, there was a woman from an Arizona paper closely monitoring power, utilities, and water usage in the area.
Molly Taft: She’s the real deal. Read her work.
Lauren Goode: My advice isn’t directly related to energy or AI use. I’d say double – down on the humanities. Even if we’re in an AI bubble or it bursts, AI isn’t going away. Its form may change, but I believe our ability to think, build human relationships, and appreciate human – generated art will set us apart. So, I spend time reading books, watching films, and reconnecting with family and friends when I’m not online. It’s my small act of resistance.
Michael Calore: That’s excellent.
Molly Taft: Agreed.
Michael Calore: You need to understand these technologies to form an opinion. Engage with them, but don’t over – use AI. Avoid thanking the machine as it consumes more resources. Understand the technology well enough to know why having an opinion is important. Also, resist AI features being added where they’re not needed. If you don’t see the utility in AI features on your phone or car, turn them
