Ghost data centers: What they are (or aren't) and why they're hindering the real promise of AI


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. learn more


In the the age of AIpublic utilities are now facing a new and unexpected problem: Phantom data centers. On the surface, it may seem absurd: Why (and how) would anyone do something as complex as a data center? But as the demand for AI rises along with the need for more computing power, speculation about data center development is creating chaos, especially in areas like Northern Virginia, the data center capital of the world. In this evolving landscape, resources are being hit with demands for power from real estate developers who may or may not. really build the infrastructure they claim.

Mock data centers represent a critical barrier to scaling data infrastructure to keep up with computing demand. This emerging phenomenon prevents capital from flowing where it needs to. Any initiative that helps to solve this problem – maybe using AI to solve a problem created by AI – it will have a huge advantage.

A gigawatt mirage is wanted

Dominion Energy, the largest utility in Northern Virginia, has received complete applications for 50 gigawatts of power from data center projects. That's more power than Iceland consumes in a year.

But many of these claims are either speculative or completely false. Developers are scouting potential sites and monitoring their power potential claims long before they have the capital or any strategy on how to break ground. In fact, estimates suggest that up to 90% of these claims are completely false.

In the early days of the data center boom, utilities never had to worry about false demand. Companies like Amazon, Google and Microsoft – called “hyperscalers” because they operate data centers with hundreds of thousands of servers – put in direct power requests, and simply delivered resources. But now, the drive to gain power has led to an influx of applications from lesser-known developers or speculators with dubious histories. Utilities, which traditionally deal with just a handful of power-hungry customers, are suddenly swamped with orders for power capacity that would knock out their entire grid.

Resources struggle to sort fact from fiction

The challenge for resources isn't just technical – it's alive. It is their duty to decide what is true and what is not. And they are not equipped to handle this. Historically, utilities have been slow-moving, risk-free institutions. Now they are being asked to check on speculators, many of whom are just playing the real estate game, hoping to flip their power shares once the market heats up.

Resources have groups responsible for economic development, but these teams are not used to dealing with dozens of speculative requests at the same time. It's like a land rush, where only a fraction of those who apply actually intend to build something substantial. The result? Paralysis. Reluctance to allocate power resources when they don't know which projects will come to fruition, slowing down the overall development cycle.

A wall of capital

There is no shortage of capital flowing into the data center, but that abundance is part of the problem. When access to capital is easy, it leads to speculation. In a way, this is like the better mousetrap problem: Too many players running a market with too much supply. This influx of speculators creates uncertainty not only within resources but also in local communities, who have to decide whether to grant permits for land use and infrastructure development.

Adding to the complexity is that data centers aren't just for AI. Yes, AI is driving an increase in demand, but there is also an ongoing need for cloud computing. Developers are building data centers to accommodate both, but it's harder to distinguish between the two, especially when projects merge AI hype with traditional cloud infrastructure.

What is true?

The legitimate players – the aforementioned Apples, Googles and Microsofts – are building real data centers, and many are adopting strategies such as “behind the meter” deals with renewable energy providers or building microgrids to avoid grid interconnection barriers. But as real projects proliferate, so do the fake ones. Developers with little experience in the area are trying to bring in money, leading to an increasingly chaotic environment for resources.

The problem is not just financial risk – although the capital required to build a single gigawatt campus can easily exceed several billion dollars – the real complexity of developing infrastructure at this scale. A 6-gigawatt campus sounds impressive, but the financial and engineering realities make it nearly impossible to build in a reasonable amount of time. However, speculators throw these big numbers around, hoping to gain power in the hope of turning the project around later.

Why the grid can't keep up with data center demands

As utilities struggle to sort fact from fiction, the grid itself becomes a bottleneck. McKinsey recently estimated that global data center demand could reach up to 152 gigawatts by 2030adding 250 terawatt-hours of new electricity demand. In the US, individual data centers could report 8% of total energy demand by 2030a surprising figure considering how little demand has increased in the last two decades.

However, the grid is not ready for this flow. Interconnection and transmission issues are rampant, with estimates suggesting the US could run out of power capacity by 2027 to 2029 unless alternative solutions are found. Developers are increasingly turning to on-site generation such as gas turbines or microgrids to avoid the bottleneck of interconnection, but these stopgaps only highlight the grid's limitations.

Conclusion: Resources as gatekeepers

The real bottleneck isn't a lack of capital (trust me, there's plenty of capital here) or even technology – it's the ability of resources to act as gatekeepers, determining who's real and who's just ' play the speculation game. Without a robust process to vet developers, there is a risk that the grid will be overrun with projects that never materialize. The age of data centers is tricky here, and until resources change, the entire industry may struggle to keep up with the real demand.

In this chaotic environment, it is not just about allocation of power; it's about resources learning to navigate a new, speculative frontier so that enterprises (and AI) can thrive.

Sophie Bakalar is a partner at Cooperative Fund.

Data Decision Makers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people who do data work, can share insights and innovation related to data.

If you want to read about cutting-edge ideas and information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You may even be considering contributing to an article by yourself!

Read more from DataDecisionMakers



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *