Today tech giants are navigating a striking paradox: they’re insatiably hungry for computational power, yet fiercely committed to a sustainable future. The staggering power consumption of today’s data centers, especially hyperscale ones used for AI, is outpacing sustainability advancements, poses significant energy challenges, demanding a new approach to meet these challenges.
A single hyperscale data center — like those operated by AWS, Azure, and Google Cloud — can require up to 150 megawatts of power, equivalent to a mid-sized city. Even before the LLM boom, U.S. data center power consumption was projected to double from 17 gigawatts in 2022 to 35 gigawatts by 2030. New data centers are pushing electricity demand to unprecedented rates, and the stock markets are taking notice. New data centers are driving up electricity demand, causing the utilities sector of the S&P 500 Index to climb 18% over the past few months.
Existing major data center hubs like Virginia (VA), which handle nearly 70% of global digital traffic, are feeling a massive strain on energy grids, struggling to keep up without the energy and transmission capacity to manage the growing demand. Dominion Energy, the largest electricity provider in the VA, foresees a doubling of CO2 emissions and plans for new fossil fuel infrastructure.
Balancing energy use and sustainability is a constant tradeoff. As the world heats up, corporations have a responsibility to conserve energy and reduce climate-heating emissions. And as AI and advanced manufacturing become increasingly important drivers of our global economy, the demand for sustainable power solutions for data centers will only intensify.
Renewable energy commitments
To reach carbon-free energy goals, top tech companies have locked in power purchase agreements with renewable energy suppliers. Meanwhile, hyperscalers are starting to fund the building of renewable-energy plants in the face of soaring prices caused by supply shortages. Collectively, the hyperscalers — Amazon, Apple, Google, Meta and Microsoft — have developed renewable energy projects with a total capacity of over 45 gigawatts, enough to power about 33.75 million homes. Roughly 57% of the global corporate wind and solar capacity is tied to just these five companies. But there are still significant challenges and opportunities for innovation, particularly in the realm of energy storage.
The edge computing challenge
Relying solely on centralized data centers, even those powered by renewable energy, isn’t practical for many real-world applications. By processing data closer to where it is generated, edge computing will remain a foundational infrastructure not just to reduce latency by minimizing the distance that data must travel, but also to cut costs of transferring large data volumes and adhere to data privacy and residency regulations. Key applications where low latency is critical include:
- AI Inference Applications: Real-time AI applications and decision-making need extremely low latency processing and transmission.
- Streaming Analytics Applications: Real-time auctions, online betting, and multiplayer games require low latency networks for accurate real-time information.
- Real-Time Data Management: Enterprise applications merge and optimize data from various sources in real-time, needing low-latency networks to avoid performance issues.
- API Integration: Systems communicating via APIs need low latency to avoid performance issues, such as a flight-booking website needing real-time seat availability.
- Video-Enabled Remote Operations: Remote control of machines, like drones for search-and-rescue, requires low latency networks to avoid life-threatening consequences.
In these cases, proximity to the data source will remain crucial, and simply placing renewable or net-zero energy sources where they are most abundant won't solve the problem. Different regions have varying capacities for renewable energy and data centers need to be located close to users for low latency, complicating energy sourcing. New solutions for storing renewable energy efficiently are needed to ensure a steady power supply.
Challenges in locating renewable energy sources
Connecting large renewable power sources to the grid can be a lengthy and complex process, often taking years. For data centers with significant power needs, local power sources might offer a more practical solution. But renewable resources are limited and highly sought after, making competition fierce. Energy transmission over long distances (>1,000 miles) can result in about a 12% percent loss in heat and costs up to $3.90 per mile, a government study found. Solar power is also only available during the day, and wind power depends on weather conditions, often requiring fossil fuels to fill the gaps. Our recent incubation of and investment in Peak Energy advances energy storage with sodium-ion batteries, which are which are more cost-effective, safe, and better optimized for stationary storage versus standard lithium-ion batteries. This is one example of improving storage of renewable energy, which will allow regions to use local sources more effectively.
Call for Founders
The next wave of technology for data centers will include new ways to generate power, connect to the grid, and manage energy use to reduce carbon emissions while improving performance. These innovations are key to making data centers more sustainable and effective. Each of these areas needs specialized focus and innovative approaches.
At Eclipse, we're committed to pushing progress to help industries transition to clean energy, ensuring a sustainable future for all.
If you're building the next generation of power sources, interconnect solutions, or net-zero approaches to data center power and management, we want to hear from you. Eclipse partners with entrepreneurs starting at the pre-seed stage idea through our Venture Equity Platform: aidan@eclipse.vc.
Follow Eclipse on LinkedIn or sign up for our newsletter for the latest on the Industrial Evolution.