Decarbonization · Digitization

Powering Sustainable AI in Data Centers

As data centers power the AI revolution, their soaring energy demands strain the grid and challenge sustainability efforts—but innovative startups and technologies are stepping up to shape a more resilient, efficient future.

The digital economy—spanning cloud computing, artificial intelligence, and software workflows—is expanding at an unprecedented rate. At the core of this growth lies a critical enabler: data centers. These facilities power the internet, applications, and services we rely on daily. However, with exponential growth in digitization comes significant infrastructure and physical challenges. 

Data centers are now grappling with surging energy consumption, sustainability concerns, and grid infrastructure strains. Startups are stepping up with groundbreaking innovations to address these energy demand challenges to drive this sector towards a sustainable future. In this article, we explore the escalating power demand of data centers, how sophisticated incumbents are responding, opportunities for flexibility and optimization, and startups that are reshaping this landscape. 

The Growing Power Demand of Data Centers 

Data centers currently account for 1-1.5% of global electricity consumption, with the U.S. leading in both data center capacity and energy use. Hyperscalers like Microsoft, Alphabet, Meta and Amazon are rapidly expanding their facilities to meet the growing demand for cloud computing, AI workloads, and real-time applications. This surge in activity prioritizes speed-to-power and reliability, outpacing current procurement plans for the clean energy system buildout.

Source: Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption, by EPRI

In the U.S., data centers are forecasted to represent between 5-9% of the nation’s total electricity consumption by 2030 (up from 2% today), according to a recent white paper by the Electric Power Research Institute (EPRI). The wide variation in growth scenarios underscores the challenges in predicting future demand for data center capacity, which depends on several interrelated factors. These include the adoption rate of advanced AI use cases, the types of chips deployed and their power requirements, and the distribution of AI workloads between cloud and edge computing. Innovations that increase the efficiency and thereby reduce the energy intensity of AI and computing—such as DeepSeek’s recently released model—will also play a critical role in shaping the energy sector’s future. 

The majority of forecasted demand for data centers is focused on larger-scale deployments (e.g., more than 500 MW) to meet the expanding needs of AI. Several recent announcements have set capacity and investment records, including the Reliance Group’s news last week that they plan to build what may become the world’s biggest data center with a total capacity of three gigawatts, and the Stargate Project plan in which OpenAI, SoftBank and Oracle pledge to invest $100-$500 billion in AI infrastructure in the US. 

Challenges Posed by Rapid Expansion

This rapid growth introduces significant challenges. According to an analysis last fall by Morgan Stanley, the rise of generative AI is expected to increase emissions by 600 million MTCO2e per year globally between now and 2030 across scopes 1, 2 and 3, based on power demand plus the embodied carbon in building materials and IT equipment. The US is expected to make up approximately half of those emissions, and for context, 300 million MTCO2e equals about 5.7% of the U.S. economy’s 2024 annual emissions. 

Source: Latitude Media and Morgan Stanley analysis 

Additionally, increasing energy loads strain grid infrastructure, leading to higher energy costs for neighboring regions and potential instability during peak demand periods. Power shortages are expected to restrict 40% of AI data centers by 2027, while nearby households face power distortions.

Nuclear Power Is Promising, But Expensive and Difficult to Deploy

To meet these escalating demands, much of the clean energy for training and running massive generative AI models is believed to be secured through traditional nuclear power as well as small modular reactors (SMRs), with big tech companies forming energy deals worth billions of dollars. However, these solutions—offering the essential baseload energy—will not be widely available until the early 2030s. In the interim, data centers must rely on a mix of innovative and traditional energy solutions.

Innovative Power Solutions Addressing Short- to Mid-term Energy Needs

To address short- to mid-term energy needs, data centers are leveraging diverse power sources. These range from existing technologies within renewables, geothermal energy, and natural gas, to novel technologies from startups creating localized, scalable solutions to decarbonize natural gas. 

Valo’s portfolio company XGS Energy enables brownfield and greenfield development of geothermal energy with its proprietary, thermally conductive heat-harvesting technology. Their product is a firm renewable energy resource that has an attractive time-to-power (12-18 months) compared to alternative clean firm technologies (5-15 years). With its highly attractive unit economics, simplicity and scalability, XGS Energy is experiencing strong demand signals from tech hyperscalers and large energy companies for their unique geothermal value proposition.  Another Valo portfolio company, Modern Hydrogen, produces distributed, on-demand clean power from natural gas directly at customer locations, circumventing costly distribution and bottlenecks associated with upgrading grid infrastructure.

In the U.S., where approximately one-third of the world’s roughly 8,000 data centers are located, these innovative power solutions play a critical role in expanding data center buildout beyond primary markets and into secondary and tertiary regions. By alleviating stress on the grid and enabling off-the-grid solutions, these technologies ensure a more resilient and flexible energy ecosystem for the data center industry.

Source: AI data center growth: Meeting the demand | McKinsey

Opportunities for Startups 

1. Flexibility and Optimization

With rapidly increasing energy loads, one of the most promising solutions for mitigating the environmental impact of data centers lies in energy flexibility. For example, Google has demonstrated the ability to temporarily reduce power demand by shifting computational loads to different times of the day or from one data center to another, helping to alleviate stress on the grid. This temporal and spatial flexibility is critical as power demand continues to surge.

Another Valo portfolio company, Equilibrium Energy, is at the forefront of this effort, leveraging AI and real-time grid analytics to enable dynamic energy load management. Their solutions align data center operations with grid capacity, reduce peak demand stress, and integrate higher shares of renewable energy. Equilibrium Energy enhances resilience and sustainability across data center facilities by incorporating energy storage systems and responding to grid signals in real-time.

New backup power alternatives such as battery storage systems and fuel cells further enhance flexibility by reducing grid dependence and enabling energy storage. These technologies allow data centers to store excess energy and even sell it back to the grid during high-demand periods, promoting renewable energy integration while maintaining uptime.

AI-driven optimization also plays a crucial role in improving efficiency and reliability. By using machine learning and real-time analytics, AI enhances server utilization, predicts equipment failures, and fine-tunes cooling systems to minimize energy consumption. AI also enables workload scheduling for delay-tolerant tasks, such as batch processing for AI training, allowing for greater load flexibility, energy efficiency and grid reliability.

Source: Wells Fargo report in Forbes

Startups like CoolGradient are innovating in this space, developing solutions that not only address the energy inefficiencies in server utilization but also improve cost-effectiveness and scalability. With the rapid growth of data-driven applications and AI workloads, these solutions are poised to meet a critical need in the market.

Source: Goldman Sachs

AI algorithm development has prioritized improving accuracy and performance, but as algorithms have advanced and computational demands have grown dramatically, the focus is gradually shifting toward emphasizing the efficiency of model development alongside performance.

2: Data Center Cooling, Heat Transfer and Heat Reuse

Cooling remains one of the largest energy expenditures for data centers, often accounting for up to 40% of their total energy use. Addressing this challenge, a wave of startups is focused on revolutionizing cooling technologies to improve efficiency and reduce environmental impact.

Active cooling systems, such as liquid cooling and immersion cooling, have gained traction for their ability to handle high-performance workloads more efficiently. Companies like Submer and CoolIT Systems are leading the charge with solutions that submerge servers in dielectric fluids or circulate coolant directly to hot spots, dramatically reducing the need for traditional air conditioning and thereby need for water.

Maximizing the transfer of heat away from the chip is also an important consideration for advanced cooling systems. Valo portfolio company Boston Materials, for instance, has developed advanced thermal interface materials that enhance heat transfer by 10X. By incorporating these materials into liquid-cooled and immersion-cooled server designs, data centers can improve chip performance, reduce the strain on cooling infrastructure, and lower overall energy consumption.

Data centers also have the possibility of reusing heat generated by computation. One such example comes from Finland where the world’s largest waste heat recovery project from data centers is being architected by Microsoft and Fortum, a Valo cornerstone LP. The concept is unique in that the location for the data center region was chosen specifically with waste heat recycling in mind. The project makes use of district heating infrastructure for waste heat capture and distribution. The existing infrastructure includes about 560 miles of underground pipes that transfer heat to approximately 250,000 users in the local municipalities. With the increasing computing density, we can expect to see more examples of waste heat reuse in the future, both from the rack level all the way to the heat reuse in district heating.

Investing in Future Data Center Solutions

Data centers are at the heart of our digital world, but they also represent a significant challenge for sustainability. We believe that a sustainable and resilient data center ecosystem will help advance and integrate clean energy solutions, load flexibility technologies, and optimization resources to meet the growing demands of the digital economy without compromising the environment. Collaboration between startups, investors, and asset operators is essential to drive meaningful progress. 

Image Source: The banner image in this article was generated in collaboration with ChatGPT 

More from Valo

View all News & Insights