Concerns over artificial intelligence’s energy appetite are pushing some ideas to extremes. Some engineers, including those at Google, are planning for or testing solar-powered computing infrastructure in orbit. While the idea draws attention, it sidesteps more immediate constraints.
For Bobby Hollis, vice-president of energy at Microsoft, the challenge is far more terrestrial. “We’re still Earth-focused,” he says. In his view, the issue is whether power systems can adapt to a new class of demand rather than finding new places to put data centres.
More work per megawatt
Goldman Sachs Research estimates global data centre power demand will rise 165% by 2030 from 2023 levels, driven by AI workloads. However, Hollis believes the timeline is less compressed than the forecast suggests because computing efficiency continues to shift the curve.
He points to cloud computing, which was once expected to consume as much as 10% of global electricity by 2020. “[In reality,] it moved from 1% to 1.5%,” he says, citing International Energy Agency data, as efficiency gains repeatedly outpaced expectations.
According to Hollis, the gap between forecast and reality persists because many observers misunderstand how data centres operate. Efficiency improvements rarely show up as declining electricity use. Instead, they appear as more computing work delivered using the same amount of power. “Even when we use the same 100 megawatts data centre, the amount of work that can be done with that capacity changes every single time we put new servers into that location,” he explains.
See also: AI adoption is surging in Asia but are we using it right?
That gain comes from incremental improvements rather than dramatic breakthroughs. Higher voltage connections reduce line losses. Hardware refresh cycles get faster. Cooling systems are chosen based on trade-offs between power efficiency and water use. The objective is to lift performance across the entire facility rather than optimise a single variable.
This is why Microsoft tracks water-usage effectiveness, or WUE, alongside power metrics as it expands its data centre footprint. In practice, electricity use and water consumption are inversely related. Data centres can lower power demand by using water-intensive evaporative cooling or reduce water use by relying on more electricity-hungry mechanical systems. Improving one constraint often increases pressure on the other.
Those choices increasingly influence where and how new capacity is built, particularly in water-stressed markets. In Singapore, Microsoft’s data centres use utility-provided, high-grade reclaimed water and mechanical cooling to keep servers at a suitable operating temperature, reducing freshwater demand while accepting higher electricity use as a trade-off.
See also: Nanyang Polytechnic launches AI lab to support SMEs while training students
The need for grid modernisation
According to PwC, Asia Pacific accounts for roughly 30% of global data centre capacity and is expected to grow at a CAGR of 21% from 2024 to 2028. Growth is driven by surging demand for cloud services across industries, the rollout of 5G networks and the accelerating adoption of AI.
As those data centre projects move forward, attention turns from feasibility to sourcing power that remains reliable and sustainable. Hollis explains that data centres typically draw electricity from shared grids rather than from power generators built solely for their use. Even when companies fund new renewable projects, their facilities continue to rely on the same systems that serve households and factories. Moreover, building a fully self-contained power supply at scale would demand extraordinary amounts of land, storage and infrastructure.
That dependence makes grid capacity and resilience the binding constraint. Addressing that requires grid modernisation, which Microsoft supports through its Climate Innovation Fund by backing companies such as LineVision. Its AI-enabled transmission monitoring has helped expand National Grid’s capacity in Britain to support around 500,000 homes while saving about GBP1.4 million ($2.43 million) a year.
AI itself is becoming part of the solution, enabling real-time balancing of intermittent renewable supply and variable demand from data centres and consumers.
Cleaning the grid
Electricity supply addresses only part of the AI energy challenge; how that power is produced matters just as much. This is why Microsoft has matched 100% of its global electricity consumption with renewable energy on an annual basis.
To stay ahead of the latest tech trends, click here for DigitalEdge Section
Hollis is careful to distinguish credible decarbonisation from accounting exercises. The key lies in what he terms as the “principle of additionality”, meaning new carbon-free capacity enters the grid instead of existing renewable supply being reassigned. “If there was already a carbon-free resource serving the grid, and we took it and claimed it belongs to us, it didn’t really do anything from a decarbonisation standpoint,” he says.
In line with that, Microsoft has signed a 20-year power purchase agreement (PPA) with EDP Renewables for 200 megawatts of solar capacity under the SolarNova 8 programme, spread across public housing and government rooftops. Led by HDB and the Economic Development Board (EDB), the SolarNova 8 programme is expected to generate an estimated 420 gigawatt-hours of solar energy annually, equivalent to 5% of Singapore’s total energy.
Globally, Microsoft has announced PPAs exceeding 34 gigawatts of renewable energy across 135 projects in 24 countries. This strategy reflects a pragmatic reality in markets where fossil fuels still dominate electricity generation. For example, coal still accounts for about 49% of Southeast Asia’s power supply, according to the Southeast Asia Green Economy 2025 report by Bain & Company, GenZero, Google, Standard Chartered and Temasek. “Adding renewable or carbon-free resources [alongside conventional generation will still] have a net positive impact to make sure decarbonisation is continuing in the right trajectory,” claims Hollis.
Source: Southeast Asia Green Economy 2025 report
On nuclear
Microsoft is also turning to nuclear power as a source of clean, reliable electricity. Its agreement with Constellation Energy to restart the Crane Clean Energy Facility (formerly known as Three Mile Island Unit 1 nuclear plant) in the US supports carbon-free baseload generation that once lost economic viability as lower-cost alternatives gained ground.
Yet, the tech giant is taking a cautious approach towards small modular reactors (SMRs) as they are nascent. “We won’t necessarily plug SMR in the same way we would with a wind or solar project. But we don’t want to wait five or 10 years for the new technology to show up, so we’re investing in SMR companies through our Climate Innovation Fund to help accelerate development,” says Hollis.
He anticipates pilot projects in the 2030s to provide firmer deployment dates, at which point SMRs might merit the same planning treatment afforded to wind or solar projects. Until then, Microsoft will continue funding innovation while keeping core operations focused on energy sources that are ready to scale.
While no one knows how much energy AI workloads will require in the future, Hollis expects efficiency to keep pace with demand, driven by commercial necessity and climate goals.
That logic shapes Microsoft’s planning. Rather than building for speculative peaks, the company focuses on what Hollis calls actual requirements with “line of sight”, or concrete business demand instead of extrapolated projections. Whether that discipline holds as AI scales remains an open question. For now, Microsoft is betting that the incentive to optimise will prove stronger than the temptation to overbuild.
