• • •
Yes. We Can Have More Data With Less Carbon
1. AI ❤ renewables. Amazon, Apple, Google, Meta and Microsoft account for over 45 gigawatts of wind and solar energy purchases worldwide—over half the global corporate renewables market, according to S&P Global. Data centers are also supporting the development of a range of new low-carbon energy sources, including advanced geothermal, small modular fission reactors and even nuclear fusion start-ups.
2. Power couples. Siting a data center properly can improve the affordability of electricity for domestic customers while reducing overall grid emissions, researchers at the Rocky Mountain Institute found. The trick is to locate a data center near a renewable energy source (no surprise there) but also close to an existing generator with an approved grid connection, such as a natural gas power plant. This counterintuitive pairing of fossil fuels with renewables means the new data center avoids stressing the grid, thus protecting other customers from paying for infrastructure upgrades. RMI has identified dozens of suitable “power couple” locations across the US.
3. A digital drop in the ocean. AI looms large in our cultural life, but the carbon footprint of making and moving data pales in comparison with making and moving physical stuff. In 2023, the International Energy Authority calculated that energy demand from data centers and data transmission networks each account for at most 1.5% of global electricity use. And only about 10% of that is attributable to AI, according to an in-depth survey of AI’s carbon footprint at the Sierra Club.
• • •
What To Keep An Eye On
1. Hot AI is cool. Chip makers should focus on building servers that run hotter not faster if they’re serious about saving power and water, according to research from Hong Kong. If computer chips could function at 41 Celsius, rather than the 25 Celsius they do today, every data center around the world could swap thirsty AC systems for cheap, low-power fans, slashing nearly 60% off their power bills.
2. More slop for less juice? The tech world was rocked earlier this year with the emergence of a Chinese AI model called DeepSeek that takes 10 to 40 times less power to train than some American AIs. Tech and power stocks slumped on the news but it now appears that DeepSeek is less efficient when it comes to answering queries, suggesting that AI’s power consumption might even out over the long-term.
3. AI in your hand. Moving AI systems from the cloud to your phone can mean a 100 to 1,000-fold reduction in energy consumption per task, according to AI startup DeepX founder Kim Lokwon, writing at the World Economic Forum. He also suggests an energy credit trading scheme (developed by his company) could drive lower-carbon AI models.
Top image: © Anthropocene
|