The rapid growth of artificial intelligence (AI) has substantially increased the energy needs of large tech companies. Popular AI tools like ChatGPT require about 10 times more power than traditional Google searches. New AI features like audio and video generation are expected to increase this demand even further.
Energy companies are reevaluating their options
Due to the growing energy needs of AI, energy companies are considering new and previously unimaginable options. For example, they are considering restarting the old nuclear reactor at Three Mile Island, which has been offline since the 1979 disaster.
Data centers are growing rapidly
Data centers that store and process data have been growing steadily for years. However, the rapid expansion driven by AI requires much more computing power and storage than data centers were originally designed to handle.
Putting pressure on the electric grid
The energy demands of AI are putting additional pressure on the electric grid, which is already close to full capacity in many areas. It takes 1-2 years to build a new data center, but adding new power to the grid can take more than 4 years, causing significant delays.
Data centers are concentrated in a few states
In the US, 15 states have 80% of data centers. Some states, such as Virginia, use more than 25% of their electricity for these centers. Similar patterns are seen around the world, with countries such as Ireland also having many data centers.
Balancing growth and sustainability
Countries aim to reduce carbon emissions by using more renewable energy sources such as wind and solar. However, these sources are inconsistent (the wind doesn’t always blow and the sun doesn’t always shine), making it challenging to match supply and demand.
Water cooling is depleting freshwater sources
Data centers are increasingly using water for cooling, depleting limited freshwater resources. This has led some communities to oppose new data center projects.
Improving efficiency with new technology
The industry is working on making data centers more efficient. Advancements include more energy-efficient hardware and cooling methods such as water cooling and air-assisted liquid cooling. However, simply making things more efficient is not enough to solve the overall sustainability issue.
Exploring new cooling methods
Researchers are developing new cooling technologies for data centers, such as immersion cooling, where servers are cooled by being immersed in a liquid. This method is still in development but holds promise for improved efficiency.
Flexible computing for energy savings
A new approach called flexible computing aims to use energy more efficiently by running more data centers in times of cheap, green electricity and fewer data centers in times of expensive or polluting electricity. Data centers can also adjust their computing load based on the grid’s needs.
Better forecasting for the future
Accurate forecasting and planning is crucial for managing energy use in data centers and the grid. The Electric Power Research Institute is working on better forecasting methods to help with grid management.
Building more edge data centers
Smaller, local data centers, known as edge data centers, can help meet AI’s energy demands more sustainably. These centers can add computing power to local areas and help ease pressure on the main grid. They currently make up 10% of U.S. data centers, with growth expected in the coming years.
Conclusion
To sustainably manage AI’s energy demands, the industry needs to continue innovating in data center design, cooling technologies, and energy efficiency. Exploring new methods like flexible computing and edge data centers could be the key to a greener future.