Back
How AI Energy Issues Can Affect Your Life
Increases in AI energy consumption triggered a frenzy of data-center construction projects that require a supply of electricity much greater than now available.
Each person who consumes electricity in the US can be directly impacted by what impacts the national electrical grid at a distant point—even if it’s hundreds or thousands of miles from your home or office. The potential of AI is immense, but AI energy consumption is creating problems for the providers of electrical energy, and it is questionable whether new sources will be available in time to avert a local or national catastrophe.
We are in this mess together, and we will either solve this AI energy problem collectively or die individually. That is not an overstatement or hype—how long can you live in a city without water and food that requires electricity from the national electrical grid for creation and transport? The city water plants (and the neighborhood gasoline station) cannot operate without electricity, and food depends on factories that require electricity. Stores cannot sell you anything if their cash registers are not working. Without electricity, it all stops.
What is the Problem?
The problem can be summarized easily: The demand for electricity, especially AI electricity requirements, is projected to be greater than what is available. Because the electrical grid is national infrastructure, local problems quickly become regional or national problems. Electricity is bought and sold across the grid irrespective of the locations of the buyer and seller.
The US electricity markets are complex and somewhat regulated. The contiguous 48 states are divided into three major areas that are independent but can transfer power (electricity) between them. There are regional authorities that manage the operation of their portion of the grid to balance supply and demand. Turning on a light in Los Angeles could require starting a turbine hundreds of miles away.
Although the examples in this post are from Central Ohio, the same issues may be relevant where you live and work. Data center projects that are unwilling to be on AEP Ohio’s waitlist are searching for other locations where they can immediately build.
This is a nationwide problem! For example, the political/legal turmoil in New York over business valuation caused one data center developer to declare he was through with New York and going to explore developing new data centers in Oklahoma.
This April 2024 report from the Department of Energy (DOE) summarizes numerous steps the DOE is taking or has taken to address the issues described in this post. It also provides links to other posts that provide relevant information.
Assumptions About Electrical Demand
Discussions about the ability of the US power grid to handle future demand are filled with assumptions, many of which are probably invalid. For example, it is a common assumption that the typical car (“light-duty passenger vehicle”) is available to be charged about 23 hours per day. That ignores when people are available to recharge their cars, which is a much more limited availability.
Consumer and business demand for high-speed charging stations at work and at home or for fleets of cars and other vehicles will be cyclical and correlated. At peak cycle times, the demands will be enormous—consider that “even a four-plug, high-speed charging station delivering 150 kilowatts per charger can pull enough power to supply hundreds of homes.”
This information is from a 2023 meeting that gathered “electric utilities officials for a forum discussion around grid resilience” for the use of electric vehicles. Their assumptions and projections do not acknowledge AI power and that “the voracious energy consumption of artificial intelligence is driving an expansion of fossil fuel use—including delaying the retirement of some coal-fired plants.”
This phenomenal increase in demand for electricity is happening now, and the projections for the future are almost inconceivable. How much energy does AI use? A local cluster of AI-driven data centers is a “computing campus,” and “some computing campuses require as much energy as a modest-sized city.” Big Tech companies are in a competitive AI race and therefore electrically dependent.
There is a frenzy of data center proposals and construction projects. For example, there is “a queue of more than 50 customers—mostly data centers—that are seeking power from AEP,” which is the electric utility based in Columbus, Ohio. It is one of the largest producers of electricity in the US, and its 40,000+ miles of transmission lines are the largest transmission system for electricity in the US. Its 225,000+ miles of distribution lines provide electricity to 5.6+ million customers in 11 states.
Electrical Supply and Demand
Supply is when electricity flows into the national electrical grid, and demand is when electricity flows out of the grid to where it will be consumed. The supply of electricity is maintained by either buying or producing electricity. If a utility such as American Electric Power (AEP) cannot produce all of the electricity it needs, it will try to buy the electricity it lacks.
Electricity from a generation plant that normally supplies your electricity may be sent to AEP instead. Each generation plant has a maximum number of megawatts it can generate at any point in time in a specific set of circumstances. When the local demand load exceeds the maximum supply available, generation sources elsewhere must be accessed to meet the immediate demand or local “brown-outs” will be necessary.
For example, AEP can generate about 29,000 megawatts, with about 6,000 megawatts of that from renewable sources. For a sense of scale, electricity is usually measured in watts, and 29,000 megawatts is 29,000,000,000 watts. At any point in time that electricity could light up 290,000,000 100-watt light bulbs, or it could provide electricity to 58 facilities like Intel’s new semiconductor factory being built in Central Ohio. Intel’s proposed $28 billion development has a predicted load demand of up to 500 megawatts at any point in time.
There are two kinds of data centers using AI energy: (1) smaller data centers used by local companies for their own purposes, and (2) the hyperscale data centers built by Amazon, Google, and Meta/Facebook. Without including land costs for future data center projects, these companies have already invested $9.3 billion in Central Ohio and have pledged $12.3 billion more for data center projects. For example, Google is investing another $2.3 billion into its three Central Ohio campuses.
Numerous other companies, including Microsoft, are planning data center projects. A map identified as “Data Centers in Central Ohio” has 33 identified locations. Per floor space, these power-hungry locations consume 10 to 50 times the energy consumed by a typical office building. The predicted regular electricity demand load ranges from 100 megawatts to 1,000 megawatts for each location at any point in time.
The math is easy—spread 29,000 megawatts across those 33 large-demand locations plus Intel and then whatever amount is left over (if any) is available to power every other building and machine in Central Ohio. Artificial intelligence and energy is not a long-term issue only! By 2028, signed agreements predict adding 5,000 megawatts to the 4,600 current load in the region.
Consequently, AEP Ohio has created a wait list with 50+ customers that are mostly data centers. The demand projected for the list would more than double the required load by adding another 30,000 megawatts of power. For scale, this addition is equivalent to adding three times the load pulled by New York City at its peak summer demand.
Also, these projections do not appear to consider the additional demand generated by the new jobs not directly at the data center. For example, the Intel manufacturing plant is expected to create 3,000 direct jobs and indirectly create more than 20,000 total jobs. I once heard a data center developer say that each data center job creates 3-4 indirect jobs for support of various kinds, which is slightly less than the prediction for Intel’s manufacturing jobs. Every one of those new jobs will require electricity, putting more strain on the artificial intelligence power grid.
Who is Going to Pay?
Exploring AI in renewable energy, the push against fossil fuels, and other factors are decreasing how much electricity the US can generate. For example, AEP closed (“retired”) one of its largest coal-fired power plants in 2022 and another in 2023. Renewable energy cannot be expected to make up for the loss in production, so there is a regulatory effort to increase the use of natural gas for electricity production.
All of this requires money to pay for new transmission and distribution lines infrastructure as well as the transition from coal-fired power plants to gas. If the new data centers and factories do not cover the costs of expanding the infrastructure, there is concern that costs might be passed on to other AEP (AEP Ohio) customers through rate increases.
Payment may be in ways other than money. There are four primary costs that concern those who oppose the construction of data centers: electricity, water, emissions, and loss of other uses of the land. For example, Atlanta is attempting to force data centers farther than one-half mile away from transit stations.
Some cloud services and larger technology organizations are working to decrease the demand load on utilities by powering systems entirely by renewable energy. Also, a data center may arrange to provide most of its server-generated heat to community projects or other buildings that would have needed electricity for heating. This allows zero carbon heat emission.
AI Energy Solutions and Workarounds
There are several types of solutions and workarounds being considered and implemented. For example, Amazon Web Services (AWS) is working to reduce its internal energy consumption and to introduce software that enhances the existing grid. It also has several AI in renewable energy partnerships in Ohio to generate additional electricity.
There are two major impacts of an AI model on energy demand: (1) The electricity used while training the model, and (2) the electricity required for each use of the model. For example, training GPT-3 used 1,287 megawatts, and training larger models demands more electricity. However, the choice of neural network model, processor, and data-center infrastructure and location can significantly decrease energy consumption and its negative consequences of carbon emissions.
For example, Tripp et al (2024) identify seven actions that can decrease energy costs for training or for use. Their research “sheds light on the nuanced energy consumption patterns and trade-offs of deep neural networks and underscores the importance of considering the interactions between network and hardware topologies when developing AI systems.”
The best report found during the research for this post on AI energy was Daniel Castro’s “Rethinking Concerns About AI’s Energy Use.” It debunks some of the myths and presents a balanced view of AI energy consumption. While the electricity demands of data centers have grown, the demand per inference (prompt) has remained relatively constant, probably because of improvements in AI models and hardware.
Pieces Copilot+ provides the best example of energy conservation by its ability to run offline AI models that are not connected to the Internet. This limits the energy demand to the electrical load drawn by the local machine—nothing is running in the cloud.
Being able to run powerful AI models on your air-gapped device also eliminates the risk of data breaches associated with network transmissions or an AI in the cloud storing your sensitive data. Click the following link to learn about other advantages provided by Pieces Copilot+, such as its air-gapped live context based on your workstream.