A senior Democratic lawmaker expressed concerns today about the United States’ ability to meet growing energy demands that are accompanying the surge of artificial intelligence and other emerging technologies, particularly when it comes to beating China in the tech race.
During a Senate Committee on Energy and Natural Resources hearing, Chairman Joe Manchin, D-W.Va., said the U.S. is headed for a “calamity” if it’s not able to “energize these data centers to compete” with China.
“America will need more energy to meet the growing demand from data centers and a manufacturing resurgence that has resulted from the Bipartisan Infrastructure Law, CHIPS and Science Act, and Inflation Reduction Act,” Sen. Manchin said during his opening statement of today’s hearing. “For decades, power demand has been decreasing, but now we’re expecting a rapid turnaround this decade.”
“If America can’t build the energy infrastructure needed to support high tech industries, companies will choose to take their business elsewhere,” the chairman said. “We simply must get common sense policy like our bipartisan energy permitting bill enacted, or we’ll have squandered this opportunity and really put ourselves at risk.”
Shaun Gleason, the director of science-security initiative integration at the Department of Energy’s (DoE) Oak Ridge National Lab, testified that energy efficiency is a “grand challenge” in AI.
“Many are predicting that energy use by AI-driven data centers will approach 10 percent of U.S. energy demand by 2030,” Gleason said during his opening statement.
Helena Fu, the director of the DoE’s Office of Critical and Emerging Technologies, emphasized that the department is “laser focused” on the ability to generate energy to power data centers for AI.
“We understand the implications of having enough power to power both manufacturing that’s coming back to the United States, electrification of the grid, as well as the data centers and the AI that is going to be needed to train those models in the United States,” Fu said.
“We just recently issued a new website, a new hub for folks who want to work with us on these issues,” she continued, “There are new technologies and new tools that we have available – grants, tax credits, loans, technical assistance – that we’re bringing to bear on this particular issue. Our Lawrence Berkeley National Lab is also working on a study that’s looking at energy efficiency and data centers.”
Fu said that the DoE “understands the urgency of the issue” and will begin convening stakeholders around the country in “areas of high load growth” to start thinking through the challenge of increased energy demand due to AI.
“This is going to be an effort over the next several months,” Fu said. “Part of the challenge with AI load growth is that it is both a very large load and the expectations are very fast. This large load is going to come online now through the next few years, and so we have many different kinds of technologies that we’re looking at that are going out to 2030.”
Fu said the DoE is looking at steps they can begin taking now.
“Permitting, of course, is one piece of that. We have actually an AI and permitting pilot that we have already launched to see how we can use AI to expedite and streamline the permitting process,” she said. “We know that there is near term needs, midterm needs, and longer-term needs, and DoE is focused on all of those.”