Jonathan Swift once observed, “We have just enough religion to make us hate one another, but not enough to make us love one another.” In a modern-day analog, we have just enough artificial intelligence to devour massive amounts of energy and socialize the cost, but not enough to augment human intelligence meaningfully, efficiently or equitably.
Opinion
Thanks to AI, the combined electricity consumption of hyperscale data centers could double within the next three years. Planned projects call for a gigawatt of electricity each, enough to power 750,000 homes. Cheyenne hosts 14 data centers, with two more being constructed by Related Digital and Meta. Crusoe and Tallgrass announced plans for an additional 1.8 gigawatts in Cheyenne that may expand to 10 gigawatts. Nationally, data center demand is expected to grow from today’s 30 gigawatts to 134 gigawatts by 2030 – 17% of the entire country’s peak demand.
Market forces motivate utilities to offer data centers secret contracts that can shift costs to the public. This helps them compete in unregulated markets at the expense of their captive ratepayers. They tell public utility commissions what they want to hear: that special contracts isolate data center energy costs from other ratepayers’ bills. But verifying this claim is all but impossible.
Why? First, public utility commission rate-setting processes rely on the validity of data provided by utilities protecting their own interests. Second, their confidential agreements with data centers prevent full public participation in commission proceedings. Third, a complex system of state and federal cost allocation for additions to transmission capacity obscures the drivers of those costs and provides cover for regulatory approval. Fourth, surging data center demand may necessitate reliance on backup generators for “peak shaving.” Some data centers use backup generation during peak system load to cut their utility bill, leaving other consumers to absorb the discount. Finally, data centers may contract with merchant power plants, pulling them off the open market, straining grid capacity and raising regulated rates.
AI energy intensity amplifies environmental costs. Nationwide, 60% of the electricity used to power data centers comes from fossil fuels, releasing tens of thousands of tons per year of harmful air pollutants. The total annual public health burden of U.S. data centers could exceed $20 billion by 2028. Their carbon footprint is also escalating, with 105 million tons of greenhouse gases emitted in 2023, and potentially 250 million tons or more by 2035.

The Trump administration fosters this emissions upsurge by promoting fossil fuels and suppressing renewable energy expansion. EPA has embraced Trump’s executive order, “Removing Barriers to American Leadership in Artificial Intelligence.” Rather than earnestly enforcing Clean Air Act requirements, EPA now offers to assist data centers seeking to avoid permitting delays and disguise potential emissions. In its just-released website, EPA refers to its own permitting regulations as “overly burdensome red tape.”
The repurposed EPA recommends permitting loopholes that allow a data center that would normally qualify as a major source of air pollutants to avoid the attendant hurdle known as “prevention of significant deterioration.” Such permitting takes longer and requires more extensive modeling and monitoring. To stay under the minor-source emissions cap, EPA encourages AI generating plants to either split into pieces that qualify individually as minor sources, or to nominally constrain output (“synthetic minor source”).
Project Jupiter, a massive $165 billion AI data center in New Mexico proposed by OpenAI and Oracle, exploited both loopholes. They submitted minor source air permit applications for a combined 2.2 gigawatts of natural gas turbine capacity split between two microgrids. Each half of the project would further restrict its output to emit just under the minor source limit for nitrogen oxides.
AI developers may anticipate the political impracticality of enforcing minor source limits at a time of soaring power demand. Microsoft has pursued minor source permits for a borderline major source in Cheyenne. The Cheyenne Power Hub resisted this shortcut, submitting a PSD application to power the 1.8 gigawatts Crusoe/Tallgrass data center with natural gas turbines. Nonetheless, cumulative air quality impacts from current and future data centers on Cheyenne have been understated.
AI enthusiasts predict it will solve intractable problems like climate change. Ironically, its energy intensity and its rushed reliance on fossil fuels have reversed the decline in U.S. greenhouse gas emissions. AI lacks either the reasoning power or conviction to connect its own excesses to the indisputable evidence that we are heating the planet. It seems unlikely that quadrupling its computing power will solve the paradox. Rather, we must depend on real intelligence to restrain our appetites, uphold quality over quantity and distinguish truth from nonsense.
