Datacenters are slurping ever more energy to meet the growing demands of AI, but some estimates of future demand imply an increase in hardware that would be beyond the capacity of global chipmakers to supply, according to an environmental nonprofit.

Warnings about the amount of energy that AI datacenters will consume have been getting more strident. A recent report by Deloitte Insights estimated that the total power required by bit barns in the US will increase by a factor of five by 2035, and consultants Bain & Company issued advice to utility companies to revamp the way they operate to support a rapid scale-up of energy resources.

But what happens if those estimates are overinflated? If power companies invest heavily in additional power generation and transmission infrastructure, but datacenter growth does not come near the forecast level, the cost of that expansion would have to be borne by other customers.

Which … is already happening. $50-a-month rate hike?! As recently as 2019, I was paying $25 per month for my first MWh, all inclusive.

Meanwhile, some US power companies are already set to impose price hikes on consumers because of those pesky bit barns, according to various reports.

The Financial Times said that National Grid, with users in New York and Massachusetts, is to raise rates by $50 a month, while Northern Indiana Public Service Company is upping monthly rates by $23 a customer.

Reuters reports that PJM Interconnection, which serves a number of states clustered near the east coast, is set to increase its energy bills by more than 20 percent this summer. Its area of coverage includes Virginia, home to the largest concentration of datacenter capacity in the world.