Fucking AI/data centers....
The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.
But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.
And because those new chips generate so much heat, more power and water is required to keep them cool.
“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.
Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.
In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.
Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.
In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.
By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.
But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.
Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.
“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.
In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.
One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.
James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.