If you haven’t noticed, data centers have been sprouting up all over. And they’re mighty thirsty.
Inside these secured, windowless compounds, where servers hum and data never sleeps, AI systems guzzle millions of gallons of water (a single large data center can consume as much water as a town of 10,000 people, or even more) just to stay cool.
In this new reality, no sector has escaped the lure of advanced tech: Faster processing! Smarter analytics! More data than ever! But every leap forward comes at a cost. For AI, it shows up on the utility bill: electricity to run the servers, and, in more and more cases, vast amounts of water to keep them from overheating.
In a region like Sacramento, where the grid strains under demand and droughts loom, some critics, including Assemblymember Diane Papan, a Democrat from San Mateo, see data centers as hulking, water-hogging threats to local communities.
“With the rise of artificial intelligence and data-driven technologies, we must manage our resources responsibly,” said Papan, author of AB 93, in a press release.
The bill would’ve required data centers to disclose how much water they use as part of local business licensing. Gov. Gavin Newsom vetoed it in October. He said he didn’t want to make tech businesses report on their operations without knowing how it might impact them and their customers. But Papan and other lawmakers have vowed to keep fighting for transparency, saying: “Every drop matters in California.”
So on one side, we have a push for new AI policy, and on the other, pushback to protect privacy. Through the turbulence of this ethical debate, innovation is charting a new course as companies develop water-efficient cooling systems to keep data flowing in the digital world without draining resources in the real one.
As data centers keep rising, the urgency is real, and the challenge is clear: Whoever can handle AI’s big demands with the smallest water footprint will shape the industry’s next phase.
“In a good way, this is forcing innovation in new ways of power,” says Roger Corell, senior director of AI leadership and marketing at Solidigm. “When you hear about certain communities hesitant on an AI data center because of implications on the power grid, that forces us to develop more efficient alternative forms of energy.”
A state of flux
Solidigm develops enterprise solid-state drives, or SSDs, for data centers. Spun out of Intel, this Rancho Cordova-based subsidiary of SK Hynix recently announced the world’s first hot-swappable liquid-cooled enterprise SSD, designed to help improve thermal efficiency of AI-powered servers.
Why is this important? Picture a data center, a giant warehouse filled with metal racks, each about 2 feet wide, 3.5 feet deep and 6.5 feet tall. Inside these racks, you’ll find four pieces of critical IT equipment that make the digital world go round: compute, memory, storage and networking.
General-purpose server racks typically draw 10 to 15 kilowatts of power. The latest NVIDIA GPU GB200 and GB300 racks can draw 120-130 kilowatts at peak load, about the same as 65 U.S. homes at the same time.
Traditional air cooling systems (fans, chillers and conditioned airflow) worked well enough when racks ran cooler. But at these new power densities, air can’t remove heat fast enough. This is why liquid cooling, Corell says, is the crucial breakthrough of the moment.
There are two types of liquid cooling in a data center: direct-to-chip, where cold liquid flows through a plate sitting on top of critical IT devices such as GPUs or SSDs; and immersion cooling, where the entire server is plunged into a tank of non-conductive liquid.
“The beauty of both approaches, whether direct-to-chip or immersion, is they’re so much more effective than air cooling because they’re closed-loop systems,” Corell says. “I just continuously circulate that. I don’t need to refill it. They are very sustainable once the initial setup is done.”
Solidigm recently worked with NVIDIA to develop the first direct-to-chip SSDs to support cold-plate cooling for GPU servers. Storage plays a vital role because, more than general compute workloads, AI moves massive volumes of data in and out of storage at high speeds, which generates heat in the process.
Solidigm recently announced the world’s first hot-swappable
liquid-cooled enterprise SSD, designed to help improve thermal
efficiency of AI-powered servers.

As data growth continues exponentially, Corell expects direct-to-chip cooling to become mainstream in 2026. Looking ahead, he sees “neoclouds,” a new generation AI-customized cloud infrastructure, on the horizon.
“At a macro level, the more distributed you can make AI compute, the better position the industry is in to have a more sustainable strategy,” he says. “You don’t want 5-gigawatt data centers everywhere. But you can support 50- to 100-megawatt data centers with a more distributed approach to balance supply and demand between the power grid and what the AI compute demands are.”
No downtime
With large-scale AI data centers, electricity consumption can match the entire city of New York, says Vinod Narayanan, professor of mechanical and aerospace engineering and faculty director of UC Davis’ Western Cooling Efficiency Center.
“You’re talking 10 large, 1-gigawatt nuclear power plants,” he says, referring to the electricity required for colossal AI buildouts, such as OpenAI’s Stargate project. “It’s mind-boggling.”
Narayanan studies modular, sustainable data centers as part of ARPA-E’s COOLERCHIPS program. The electricity issue is its own beast. His focus is on cutting a data center’s second biggest power drain: the energy used for cooling. In traditional air-cooled data centers, fans and chillers account for about 40 percent of total electricity, he says. So he’s been asking some pressing questions: How do you design a cooling system that doesn’t rely on air conditioners or chillers? If one exists, what are the challenges? And why isn’t it widely used today?
While liquid cooling has been used in supercomputers for years, he says, commercial server manufacturers were hesitant at first.
“If there’s a leak, for example, the entire system is ruined,” Narayanan says. “You shut down the entire data center. And data centers are all about reliability. They don’t want any downtime.”
Today, he notes, manufacturers have come around. Reliability remains critical in a data-driven environment, and liquid cooling adoption has gained momentum. Even so, evaporative cooling remains common in large data centers. This method, unlike traditional air conditioning, uses water to absorb heat from the air before it flows into the server room, like how sweat cools the human body.
Narayanan says evaporative-cooled data centers typically need 1.8 to 2 liters of water per kilowatt-hour of heat removed, which can be more water than several thousand homes use, depending on facility size.
“I did some calculations recently,” Narayanan says. “If you had an enterprise data center, 5-megawatt compute, you would be using about 20.8 million gallons of water annually,” he says. “If it’s a hyperscaler, with 50 megawatts, then you’re talking about 200-plus million gallons per year. So that’s water you can’t have for other purposes like drinking and irrigation. It’ll stress the water resources in the community for sure.”
Dripping with irony
In the Capital Region, water scarcity isn’t a foreign concept.
The climate crisis is already shifting weather patterns, intensifying droughts and putting pressure on rivers that feed the state’s ecosystems. The Sacramento-San Joaquin Delta, the source of water for millions, is currently in the middle of a tug-of-war. As CalMatters reports, Gov. Newsom, state agencies and water suppliers want to send Delta water south to farms and cities. But conservationists, fishing communities and Native tribes say hands off.
The fight isn’t ending anytime soon. And now come AI data centers, tapping into the same limited resource as it becomes more and more unpredictable. Sacramento isn’t likely to run out of water, but as the weather gets hotter and drier and Sierra Nevada snowpack declines, each new industrial player raises the stakes.
CoreSite operates a data center in downtown Los Angeles.
(Shutterstock photo)

Meanwhile, tech companies refuse to talk about what goes on behind the data center walls. NTT Global Data Centers operates a sprawling campus in Sacramento with three facilities. Prime Data Centers runs one at McClellan Park with another under construction. Both advertise multiple cooling systems, but neither responded to multiple requests for comment. Neither did Conscious Data Centers, which operates a facility in Rancho Cordova and promotes a “patented cooling technology” that improves energy efficiency.
Before the AI boom, Datacate highlighted its cooling setup, designed by Australian company Climate Wizard. The system uses an indirect evaporative heat exchanger to draw heat and moisture from the air, lowering server room temperature before venting hot air outside. That was in 2017. When asked about updates, the Rancho Cordova-based company said it had no PR contact and declined to share details.
“Sorry, we don’t do that here,” said an operator over the phone, referring to media inquiries.
There’s a certain irony to it all: Companies storing the world’s data don’t want to share their own.
On North Market Boulevard, QTS runs a 92,000-square-foot facility. The company emphasized its collaboration with utility companies and sustainable practices that save “millions of gallons of water annually per megawatt.”
“Our data centers are built with a water-free cooling system that, once operational, does not consume water for cooling,” the company said in a statement.
Water divisions
In The Dalles, Oregon, Google’s data center accounts for more than a quarter of the city’s total water use. In California, water for data centers is statutorily defined as a type of “process water,” which falls outside the statewide “Conservation as a Way of Life” regulatory framework. This means, unlike the roughly 400 urban retail water suppliers, data centers aren’t obligated to meet performance measures for efficiency. (Although they may be subject to local efficiency rules.)
The fact is, California has no universal water system. There are thousands of public water systems across the state, so tracking, billing and oversight vary. The California Public Utilities Commission says most data centers are treated like any other commercial customer, except in regions with limited supply or where additional capital is needed to serve them.
A data center, like all other water users, can get water in three ways: buying it from a water district, diverting it from rivers or canals, or pumping it from underground aquifers. If it’s connected to a water district, the operator pays by the unit, and the district tracks usage. Surface-water users need a state-issued water right and report on how they use it. Groundwater users fall under the Sustainable Groundwater Management Act, with requirements depending on the basin.
How a data center gets its water also determines what happens during shortages. With surface water, for example, cutbacks follow a “seniority” system: the oldest rights are served first, while newer ones are shut off first. During the last two droughts, only health and safety uses were allowed to keep pumping.
In the Conservation as a Way of Life ISOR, the board highlights studies showing that when water agencies have to increase supply to meet the demands of customers who use the most, rates go up for everyone, hitting low-income households the hardest.
According to the board, “If data center water use is as significant for a water agency as the peak water use of its highest water-using residential customers, the consequences may be similar.”
–
Stay up to date on business in the Capital Region: Subscribe to the Comstock’s newsletter today.
Recommended For You
This Collaboration Between Farmers, Water Officials and Environmentalists Is Saving Wildlife
The Floodplain Forward Coalition helps farmers help salmon while protecting water supplies
Water policy in California is defined by fighting. Plans to fix the system languish for decades, and if they’re implemented, they end up in court for many more years. The Floodplain Forward Coalition has broken out of that paradigm.
Cumulous Data
The opportunities and hazards of cloud computing
While most businesses are postponing investments and stashing cash, at least one expense is expected to grow this year: information technology.
The Delta in Decline
Wildlife and businesses in the Sacramento-San Joaquin Delta are suffering from lack of fresh water
The life cycle of a salmon, so the story goes, is a heroic journey. The fish emerge from fertilized eggs in a river bed, swim to the ocean where they spend most of their lives and return to give birth in the exact place where they were born.
Comstock’s Talks: Final Frontiers
PODCAST: As remote work continues to be normalized within the technology industry, techies are migrating to the Sierra Nevada foothills of Northern California.
