It’s been nine months since Hurricane Harvey’s record rainfall wreaked havoc on the state of Texas, and now much of the state is in drought. At the same time, more than 15 percent of the Western United States is in extreme drought, up from practically zero percent a year ago. Around the world, many regions are experiencing drought, groundwater depletion, and municipal water shortages.
Dealing with these challenges will require investment in technology and infrastructure to conserve water and augment conventional water supplies. One obvious target for water conservation is the electric power sector, which consumes trillions of gallons of water per year. The question is whether we should focus on reducing the amount of water power plants consume or use the electricity they generate to produce freshwater through desalination instead. Let’s dig in to find out how these two strategies really compare.
How Do Power Plants Consume Water?
According to one estimate, electric power generation consumes more than three trillion gallons of water globally per year. Why are power plants so thirsty? Most power plants use a steam turbine to generate electricity. The steam coming out of the turbine has to be cooled, condensed back into water, and recycled through the system as shown in the illustration below. This cooling process is where most of the water is consumed at power plants.
According to the U.S. Energy Information Administration (EIA), the majority of power plants in the United States use “closed-cycle” or “recirculating” cooling systems. An illustration of a recirculating cooling system is shown below. In recirculating cooling systems, a separate stream of water is used to cool and condense the steam coming out of the turbine. This process heats up the cooling water, which is then sprayed into a cooling tower. Some of these hot water droplets evaporate and float out of the cooling tower, which is how heat exits the cooling system. The water lost to evaporation is the water “consumed” by the power plant. “Consumption” doesn’t mean the water is gone forever, as it eventually re-enters the water system through rainfall, but the water is no longer available locally after it evaporates.
Can We Cool Power Plants Without Water?
With so much concern about power plant water consumption, you might be asking “why not just take water out of the equation and use a big fan to cool the steam coming out of the turbine?” Such “dry cooling” systems do exist and have been deployed in water-scarce regions around the world including parts of South Africa, China, and the United States. Unfortunately, dry cooling systems tend to reduce power plant efficiency. A steam turbine produces more power when the steam is cooled at a lower temperature. So on a hot day, the power output and efficiency of a dry-cooled power plant will be considerably less than a similar wet-cooled power plant. Thus, the viability of dry cooling systems as a strategy for conserving water depends on how much extra fuel energy must be consumed by power plants to make up for the reduced efficiency and how that extra energy consumption compares with other strategies for managing water supply like desalinating salt water. In simple terms, it’s important to ask if we’re better off using a bunch of extra energy to switch power plants over to dry cooling, or redirecting that energy to desalination. Let’s dive in and answer that question.
The energy cost of saving water with dry cooling can be estimated based on two factors: 1) the water saved by switching from wet to dry cooling and 2) the impact dry cooling has on a power plant’s efficiency. The amount of water saved with dry cooling depends on the type of power plant, as coal plants consume more water than combined cycle natural gas plants, the most efficient fossil fuel power plants available. The impact of dry cooling on a power plant’s efficiency also depends on the type of power plant, as coal plants lose more efficiency than combined cycle natural gas plants by switching to dry cooling. A summary of illustrative water savings and efficiency impacts of dry cooling is shown in the table below.
The bottom line is that it takes something like 55-130 kWh of electricity to save a thousand gallons of water by switching from wet to dry cooling systems for power plants. For comparison, an average American household uses about 30 kWh of electricity and 300 gallons of water per day.
What’s The Best Way to Save Water?
How do these numbers compare to the energy intensity of desalinating salt water? The most common desalination technology uses high-pressure pumps to force salt water through a membrane that separates freshwater from concentrated wastewater. Desalination of salty groundwater with this technology only uses 4-6 kWh per thousand gallons, and desalination of saltier seawater uses 10-15 kWh per thousand gallons. Desalination is several times more energy intensive than conventional water treatment, which uses less than 2 kWh per thousand gallons. Even so, the energy intensity of desalination is much lower than the energy intensity of saving water with dry cooling.
This analysis only considers the difference in energy consumption between saving water with dry cooling systems and treating water with desalination. There are, as ever, other factors to consider. For example, desalination plants are expensive and require proximity to salty groundwater or seawater. Ultimately though, less energy intensive strategies for managing water supply are inherently preferable to more energy intensive alternatives. Climate change has the potential to exacerbate the causes of water scarcity, and increased demand for energy to conserve or treat water puts additional pressure on plans to reduce carbon emissions. Thus, while reducing water consumption from electric power generation is a worthy strategy for conserving water, the environmental costs of technologies like dry cooling systems may outweigh the benefits compared to the alternatives.