by Brian Murray, Research Professor and Director of the Duke Energy Initiative

 

I came of age in the 1970s. Our family car was a 1973 Chevy Nova, which got 9 miles per gallon of leaded gasoline. So when the Arab oil embargo hit, I spent a long time in gas lines on the days that our license plate number allowed it. In March 1979, my sister called home from York College in Pennsylvania to say that she had to evacuate campus due to a partial meltdown at Three Mile Island nuclear plant. I wasn’t sure what that meant, but it didn’t sound good. (It also inspired me to buy the “No Nukes” triple-record album recorded by the folk rock artists of the day.)

I didn’t realize it then, but my career would give me a front-row seat to even more energy sector drama.

Fifty years ago, annual primary energy consumption—what we use to fuel factories, light buildings, cook food, and transport ourselves—was rising at a tremendous clip alongside modernization: up 41% from 1960 to 1970 and up another 23% by 1980.

By contrast, annual consumption increased just 2% between 2000 and 2018. Why the difference? Economic growth has slowed, but energy intensity has dropped, too, as we’ve moved from a manufacturing- to service-based economy. (Of course, the countries that now manufacture our goods continue to use a lot of energy.) And energy efficiency has improved, driven by market incentives (energy costs money!) and policies (e.g., fuel efficiency standards, appliance standards and building codes).

Energy production has also changed substantially. Consider electric power generation.

The 1970s saw a massive build-out of nuclear plants, with the expectation that it would soon be the country’s dominant source of power. But the Three Mile Island accident effectively put a halt on nuclear construction in the 1980s.

In 1980, coal provided 50% of our electricity generation, but three things reversed this: air pollution regulations (especially related to acid rain), the hydraulic fracturing revolution of the mid-2000s, and recent climate-driven decarbonization efforts. Coal now accounts for a quarter of electricity generation in the US, with natural gas producing 35%, renewables producing almost 20%, and nuclear covering the rest. As a result, greenhouse gas emissions from electricity generation have dropped 27% since 2005.

Natural gas fuels a greater share of our electric power because it’s gotten a lot cheaper. But renewables’ ascendency is policy-driven: 29 states and Washington, D.C., have renewables mandates, and 11 states have a price on carbon emissions from electricity generation. Now renewables, once appreciably more expensive than fossil sources, are close in cost, even without subsidies.

What’s next? Many states and companies have set net-zero carbon targets for electricity by mid-century. Meeting those goals will require continued reductions in the cost of renewables, along with advances in energy storage and dispatchability. Meanwhile, other uses of energy—like transportation, manufacturing, and buildings—must decarbonize as well. Electrification will help with that but will involve additional technological challenges.

The bottom line: Public and private investment in R&D across economic sectors will absolutely be critical to escalating the adoption of renewables and reduction of carbon emissions.