Methane Is A Powerful Greenhouse Gas, But Where Does It Come From?

By Robert Talbot, Director of Institute for Climate and Atmospheric Science (ICAS) Professor of Atmospheric Chemistry, University of Houston

Carbon dioxide, or CO2, gets all the attention when people talk about global warming, but it’s far from the only greenhouse gas we should be thinking about. Methane (CH4) – like carbon dioxide, a gas emitted by both natural and man-made sources – is starting to draw more attention, too.

Methane has a global warming potential of 28 over a 100-year time frame, a measure developed to reflect how much heat it traps in the atmosphere, meaning a ton of methane will absorb 28 times as much thermal energy as a ton of carbon dioxide. That makes it a very important greenhouse gas, much more powerful than carbon dioxide.  Methane comes from natural sources, such as wetlands and animal digestion, along with thermogenic sources, including oil and gas production. Natural gas is approximately 90% methane.

Recent analysis indicates that additional sources of atmospheric methane should be considered, as well.

While methane is just starting to gain public attention, scientists have been studying it for decades. The National Oceanic and Atmospheric Administration started measuring methane in the Earth’s atmosphere at its global monitoring sites, such as atop Mauna Loa in Hawaii, in the early 1980s. Throughout the ’80s, methane levels showed a steady increase of 1% to 2% per year, dropping to around 1% per year in the ’90s.

IT held steady from 2000 until 2007, when the rate of increase abruptly began to rise again, which continues today. (Figure 1)

These changes have been challenging for scientists to explain quantitatively and to attribute explicitly to varying sources.

Global Monthly Mean of Methane

Recently there has been a flurry of activity to quantify fugitive methane emissions from oil and gas production sites. Indeed, I was a participant in the Barnett Shale Coordinated Campaign in 2013. Using our mobile laboratory, we visited 152 facilities and found that instead of well sites, the largest emissions occurred from compressor stations and chemical processing plants. Other studies have investigated distribution systems and other components of the delivery system. All were found to be leaking methane to some degree. Could the recent 10-year increase in global methane be related to oil and gas production?

The answer appears to be probably not.

A paper published in Science magazine  last year showed that the dominant source of 13C (carbon-13) in methane was shifting on a global basis. Carbon-13 is useful in that it can distinguish different sources of methane from one another. For example, isotopic analysis suggests a new trend away from oil and gas sources in the 21st century and indicates that global agriculture may be responsible for the recent increase in atmospheric methane.

This directly contradicts emission inventories and points out the growing problem of controlling methane emissions while still feeding an increasing human population – truly a delicate balance to manage responsibly.

A second scenario that has been suggested to account for increasing global methane is increasing production of biogenic (bacterial) methane in tropical areas. Under global warming, these areas are receiving more rainfall, which increases the size of flooded areas. This may, in turn, enhance the biogenic production of methane.

However, it appears that increasing agriculture and human population is a more likely scenario. That’s consistent with the isotopic data analysis.

The situation should become clearer in the future as more data is collected. Stay tuned.


Dr. Bob Talbot is Professor of Atmospheric Chemistry and Director of the Institute for Climate and Atmospheric Science (ICAS). Dr. Talbot is also an adjunct Professor of Atmospheric Chemistry in the School of Atmospheric Science at Nanjing University, Nanjing, China. He also serves there as Vice Chair for the Institute for Climate and Global Change Research at Nanjing University. Dr. Talbot has been part of the NASA Global Tropospheric Chemistry program since 1983, serving on the science team for 20 major airborne expeditions supported by this program and is currently the Editor-in-Chief of the international journal Atmosphere.

Advertisements

Rethinking Chemical Storage: A Wake-Up Call From Harvey

By Ramanan Krishnamoorti, Chief Energy Officer at the University of Houston

Hurricane Harvey, and especially the flooding along the Gulf Coast that accompanied the storm, offered a litmus test for the safety of the nation’s petrochemical and refining industry. With a few notable exceptions, the plants passed.

Investments in plant and equipment safety appear to be paying off. Storage, transportation and other supply chain issues need similar attention. The substantial economic and environmental impact Harvey imposed on the industry is a stark illustration of that.

The Federal Reserve Bank has noted that the hurricane and flooding affected about 30% of refining and petrochemical production in the U.S. That followed similar disruptions to petrochemical production from recent hurricanes and weather events, including hurricanes Katrina (2005) and Ike (2008).

During Harvey, production facilities, including refineries and chemical plants, and the raw material supply chain – from tankers at ports, offshore and onshore production wells – were systematically shut down and process safety barriers implemented. No significant production mishaps were reported.

Impressively, no significant safety-related issues were reported when many of these systems came back online, either.

The soft underbelly of the chemical and petrochemical industry along the Gulf Coast turned out to be the storage of raw materials, intermediates and refined products, not the process of refining or chemical manufacturing or their startup or shutdown processes.

Petrochemical storage facilities continue to be vulnerable during natural disasters , risking releases which can damage the environment and impact public safety.

The most recent example happened when Harvey-related flooding swamped the Arkema Inc. facility in Crosby, Texas, about 30 miles from downtown Houston. That triggered the ignition of highly energetic organic peroxides when the plant’s emergency power system failed to maintain the refrigeration required to keep the chemicals stable.

Consumer Protection Concerns As Big Tech Enters The Energy Markets

By Gina S. Warren, Associate Professor, University of Houston Law Center

Imagine a situation where technology companies have the motive, means and opportunity to supply energy with one click. It may not be too far-fetched, and already some of the ramifications for utilities and consumers are becoming clear.

Private tech companies like Apple and Google have emerged onto the energy landscape, a shift that could have a significant impact on the existing energy delivery system. In June 2016, Apple Energy received federal approval to sell wholesale electricity into the national grid. Prior to that, Google Energy received approval to do the same. Globally we are seeing more private businesses, especially Fortune 500 companies, generating their own electricity, investing in renewable energy facilities and voluntarily purchasing renewable energy credits to cover their carbon footprints.

While multiple reasons have factored into this shift, one reason may be that utilities are unable to supply the amount of renewable energy now in demand by large businesses, and those businesses are working to meet the market demands of millennials who are seeking sustainable products.

According to a 2015 market study conducted by Morgan Stanley, millennials – and especially female millennials – care significantly more about sustainability than previous generations. With a whopping 84% of millennial investors identifying sustainability as an important factor when making living and investment decisions, private businesses are taking note.

Apple, Google and 70 more of the world’s most influential companies have joined RE100, a collaborative of businesses who are committed to only using electricity generated from renewable sources and to increasing the demand for and access to renewable energy around the globe. While they each have varying goals, these companies have all made commitments to become 100% renewable by a certain date, with nearly half using some form of on-site power generation. According to a 2016 RE report, most companies do not want to become energy providers, but “the lack of responsiveness from utilities in some regions has forced them to do exactly this.” Unless the power sector can find “more proactive and creative solutions,” it may be the wave of the future.

The energy delivery landscape will continue to change as more and more businesses self-generate. This change can be positive in that we are adding much needed renewable energy. As the negative impacts of climate change accelerate around the globe, the goal of decreasing reliance on fossil fuels is certainly an important one. It is a time of opportunity for collaboration between utilities and businesses that would allow companies to access the renewable energy they demand and utilities to avoid lost profits and stranded investments.

One concern, however, is the private disruption of what has historically been a highly regulated public service industry, potentially resulting in a slippery slope of market power and a loosening of consumer protection.

Safeguarding consumer protections will be key. As large multinational corporations seek to sell electricity, the Federal Energy Regulatory Commission (FERC), the agency in charge of regulating wholesale energy sales, will need to implement more protective measures to ensure consumers are charged reasonable and nondiscriminatory rates for electricity and energy products. Under FERC’s current rule, these large corporations are allowed to use market-based rates and given a lot of leeway in setting customer rates for electricity or energy products, so long as they do not own or control more than a certain amount of electricity within any given region. This is called the horizontal market power rule. The rule was intended to promote competition and entry into the market by small utilities and independent power producers. Large utilities holding horizontal market power do not qualify but instead are subject to more stringent regulation by FERC so as to ensure their rates are fair, reasonable and non-discriminatory.

How American Fracking Ran OPEC’s Oil Recovery Off The Rails

By Bill Gilmer, Director of Institute for Regional Forecasting, C.T. Bauer College of Business

Last fall, it seemed the end of the global oil glut was already at hand, when optimism soared after OPEC’s commitment to speed the process by limiting production.  Oil prices were expected to quickly move to $55 and $60 per barrel, and then continue climbing in 2017. The rig count rose, and jobs began to return throughout the oil patch.

But it has since become another false start for oil markets.  Oil prices remain mired between $45 and $50 per barrel, and price expectations – measured by the futures market for West Texas Intermediate – have fallen back to levels well below those that prevailed before the OPEC accord. The domestic rig count has peaked for now, and the big investment houses forecast a decline in domestic drilling through the second half of this year.

What happened? American fracking ran the recovery off the rails . A competitive industry that – in principle — should move oil output and price to stable long-run levels, fracking is once more living too high on large subsidies to its capital base and operating costs. This leaves oil markets locked in a destructive cycle that has again reached the stage of over-production and depressed price. It has brought us to the brink of yet another pull-back in U.S. drilling activity, and another round of financial stress for many producers.

What happened to $60 Oil?

The revolution wrought by American fracking is a technical marvel, but it also leaves the industry largely responsible for the 2014 oil bust. Figure 1 shows how horizontal drilling and hydraulic fracturing reversed 40 years of declining U.S. oil production, adding just over four million barrels per day (b/d) of new production between 2011 and 2014.  This was the only source of new non-OPEC oil during this period, and flooded into a market that had averaged annual growth of only 1.3 million b/d over the previous 20 years.

U.S. Shale Reversed 40 Years of Declining Oil Production

The economics of the shale revolution set it well apart from conventional oil exploration and production.  In contrast to the oligopolistic markets of Shell, Exxon, and the giant national oil companies, fracking looks and behaves more like a competitive industry:  numerous small firms, low barriers to entry, and production that can be quickly ramped up or down in as price changes.  Unlike conventional oil, there is no significant exploration risk, making output relatively certain, and working more like an assembly line.  Given these properties, if the long-run equilibrium oil price is $60 per barrel – something that both the petroleum and financial engineers tell us — then producer behavior should move supply and demand into balance near that level.

The trigger for the 2014 collapse in oil prices was OPEC’s declaration that it would no longer act as swing producer, i.e., no longer withdraw oil from world markets to support price. OPEC handed that job on to American fracking, but the industry has proved messy and undisciplined in the process. U.S. production fell slowly and by nearly 900,000 b/d in response to low oil prices, but then began to rise again in late 2016.  Based on drilling already performed, it should return to near record-high levels by year-end. New production was partly triggered by OPEC’s renewed efforts to rebalance oil markets last November, and partly by perverse incentives enjoyed by the fracking industry.