Count On Sea Levels To Keep Rising For A Long Time

by Bob Talbot, Professor of Atmospheric Chemistry Director, Institute for Climate and Atmospheric Science, University of Houston

The world’s oceans have been warming for decades. Increasing water temperatures – driven by higher emissions from a variety of greenhouse gases – have caused the oceans to thermally expand. Glaciers and other previously frozen areas are melting, aggravating and accelerating the rise of the ocean surface.

Fossil fuels are a key contributor to the warming, but they are not the only one.

Scientists now track ice across the Arctic and Antarctica, and what they are finding isn’t encouraging. Last year was the warmest year ever recorded for the global oceans, a phenomenon linked to a number of potential problems, including damage to important habitats such as coral reefs and risks for certain animal populations.

In addition, the Arctic Ocean is expected to be ice-free during the summer within the next 20 years.

Rising sea levels are among the most visible signs of climate change, as well as one that will have a dramatic impact on humans.

And it’s happening faster along the Gulf Coast – home not only to the nation’s fourth-largest city, Houston, but also home to much of the nation’s critical energy infrastructure – than anywhere else in the United States, between 5 millimeters and 10 millimeters per year.

Eventually, cities such as Galveston will be underwater, and the rising waters also will impact the Port of Houston’s operations in coming decades. This is the largest U.S. port in terms of tonnage handled each year, and the amount is increasing due to enhanced Panama Canal ship traffic.

The same thing is happening along the Florida Keys, where areas are already flooded today. Hurricane Irma facilitated the erosion of beaches and other low-lying areas.  Today residents are driving to the local grocery store with many inches of seawater on the roadways in many locations, and although efforts to raise the roads are underway, it won’t be cheap.

All of this is just the tip of the iceberg.  The worst is yet to come. And the economic impact on the United States could be dramatic.

The causes are complex. That means the solutions – and the timeline for any possible recovery – are complex, too.

Fossil fuels are a major contributor to the problem. Carbon dioxide and other greenhouse gases are being added to Earth’s atmosphere at alarming rates as the world continues to burn crude oil, coal and natural gas. Indeed, the annual increase in carbon dioxide is at its highest rate ever. That has pushed the Earth out of radiative equilibrium – ideally, the heat coming to Earth from the sun is equal to the amount of heat that returns to space. Because carbon dioxide and other greenhouse gases trap some of the heat that is trying to escape our atmosphere, the radiative equilibrium is out of balance.

But the causes go beyond fossil fuels. Global agriculture is also a growing problematic source of methane and nitrous oxide, two powerful greenhouse gases. The ever-expanding population of Earth will not stop, and these people need to be fed.

And not all greenhouse gases are equal. Methane, for example, is a more potent greenhouse gas than carbon dioxide, but it also degrades in the atmosphere within a decade or so. Cutting methane emissions would, therefore, show results relatively quickly.

Carbon dioxide and nitrous oxide are different, and their warming effects will remain intact for future generations. This is because they are essentially chemically unreactive in the troposphere, or the lowest part of Earth’s atmosphere, where we live. Moreover, carbon dioxide is most soluble in cold oceanic waters, which are diminishing. Warmer ocean waters means the oceans can absorb less carbon dioxide.

Estimates of this are highly uncertain, but the full warming effect of an emission may not be felt for several decades, if not centuries.

What does this all mean?

Sea levels will likely continue to rise for many centuries into the future.  Don’t get wet.


Defunding the Chemical Safety Board is a Bad Idea and is Likely to Increase Chemical Disasters

By Jacinta Conrad, Associate Professor of Chemical and Biomolecular Engineering, University of Houston

The two explosions in Crosby, Texas, on August 29, 2017 weren’t loud or massive – just gentle pops of sound. Even such small pops, however, were sufficient to disperse chemicals involved in the manufacture of organic peroxides into the air. First responders at the scene reported respiratory irritation and fell ill after breathing the smoke seen at the perimeter of the plant site.

The explosion at the Arkema plant in Crosby was a result of flooding caused by Hurricane Harvey, one of the costliest hurricanes to hit the mainland United States. The plant lost electricity early in the storm, leading to the shutdown of refrigeration systems. After backup power generators also failed, volatile peroxides – used in the creation of plastics for a wide range of consumer products – heated up and became combustible. Result: explosions. Over several days, 500,000 pounds of organic peroxides in nine trailers burned at the plant.

Hurricane Harvey hit Houston hard last year, and the Arkema explosion was only one incident. While much of the world’s attention was focused on the breathtaking rescues carried out by first responders and volunteers, chemical engineers in and near Texas also thought – with great concern — of the many chemical plants located around Houston. Were plants and facilities designed to handle challenges posed by severe flooding? Were necessary safety processes in place to ensure that operations could be safely halted?

Other recent high-profile incidents in Texas –  most prominently, the explosion at the West Fertilizer Company in 2013, which caused 15 deaths and over 260 injuries – have reinforced the idea that safety must be a central focus of the chemical industry. Competition, however, makes it difficult to share best practices across companies. In addition, changes to improve safety are often reactive – made in response to catastrophic incidents such as those at the Arkema or West Fertilizer plants and focused on minimizing consequences after damage.

The U. S. Chemical Safety Board (USCSB) has a critical role to play in surmounting these challenges. Inspired by its vision of “a nation safe from chemical disasters,” the USCSB investigates industrial accidents involving chemicals that are focused on identifying the root cause. Its board members, who have significant experience and expertise in one or more of chemistry, engineering and hazard management, use the information collected from the investigations to make safety recommendations designed to reduce the risk or consequences of accidents.  Importantly, the nonpartisan USCSB does not regulate or fund chemical safety. Instead, the Chemical Safety Board acts as an independent, objective party in assessing chemical accidents and recommending better practices.

Thus it functions analogously to the National Transportation Safety Board (NTSB), which investigates accidents in transportation. The NTSB does not regulate or fund transportation. Nonetheless, its recommendations have greatly improved transportation safety over its 51 years – including from anti-collision technologies in aviation and rail to airbag and brake light improvements on automobiles. These advances have saved lives by identifying ways to make industry better.

The history of the USCSB is shorter – it was started in 1998 – but it has still played an important role in improving safety in the chemical industry. As one example, its 19 recommendations after the West Fertilizer explosion and fire have already led to improvements in hazardous materials training for firefighters across multiple delivery platforms. Likewise, its 26 recommendations after the explosion at BP America’s Texas City refinery in 2005 led to changes in practices sanctioned by key professional organizations and spurred the development of two new performance indicator standards for process safety by the American National Standards Institute. The Chemical Safety Board’s investigation into the Arkema incident is ongoing.

Unfortunately, the 2019 budget proposed by the Trump administration zeros out funding for the USCSB. Its requested fiscal-year funding, $12 million, is modest for a government agency. Likewise, the 2018 budget also proposed to defund the USCSB. This sustained effort reflects an ongoing de-emphasis on chemical safety – as a second example, Environmental Protection Agency Administrator Scott Pruitt has indefinitely delayed bans on the use of three hazardous chemicals, shown to be toxic to human health.

Chemical production is an essential component of modern society. This does not mean that there is not room to improve practices in manufacturing, storing, and shipping chemicals, and in ensuring the safety of those who work in or live near chemical plants. The vantage of an independent group is crucial for identifying those aspects that can and should be improved.

Defunding the USCSB, which provides this indispensable independent perspective, is likely to hinder efforts to identify the causes of chemical accidents – especially in low-regulation locales. Moreover, it is also likely to worsen our ability to respond in previously unforeseen events, such as the heavy flooding of Harvey, that may be exacerbated by climate change. Finally, it is likely to cost lives in future incidents.

Jacinta Conrad is an Associate Professor of Chemical and Biomolecular Engineering at the University of Houston, where she holds an Ernest J. and Barbara M. Henley chaired professorship. Her research explores the fundamental science underlying the transport of micro- and nanoscale particles, viruses, and bacteria, with energy-related applications in sustainable materials processing and in bioremediation. She is the co-PI of the NSF-sponsored Research Experiences for Undergraduates Site: Materials for Sustainability in Energy and Manufacturing, involving engineering faculty working in sustainability across four departments. At UH, Jaci teaches classes on engineering mathematics, fluid mechanics, and heat and mass transport. She received an S. B. in Mathematics from the University of Chicago and an M. A. and Ph.D. in Physics from Harvard University.

Big Sports Events Have Big Environmental Footprints. Could Social Licenses To Operate Help?

By Gina S. Warren, Associate Professor, University of Houston Law Center

Minneapolis will host the 2018 National Football League (NFL) Super Bowl in February. Pyeongchang, South Korea will host the 2018 Winter Olympics that month, followed next summer by the FIFA World Cup in Ekaterinburg, Russia. A growing number of mega sporting events promise fame and fortune to the host cities, with the lure of funding for new infrastructure and community projects and a boost in tourism for the event and beyond.

Just as the athletes compete in their sport’s biggest showcase, cities dream of urban revitalization, an improved economy and a better quality of life for residents. Past experience has shown, however, that host cities do not always reap social and economic benefits from these events. Instead, these major sporting events generate significant unforeseen – or at least unaccounted for – environmental consequences .

The environmental consequences involve everything from building new stadiums, hotels, parking lots and other infrastructure to handling the sanitation from all those new toilets. The use of “social licenses” – a practice adapted from mining and energy industries working in developing nations – could help.

Carbon emissions that contribute to climate change are a significant factor. While some organizers tout policies for offsetting carbon emissions generated by an event, this is little comfort in a time when the world needs to reduce carbon emissions, not just offset extra carbon generated by an event. Further, those offsets do not account for the heaps of trash and food waste, energy consumption to power the stadium or water consumption for toilets and to irrigate the fields and nearby areas. It is separate from the consumption, pollution and waste of constructing new buildings, parking lots, apartments and other structures. One research study conducted by professors at Cardiff University in the United Kingdom looked at different models to assess the ecological footprints of a major event – the Football Association Challenge Cup Final (English domestic football). The impact elements included travel, food and water, infrastructure and waste.

The study found that the average attendee generates a footprint seven times greater than someone going about normal, everyday activity. Increased travel by event visitors accounted for the biggest part of this significant increase. The consumption of food and drink, and the energy and resources required to produce that food and drink, makes up the next largest part of the footprint.

The study apportioned a very small footprint to the stadium itself (here the Millennium Stadium in Cardiff, Wales), in part because the footprint was amortized over a 100-year life span. This is a very optimistic view. Instead, it is more likely that the stadium will become obsolete within a few decades, as new technologies are introduced, new urban development occurs and cities offer lavish facilities to lure teams looking for a new home. NFL stadiums in the United States, for example, have a median age of 31 years before they are replaced.  In any event, it is difficult to assess the global environmental and economic impact of these events, let alone to try to create a strategy to address them.

Lastly, the ambition of hosting a mega sporting event tends to encourage cities to relax their rules for urban development and restructuring. This may be because of the short timeframe for hosting the event, or it may be that cities receive significant internal and external pressure to satisfy their obligations for the event.

In the run up to the 2014 World Cup and the 2016 Olympics in Brazil, for example, politicians in Rio de Janeiro executed “flash-votes” that allowed the Legislative Assembly to push through emergency bills to (1) lift the ban on alcohol at stadiums; and (2) annul the laws that protect historical architecture and patrimony of certain existing stadiums. These emergency bills were approved without the usual mandatory public debate, resulting in the demolition of two historical structures – the Sambodromo and Maracana Stadiums – and their replacement with a new stadium. This not only reflects a disregard for community involvement, it is also disconcerting because much of the cost for these events is borne by public funding. In the United States, for example, sports stadiums have historically been funded through publicly subsidized financial mechanisms including general sales taxes. In Australia, much of the $30 million annual cost of holding the Formula 1 Grand Prix comes from public funds. Further adding insult to injury is the fact that most local residents cannot afford to attend these mega events, which are targeted toward the elite foreign traveler.

Little legal framework exists to regulate these transient pop-up cities created by mega sporting events. While there are a handful of United Nations treaties on sports, mostly recognizing the general right to participate in and have access to sporting and recreational events, no international treaty addresses the social, economic and environmental externalities. The closest is Agenda 21, adopted by United Nations (UN) member nations in 1992. At the 1992 Rio Earth Summit, many UN member states committed to environmental sustainability in economic development generally and adopted Agenda 21 as the framework for fulfilling this obligation. Agenda 21 is non-binding and voluntary but encourages all organizations – governmental and non-governmental, international, regional and local – to prepare their own version based on the framework provided. While it does not specifically address sporting events, the International Olympic Committee (IOC), working with United Nations Environment Programme, adopted its own Agenda 21 in 1999, following the general framework of the Rio Agenda 21 and providing a plan to improve socioeconomic conditions, conserve and manage resources and strengthen the role of major groups in each Olympic host country.

Agenda 21 provides a potential framework for sustainable development generally, but it does little to address the unique temporary nature of mega sporting events, and if the 2016 Olympics were any indication of its effectiveness, it falls well short of ensuring sustainable practices. Further, other than the IOC, it does not appear to have been adopted by any other major sporting organization.

With more sporting events on the horizon than ever before, it is time to more holistically address the pollution, waste, greenhouse gases and other negative consequences. Agreements between host city and event organizer often ignore key issues, and host cities are sometimes concerned that organizers will simply go on to the next city if they push too hard on specific terms.

So what might work? One possibility is the use of social licenses, a concept that originated with mining and energy industries operating in developing nations. After unbridled environmental damage – and the ensuing reputational hits – during the 1990s, the World Bank encouraged the industry to use social licenses. These social licenses, which are essentially ongoing agreements with local governments and other stakeholders to indicate local acceptance of a project, helped identify and address concerns about the environmental and human cost of the transitory mining and drilling activities.

Over the last few decades, societies around the globe have begun to shift to a more informed and involved form of decision-making, with an eye toward sustainable practices. Social licenses are part of that, legitimizing stakeholder decisions and providing a framework for managing expectations. The use of social licenses for mega sporting events could benefit all parties and allow for a fair allocation of the benefits and costs associated with the event. Some of the key elements of a social license that could apply include full disclosure and transparency of process; making environmental, social, and economic information available in the local language; early and meaningful community involvement in decision-making; a commitment to sustainable energy and environmental sensitivity, and longevity of community investments.

Although there is no silver bullet to prevent the negative side effects of these mega sporting events, implementing a social license to operate mechanism could at the very least allow communities to identify and meaningful analyze the costs and benefits associated with hosting the event early in the process.

Gina S. Warren is an associate professor at the University of Houston Law Center where she teaches classes in property law, oil & gas law, and domestic and international energy law. Her research explores the role of policy and regulation in the area of sustainable energy, with a focus on renewable energy, climate change, and distributed generation. Prior to entering academia, Warren worked for the international law firm of Perkins Coie, based in Seattle, Washington, where she litigated and advised on matters of energy and utility law. Warren holds a Bachelor of Science in Psychology from the University of Arizona and a Juris Doctorate from Rutgers School of Law.

Radioactive Waste And The Hidden Costs Of The Cold War

By David Rainbow, Assistant Professor, Honors College, University of Houston

Hanford, a dusty decommissioned plutonium production site in eastern Washington state, is one of the most polluted places in the country. The disaster is part of the inheritance of the Cold War.

A few months ago, a 110-meter-long tunnel collapsed at the site, exposing an old rail line and eight rail cars filled with contaminated radioactive equipment. This open wound in the landscape, which was quickly covered over again, is a tiny part of an environmental and human health catastrophe that steadily unfolded there over four decades of plutonium production. Big Cold War fears justified big risks. Big, secretive, nuclear-sized risks.

Hanford and other toxic reminders of the Cold War should serve as a cautionary tale to those who have a say in mitigating geopolitical tensions today, as well as to those who promote nuclear energy as an environmentally sustainable source of electricity. The energy debate must balance the downside – not just the risk of a nuclear meltdown but also the lack of a permanent repository for the still-dangerous spent fuel rods – with the environmental benefits of a source of electricity that produces no greenhouse gases. People on both sides of the issue have a vested interest in how the current geopolitical tussling over nuclear weapons plays out.

These days, fear of other countries is big again. North Korea’s nuclear detonations and intercontinental ballistic missile launches – the most recent just days ago – are explicit threats to the U.S. For his part, President Trump has responded with threats (and mockery) of his own, promising to reign down “fire and fury” on North Korea if Kim Jong Un follows through on his threats.

On the campaign trail last year, Trump called for the U.S. to “greatly strengthen and expand its nuclear capability.” Recent reports (which Trump denies) that the President has called for increasing our nuclear arsenal by 10 times are in line with this campaign pledge. According to the reports, Trump wants to return to the peak nuclear production of the 1960s, the height of the Cold War. While Trump’s statements on nuclear weapons have been inconsistent, the overall picture has been clear and in line with his general chest-thumping approach to foreign policy: We will do and say what we want. None of this rhetoric is conducive to making the world safer from nuclear weapons.

The saga of Russia’s connection to Trump’s presidential campaign continues, too. Again this past week we learned more about conversations between Trump’s people and the Russians during the election. Here it’s been the left that has most often drawn upon rhetoric to characterize Russia’s meddling – or “The Plot Against America” – that harks back to the conflicts of the last century. Secret plots, missile tests, Russian spies, insinuations of treason, radioactive materials. Put these together with the deep disagreements between the U.S. and Russia over the ongoing conflicts around the globe (Syria, Ukraine, and the significant military exercises conducted along NATO’s eastern border), and we are back, it seems, to the bad old days of the Cold War.

Even if, as we all hope, the “new Cold War” never gets hot, escalating tensions can have seriously harmful effects at home. The radioactive cave-in at the Hanford site earlier this year should serve as a reminder of that.

Nuclear refinement at Hanford began as a part of the Manhattan Project during World War II, the highly secretive plan to develop a nuclear bomb.

Initially, the drive to mobilize for war justified substantial costs, among them significant damage to human and environmental health in the U.S. resulting from the nuclear program. Hanford was integral to the program: its plutonium fell on Nagasaki. But after the end of the war, the scale of production at the site increased to a fevered pitch thanks to the ensuing competition for global influence between the U.S. and the Soviet Union that became the Cold War.

Our gargantuan stockpiles of nuclear arms demanded gargantuan quantities of plutonium. Forty-five years of work at Hanford – from 1943 to 1987 – yielded 20 million uranium metal plugs used to generate 110,000 tons of fuel. The process also generated 53 million gallons of radioactive waste, now stored in 177 underground tanks at the facility, and created 450 billion gallons of irradiated waste water that was discharged onto “soil disposal sites,” meaning it went into the ground. Some of the irradiated discharge simply ran back to where it had originally been taken from, the nearby Columbia River. The Office of Environmental Management at the Department of Energy is currently overseeing a cleanup project involving 11,000 people. It is expected to take several decades and cost around $100 billion.

Kate Brown’s award-winning book, “Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters,” is a history of the Hanford plant and its Soviet doppelgänger, a plant in the Ural Mountains called Maiak. Brown points out that over the course of a few decades, the two nuclear sites spewed two times the radiation emitted in the Chernobyl explosion. Yet few Americans at the time, even those involved in plutonium production, realized this was going on or how dangerous it was.

Naturally, the hidden nature of the project meant that information was hard to come by. As Brown shows, even the experts, managers and scientists involved directly in overseeing the production process knew little about the seriousness of the risk. Doctors studying the effects of radiation on people didn’t have access to the research related to environmental pollution. Scientists studying fish die-offs had no way of connecting their findings to the deteriorating immune systems of humans in the same areas. Most poignantly, researchers measuring the effectiveness of nuclear bombs on the enemy did not communicate with researchers measuring the threat of nuclear bombs on the workers making them. Consequences for the workers were grave. Hanford and Maiak’s hidden mega-pollution was collateral damage in the fight to win the Cold War. Russia, like the U.S., is still living with the damage, and trying to bury it, too.

Within two days of the tunnel collapse at the Hanford site this past May, workers filled the breach with 53 truckloads of dirt and narrowly avoided a radiological event. However, these eight railcars are hardly the only waste left behind in the U.S. from our cold conflict with the Soviet Union, in which our willingness to risk human and environmental health was proportionate to our fears. It’s going to be a while before it’s all cleaned up. In the meantime, hopefully our leaders will work to keep the new Cold War from getting any worse.

David Rainbow is an Assistant Professor in the Honors College at the University of Houston. He teaches and writes about modern  Russian and Eurasian history. Prior to coming to Houston in 2015, he held postdoctoral fellowships at Columbia University’s Harriman Institute for Russian, Eurasian and East European Studies, New York University, and was a writer in residence at the Jordan Center for the Advanced Study of Russia at NYU. He holds an M.A. in European intellectual history from Drew University, and a Ph.D. in Russian history from New York University (2013). Before becoming a historian, he worked as an engineer aboard a merchant ship on the Pacific, a rancher in western North Dakota, and has lived in Russia and Siberia several times.

What Harvey Taught Us: Lessons From The Energy Industry

By Dr. Latha Ramchand, Dean, Bauer College of Business, University of Houston and Dr. Ramanan Krishnamoorti, Chief Energy Officer, University of Houston

The last week of August 2017 will remain etched in Houston’s memory for a long time to come. The week started with a total solar eclipse that captured the nation’s imagination. Then, Harvey made landfall on Aug. 25.

Dumping more than 51 inches of rain in some areas, Harvey gave new meaning to flooding.  Damaging more than 148,000 single family homes, 163,000 apartments and more than 500,000 vehicles, Harvey also is responsible for 88 fatalities.

The storm’s impact on the energy supply chain was significant, too.  Airports, roads and freight were affected, including about 10% of the nation’s trucking business. Harvey shut down 22% of nation’s refining capacity, 25% of oil production in the Gulf of Mexico and half of both the production of organic chemical and plastics resin and of natural gas in the Eagle Ford. Fuel shortages (perceived or real) hit Houston, Austin and Dallas.

So how did the industry deal with the disaster? We interviewed key decision-makers from a dozen companies to find out what they had learned from the past and what should be changed before future storms. And we asked their thoughts on remaining and growing their organizations along the Gulf Coast, a geographic region prone to severe weather.

This wasn’t the industry’s first test, although past emergency management plans mostly addressed hurricane-force winds and storm surge.  Massive rain and inland flooding on the scale witnessed during Harvey was unprecedented. In addition to facilities and operations, approximately 10% of industry personnel were impacted, as was access to offices, and industrial sites.  In short mobility was curtailed for 7 days for over six million people.  Harvey was unique.

After Superstorm Sandy, the Department of Energy (DOE) requested the National Petroleum Council (NPC) to study emergency preparedness, which led to a series of recommendations. These revolved around coordinating industry efforts with those of federal, state and local agencies to make sure emergency management plans reflect energy system interdependencies in responding to regional and national disruption.

The American Petroleum Institute has protocols for members to use during emergencies while maintaining compliance with antitrust laws that limit information-sharing across companies. During emergencies, the electric power utilities operate under rules set by the Federal Energy Regulatory Commission and in the state of Texas by ERCOT, which operates most of the state’s electric grid.  In addition, in Texas the Fuel Team, a state level coordinating council, brings together industry and the public sector to help coordinate relief efforts, including the ports, Federal Emergency Management Agency, the Department of Public Safety, Department of Transportation, health care and local emergency management officials.

While the framework for disaster planning was in place, Harvey tested its effectiveness.

Is This A Normal Business Cycle, Or Are We Seeing Structural Changes To The Energy Business?

By William “Bill” Maloney, Energy Advisory Board, University of Houston

There are many aspects of what we are experiencing today in energy markets that can lead you to believe we are simply in another commodity cycle. In years past we have seen the low-cost producers maintain production to capture market share. We have also seen production cuts aimed at balancing supply and demand. Today we are approaching a delicate supply-and-demand balance. We see oil prices firming as a result.

However, I do not believe this is the entire story. My view is that there are four factors impacting the energy business that will lead to long-term structural change. They are:

  1. Changing of the guard: We are witnessing a change in the type of individual running some of the largest energy companies.  ExxonMobil, Chevron, Shell, Total and Statoil are all currently or about to be run by people who have significant downstream experience. Why is that important? The downstream sector of the energy business (refining, chemicals and marketing) has had to live with thin margins forever. So the focus on cost cutting and a relentless drive for improvement has always been part of downstream’s DNA. Now the same drive to control costs and improve profitability will be happening across all sectors within these companies – upstream, downstream and new energy.
  2. Costs: We have experienced a large reduction in the cost of doing business especially in the upstream sector. Service companies are hurting and struggle to make a profit at current commodity prices. As supply and demand comes into balance and prices firm we will likely see some increase in costs. However, an argument can also be made that a significant percentage, perhaps up to 50% of the cost reductions we have witnessed are both structural and sustainable. Many publicly traded companies are disclosing how they are now profitable at $50 a barrel. They have made changes to their businesses in the form of greater efficiency, fewer staff and the application of technology. In my view, there is no going back. Having worked hard to make these changes, companies are not likely to abandon all the good work they have done.
  3. Climate: Many oil and gas companies are working toward producing cleaner and greener energy. Many states in the U.S. and countries outside the U.S. are demanding a stoppage or significant reduction to flaring. Companies are spending more money on various forms of clean and renewable energy. Looking toward the future we can already see that power generation, heating in buildings and passenger cars are all changing and will result in less carbon usage in decades to come. We are clearly on a long journey, which will result in the world changing to a lower carbon society.
  4. Financial markets: We have just finished third quarter earnings reporting. The financial markets are pushing companies for even more capital discipline and even further improvements on returns. Some companies are almost bragging about their ability to lower costs and be robust at current commodity prices. Right now only the best projects, especially offshore oil and gas, are being funded. Non-core or non-competitive assets in company portfolios are being sold to others that can see better profitability. No longer are the headlines being about growth in reserves. Rather the conversation is all about the growth in profits.

These four factors will have a large impact on the energy business going forward and will lead to some structural change. Recently I was talking about this on a radio program and the interviewer asked, “What about the state run companies? Will they be doing what you describe as well?” My answer was first, we are all aware that Saudi Aramco is getting ready for an initial public offering (IPO). When that happens, Saudi Aramco will be subject to the same pressure from financial markets that I have mentioned. In addition to that, if any company, be they public, private or state run, can increase profitability, bring down costs and produce cleaner energy, it is a win for all concerned. So my view is that state controlled companies have as much to gain as public companies in running their businesses as efficiently as possible.

I would like to mention one more thing. The structural changes outlined above will not circumvent commodity cycles. Companies have adjusted to a low price environment by cutting costs, lowering capital expenditures, deferring projects, layoffs and some have even cut their dividend. There will come a time where this underinvestment will manifest in a supply shortage. As a world, we use over 30 billion barrels of oil a year. We are currently not replacing the reserves we produce by a wide margin. Additionally, oil fields naturally decline at 5% each year, although I continue to marvel at how advances in technology enable the industry to slow that decline.

In any case, at some point in the next decade we could very well see a supply shortage due to the massive underinvestment we are witnessing at the moment. Related to this, some believe that shale in the U.S. can come to the rescue. I would not count on that. Today the onshore U.S. produces approximately 8% of total world oil production. It is hard to visualize a world where shale can take the place of a large portion of today’s conventional oil production.

In closing, many people ask me what the future will look like, especially for jobs in the energy industry. Bringing reliable energy to the world’s population will always be a priority for any energy company. The fundamentals of science and engineering will never go away. They are the foundation of the energy business.

Technology will improve and as it does, it will only enhance our ability to safely and efficiently bring energy to the world. So my view is that while today things may look tough for employment in energy, the future is bright. The world needs energy and it needs smart dedicated men and women to deliver that energy.

William “Bill” Maloney has been a member of the University of Houston’s Energy Advisory Board since 2014. He is currently on the Board of Directors of Trident Energy and serves as an energy advisor to Warburg Pincus. Bill retired from Statoil in 2015, where he was Executive Vice President, leading the business area Development and Production North America. Bill attended Syracuse University where he received an MS in geology.

Robin Hood Rides Again: Lifting The Electric Vehicle Tax Credit

By Chris Ross, Executive Professor, C.T. Bauer College of Business, University of Houston

The recently issued House GOP tax overhaul bill proposes to eliminate the $7,500 federal tax credit for battery electric vehicle (BEV) purchases. This subsidy was introduced in 2012 and applies only to the first 200,000 BEVs sold by each manufacturer.

A smaller tax rebate has been available for plug-in hybrid electric vehicles, or PHEVs, since 2016. In California, BEV manufacturers can also benefit from sales of clean air credits through the sale of zero emission vehicles, funded by manufacturers who sell the internal combustion engine vehicles that most people choose to drive.

These tax credits and other benefits are generally intended to reduce greenhouse gases and on-road emissions of toxic pollutants in urban areas. There is a widespread belief that BEVs represent the future and will steadily displace internal-combustion-powered vehicles in the global vehicle fleet.

BEV advocates worry that cutting the subsidies will slow the growth of electric vehicles. But the reality is more complex.

As important as tax rebates in promoting BEVs is the aggressive Corporate Average Fuel Efficiency (CAFE) standards imposed by the Obama administration in 2012 as a measure to “reduce our dependence on foreign oil,” as well as reduce emissions. This will require manufacturers’ sales of cars and light trucks to average 54.5 mpg in model year 2025, up from a mandated 35.5 mpg for model year 2017. The standards and their penalties are under review by the Trump administration, and in July 2017 the National Highway Traffic Safety Administration (NHTSA) of the Department of Transportation filed in the Federal Register:

“NHTSA seeks comment on whether and how to amend the civil penalty rate for violations of Corporate Average Fuel Economy (CAFE) standards. NHTSA initially raised the civil penalty rate for CAFE standard violations for inflation in 2016, but upon further consideration, NHTSA believes that obtaining additional public input on how to proceed with CAFE civil penalties in the future will be helpful.”

There is a lot to like about BEVs. Neighbors of mine both recently retired after long careers as engineers for major oil companies and immediately acquired a Tesla Model S. They are enraptured with its design, extraordinary torque and technological sophistication. Doubtless, the tax credit helped them decide; there is gratification beyond economics in receiving money from, rather than sending it to, the IRS. But the Tesla Model S probably would have competed well with conventional vehicles in the luxury car niche even without the tax credit.

The tax credit is more important outside the luxury niche, but the CAFE standards may be more important still. A Bloomberg report estimated that GM was selling its Bolt BEV at a loss of $8,000 or $9,000 per vehicle, presumably hoping to recover the costs through lower penalties from failing to meet increasingly stringent CAFE standards. In this case the costs are being borne ultimately by GM’s shareholders. If the CAFE standards are relaxed and penalties reduced, GM may have to answer questions from shareholders on whether this was a wise use of resources.

The answer will probably be yes, on the grounds that battery costs are declining such that the BEV niche may expand beyond the luxury sector. Nevertheless, there remain barriers to BEV penetration rates:

  • Range anxiety: The Chevy Bolt takes about 10 hours to fully recharge from empty to its full range of 238 miles at a home 24 Volt/32 Amp charging unit in your garage; there are a limited number of publicly available DC power fast-charging stations to top up. This suggests that the Bolt would be best suited for commuting or short trips, which limits its functionality.
  • Full cycle cost: The Bolt received very positive reviews but remains expensive for a small hatchback when fully equipped, relative to its internal combustion engine competitors. It would be economically more attractive if gasoline prices increase while the price of natural gas – which in many areas is the marginal source of the electricity that powers these vehicles — stays low. Thus, BEVs will be most competitive where gasoline is highly taxed and power is relatively inexpensive.
  • Social costs: Cobalt, which is required to stabilize lithium ion batteries, is largely found in parts of the Congo renowned for human rights violations and abusive workforce practices.
  • Battery recycling: As BEVs penetrate the vehicle fleet and batteries wear out, a new industry will be required to recycle the spent batteries and separate the component materials.

These barriers will put brakes on the penetration rate of BEVs.

There will doubtless be an angry response to the GOP proposal, but its effect will be minimal. Tesla will likely reach the 200,000 battery electric vehicle mark in early 2018, followed quickly by GM and Nissan, so killing the rebate this year will only slightly advance the schedule for eliminating the tax credit.

There is also an issue of equity. The $7,500 tax rebate is most valuable to high income people, but it is paid for by the rest of us in a reverse Robin Hood move of robbing the poor to give to the rich. Eliminating it will rob the rich of this perk, and the money saved can be put to more fruitful and equitable uses.

Hopefully the administration will seek out other situations that are regressive, where high-cost energy solutions favored by the rich are paid for by spreading the costs over rich and poor alike.

Chris Ross is an Executive Professor of Finance at the C.T. Bauer College of Business and the University of Houston, where he teaches classes on strategies in the oil and gas industry. He also leads research classes investigating how different energy industry segments are creating value for shareholders.   He served on the Program Committee of the Offshore Technology Conference as Chair of the Marine Technology Society OTC Sub-Committee from 2008-13, when he was also Co-Chair of the Energy Policy Sub-Committee of the Greater Houston Partnership’s Energy Collaborative. From 2012-15, he served as Board Chairman of the River Oaks Chamber Orchestra and remains on the Board and the Executive Committee.