Amnesty and New Violence in the Niger Delta

by Rebecca Golden-Timsar, Associate Director, Graduate Certificate in Global Energy, Development & Sustainability (GEDS), University of Houston

Hoping to quell a violent insurgency aimed at the Nigerian government and the oil industry in the Niger Delta, the Nigerian presidency implemented an unconditional amnesty in 2009, offering a clean slate to militants whose demands for resource control, environmental justice and sustainable socioeconomic development had resulted in massive regional disruption.

I have been conducting research in the Niger Delta for the past 20 years, and my latest trip there in early 2018 found ample evidence that the amnesty hasn’t worked as planned. The negotiated amnesty and resulting fragile peace are primed for collapse, while crime and oil theft remain serious problems.

Then-President Umaru Musa Yar’Adua introduced the Presidential Amnesty Program, (PAP) or the Niger Delta Amnesty Program (NDAP), as a disarmament, demobilization and reintegration program to answer to the increasing violence throughout the prior decade, which intensified after Ogoni environmental rights’ activist Ken Saro Wiwa was executed by a military tribunal in 1995.

The amnesty was originally designed to last only five years, but it remains in effect.

In the 18 months leading up to the 2009 amnesty deal, world crude oil prices topped $145 per barrel while the insurgency compromised Nigeria’s production capacity by 900,000 barrels per day (about 30% in 2007), which dramatically impacted the national treasury. Although the amnesty precipitated a cessation of hostilities against the federal government and the oil industry, the results are fraught with the makings of new violence.

Approximately 30,000 people in the Niger Delta enrolled in the PAP as ex-militants. However, only 2,700 weapons were surrendered. Some militants, fearing the program and its potential repercussions, abstained from participating. I found three potentially explosive problems with the amnesty as it relates directly to reintegration of ex-combatants: reinforcement of militant hierarchies and commodification of violence; substitution of militancy for criminality and ongoing communal tensions; and professionalization of illegal oil lifting of Nigeria’s current production.

Reinforcement of militant structures and organizations

Under the agreement, former combatants were promised monthly stipends and job training. But the payment system is hampered with challenges. The extended duration of the payments – almost 10 years – and the methods by which they are distributed reinforce militant hierarchies rather than dismantling them and helping to reintegrate the former militants into society. At the outset, the federal government of Nigeria reportedly made lump sum payments to ex-commanders, who were charged with distributing the cash to their ex-combatants. This system was challenged in 2015 by mid-level commanders claiming corruption in the payment system and in the granting of large pipeline security contracts to top commanders, with little trickle-down effect.

A new system was devised to directly deposit the payments to the former combatants’ bank accounts. But this was also challenged by the ex-militants, who accused commanders and the banks charged with the distribution of collusion and shortchanging payments. The lump sum cash payment system was resumed in 2017.

This is problematic on several levels. First, paying ex-commanders directly maintains fighting organizations and power structures. The continued amnesty payments reinforce patronage networks. They also create vehicles for political power and political violence for the 2019 presidential elections.

Finally, the stipends have morphed into a cash-for-peace system that is not sustainable, turning violence into a commodity.

Exchanging militancy for criminal behavior and community tensions

The top-down cash distribution creates and re-creates potential rivalries through discretionary and often opaque cash disbursements. By bolstering ex-commanders’ control, the former fighting organizations are re-created and able to leverage their power over the government.

This has resulted in fresh threats and eventual attacks on the military, oil installations and hostage taking, with direct consequences on oil production at a time when lower oil prices have already affected Nigerian coffers.

When stipends were not paid for several months in 2016, ex-combatants quickly slipped into old patterns of resistance as ‘new’ groups that emerged. The Niger Delta Avengers, Red Scorpions and the Niger Delta Greenland Justice Movement all rose in 2016, attacking the Forcados pipeline installations  in the western Niger Delta, causing national production to plummet to a 30-year low at 1.1 million barrels per day. After payments in arrears were made, these groups fell somewhat silent again.

Further, because of the relatively significant amount of monthly individual stipend, ex-combatants are discouraged from getting a job, which even for professionals, generally pays less than the amnesty stipend of 65,000 Naira per month, equivalent to about $180 in U.S. dollars. An average schoolteacher in Nigeria earns 18,000 Naira, or about $50.

The sizeable stipends, coupled with limited access to and availability of skills training under the amnesty agreement, the lack of fundamental improvements in regional socioeconomic development and increasing small arms circulation, only serve to sustain the fighting frameworks and capabilities to strike. Consequently, concepts of the marginalized warrior identity, fundamental to the protracted violence, are also sustained.

Because there haven’t been sufficient sustained reintegration efforts in the way of training and job creation, there is an increasing perception of criminality in the Niger Delta, and particularly in the oil capital, Port Harcourt. Reports of the kidnapping of prominent locals and their family members abound, as do reports of increased armed robbery. Additionally, former combatants continue to turn to gang (known as cults or campus cult organizations) membership, creating altered if not new layers of communal rivalries as these gangs battle for turf.

Further, the amnesty program’s lack of full participation from some commanders and their militants, along with the limited surrendering of weapons, generates additional communal rivalries and violent clashes, both between and within militant and gang hierarchies.

Illegal oil lifting

Finally, illegal oil lifting (known as bunkering) has been increasingly professionalized and militarized: there are organized underground labor unions for both crude and refined products; there are well-defined levels of investment for buy-in for the lifting and marine transport activities from the pipeline tappers, pumpers and speedboat drivers to offshore tankers, captains and document forgers; there are set payoff calculations for the players including the Nigerian military’s joint task force; and security details for each phase of the operation.

Current bunkering estimates range from 10% to 15%, or a minimum of 200,000 barrels per day (roughly the total production of Trinidad) out of the official production rate of just over 2 million barrels per day in early 2018.

Despite the decreased hostilities ushered in by the amnesty, Niger Deltans report that since the inception of the amnesty, the federal government’s military presence has broadened rather than diminished. They blame the military and the politicians that control it for the majority of the bunkering activities and for generating the conditions for the current reciprocal racketeering.

The outcome of the military presence, the ongoing militant hierarchies and poverty serve to maintain a social disorder and a security economy-potent ingredients for petrol violence anew.


Eyes in the Sky Offer a Dramatic Picture of Energy Use

by Ryan Kennedy, Associate Professor of Political Science, University of Houston

Some readers will remember the dramatic change that took place with computer access in the 1980s and 1990s. Computers were once large machines, which took entire rooms to themselves and were only available to major corporations, government organizations and universities. This changed dramatically in the 1980s with the introduction of the personal computer. Much smaller machines, still capable of doing advanced computations with what, for the time, was amazing speed.

Today we may be experiencing a similar revolution, but this time with satellites, and this revolution will have important implications for the energy industry. Two interrelated trends are driving this. First, governments and corporations are opening up the data collected from their satellites for public use. One of the most popular examples of this is the Night Lights dataset, provided by the National Oceanic and Atmospheric Administration (NOAA). Originally used to detect cloud cover for military usage, NOAA now makes available a global map of the world as it is lit at night – producing dramatic illustrations of global energy usage, like the map of North and South Korea below.


The second trend has the potential to be even more disruptive. Much as the microprocessor allowed access to computers for the masses, the development of picosatellites – small, low-cost satellites that could be used for a variety of purposes – have the potential to do the same for satellites. Planet, for example, is a private company that utilizes a chain of satellites constantly orbiting the earth to collect high-resolution pictures of the planet at all times. From this information, they design computer algorithms to monitor supply chains, natural disasters and a variety of other metrics that may interest other companies. Everyone from NASA to SpaceX is now trying to encourage the development and deployment of smaller and smaller satellites that can do everything from monitoring pollution to creating an artificial meteor shower.

The explosion of satellite data has large potential impacts on research and policy in the energy arena. Eugenie Dugoua at Columbia University, Johannes Urpelainen at Johns Hopkins University’s School of Advanced International Studies and I approached a specific application of this satellite data in our forthcoming article in the International Journal of Remote Sensing. We used the data from the Night Lights dataset to explore the extent to which it could be used to track electrification patterns among villages in rural India. This was the largest attempt to validate the data on a sub-national level, and our results suggested satellite data could be used with reasonable success to track the progress of rural electrification throughout India.

This suggests policymakers can use such data to gain nearer-real-time monitoring of the progress of their policies, without having to wait for the next census.

There were, however, some caveats. First, we noticed that the capability of the satellite data to capture the development of rural electrification was conditioned on the methods used for analysis. In particular, the performance depended greatly on how good the available geographic information was for the actual shape of the village. Second, we noted that the capability of the satellite data to detect electrification was conditioned on the steadiness of the regional electricity supply. This suggests satellite data works better in areas that are more developed and have access to high quality connections. Finally, even though some scholars have used Night Lights to detect the level of economic development for regions, we find that it is not a very strong indicator in rural India, where the government has made a strong push to electrify poorer villages.

All of these findings suggest some areas about which policymakers and corporations need to be aware for the upcoming satellite revolution. While satellite data can do a lot for us, the ability to develop good proxies for events on the ground still depends on our ability to directly capture the relevant comparison information. Satellite data may not replace traditional monitoring, but it will likely provide a way to get data more quickly and cheaply between traditional data gathering periods.

It also provides a warning about the limitations of satellite data collection. Careful validation is crucial for understanding what the satellites are actually capturing with their images and making sure the data means what we think it does.

Much like the hype about “Big Data,” managers should beware of latching onto this data before its utility has been established.

We must also be aware of the context around the data collected. As we found, policies intended to electrify poorer villages undermined the ability for us to use the satellite data for measuring economic wellbeing, since some villages gained electricity access exactly because they were underprivileged. As with any data source, we need to have a clear understanding of the process that generates the data we observe.

We are moving into a potentially revolutionary era when it comes to the accessibility of data from satellites. With careful study and evaluation, this data can greatly assist corporations and governments as we attempt to purse policy goals and monitor how our world works.

Plastics Recycling: Could The Future Be In India?

by Ramanan Krishnamoorti, Chief Energy Officer, University of Houston

On a recent visit to India I made two striking observations:  First, in the smaller cities and on national highways, plastic bags were everywhere. Plastic pollution was rampant. Second, even as the Indian government’s pro-growth policy calls for the increased use of plastics – plastics are, in effect, a proxy for economic growth – the country’s plastics recycling industry is booming, spread across an informal amalgam of street pickers, small start-ups and non-governmental entities focused on the secondary use economy.

India isn’t alone in its efforts to deal with plastic waste. About 75 percent of plastic waste in the U.S. ends up in landfills, and less than 10% is successfully recycled. (Most of the rest is combusted for energy.)

Plastics are lightweight, versatile and durable but in spite of their ubiquitous presence and critical role in many of our technological advancements – from automobiles and computers to replacement heart valves – they are now seen as a challenge to animals, marine life and future generations of humans.

Recent reports of plastics and microplastics pollution in every remote corner of the oceans has raised public awareness of the challenges posed by our increased use of synthetic plastics. In some cases this has raised the call for more biodegradable plastics to replace synthetic plastics. However, a UN report in 2016 indicated that biodegradable plastics are not the panacea for the marine challenge of plastic litter in the ocean.

Even so, biodegradable plastics and those that are easier to recycle or repurpose will be important for reducing other waste streams, and science has responded.

A number of researchers are working on the problem. From the other end, a growing number of cities in the U.S. and Europe have banned single-use plastic bags. India, too, is struggling to deal with these ubiquitous carry-alls.

Some cities and regions of India have banned these ultra-thin bags – which are made of polyethylene, a non-biodegradable petrochemical product – and metropolitan areas and both some state and the national governments are focused on the difficult task of enforcing the bans.

India’s informal plastics recycling economy has instead focused on the more lucrative water and shampoo bottles, which are easier to gather and process and are far more lucrative than the lightweight bags. But the country also has spawned some of the most creative thinking about how to deal with this thorny issue.

And all of those efforts come amidst a government push to actually increase the amount of plastics in Indian society.

The average Indian uses approximately 25 pounds of plastics each year, about a tenth of what an average American uses. The Indian government has set the goal of doubling the per capita plastics consumption by 2022, presumably a surrogate measure for economic advancement and increased advanced manufacturing.

More plastic represents more wealth.


Figure 1  Per capita plastic products consumption (Kg/person)

Recent estimates predict a 10% compound annual growth rate (CAGR) in plastics consumption over the next five years, reflecting a similar growth in the preceding five years. On the other, the local governments are responding to public outrage, including with the banning of plastic bags including ultra-thin bags of polyethylene and Styrofoam-based products. The national government is also considering banning polyvinyl chloride, or PVC, a plastic used in infrastructure building that, when improperly disposed of, leads to the release of toxic compounds into the environment.

That’s just one example of why India has long been called the land of contradictions. The country’s love-hate relationship with all things plastics is no different.


The street picker-based recycling economy, along with the various bans, have ensured India’s continued efforts in battling plastic pollution. At the other end of the spectrum, the country is home to some of the most innovative thinking about plastics recycling.  Clearly the economic and developmental goals of India, if not the world, require a fresh approach to changing the story of plastics.

That approach might be found here. Banyan Nation, a plastics recycling start-up from the Indian city of Hyderabad, stunned the world by winning the Dell People’s Choice Award for Circular Economy Entrepreneur as part of the Circulars Awards at the World Economic Forum in Davos.

The five-year-old company is known for its work with Tata Motors in recycling automotive bumpers and for working with the French cosmetics company L’Oréal to recycle shampoo bottles. But its true innovation lies in its efforts to address the three key challenges in plastics recycling in countries like India – addressing the “last-mile” of the waste through a digital network; developing a strategy for cleaning and sorting the plastic waste economically to ensure creation of a secondary-use pellet that was comparable to primary plastic; and lastly partnership with large state-wide entities and multi-national corporations towards the waste-to-product recycling for e-waste, automobile parts and consumer products packaging.

Such a systems level approach is perhaps the only way we are going to address the challenge of plastics pollution and ensure their continued use to fuel life-changing innovation across the world.

Count On Sea Levels To Keep Rising For A Long Time

by Bob Talbot, Professor of Atmospheric Chemistry Director, Institute for Climate and Atmospheric Science, University of Houston

The world’s oceans have been warming for decades. Increasing water temperatures – driven by higher emissions from a variety of greenhouse gases – have caused the oceans to thermally expand. Glaciers and other previously frozen areas are melting, aggravating and accelerating the rise of the ocean surface.

Fossil fuels are a key contributor to the warming, but they are not the only one.

Scientists now track ice across the Arctic and Antarctica, and what they are finding isn’t encouraging. Last year was the warmest year ever recorded for the global oceans, a phenomenon linked to a number of potential problems, including damage to important habitats such as coral reefs and risks for certain animal populations.

In addition, the Arctic Ocean is expected to be ice-free during the summer within the next 20 years.

Rising sea levels are among the most visible signs of climate change, as well as one that will have a dramatic impact on humans.

And it’s happening faster along the Gulf Coast – home not only to the nation’s fourth-largest city, Houston, but also home to much of the nation’s critical energy infrastructure – than anywhere else in the United States, between 5 millimeters and 10 millimeters per year.

Eventually, cities such as Galveston will be underwater, and the rising waters also will impact the Port of Houston’s operations in coming decades. This is the largest U.S. port in terms of tonnage handled each year, and the amount is increasing due to enhanced Panama Canal ship traffic.

The same thing is happening along the Florida Keys, where areas are already flooded today. Hurricane Irma facilitated the erosion of beaches and other low-lying areas.  Today residents are driving to the local grocery store with many inches of seawater on the roadways in many locations, and although efforts to raise the roads are underway, it won’t be cheap.

All of this is just the tip of the iceberg.  The worst is yet to come. And the economic impact on the United States could be dramatic.

The causes are complex. That means the solutions – and the timeline for any possible recovery – are complex, too.

Fossil fuels are a major contributor to the problem. Carbon dioxide and other greenhouse gases are being added to Earth’s atmosphere at alarming rates as the world continues to burn crude oil, coal and natural gas. Indeed, the annual increase in carbon dioxide is at its highest rate ever. That has pushed the Earth out of radiative equilibrium – ideally, the heat coming to Earth from the sun is equal to the amount of heat that returns to space. Because carbon dioxide and other greenhouse gases trap some of the heat that is trying to escape our atmosphere, the radiative equilibrium is out of balance.

But the causes go beyond fossil fuels. Global agriculture is also a growing problematic source of methane and nitrous oxide, two powerful greenhouse gases. The ever-expanding population of Earth will not stop, and these people need to be fed.

And not all greenhouse gases are equal. Methane, for example, is a more potent greenhouse gas than carbon dioxide, but it also degrades in the atmosphere within a decade or so. Cutting methane emissions would, therefore, show results relatively quickly.

Carbon dioxide and nitrous oxide are different, and their warming effects will remain intact for future generations. This is because they are essentially chemically unreactive in the troposphere, or the lowest part of Earth’s atmosphere, where we live. Moreover, carbon dioxide is most soluble in cold oceanic waters, which are diminishing. Warmer ocean waters means the oceans can absorb less carbon dioxide.

Estimates of this are highly uncertain, but the full warming effect of an emission may not be felt for several decades, if not centuries.

What does this all mean?

Sea levels will likely continue to rise for many centuries into the future.  Don’t get wet.

Defunding the Chemical Safety Board is a Bad Idea and is Likely to Increase Chemical Disasters

By Jacinta Conrad, Associate Professor of Chemical and Biomolecular Engineering, University of Houston

The two explosions in Crosby, Texas, on August 29, 2017 weren’t loud or massive – just gentle pops of sound. Even such small pops, however, were sufficient to disperse chemicals involved in the manufacture of organic peroxides into the air. First responders at the scene reported respiratory irritation and fell ill after breathing the smoke seen at the perimeter of the plant site.

The explosion at the Arkema plant in Crosby was a result of flooding caused by Hurricane Harvey, one of the costliest hurricanes to hit the mainland United States. The plant lost electricity early in the storm, leading to the shutdown of refrigeration systems. After backup power generators also failed, volatile peroxides – used in the creation of plastics for a wide range of consumer products – heated up and became combustible. Result: explosions. Over several days, 500,000 pounds of organic peroxides in nine trailers burned at the plant.

Hurricane Harvey hit Houston hard last year, and the Arkema explosion was only one incident. While much of the world’s attention was focused on the breathtaking rescues carried out by first responders and volunteers, chemical engineers in and near Texas also thought – with great concern — of the many chemical plants located around Houston. Were plants and facilities designed to handle challenges posed by severe flooding? Were necessary safety processes in place to ensure that operations could be safely halted?

Other recent high-profile incidents in Texas –  most prominently, the explosion at the West Fertilizer Company in 2013, which caused 15 deaths and over 260 injuries – have reinforced the idea that safety must be a central focus of the chemical industry. Competition, however, makes it difficult to share best practices across companies. In addition, changes to improve safety are often reactive – made in response to catastrophic incidents such as those at the Arkema or West Fertilizer plants and focused on minimizing consequences after damage.

The U. S. Chemical Safety Board (USCSB) has a critical role to play in surmounting these challenges. Inspired by its vision of “a nation safe from chemical disasters,” the USCSB investigates industrial accidents involving chemicals that are focused on identifying the root cause. Its board members, who have significant experience and expertise in one or more of chemistry, engineering and hazard management, use the information collected from the investigations to make safety recommendations designed to reduce the risk or consequences of accidents.  Importantly, the nonpartisan USCSB does not regulate or fund chemical safety. Instead, the Chemical Safety Board acts as an independent, objective party in assessing chemical accidents and recommending better practices.

Thus it functions analogously to the National Transportation Safety Board (NTSB), which investigates accidents in transportation. The NTSB does not regulate or fund transportation. Nonetheless, its recommendations have greatly improved transportation safety over its 51 years – including from anti-collision technologies in aviation and rail to airbag and brake light improvements on automobiles. These advances have saved lives by identifying ways to make industry better.

The history of the USCSB is shorter – it was started in 1998 – but it has still played an important role in improving safety in the chemical industry. As one example, its 19 recommendations after the West Fertilizer explosion and fire have already led to improvements in hazardous materials training for firefighters across multiple delivery platforms. Likewise, its 26 recommendations after the explosion at BP America’s Texas City refinery in 2005 led to changes in practices sanctioned by key professional organizations and spurred the development of two new performance indicator standards for process safety by the American National Standards Institute. The Chemical Safety Board’s investigation into the Arkema incident is ongoing.

Unfortunately, the 2019 budget proposed by the Trump administration zeros out funding for the USCSB. Its requested fiscal-year funding, $12 million, is modest for a government agency. Likewise, the 2018 budget also proposed to defund the USCSB. This sustained effort reflects an ongoing de-emphasis on chemical safety – as a second example, Environmental Protection Agency Administrator Scott Pruitt has indefinitely delayed bans on the use of three hazardous chemicals, shown to be toxic to human health.

Chemical production is an essential component of modern society. This does not mean that there is not room to improve practices in manufacturing, storing, and shipping chemicals, and in ensuring the safety of those who work in or live near chemical plants. The vantage of an independent group is crucial for identifying those aspects that can and should be improved.

Defunding the USCSB, which provides this indispensable independent perspective, is likely to hinder efforts to identify the causes of chemical accidents – especially in low-regulation locales. Moreover, it is also likely to worsen our ability to respond in previously unforeseen events, such as the heavy flooding of Harvey, that may be exacerbated by climate change. Finally, it is likely to cost lives in future incidents.

Jacinta Conrad is an Associate Professor of Chemical and Biomolecular Engineering at the University of Houston, where she holds an Ernest J. and Barbara M. Henley chaired professorship. Her research explores the fundamental science underlying the transport of micro- and nanoscale particles, viruses, and bacteria, with energy-related applications in sustainable materials processing and in bioremediation. She is the co-PI of the NSF-sponsored Research Experiences for Undergraduates Site: Materials for Sustainability in Energy and Manufacturing, involving engineering faculty working in sustainability across four departments. At UH, Jaci teaches classes on engineering mathematics, fluid mechanics, and heat and mass transport. She received an S. B. in Mathematics from the University of Chicago and an M. A. and Ph.D. in Physics from Harvard University.

Big Sports Events Have Big Environmental Footprints. Could Social Licenses To Operate Help?

By Gina S. Warren, Associate Professor, University of Houston Law Center

Minneapolis will host the 2018 National Football League (NFL) Super Bowl in February. Pyeongchang, South Korea will host the 2018 Winter Olympics that month, followed next summer by the FIFA World Cup in Ekaterinburg, Russia. A growing number of mega sporting events promise fame and fortune to the host cities, with the lure of funding for new infrastructure and community projects and a boost in tourism for the event and beyond.

Just as the athletes compete in their sport’s biggest showcase, cities dream of urban revitalization, an improved economy and a better quality of life for residents. Past experience has shown, however, that host cities do not always reap social and economic benefits from these events. Instead, these major sporting events generate significant unforeseen – or at least unaccounted for – environmental consequences .

The environmental consequences involve everything from building new stadiums, hotels, parking lots and other infrastructure to handling the sanitation from all those new toilets. The use of “social licenses” – a practice adapted from mining and energy industries working in developing nations – could help.

Carbon emissions that contribute to climate change are a significant factor. While some organizers tout policies for offsetting carbon emissions generated by an event, this is little comfort in a time when the world needs to reduce carbon emissions, not just offset extra carbon generated by an event. Further, those offsets do not account for the heaps of trash and food waste, energy consumption to power the stadium or water consumption for toilets and to irrigate the fields and nearby areas. It is separate from the consumption, pollution and waste of constructing new buildings, parking lots, apartments and other structures. One research study conducted by professors at Cardiff University in the United Kingdom looked at different models to assess the ecological footprints of a major event – the Football Association Challenge Cup Final (English domestic football). The impact elements included travel, food and water, infrastructure and waste.

The study found that the average attendee generates a footprint seven times greater than someone going about normal, everyday activity. Increased travel by event visitors accounted for the biggest part of this significant increase. The consumption of food and drink, and the energy and resources required to produce that food and drink, makes up the next largest part of the footprint.

The study apportioned a very small footprint to the stadium itself (here the Millennium Stadium in Cardiff, Wales), in part because the footprint was amortized over a 100-year life span. This is a very optimistic view. Instead, it is more likely that the stadium will become obsolete within a few decades, as new technologies are introduced, new urban development occurs and cities offer lavish facilities to lure teams looking for a new home. NFL stadiums in the United States, for example, have a median age of 31 years before they are replaced.  In any event, it is difficult to assess the global environmental and economic impact of these events, let alone to try to create a strategy to address them.

Lastly, the ambition of hosting a mega sporting event tends to encourage cities to relax their rules for urban development and restructuring. This may be because of the short timeframe for hosting the event, or it may be that cities receive significant internal and external pressure to satisfy their obligations for the event.

In the run up to the 2014 World Cup and the 2016 Olympics in Brazil, for example, politicians in Rio de Janeiro executed “flash-votes” that allowed the Legislative Assembly to push through emergency bills to (1) lift the ban on alcohol at stadiums; and (2) annul the laws that protect historical architecture and patrimony of certain existing stadiums. These emergency bills were approved without the usual mandatory public debate, resulting in the demolition of two historical structures – the Sambodromo and Maracana Stadiums – and their replacement with a new stadium. This not only reflects a disregard for community involvement, it is also disconcerting because much of the cost for these events is borne by public funding. In the United States, for example, sports stadiums have historically been funded through publicly subsidized financial mechanisms including general sales taxes. In Australia, much of the $30 million annual cost of holding the Formula 1 Grand Prix comes from public funds. Further adding insult to injury is the fact that most local residents cannot afford to attend these mega events, which are targeted toward the elite foreign traveler.

Little legal framework exists to regulate these transient pop-up cities created by mega sporting events. While there are a handful of United Nations treaties on sports, mostly recognizing the general right to participate in and have access to sporting and recreational events, no international treaty addresses the social, economic and environmental externalities. The closest is Agenda 21, adopted by United Nations (UN) member nations in 1992. At the 1992 Rio Earth Summit, many UN member states committed to environmental sustainability in economic development generally and adopted Agenda 21 as the framework for fulfilling this obligation. Agenda 21 is non-binding and voluntary but encourages all organizations – governmental and non-governmental, international, regional and local – to prepare their own version based on the framework provided. While it does not specifically address sporting events, the International Olympic Committee (IOC), working with United Nations Environment Programme, adopted its own Agenda 21 in 1999, following the general framework of the Rio Agenda 21 and providing a plan to improve socioeconomic conditions, conserve and manage resources and strengthen the role of major groups in each Olympic host country.

Agenda 21 provides a potential framework for sustainable development generally, but it does little to address the unique temporary nature of mega sporting events, and if the 2016 Olympics were any indication of its effectiveness, it falls well short of ensuring sustainable practices. Further, other than the IOC, it does not appear to have been adopted by any other major sporting organization.

With more sporting events on the horizon than ever before, it is time to more holistically address the pollution, waste, greenhouse gases and other negative consequences. Agreements between host city and event organizer often ignore key issues, and host cities are sometimes concerned that organizers will simply go on to the next city if they push too hard on specific terms.

So what might work? One possibility is the use of social licenses, a concept that originated with mining and energy industries operating in developing nations. After unbridled environmental damage – and the ensuing reputational hits – during the 1990s, the World Bank encouraged the industry to use social licenses. These social licenses, which are essentially ongoing agreements with local governments and other stakeholders to indicate local acceptance of a project, helped identify and address concerns about the environmental and human cost of the transitory mining and drilling activities.

Over the last few decades, societies around the globe have begun to shift to a more informed and involved form of decision-making, with an eye toward sustainable practices. Social licenses are part of that, legitimizing stakeholder decisions and providing a framework for managing expectations. The use of social licenses for mega sporting events could benefit all parties and allow for a fair allocation of the benefits and costs associated with the event. Some of the key elements of a social license that could apply include full disclosure and transparency of process; making environmental, social, and economic information available in the local language; early and meaningful community involvement in decision-making; a commitment to sustainable energy and environmental sensitivity, and longevity of community investments.

Although there is no silver bullet to prevent the negative side effects of these mega sporting events, implementing a social license to operate mechanism could at the very least allow communities to identify and meaningful analyze the costs and benefits associated with hosting the event early in the process.

Gina S. Warren is an associate professor at the University of Houston Law Center where she teaches classes in property law, oil & gas law, and domestic and international energy law. Her research explores the role of policy and regulation in the area of sustainable energy, with a focus on renewable energy, climate change, and distributed generation. Prior to entering academia, Warren worked for the international law firm of Perkins Coie, based in Seattle, Washington, where she litigated and advised on matters of energy and utility law. Warren holds a Bachelor of Science in Psychology from the University of Arizona and a Juris Doctorate from Rutgers School of Law.

Radioactive Waste And The Hidden Costs Of The Cold War

By David Rainbow, Assistant Professor, Honors College, University of Houston

Hanford, a dusty decommissioned plutonium production site in eastern Washington state, is one of the most polluted places in the country. The disaster is part of the inheritance of the Cold War.

A few months ago, a 110-meter-long tunnel collapsed at the site, exposing an old rail line and eight rail cars filled with contaminated radioactive equipment. This open wound in the landscape, which was quickly covered over again, is a tiny part of an environmental and human health catastrophe that steadily unfolded there over four decades of plutonium production. Big Cold War fears justified big risks. Big, secretive, nuclear-sized risks.

Hanford and other toxic reminders of the Cold War should serve as a cautionary tale to those who have a say in mitigating geopolitical tensions today, as well as to those who promote nuclear energy as an environmentally sustainable source of electricity. The energy debate must balance the downside – not just the risk of a nuclear meltdown but also the lack of a permanent repository for the still-dangerous spent fuel rods – with the environmental benefits of a source of electricity that produces no greenhouse gases. People on both sides of the issue have a vested interest in how the current geopolitical tussling over nuclear weapons plays out.

These days, fear of other countries is big again. North Korea’s nuclear detonations and intercontinental ballistic missile launches – the most recent just days ago – are explicit threats to the U.S. For his part, President Trump has responded with threats (and mockery) of his own, promising to reign down “fire and fury” on North Korea if Kim Jong Un follows through on his threats.

On the campaign trail last year, Trump called for the U.S. to “greatly strengthen and expand its nuclear capability.” Recent reports (which Trump denies) that the President has called for increasing our nuclear arsenal by 10 times are in line with this campaign pledge. According to the reports, Trump wants to return to the peak nuclear production of the 1960s, the height of the Cold War. While Trump’s statements on nuclear weapons have been inconsistent, the overall picture has been clear and in line with his general chest-thumping approach to foreign policy: We will do and say what we want. None of this rhetoric is conducive to making the world safer from nuclear weapons.

The saga of Russia’s connection to Trump’s presidential campaign continues, too. Again this past week we learned more about conversations between Trump’s people and the Russians during the election. Here it’s been the left that has most often drawn upon rhetoric to characterize Russia’s meddling – or “The Plot Against America” – that harks back to the conflicts of the last century. Secret plots, missile tests, Russian spies, insinuations of treason, radioactive materials. Put these together with the deep disagreements between the U.S. and Russia over the ongoing conflicts around the globe (Syria, Ukraine, and the significant military exercises conducted along NATO’s eastern border), and we are back, it seems, to the bad old days of the Cold War.

Even if, as we all hope, the “new Cold War” never gets hot, escalating tensions can have seriously harmful effects at home. The radioactive cave-in at the Hanford site earlier this year should serve as a reminder of that.

Nuclear refinement at Hanford began as a part of the Manhattan Project during World War II, the highly secretive plan to develop a nuclear bomb.

Initially, the drive to mobilize for war justified substantial costs, among them significant damage to human and environmental health in the U.S. resulting from the nuclear program. Hanford was integral to the program: its plutonium fell on Nagasaki. But after the end of the war, the scale of production at the site increased to a fevered pitch thanks to the ensuing competition for global influence between the U.S. and the Soviet Union that became the Cold War.

Our gargantuan stockpiles of nuclear arms demanded gargantuan quantities of plutonium. Forty-five years of work at Hanford – from 1943 to 1987 – yielded 20 million uranium metal plugs used to generate 110,000 tons of fuel. The process also generated 53 million gallons of radioactive waste, now stored in 177 underground tanks at the facility, and created 450 billion gallons of irradiated waste water that was discharged onto “soil disposal sites,” meaning it went into the ground. Some of the irradiated discharge simply ran back to where it had originally been taken from, the nearby Columbia River. The Office of Environmental Management at the Department of Energy is currently overseeing a cleanup project involving 11,000 people. It is expected to take several decades and cost around $100 billion.

Kate Brown’s award-winning book, “Plutopia: Nuclear Families, Atomic Cities, and the Great Soviet and American Plutonium Disasters,” is a history of the Hanford plant and its Soviet doppelgänger, a plant in the Ural Mountains called Maiak. Brown points out that over the course of a few decades, the two nuclear sites spewed two times the radiation emitted in the Chernobyl explosion. Yet few Americans at the time, even those involved in plutonium production, realized this was going on or how dangerous it was.

Naturally, the hidden nature of the project meant that information was hard to come by. As Brown shows, even the experts, managers and scientists involved directly in overseeing the production process knew little about the seriousness of the risk. Doctors studying the effects of radiation on people didn’t have access to the research related to environmental pollution. Scientists studying fish die-offs had no way of connecting their findings to the deteriorating immune systems of humans in the same areas. Most poignantly, researchers measuring the effectiveness of nuclear bombs on the enemy did not communicate with researchers measuring the threat of nuclear bombs on the workers making them. Consequences for the workers were grave. Hanford and Maiak’s hidden mega-pollution was collateral damage in the fight to win the Cold War. Russia, like the U.S., is still living with the damage, and trying to bury it, too.

Within two days of the tunnel collapse at the Hanford site this past May, workers filled the breach with 53 truckloads of dirt and narrowly avoided a radiological event. However, these eight railcars are hardly the only waste left behind in the U.S. from our cold conflict with the Soviet Union, in which our willingness to risk human and environmental health was proportionate to our fears. It’s going to be a while before it’s all cleaned up. In the meantime, hopefully our leaders will work to keep the new Cold War from getting any worse.

David Rainbow is an Assistant Professor in the Honors College at the University of Houston. He teaches and writes about modern  Russian and Eurasian history. Prior to coming to Houston in 2015, he held postdoctoral fellowships at Columbia University’s Harriman Institute for Russian, Eurasian and East European Studies, New York University, and was a writer in residence at the Jordan Center for the Advanced Study of Russia at NYU. He holds an M.A. in European intellectual history from Drew University, and a Ph.D. in Russian history from New York University (2013). Before becoming a historian, he worked as an engineer aboard a merchant ship on the Pacific, a rancher in western North Dakota, and has lived in Russia and Siberia several times.