A (very) short history of sustainability

Featured

A mud map of how sustainable development has grown up

By David Salt

For many, sustainability is a buzz word; a descriptor used and abused by governments (and corporations) all around the world to give the impression their policies of economic growth and development are simultaneously meeting the needs of society and the environment. But it’s more than just a hollow catch cry. Sustainability is a concept with substantive meaning and pedigree.

The growing body of evidence, unfortunately, is that our world is not on a trajectory of sustainability. If anything, we are accelerating away from it. However, there was a time, no so long ago, when there appeared to be a growing international consensus that sustainability was a real and achievable goal. When was that? Here is my (very) short potted mud map of sustainability (with a fist full of caveats at the end for me to hide behind).

The Twentieth Century

The Twentieth Century was the century of human domination in which our species ‘conquered’ the final bits of the planet’s surface. We encircled the world with our communication cables (1902), reached its South Pole (1911), ascended to its highest point (Mt Everest, 1953) and then reached even higher with artificial satellites (Sputnik, 1957). We also made a real effort to annihilate many dimensions of our own culture in two world wars.

If the first half of this century was marked by massive global-scale disruptions (two world wars and a Depression) and empire failures (Britain and Japan especially), then the second half was characterised by population and economic growth of unprecedented scale. Population more than doubled, while the global economy increased by more than 15-fold. And it was in this second half that notions of sustainability were developed.

The 1940s: Reboot

My mud map begins in the aftermath of the Second World War; a time of mass destruction, renewal and new beginnings. The aim of governments was growth, stability and the kindling of hope for a prosperous future.

The tremendous economic growth that followed was in large part enabled by the ‘rebooting’ effect of the wars. These broke down old imperial and feudal institutions, opened up space for new institutions based on liberal-democratic and later neo-liberal economic principles, and empowered us with a new suite of powerful science and technology.

Survival was more the consideration than sustainability, but towards the end of the 1940s there was an international push to set aside bits of landscapes for wildlife and nature with the establishment in 1948 of the International Union for the Protection of Nature (which was to become the International Union for the Conservation of Nature, IUCN, in 1956). Economic growth was the main focus and the environment was seen as a space separate from human activity.

The 1950s: Lift off

Today’s economy and environment has direct roots in the explosion in economic growth that took place in the 1950s, the beginning of the so called Great Acceleration. Population, GDP, energy generation, fertiliser consumption, water use and international tourism all underwent dramatic (often exponential) increases as the economy powered up.

The ‘sustainability’ of the environment was not really a question back then. The USA, a major driver of growth, was concerned about the ongoing supply of natural resources, but only as it related to feeding the economy rather than sustaining the environment. It set up a commission, the Paley Commission, which led to the establishment of the NGO called ‘Resources for the Future’. Its brief was to look at resource scarcity issues on an ongoing basis. The great environmental economist David Pearce identifies this as the founding of environmental economics.

The 1960s: Cracks in the model

The economy was growing strongly, living standards for many were improving, the rich were getting richer but the poor were getting less poor. Indeed, during these first decades after the war the gap between the richest and the poorest was decreasing (proof that a rising tide can indeed lift all the boats).

But underneath the growth and the technological mastery, cracks were appearing in the form of environmental decline. These concerns were embodied in the book Silent Spring by Rachel Carson (1962). It drew attention to the accumulating impacts of pesticides on natural ecosystems, and questioned the costs of industrial scale-agriculture.

Technology also gave us new frames for considering humanity’s role and place, with the race for the Moon providing new perspectives, metaphorical and literal, on our planet. Kenneth Boulding coined the term ‘Spaceship Earth’ in a famous essay in 1966 (and in 1968 we saw our fragile home in perspective for the first time in the famous ‘Earth Rising’ photo taken by Apollo 8 astronauts as they orbited the moon).

Concern was growing as case study (eg, acid rain) after case study (eg, contaminated waterways) caused people to question the costs and benefits of economic development. Laws for environmental protection started taking shape and the idea of Environmental Impact Assessment took off (enshrined in US environmental law, NEPA, in 1969); yet the approach that evolved was more a ‘bottom up’ one of minimising impacts on a case by case basis rather than the holistic bigger picture approach that Boulding had advocated and NEPA, read as prose rather than law, clearly embodies.

The 1970s: Hopes are high

1972 saw the publication of landmark report titled Limits to Growth, one of the first formal efforts to understand what the consequences of unbounded economic development might be. Its conclusion was that our species was likely heading for some form of collapse in the mid to latter part of the 21st Century. (While widely dismissed by economists, a review in 2014 of the Limits-to-Growth analysis found its forecasts are still on track.)

The 70s saw many efforts by governments and community groups around the world to address the swelling list of environmental problems falling out of our rapacious growth. Key among these was UN Conference on the Human Environment, also known as the Stockholm Conference, in 1972. It catalysed many activities that were to prove pivotal to the manner in which we dealt with the environment, including many nations setting up their own environment ministries. It also saw the creation of UNEP (the UN Environmental Programme), and it put a greater focus on the connection between society and the environment. The Stockholm Conference was one of the first events where there was a strong acknowledgement of the need for poverty alleviation and its connection with access to environmental resources.

And it was during this decade that the term sustainable development began to see common usage. Indeed, the term was first used officially in the World Conservation Strategy launched in 1980, though at this stage the focus was on the environment alone.

The 1980s: Negotiations are had

‘Sustainable development’ took real form with the release of the report titled Our Common Future by The World Commission on Environment and Development (let by the indefatigable Gro Harlem Brundtland, Norway’s first female Prime Minister) in 1987. The report defined a sustainable society as one that “meets the needs of the present without compromising the ability of future generations to meet their own needs”. It made sustainability an idea that involved acknowledging the linkages between the economy, the environment and society.

The mid 80s also saw the emergence of a massive ozone hole over the south pole (resulting from humans pumping ozone depleting substances into our atmosphere). This went some way to puncturing our complacency about environmental decline. Countries met and negotiated what they would do about the ozone problem, treaties were signed and these days ozone depleting emissions are on the decline.

Not so easily addressed, unfortunately, was the greenhouse gas problem in which a by-product of economic activity (energy, transport and agriculture in particular) was carbon-based emissions that distorted the Earth’s climate systems. Though the science of greenhouse warming was well understood and discussed in scientific circles in the 70s, it actually became visible in the late 80s. (In 1988 Jim Hansen, a leading atmospheric scientist at NASA, declared: “The greenhouse effect has been detected, and it is changing our climate now.”)

The 1990s: Plans are drawn

In 1992 the world came together in Rio for the great Earth Summit in which nations would pledge how they were going to meet the great challenge of sustainability. A plan for sustainable development in the rapidly approaching 21st Century was adopted (Agenda 21) and an international agreement on biodiversity conservation was opened for signing.

Through the 90s the Intergovernmental Panel on Climate Change (formed by UNEP and the WMO in the 80s) began compiling an enormous brief of evidence that greenhouse gas levels were growing remorselessly and creating a raft of problems from shifting climate to sea level rise and extreme weather. But as the fear rose about the need to do something about carbon emissions, vested interests increased their efforts to discredit the science, and obfuscate the emerging picture.

And governments everywhere were discovering that policy positions developed to meet sustainability pledges came with real short term electoral pain, and that the prospect of deep change, transformational change, was simply too much to push through. Sustainable development is a moral imperative but the reality is that sustainability bites. Or, as President Bush said in 1992 at the Rio Earth Summit: “the American way of life is not up for negotiation.”

The 2000s (the Naughties): Sustainability bites

As you’d expect, the beginning of a new millennium saw a lot of reflection, discussion and planning for a better world (a bit like my New Year’s resolutions to be a better person). There was the Millennium Summit in 2000 (and ensuing Ecosystem Assessment in 2005), a Rio+10 Earth Summit (held in Johannesburg in 2002) and a World Summit held in 2005. Millennium Development Goals were drawn up and agreed to, and almost all nations (with the US a notable exception) committed to reversing declines in biodiversity (the Convention on Biological Diversity, CBD).

And the manner in which many governments sought to deliver on their sustainability commitments increasingly invoked utilitarian values, a move supported by an emerging line of conservation science that demonstrated that nature provided benefits to humans that save us money (like native vegetation providing water purification). So, why don’t we start paying for the things that nature gives us, ecosystem services, and let the market optimise the delivery of these services? Some saw this as a dangerous move away from acknowledging nature’s intrinsic value.

But, just like my News Year’s resolutions, it didn’t take long for most governments to begin making excuses for why aspirations (for sustainable development) needed to take second place to the realities of day-to-day life: “as soon as we’ve secured a strong economy we can begin worrying about fixing up the environment.”

Targets adopted under the CBD meant that 2010 was supposed to be the line in the sand for biodiversity conservation but all countries failed to deliver on their commitments with extinction rates climbing and the drivers of extinction only accelerating.

The 2010s: Cracks in the ice cap

Sustainability, however you want to define it (and heaven knows it comes in many flavours), was proving a stubbornly elusive goal. But the negotiations continued.

The world’s nations continued to get together (Rio+20 in 2012, this time in Rio) but failed to agree on any major outcomes other than replacing a failed international body, the Commission on Sustainable Development, with a new one, the UN Environment Assembly); the failed Biodiversity Convention targets were replaced with a more nuanced set of goals (the Aichi Targets); the Millennium Development Goals (which some believed were quite effective while others said were unmeasurable) were replaced with a more nuanced set of sustainability targets (the Sustainable Development Goals); and the stalled climate change discussions actually reached half a consensus with the Paris Agreement (in 2015; though President Trump has since withdrawn from it).

In many ways, it’s the same old, same old; endless meetings, discussions, agreements and targets; one step forward, two steps back, another step forward; but, at the end of the day Bill Clinton’s 1992 election mantra ‘it’s the economy stupid’ sums up the approach of virtually every country. Which sometimes has me wondering that Rachel Carson, Kenneth Boulding and the doomsayers behind ‘Limits to Growth’ were simply wrong. The environment is undoubtedly in decline but we’re still standing, talking and aspiring to better things (most of us are wealthier, but at the expense of future generations). Clearly governments are almost unanimous in believing that the economy is what counts and if things get scarce then markets and technology will always find a solution; they have so far.

But those people calling for reflection and change were not wrong; and the 2010s and the emerging science are emphatically backing their calls for a new way of stewarding Spaceship Earth. We’re losing species and ecosystems that we depend upon. We are seeing changes to our climate and Earth system that are already stressing many parts of our planet (including our food and water systems); and the science tells us these changes are just beginning, promising an increasingly uncertain future. We are losing the challenge of sustainability and it’s not a challenge we can afford to lose.

Caveats and endnotes

This ridiculously short history only touched on a few of the elements that have contributed to the evolution of sustainable development (and only mentioned a couple of the thousands of identities – people and institutions – who have made important contributions to its story). And, clearly, dividing this history into decadal phases doesn’t reflect the real inflection points of its evolution, it is merely my effort to subjugate a complex, non-linear, multi-faceted topic into something that looks like time line with a simple narrative.

However, even the limited set of events described here tells us that the history of sustainable development has gone through life stages with different dynamics. It began as our faith in the economic growth model began to erode and it’s early days kept a tight focus on the environment; as it developed there grew a better appreciation of the connections between society, economy and environment; and as it reached maturity and asked for real commitment from its sponsoring actors, the reality of shifting the status quo has proven that much of its rhetoric is impotent.

In its youth sustainable development was driven by natural science. In its young adulthood, it began to take on it legitimacy from ideas founded in social values, rights and laws. And as it matured it cloaked itself in the robes of economics and markets.

Is it any wonder then that sustainable development is no longer a force for change (if it ever was)? Rather than challenge the paradigm of unbounded economic growth, it has been forced to work within the normative structures that put economic growth before all other goals.

So, if you were a doctor asked to prescribe a change to an ageing man whose life style is clearly leading to a miserable old age, what might you suggest? Because maybe this is the lens we need to look through when considering where to from here for sustainable development. And, maybe, just like our ageing patient, we need to be confronted with some hard truths about what the future holds (unless we sign up for some demanding therapies)?

Image: Earthrise, 25 December 1968. Taken aboard Apollo 8 by Bill Anders. Earth is peeking out from beyond the lunar surface as the first crewed spacecraft circumnavigated the Moon. (NASA)

Environmental policy came from the side of the angels

Featured

Lynton Caldwell, NEPA and the birth of Environmental Impact Assessment

By Peter Burnett

When did the age of modern environmental policy begin? Some claim it kicked off with the publication of Our Common Future (also known as the Brundtland Report) in 1987. This landmark document defined the notion of ‘sustainable development’ and stressed the need for integrating economic, social and environmental approaches. Others suggest 1972 is more appropriate as it was the year of the Stockholm Conference and the establishment of Environment departments in many countries around the world.

But I’m going to suggest to you that 1969 and drafting of the US National Environmental Policy Act (NEPA) is really when environmental policy began, and it owes much to a visionary political scientist named Professor Lynton Caldwell. And it’s not just that Caldwell was astute enough to understand what effective environmental policy needed, he was also canny enough to know when to make his pitch.

NEPA is famous for introducing the world to the concept of environmental impact assessment (EIA), a mechanism now used in almost every country. But NEPA stands for so much more.

Interdisciplinarity

Its antecedents lie in Caldwell’s earlier work. In 1963 he published an article entitled ‘Environment: A New Focus for Public Policy?’ 1963! That’s only a year after Rachel Carson published Silent Spring, the book often regarded as having launched the modern environment movement.

In his article, Caldwell argues for, and thus invents, ‘environmental policy’. He calls for, among other things, an interdisciplinary approach to this new creature. Caldwell was a Professor of Government at Indiana University and he practised what he preached. He embarked on a course of interdisciplinary training and started hanging around with ecologists. (In 1963, ecology was still a relatively small discipline.) These days, interdisciplinarity is a much lauded goal (if little practiced) but back then it was a very brave undertaking.

In 1964 Caldwell began to operationalise his ideas by presenting them to a workshop for economic planners. Brave again. Not surprisingly, most of them were, as Caldwell later reminisced, ‘baffled’ by his argument and most of them rejected it as irrelevant. That’s except for one now world famous economist, Abraham Maslow (of ‘Maslow’s hierarchy of needs’ fame). Maslow understood what Caldwell was advocating. He later offered Caldwell constructive suggestions, declaring Caldwell to be ‘on the side of the angels’.

The time was ripe

By the late 1960s a wave of environmental concern was sweeping the Western World, particularly in America. Some major environmental disasters contributed to this. In 1969 and in America alone, the Santa Barbara oil spill despoiled the California coastline while on the other side of the country the Cuyahoga River was so polluted it actually caught fire.

Various members of Congress responded by proposing environmental laws. Public opinion was galvanised.

Against this backdrop, one of the leading proponents of reform in Congress, Senator Henry ‘Scoop’ Jackson, hired Caldwell to help with Jackson’s environmental Bill. Initially, Caldwell wrote a report for a Congressional committee on what a national environmental policy might be.

He later wrote that he anticipated the need for ‘action forcing provisions such as impact statements’ to support a national policy statement. But Caldwell held back as he suspected Senator Jackson ‘did not appear ready to endorse so novel and intrusive a proposition’.

Later, however, in appearing before the committee, Caldwell was able to make his arguments for his action forcing provisions and they were then included in the compromise bill. That bill became the NEPA. Caldwell had bided his time and ‘threw his pebble’ (to borrow a term from one of our earlier blogs) when he perceived it would have maximum impact.

A remarkable piece of legislation

NEPA itself is a remarkable piece of legislation. Its statement of environmental policy goals is visionary. It talks about the need for a global approach three years before the world first met to talk about a global approach, at the UN Conference on the Human Environment held at Stockholm in 1972.

The legislation talks about each generation being trustees of the environment for future generations and sharing life’s amenities – this was 18 years before the Brundtland Report proposed the concepts of sustainable development and intergenerational equity.

It refers to maintaining the diversity if life just 12 months after Dasmann first wrote of biological diversity and 20 years before Lovejoy coined the term ‘biodiversity’.

And NEPA required the preparation of state of the environment reports (as ‘environmental quality reports’), 10 years before the OECD produced one and called on its members to do likewise. NEPA sought to drive policy integration 10 years before the OECD began to promote the same concept.

Ahead of its time?

Unfortunately most of the enormous potential of NEPA was not realised. True, it brought environmental impact assessment, EIA, to the world. But Caldwell, Jackson and the others behind NEPA had a much bigger vision than EIA.

If NEPA had been applied as an ordinary reading of its words would suggest, all US government agencies would have brought their decisions in line with a long-term policy vision directed to avoiding environmental degradation, and these decisions would have been supported by comprehensive information and research facilitated by a new institution, the Council of Environmental Quality. These things did not happen because government agencies were antagonistic and the US Supreme Court read the law down to a set of procedural requirements.

Caldwell’s vision and achievements, which would have been much greater if others had not been working against them, are not widely known. And to top it off, he was a registered Republican voter, working for a Democrat: if only environment was the bipartisan issue today that it was then.

Image: Lynton Caldwell enjoying the great outdoors. Indiana University Archives