The perils of command and control and the pathology of Natural Resource Management

Featured

By David Salt

As a younger man I honestly believed that sustainability was a tractable problem; a difficult challenge no doubt but one that was solvable with hard work coupled with science and technology. And, as a confident young thing, I thought I could contribute to this outcome by serving in the area of science communication and education; get more talented young people into science, and increase community acceptance of emerging technological solutions so they can be effectively implemented.

How might science and technology save us? By providing us with insights on the many problems being faced by humanity and the environment, and by helping humanity lighten its footprint on Planet Earth. Well, science has definitely provided ample insights on the plight of our planet, and technology has given us so many ways to be more efficient in how we do things.

For all that, however, we are moving away from being sustainable; indeed, we seem to be accelerating away from it. In the last half century, humanity has pushed the Earth system over several planetary boundaries, unleashed a sixth extinction event, and seems unable as a global community to do anything about greenhouse gas emissions which are remorselessly on the increase (as a by-product of our addiction to economic growth).

Science and technology has underpinned so much of our wealth creation and economic activity, and many techno-boosters are fervent in their belief that science and technology is the solution to the many problems facing our environment (indeed, I heard Australia’s Chief Scientist say this exact thing on the radio this morning, as I write.)

As I grew older and watched the natural world decline around me (on a number of scales; think of weed infestation in your local bush reserve, glacial retreat or the bleaching of the Great Barrier Reef), my enthusiasm for (and faith in) science and technology also declined. I could see the potential of all these new discoveries (think renewable energy, nanotech and biotech as examples) but could never see where the outcomes were creating a more sustainable future. For example, for every 10% improvement in efficiency in process X, we seemed to see 100% increase in people using that process resulting in more waste, more consumption and more damage (albeit less impact per capita, see the Rebound Effect for a discussion on this).

The dangers of partial solutions

It’s not that I’m anti science and technology and I do believe increasing efficiency is important. However, by themselves they are not enough.

Then I was asked to write a couple of books on resilience science (Resilience Thinking and Resilience Practice) and my doubts on the belief that ‘science and technology is the solution’ crystallised into a new way of looking at the world. The experience of writing about resilience opened my eyes to ideas of complexity, and the capacity of a complex systems to absorb disturbance and retain their identity (the definition of resilience). The consequences of these ideas are deep and far reaching. In a range of different ways, I’ve been attempting to articulate them in my stories for Sustainability Bites.

One major consequence of acknowledging the complexity around us is to be aware of the cost of partial solutions sold to us as complete answers. Science and technology (and endlessly increasing efficiency) are not only not enough to move us to being sustainable, an exclusive reliance on them (and belief in them, think ‘technology not taxes’) will actually reduce resilience in the systems we depend upon and make us more vulnerable to disturbance.

There are many lines of evidence supporting this contention (see Resilience Thinking and Resilience Practice) but in the space I have here I’d like to discuss how natural resource management agencies decline over time. Improving science and technology (and efficiency) is often touted as the solution but only fuels this decline. This discussion is based on a landmark paper by CS Holling (one of the founding fathers of resilience thinking) and Gary Meffe, written a quarter of a century ago: Command and Control and the Pathology of Natural Resource Management.

The command-and-control pathology

Holling and Meffe point out that when command and control is applied in natural resource management, the initial phase is nearly always quite successful. Insect pests are reduced by pesticide use; fishing and hunting are enhanced by stocking or predator removal; forest fires are suppressed for years; floods are minimized by levees and dams.

But what follows on these initial successes is rarely acknowledged. The agencies responsible for management shift their attention from the original social or economic purpose towards increasing efficiency and a reduction in costs. (Of course, all agencies/companies do this over time not just NRN agencies. It’s a pattern well described in the idea of ‘adaptive cycles’ first proposed by Holling.)

NRM agencies search for better and more efficient ways to kill insects, eliminate wolves, rear hatchery fish, detect and extinguish fires, or control flows. Priorities thus shift from research and monitoring (why ‘waste’ money studying and monitoring apparent success?) to internal agency goals of cost efficiency and institutional survival.

Holling and Meffe contend that as this happens, there is a growing isolation of agency personnel from the systems being managed and insensitivity to public signals of concern. They describe this as institutional myopia and increased rigidity (again, something well described by the theory of adaptive cycles).

At the same time, economic activities exploiting the resource benefit from success (of more fish, or water or whatever) and expand in the short term. We see greater capital investment in activities such as agricultural production, pulp mills, suburban development, and fishing and hunting. There’s nothing wrong with this, they say, within limits.

But the result is increasing dependency on continued success in controlling nature while, unknown to most, nature itself is losing resilience and increasing the likelihood of unexpected events and eventual system failure. When natural systems are ‘controlled’ they invariably lose their natural diversity and processes, which leads to a declining ability to absorb disturbance (while maintaining its identity).

With dependency comes denial and demands by economic interests to keep and expand subsidies, and pressure for further command and control.

So, the initial successes of command and control come with a costs that are usually never acknowledged. Command and control reduces natural variation and erodes resilience, environmental managers aim for efficiency rather than connection with the system they are managing, and economic interests that benefit from original command and control distort the system to maintain it. The composite result is increasingly less resilient and more vulnerable ecosystems, more myopic and rigid institutions, and more dependent and selfish economic interests all attempting to maintain short-term success.

Holling and Meffe point out that solutions to this pathology cannot come from further command and control (for example, stronger regulations) but must come from innovative approaches involving incentives leading to more resilient ecosystems, more flexible agencies, more self-reliant industries, and a more knowledgeable citizenry.

Back in the ‘real world’, you’ll largely hear our political leaders deny the complexity of this and simply say science and technology will save us. Unfortunately, in a complex world, simple solutions have a habit of only making the situation worse.

Don’t get me wrong, I still love science and technology. However, by themselves, they are not the solution. To contribute to a sustainable world, they need to work with complexity, not subjugate it.

Banner image: Dams are an important piece of human infrastructure offering many valuable short-term benefits by controlling our rivers. In the longer term they come with a range of often unacknowledged costs. They reduce the natural variability of the river; they encourage human settlement in areas subject to flooding; and allow food production in areas that normally wouldn’t support agriculture. Over time, the agencies managing the dam become myopic and rigid, the economic sectors depending on the dam become increasingly reliant and selfish, and the river system becomes increasingly vulnerable to disturbances. (Image by David Salt)

Solving sustainability – It’s complicated AND complex. Do you know the difference?

Featured

What is it about the challenge of climate change that makes it so difficult to solve?

Clearly, it’s a complicated problem involving many interacting components. These interacting parts include the Earth system (and its billions of components), people (you and me), states and countries; organisations and institutions; unknowns; tradeoffs; winners and losers. We’ve spent decades of effort addressing this issue – including billions of dollars on research – and yet the problem of mounting levels of carbon emissions and accelerating environmental decline only seems to get worse. (Have you seen what’s happening in the northern hemisphere at the moment? And it’s only spring!)

Clearly, climate change is a big and complicated problem but it seems to me, having watched us deal with this challenge (and fail) over many years, what we’re not acknowledging is that it’s also a complex problem, and we’re not dealing with this complexity very well.

‘Complicated’ and ‘complex’ are words often used interchangeably but they are fundamentally different ideas. Do you know the difference? I’ll confess that for most of my life I didn’t.

So, what is complexity?

Complex systems scientists have been attempting to pin down what complexity is for decades. To me, most of their definitions are highly technical and only understandable by other complex systems scientists.

Here’s one commonly used definition set out by the famous evolutionary biologist Simon Levin in 1998 that encapsulates many of the ideas floating around complexity. It’s relatively short and sets out three criteria for defining a complex adaptive system. Complex adaptive systems have:

-components that are independent and interacting;

-there is some selection process at work on those components (and on the results of local interactions); and

-variation and novelty are constantly being added to the system (through components changing over time or new ones coming in).

Sounds straightforward but what does it mean and why is it important? Here’s how I attempted explain it in the book Resilience Thinking*.

Cogworld vs Bugworld

Consider these two situations: Cogworld and Bugworld.

Everything in Cogworld is made of interconnected cogs; big cogs are driven by smaller cogs that are in turn driven by tiny cogs. The size and behavior of the cogs doesn’t change over time, and if you were to change the speed of the cogs of any size there is a proportionate change in speed of other connected cogs.

Because this system consists of many connected parts some would describe it as being complicated. Indeed it is, but because the components never change and the manner in which the system responds to the external environment is linear and predictable, it is not complex. Really, it is just a more complicated version of a simple system, like a bicycle with multiple gears.

Bugworld is quite different. It’s populated by lots of bugs. The bugs interact with each other and the overall performance of Bugworld depends on these interactions. But some sub-groups of bugs are only loosely connected to other sub-groups of bugs. Bugs can make and break connections with other bugs, and unlike the cogs in Cogworld, the bugs reproduce and each generation of bugs come with subtle variations in size or differences in behavior. Because there is lots of variation, different bugs or subgroups of bugs respond in different ways as conditions change. As the world changes some of the subgroups of bugs ‘perform’ better than other subgroups, and the whole system is modified over time. This system is self-organizing.

Unlike Cogworld, Bugworld is not a simple system but a complex adaptive system in which it’s impossible to predict the emergent behavior of the system by understanding separately its component subgroups. It meets the three criteria outlined by Levin: it has components that are independent and interacting; there is some selection process at work on those components; and variation and novelty are constantly being added to the system.

Complicated vs Complex

In Cogworld there is a direct effect of a change in one cog, but it doesn’t lead to secondary feedbacks. The cogs that make up Cogworld interact but they are not independent, and the system can’t adapt to a changing world. Cogworld might function very ‘efficiently’ over one or even a range of ‘settings’ but it can only respond to change in one way – that is working all together. If the external conditions change so that Cogworld no longer works very well – the relative speeds of the big and little cogs don’t suit its new environment – there’s nothing Cogworld can do.

In Bugworld the system adapts as the world changes. There are secondary feedbacks – secondary effects of an initial direct change. The bugs of Bugworld are independent of each other though they do interact (strongly – though not all bugs interact with all other bugs).

In our Bugworld, if we attempted to manage a few of the subgroups – eg, hold them in some constant state to ‘optimise’ their performance – we need to be mindful that this will cause the surrounding subgroups to adapt around this intervention, possibly changing the performance of the whole system.

Ecosystems, economies, organisms and even our brains are all complex adaptive systems. We often manage parts of them as if they were simple systems (as if they were component cogs from Cogworld) when in fact the greater system will change in response to our management, often producing a raft of secondary feedback effects that sometimes bring with them unwelcome surprises.

The real world is a complex adaptive system. It is more like Bugworld than Cogworld and yet it seems most of our management, policy and leadership is based on a Cogworld metaphor.

The consequences of complexity

Complex adaptative systems are self-organizing systems with emergent properties. No-one is in control and there is no optimal sustainable state that it can be held in. These are just two of the consequences that fall out when you begin to appreciate what complexity is all about, and they are pretty important consequences if you reflect on it.

Our political leaders will tell you they are in control, and that they have a plan, a simple solution that solves the problem of climate change without anyone having to change the way they do things. This is the message that Australians have been hearing for the past decade from our (recently defeated) conservative government. But we grew skeptical of these claims as we saw our coral reefs bleach and our forest biomes burn.

Why is climate change so difficult to solve? Yes, it’s complicated with many interacting components. However, more importantly, it’s complex and complexity is something humans don’t deal with well (let alone understand).

As one piece of evidence on this, consider how we think about thinking. What’s the image that immediately comes to your mind? For most people it’s a set of mechanistic cogs encased in a head (like in our banner image this week). If you thought my ‘Cogworld’ was fanciful, how many times have you seen this representation of human thinking as mechanistic clockwork without questioning it. Because what you’re seeing is a representation of a complex system (you thinking) as a non-complex simple system (a set of cogs). The ‘cogmind’ is a fundamentally disabling metaphor.

And if you scale this up to the systems around us, how many times have you accepted that someone is in control, and that the answer is in just making the world a bit more efficient, a bit more optimal? How is that going for us at the moment?

Different priorities

If, however, we are living in a complex world, then maybe we should stop looking for the illusory optimal solution and start dealing the complexity in which we are all embedded. How is that done?

One set of ideas I have found helpful lies in resilience thinking. Rather than prioritising efficiency, command-and-control, reductionism and optimisation, resilience thinking encourages reflection, humility and co-operation, aspects on which I’ll expand in my next blog on complexity.

*Two decades ago I was asked by a group called the Resilience Alliance to write a book on resilience science. That book, co-authored with Brian Walker, one of the world’s leading authorities on resilience science, became the text Resilience Thinking. As I learnt about resilience science I discovered that it was all about dealing with complexity, an insight that transformed the way I understood the world.

Banner image: If you thought my ‘Cogworld’ was fanciful, how many times have you seen this representation of human thinking as mechanistic clockwork without questioning it. (Image by Pete Linforth from Pixabay)