Print this page

Published: 6 February 2012

Using logic to stretch the conservation dollar further

Eve McDonald-Madden

If you were to ask a room full of managers, policy makers, or even scientists, if they should be monitoring the outcomes of their conservation actions, the answer from most would be a resounding ‘yes’. They would argue that if we don’t understand the benefits of our investment, how can we possibly know if we are doing the right thing and if our investment is worth it?

Researchers have developed a step-by-step decision tree tool enabling managers to determine whether investing in monitoring will improve the outcomes of their investment in biodiversity conservation programs – such as preventing the spread of facial tumour disease among Tasmanian devil populations.
Researchers have developed a step-by-step decision tree tool enabling managers to determine whether investing in monitoring will improve the outcomes of their investment in biodiversity conservation programs – such as preventing the spread of facial tumour disease among Tasmanian devil populations.
Credit: KeresH

But in a resource-constrained world, it’s worth taking a moment to consider what you’re hoping to achieve through such monitoring.

When we did this and attempted to formalise the logic behind when to monitor, it quickly became clear that in many situations, monitoring is inappropriate. 1

To begin with, governments do not have enough money to manage the threatened biodiversity that people care about. That’s why governments need to make good decisions about where to spend their limited resources. The same is true for monitoring. Further, biodiversity loss occurs independently of monitoring timelines; we may not have enough time to apply monitoring findings to improve our decision making before we lose what we are trying to protect.

That’s not to say those who answer ‘yes’ to monitoring are not thinking about money. One of the most common questions I get asked is, ‘How much of our program budget should we spend on monitoring – is there a set percentage you can tell us to put aside to monitor our conservation actions?’

This question led The Nature Conservancy (TNC) – the world’s largest conservation nongovernmental organisation – to initiate the workshop that led to our research paper. Unfortunately, the answer to their question turned out to be, ‘no, there is no generic benchmark for monitoring: it’s situation-dependent’.

Determining how much to spend on monitoring depends heavily on the problem at hand. Even for a well-defined problem, finding the answer is not easy: not very satisfying advice for those organisations, such as TNC, that need to make daily decisions about their investment in monitoring! But while we may not have been able to give TNC a one-size-fits-all percentage, we decided we could provide a framework to improve decision making concerning when to invest in monitoring and what type of monitoring to undertake.

Working with TNC’s key monitoring scientists, we constructed a simple decision tree that guides managers stepwise through a series of basic questions. This guides the user to an explicit and transparent decision regarding their investment in monitoring to improve management.

The decision tree includes five key elements that must be considered when making a decision about monitoring in conservation.

  1. What are you hoping to achieve in your conservation endeavours? What is the goal of the program you are planning to implement? Essentially, this question asks managers to specify the objectives of their conservation program. Without an objective, they do not have a benchmark by which to evaluate their actions.

  2. Once the objective is defined, it’s crucial to ask about the threats to the system. By understanding the threats, managers can formulate a list of potential actions that may reduce their impact.

  3. Once a set of plausible management alternatives has been formulated, managers can begin to consider whether monitoring is necessary. At this point, they must consider the reasons for implementing monitoring. Is it to improve their knowledge of how the system works to modify future management decisions (adaptive management)? Or is to guide implementation in terms of the state of the environment we are managing (state-dependent management)?

  4. Managers should ask the question, ‘do we know which management actions are the best to implement?’ If this is unclear, they will need to assess their ability to implement adaptive management. Do they have the money and time to be adaptive? Do they have suitable monitoring strategies to detect changes in the system?

  5. Apart from having an objective to improve outcomes, there are, of course, other reasons to monitor conservation actions. It might be a legal or audit requirement, or it might be for publicity. Managers should clearly identify their needs before implementing such monitoring.

There has been a surge in research looking at the design of monitoring programs in recent years, as well as a growing number of calls for the establishment of long-term biodiversity monitoring.

At its heart, however, effective monitoring rests fundamentally on a clear justification for acquiring information in the first place. What we strive to know should be driven by what we need to know.

And, if we take a structured approach to decisions about monitoring, we find that the answer to whether we should monitor is not always yes. Sometimes, monitoring can be a waste of time and money.

Dr Eve McDonald-Madden is a Key Researcher with the National Environmental Research Program (NERP) Environmental Decisions Hub and the Centre for Excellence in Environmental Decisions (CEED), both funded by the Australian government. Dr McDonald-Madden is based with the Environmental Decisions Group at the University of Queensland, a NERP/CEED research node and has also been a Visiting Scientist at CSIRO.





1 McDonald-Madden E, Baxter PWJ, Fuller RA, Martin TG, Game ET, Montambault J and Possingham HP (2010) Monitoring does not always count. Trends in Ecology and Evolution 25, 547–550.





Published: 27 February 2012

Fire channelling: predicting the unpredictable in bushfires

Jason Sharples

Despite nearly a century of scientific enquiry, there is still much we don’t understand about large fires in the landscape. This is highlighted by a recent study that identifies a previously undocumented form of bushfire propagation – one that can have a potentially catastrophic effect on fires in rugged terrain.

This ‘pyro-cumulonimbus’ (fire thunderstorm) cloud over a fire west of Canberra in January 2003 formed around 30 minutes after a major fire channelling event along the Goodradigbee River corridor.
This ‘pyro-cumulonimbus’ (fire thunderstorm) cloud over a fire west of Canberra in January 2003 formed around 30 minutes after a major fire channelling event along the Goodradigbee River corridor.
Credit: Stephen Wilkes

The January 2003 Alpine fires in south-eastern Australia rivalled the notorious Australian 1939 bushfires in severity and extent. On the afternoon of 18 January 2003, fires that had been burning in rugged terrain to the west of Canberra spread into the national capital with tragic consequences including four lives lost. Then, on 26 January, the onset of unfavourable fire weather conditions escalated the fires in the Snowy Mountains – with ongoing ecological and cultural impacts.

The fires grew over 60 days, and eventually became one continuous burnt area from central Victoria to the north of Canberra. When the fires were finally declared contained and controlled on 7 March, a total area of 1.73 million hectares across two Australian states and a territory had been burnt. At least 551 houses had been destroyed and significant amounts of other property and livestock had been lost.

The Alpine fires of 2003 also stand out as some of the best-documented bushfires in Australia. As such, they provide a unique opportunity to better understand the physical processes driving the development of catastrophic bushfires.

The fires were documented by aerial and terrestrial photographs and data from video, satellite and airborne remote sensing platforms. A piloted aircraft fitted with remote-sensing recording equipment flew several missions over fire-affected regions. Multispectral line-scanning equipment mounted on the plane collected measurements of surface radiation through a sensor that swept back and forth as the aircraft flew over the fireground and its surrounds.

Once the data were processed, researchers could use the resulting images to distinguish regions of active flame from regions where flames were dying down, or regions that were smouldering. Some examples are shown in the first image below.

The images arising from the 2003 Alpine fires were examined by a team of scientists working on a collaborative research project. This was conducted as part of the Bushfire Cooperative Research Centre’s HighFire Risk project, and was published in The International Journal of Wildland Fire . The analyses revealed instances of highly unusual bushfire propagation, which the team termed ‘fire channelling’ events.

Fires are normally expected to spread fastest upslope and in the direction of the wind. However, fire channelling involves intense and rapid spread across lee-facing slopes (i.e. slopes facing away from the wind, where fire behaviour would normally be expected to be milder), nearly perpendicular to the wind direction. It was impossible to explain the 2003 fire-spreading patterns in terms of known forms of bushfire propagation.

Twenty-three instances of fire channelling were identified, characterised by (see the first image below):

  1. rapid lateral propagation of the fire flank, including instances of lateral spot-fire development

  2. downwind extension of the active flaming zone for 2–5 kilometres

  3. a distinct ‘kink’ in the upwind edge of the fire perimeter

  4. distinctive darker smoke on the advancing flank of the fire (see the second image below).

Multispectral imagery showing fire channelling events west of Canberra on 18 January 2003. The left panel depicts the McIntyre’s Hut fire, while the right panel depicts the Broken Cart fire. Synoptic wind direction is indicated by a white arrow. The red arrows indicate the direction of lateral spread associated with fire channelling – note the angular kinks in the upwind parts of the fire perimeter. Regions of active flame are bright yellow.
Multispectral imagery showing fire channelling events west of Canberra on 18 January 2003. The left panel depicts the McIntyre’s Hut fire, while the right panel depicts the Broken Cart fire. Synoptic wind direction is indicated by a white arrow. The red arrows indicate the direction of lateral spread associated with fire channelling – note the angular kinks in the upwind parts of the fire perimeter. Regions of active flame are bright yellow.
Credit: Air Target Services & NSW Rural Fire Service

Further analysis of the images, in combination with wind and topographic data, revealed that fire channelling was most likely caused by a complex interaction between wind, terrain and an active fire. Three key preconditions were necessary:

  1. a steep lee-facing slope (incline greater than about 25°) with a topographic aspect within about 40° of the direction the wind is heading

  2. strong winds (more than about 25 kilometres per hour)

  3. a fire on the lee-facing slope.

Under these conditions, lee slope eddies form. These are strong winds, separate from the surface in the lee of a ridge, which curl around onto themselves to form rotor-like structures. In parts of the terrain prone to lee slope eddies, the winds will mostly blow upslope against the main wind direction. The rapid lateral spread associated with fire channelling is driven by the interaction between a fire burning on the lee slope and the separated flow that forms over the lee slope above the eddies.

The fire behaviour associated with the rapid lateral spread is intense and highly turbulent, increasing the production of embers. As the fire moves across the slope, the embers are incorporated into the main airflow, borne aloft and then deposited downwind. This leads to greater fire-spotting in the short-to-medium range.

Overall, fire channelling therefore produces a ‘swarm’ of embers that simultaneously widen and deepen the region of active flaming.

Fire channelling in the McIntyre’s Hut fire on 18 January 2003 (left panel) and the Tinderry Ranges (south-east of Canberra) on 10 January 2010. White arrows indicate the main wind direction and red arrows indicate the direction of lateral fire spread. Note the darker smoke on the advancing flank of the fire in each panel.
Fire channelling in the McIntyre’s Hut fire on 18 January 2003 (left panel) and the Tinderry Ranges (south-east of Canberra) on 10 January 2010. White arrows indicate the main wind direction and red arrows indicate the direction of lateral fire spread. Note the darker smoke on the advancing flank of the fire in each panel.
Credit: Stephen Wilkes & Steve Forbes

Fire channelling behaviour has since been confirmed through further research involving collaboration between the University of New South Wales, the ACT Emergency Services Agency, the University of Coimbra (Portugal) and the University of Canterbury (New Zealand).

Continuing research is providing new insights into other notable bushfires, such as the 2009 Black Saturday bushfires. Fire channelling has already been cited as playing a role in the development of the Bunyip and Kilmore East fires.

The research findings also raise questions about the effectiveness of established mitigation practices, including fuel reduction burning and vegetation removal around houses. A consequence of fire channelling is the mass production of embers, such as on Black Saturday and the ember storm that descended on Canberra suburbs in January 2003. We must re-evaluate fuel reduction burning and preparedness of houses and other properties to minimise the impacts of ember attack.

The fire channelling phenomenon has clear implications for fire management. Fire channelling is a very efficient mechanism for spreading a fire across a landscape, and the intense and expansive fire behaviour associated with it has been directly linked to the formation of ‘fire thunderstorms’. These are the most catastrophic stage of fire development (see the first image above).

Fire channelling also holds implications for firefighter and community safety, and the public inquiry and education processes that inevitably follow catastrophic fires. The rapid escalation of a small fire due to fire channelling can compromise safety to an extent that is unpredictable using current operational bushfire behaviour models.

During the 2003 Canberra fires and the 2009 Victorian fires, nobody knew that a fire could behave in such a way. However, improvements to systems for providing advice to the community during extreme bushfire events may need to wait while the science progresses. Without a proper understanding of the physical processes driving large bushfires, there is a risk of disseminating information that will make the community less safe, not more.

Another outcome of the research has been demonstrating the importance of careful observation of bushfires, including the use of sophisticated remote sensing technology. Much of the knowledge arising from recent bushfires has depended on the forethought and planning of fire agencies and others, who have put these monitoring systems in place in the lead-up to the fire season.

Dr Jason Sharples is a lecturer with the School of Physical, Environmental and Mathematical Sciences, University of New South Wales (UNSW Canberra) and formerly a researcher with the Bushfire Cooperative Research Centre (CRC). The project discussed here was a Bushfire CRC-funded collaboration between researchers at UNSW Canberra, the ACT Emergency Services Agency and ACT Territory and Municipal Services.






ECOS Archive

Welcome to the ECOS Archive site which brings together 40 years of sustainability articles from 1974-2014.

For more recent ECOS articles visit the blog. You can also sign up to the email alert or RSS feed