Print this page

Published: 10 April 2012

Arguing the case for carbon capture and storage

Peter J Cook

While many nations agree on the need for deep cuts to carbon dioxide emissions to the atmosphere, there is no agreement on how this will be achieved and by when. A range of targets, timetables and strategies have been proposed, but in the absence of a binding international agreement on greenhouse gas emissions, progress is slow and global emissions continue to increase.

A bulk carrier entering Port Hunter, Newcastle, New South Wales: Newcastle is one of the world's largest coal export ports, with coal representing 90 per cent of cargo tonnage. The International Energy Agency predicts fossil fuels will be used for years to come, despite the high carbon emissions.
Credit: Nick Pitsas

Governments do, however, agree that there is no single answer to the problem and that a portfolio of responses will be necessary. These include increased energy efficiency, more sequestration of carbon in vegetation and soils, greater use of renewable energy, fuel switching from coal to gas, nuclear power and carbon capture and storage (CCS).

Each of these measures will play their part in cutting emissions. They will be driven by a range of initiatives and policies, including pricing carbon, mandatory renewable energy targets (MRETs), subsidies and improved technologies. Perhaps the most tangible impact has been through MRETs, which have resulted in a major increase in wind power and a more modest, but nonetheless significant, growth in solar photovoltaics in Australia and globally.

However, this growth in renewable energy has been dwarfed by the increase in fossil fuel use (especially coal) to meet increased electricity demand. Further, according to a 2011 report by the Productivity Commission, the cost of mitigation through subsidies to renewables has been staggeringly high for quite modest savings in CO2 emissions. 1

The International Energy Agency (IEA) predicts that fossil fuel use is likely to continue for years. On this basis, they conclude that 19 per cent of the global emissions mitigation effort will need to be met through CCS. This is because coal and natural gas will continue to be widely used – not only for electricity production, but also for processes such as iron and steel production, cement or fertiliser manufacture, which have little scope for replacement by renewables. As long as we continue to use fossil fuels, we need to deploy CCS to mitigate their impact.

For most people, CCS is an unfamiliar – and therefore, questionable – technology. But, the economy and living standards of Australia and many other major countries depend on fossil fuels or nuclear power to provide plentiful and affordable electricity. For the moment, Australia has no plans for nuclear power; the nation generates almost 80 per cent of its power from coal and receives tens of billions of dollars a year from coal and liquid natural gas exports.

We are highly dependent on fossil fuels. It seems unrealistic to assume that this picture will change in the short, or even medium, term. This is why Australia and other countries, such as the United States, Canada, China and the United Kingdom will need to deploy CCS. Public support for decreasing CO2 emissions through CCS and other ‘clean’ technologies will rely on public understanding of the various technology options. 2 Of all these technologies, CCS is perhaps the least understood.

CCS begins with the separation and capture of CO2 from a major stationary source, such as a power station or a gas processing plant. This CO2 is then compressed to a dense liquid and transported by a pipeline to a site with geology suitable for long-term storage. The dense liquid CO2 is injected to depth of 1 kilometre or more below the surface, where it will remain permanently trapped in porous and permeable rocks.

Post-combustion carbon capture technology: in a traditional coal-fired power station less than 40 per cent of the energy in coal is converted to electricity. Flue gases emitted by the power station normally contain around 10–20 per cent CO2. Post-combustion capture technology enables the capture of up to 95 per cent of CO2 created during energy production.
Credit: CSIRO

Critics argue that CO2 storage is unproven. The reality is that each year, tens of millions of tonnes of CO2 are transported and injected into deep rocks as part of petroleum operations. So, we know a great deal about the behaviour of CO2 in pipelines and in rocks. We also know how to monitor stored CO2 to ensure it does not leak and pose danger. For example, the CO2CRC’s highly successful Otway Project has demonstrated the safe and secure geological storage of tens of thousands of tonnes of CO2 under Australian conditions. 3

What has not yet been done is to deploy CCS on a large scale at a coal or gas-fired power station. This gap in technology testing needs to be addressed.

Is there enough storage space for captured CO2? The Intergovernmental Panel on Climate Change concluded that globally, there is enough storage space to meet all likely needs for the next century. This figure has been supported by subsequent national-scale studies in the United States, the European Union and Australia.

CCS is no more inherently ‘risky’ than many industrial activities that we take for granted, such as making industrial chemicals, or transporting natural gas in a pipeline. Yes, risks may arise if a storage site is inadequately studied, or a well or capture plant is incorrectly operated, but these risks can be managed.

Perhaps the biggest risk associated with CCS is the risk of not deploying it in time to achieve the necessary deep cuts in emissions. A large percentage of today’s coal-fired power stations will still be emitting CO2 to the atmosphere in 20, 30, and even 40 years’ time. Therefore, retrofitting CCS systems to existing power stations must be part of the global emissions reduction strategy.

What about the cost of CCS? Won’t it result in more expensive electricity, and isn’t it more expensive than solar or wind power? Cleaner electricity provided through CCS (or solar or wind) will be more expensive than electricity from conventional coal-fired power stations, but current and projected costs suggest that CCS will be more cost-effective than other alternative technologies.

Putting a price on carbon in the range proposed by Australia, the European Union or any other country is not sufficient to accelerate the deployment of CCS or any other clean energy technology.

For the moment, the uptake of any technology – whether CCS, wind or solar – is far more likely to result from a mandatory target than a low carbon price. In the long term, a necessarily high price on carbon will encourage technology deployment. In the meantime, other measures are necessary: including a range of clean energy technologies.

The mix will vary from country to country. Some countries may offer great scope for hydroelectric power or biomass. For Australia, CCS, solar thermal and geothermal power perhaps have the greatest potential for providing reliable base load power. In the future, the need for nuclear power may also become evident.

Let us continue to develop a picture of the world as we would like it to be, where all our energy requirements are met from sustainable renewable energy; but, let us also plan for the world as it is likely to be for some time to come – a world in which fossil fuels will continue to be used. It is important to Australia, environmentally and economically, to effectively address that reality through a scalable technology such as CCS.

Peter Cook, Professorial Fellow at the University of Melbourne and until recently Chief Executive of the CRC for Greenhouse Gas Technologies, is an internationally recognised research leader in the areas of energy, resource, environmental and greenhouse science. In 1998, he initiated Australia’s first research program into carbon capture and storage as a CO2 mitigation option. He is the author of many articles in reviewed journals and of several books, including a new title from CSIRO Publishing, Clean Energy, Climate and Carbon.


1 See http://www.pc.gov.au/projects/study/carbon-prices
2 ‘Clean’ is used here in the sense that these technologies produce fewer emissions than conventional energy sources.
3 See http://www.co2crc.com.au/otway/





Published: 2 April 2012

Computer power stacks up for flood mitigation

Carrie Bengston

The best tools to mitigate the effects of floods such as those we’ve seen recently literally splashed across our TV screens may not be levies or sandbags, but computers.

CSIRO’s computational fluid modelling expertise has enabled Chinese authorities to visualise what would happen if one of their largest dams – Geheyan – were to fail, sending 3.12 billion cubic meters of water crashing onto the town below. The colours denote different floodwater velocities.
CSIRO’s computational fluid modelling expertise has enabled Chinese authorities to visualise what would happen if one of their largest dams – Geheyan – were to fail, sending 3.12 billion cubic meters of water crashing onto the town below. The colours denote different floodwater velocities.
Credit: CSIRO

Wee Waa, Moree and Wagga Wagga – towns that to many people have previously been just dots on maps – recently made headlines, for all the wrong reasons. TV news footage showed these towns deluged with murky water from rivers swollen by record downpours. Residents, emergency services and local mayors could only assess the damage and do the best they could as they waited for damaging flood waters to recede.

While floods like this will always occur, it is possible for agencies and communities to prepare and respond more effectively. Computer power is the key: it can model fluids such as flood waters incredibly accurately. Data about specific landscapes and regions can be combined with mathematical equations of how fluids behave and move, helping emergency managers, town planners and even insurance companies be prepared for future floods.

The data deluge in sciences such as environmental modelling is every bit as awesome as the real-life deluges experienced recently in NSW. Resource managers and planners are beginning to take notice of the power of computational fluid modelling for understanding and analysing vast amounts of environmental data, and for predicting changes due to floods. Computer modelling power is based on both the power of computers themselves and the power of the algorithms (computer processing steps) that run on computers.

Twice each year, the world’s fastest supercomputers are ranked in the ‘Top500 list’. A standard test called the Linpack benchmark compares computers' speeds and energy consumption. Computer owners such as universities and government data centres, technology companies such as Intel, and supercomputer geeks all eagerly await the latest list.

In November 2011, for the first time, the number one computer on the list – Japan’s ‘K computer’ – clocked in at more than 10 petaflops, doing more than 10 quadrillion calculations per second.1

Less than three years ago, these speeds were unimaginable. Every ten years, supercomputers speed up about 1000 times. (This acceleration in processing power eventually makes its way to our desktops, mobile phones and other devices.)

The head of CSIRO’s computational and simulation sciences team, Dr John Taylor, leads teams of researchers with expertise in statistics, mathematics, information and communication technologies and other areas of science. The teams analyse large datasets from huge sensor networks such as radio telescopes, large experiments such as those using the synchrotron, and high-throughput DNA analysis systems.
The head of CSIRO’s computational and simulation sciences team, Dr John Taylor, leads teams of researchers with expertise in statistics, mathematics, information and communication technologies and other areas of science. The teams analyse large datasets from huge sensor networks such as radio telescopes, large experiments such as those using the synchrotron, and high-throughput DNA analysis systems.
Credit: CSIRO

CSIRO’s greenest supercomputer – a relatively new type of supercomputer called a graphics processing unit (GPU) cluster – has made the Top500 several times since its launch in November 2009. It ranked 212 in the November 2011 list.

Located in Canberra, it’s one of the world’s fastest and least energy-hungry supercomputers. Intriguingly, the GPUs at its heart started out as graphics rendering hardware for computer games. So, it’s no surprise that the cluster – now a workhorse for many scientists in CSIRO – can produce informative and stunning animations as it rapidly crunches enormous numbers of numbers.

‘In recent years, the huge increase in computer power and speed, along with advances in algorithm development, have allowed mathematical modellers like us to make big strides in our research,’ says Mahesh Prakash of CSIRO's computational modelling team, led by Dr Paul Cleary.

‘Now, we can model millions, even billions of fluid particles,’ says Dr Prakash. ‘That means we can predict quite accurately the effects of natural and man-made fluid flows like tsunamis, dam breaks, floods, mudslides, coastal inundation and storm surges.’

Dr Mahesh Prakash is one of a team of computational modellers at CSIRO who’ve clocked up several decades of work on fluid computer models and algorithms, including rendering to create ‘real life’ 3D wave and flood effects.
Dr Mahesh Prakash is one of a team of computational modellers at CSIRO who’ve clocked up several decades of work on fluid computer models and algorithms, including rendering to create ‘real life’ 3D wave and flood effects.
Credit: CSIRO

A dam break, for example, is essentially a human-made flood. Like a flood caused by excessive rainfall, a dam break can be modelled on computer.

The models create colourful and detailed animations that show how rapidly the water moves and where it goes: where it ‘overtops’ hills and how quickly it reaches towns or infrastructure such as power stations. This information can help town planners plan structures such as levies and help emergency services respond more efficiently.

CSIRO’s dam break models have been validated using historical data from the St Francis Dam break, which occurred in California in 1928 and killed more than 400 people. Dr Prakash and his team have used the validated modelling techniques for a range of ‘what-if’ scenarios for other dams.

Working with the Chinese Academy of Surveying and Mapping, the CSIRO team simulated the hypothetical collapse of the massive Geheyan Dam: one of the world's biggest. CSIRO combined their unique modelling techniques with digital terrain models (3-D maps of the landscape) to obtain a realistic picture of how a real-life disaster might unfold.

Realistic animations help flood mitigation and emergency response groups to better manage disasters.
Realistic animations help flood mitigation and emergency response groups to better manage disasters.
Credit: CSIRO

These evidence-based fluid-modelling tools can also help decision makers manage dam operations during excessive rainfall, for example, allowing them to determine when to undertake controlled water releases and how much water to release.

The future of computer modelling of floods and other natural disasters can only improve as computers and algorithms become more powerful. CSIRO's own supercomputer arsenal will be given a boost when its GPU cluster is upgraded this year. The tender was won by Xenon Systems of Melbourne and the upgrade is currently taking place.

The leader of CSIRO’s computational and simulation sciences team, Dr John Taylor, says the upgrade will open up even more possibilities.

‘We're anticipating a significant boost in computational performance and greater compatibility with the next generation of accelerator cards, all achieved using less energy per calculation,’ says Dr Taylor.

Flood modellers, regional planners and emergency managers – watch this space!

View a clip on computational fluid modelling for disaster management here.


1 In supercomputing, flops – or more accurately, flop/s, for floating-point operations per second – is a measure of a computer's performance, especially in fields of scientific calculations that rely on floating-point calculations. The prefix ‘peta’ denotes 1015 or 1 000 000 000 000 000 flops.




ECOS Archive

Welcome to the ECOS Archive site which brings together 40 years of sustainability articles from 1974-2014.

For more recent ECOS articles visit the blog. You can also sign up to the email alert or RSS feed