Print this page

Published: 23 April 2012

Fish as friends not food in sharks’ social networks

Carrie Bengston and Grace Chiu

‘Fish are friends, not food,’ says Bruce the Shark in the movie Finding Nemo, during a group-therapy-style meeting for marine predators. But could sharks and fish really be friends? In a statistical way, they could.

Researchers are using social network analysis to quantify the level of dependency of sharks on different fish and vice versa.
Credit: Terry Goss/Marine Photobank

CSIRO researcher Dr Grace Chiu and Dr Anton Westveld from the University of Arizona in the US have described how techniques commonly used to study who-tweets-who can be used to understand who-eats-who in the marine food chain. They have published their findings in the journal Proceedings of the National Academy of Sciences (PNAS). 1

‘Most high-school biology students know about food chains,’ says environmental statistician Dr Chiu.

‘They’re a way of illustrating feeding or “trophic” relationships between organisms in an ecosystem. An example might be algae is eaten by fish, fish is eaten by shark. The names are linked by arrows to show the direction and dependencies of the individual species in the relationship.

‘Food webs are a more complex, but more realistic, way of representing these relationships. They may show that a particular species of shark eats, say, five different species of fish and seals, and sometimes the odd shark too.

‘They may also show the seal eats many of the same fish species that its predator, the shark, also eats. Thus, the food web illustrates an intricate network of connections, in the way that people’s social connections could be represented as a network with clusters or nodes around the most connected individuals.’

Dr Chiu recognised that food webs are similar to social networks, as both are represented by nodes and dependencies. She wondered whether the statistical techniques used to study the one could be used for the other. Fellow statistician, Dr Westveld, who specialises in social network modelling, provided the link.

A social network analysis looks at how individual people are connected in social relationships. Market researchers, election campaign planners and other analysts study them to find out who the opinion leaders and key influencers are; how many camps or tribes there are around a particular issue; who leads and who follows; and even who-trades-with-who.

Social network analysis has been around longer than social media tools like Facebook and Twitter. But the advent of these tools in parallel with statistical network models has allowed researchers to more easily gain insight into interactions within relationships.

In this collaboration, Dr Westveld knew how to analyse social networks using statistical modelling. Dr Chiu, on the other hand, saw an opportunity to use the same techniques to analyse food webs. Her idea was to statistically analyse food web structures to learn about feeding activity and preferences.

By working together, the two researchers developed new data analysis tools to achieve Dr Chiu’s objectives. They found they were able to quantify preferences of predators for different prey, and see how likely roles were switched when the predator became prey.

For example, in one of the ecosystems they studied, the statistical model identified that the diving bell spider liked to ‘go to lunch’ with the great diving beetle – the two predators showed significantly similar preferences when it came to food.

A pair of diving bell spiders: in the analysis, they like to ‘go to lunch’ with great diving beetles.
Credit: Baupi/Wikimedia Commons

At the other end of the spectrum, cannibalism, a regular event in nature, is not part of the current model but could be incorporated if desired, according to Dr Chiu.

The researchers were surprised that their technique worked so well for studying ecosystems – the first time the technique has been used for a scientific application. As Dr Chiu points out, even in classical social network analysis, much of the research is qualitative rather than quantitative.

While the ecosystem case studies in the PNAS paper were well-known ones from the Caribbean, South Africa, UK and US, Dr Chiu says the technique could be applied to Australian and other ecosystems.

Which means in theory that a flounder could ‘friend’ a cod; a lorikeet could ‘link in’ a magpie; and a dingo could ‘follow’ a wallaby!

Listen to Dr Chiu discussing her work here


1 Chiu, GS & AH Westveld (2011). A unifying approach for food webs, phylogeny, social networks, and Statistics, PNAS 108(38), 15881–15886.





Published: 2 April 2012

Computer power stacks up for flood mitigation

Carrie Bengston

The best tools to mitigate the effects of floods such as those we’ve seen recently literally splashed across our TV screens may not be levies or sandbags, but computers.

CSIRO’s computational fluid modelling expertise has enabled Chinese authorities to visualise what would happen if one of their largest dams – Geheyan – were to fail, sending 3.12 billion cubic meters of water crashing onto the town below. The colours denote different floodwater velocities.
CSIRO’s computational fluid modelling expertise has enabled Chinese authorities to visualise what would happen if one of their largest dams – Geheyan – were to fail, sending 3.12 billion cubic meters of water crashing onto the town below. The colours denote different floodwater velocities.
Credit: CSIRO

Wee Waa, Moree and Wagga Wagga – towns that to many people have previously been just dots on maps – recently made headlines, for all the wrong reasons. TV news footage showed these towns deluged with murky water from rivers swollen by record downpours. Residents, emergency services and local mayors could only assess the damage and do the best they could as they waited for damaging flood waters to recede.

While floods like this will always occur, it is possible for agencies and communities to prepare and respond more effectively. Computer power is the key: it can model fluids such as flood waters incredibly accurately. Data about specific landscapes and regions can be combined with mathematical equations of how fluids behave and move, helping emergency managers, town planners and even insurance companies be prepared for future floods.

The data deluge in sciences such as environmental modelling is every bit as awesome as the real-life deluges experienced recently in NSW. Resource managers and planners are beginning to take notice of the power of computational fluid modelling for understanding and analysing vast amounts of environmental data, and for predicting changes due to floods. Computer modelling power is based on both the power of computers themselves and the power of the algorithms (computer processing steps) that run on computers.

Twice each year, the world’s fastest supercomputers are ranked in the ‘Top500 list’. A standard test called the Linpack benchmark compares computers' speeds and energy consumption. Computer owners such as universities and government data centres, technology companies such as Intel, and supercomputer geeks all eagerly await the latest list.

In November 2011, for the first time, the number one computer on the list – Japan’s ‘K computer’ – clocked in at more than 10 petaflops, doing more than 10 quadrillion calculations per second.1

Less than three years ago, these speeds were unimaginable. Every ten years, supercomputers speed up about 1000 times. (This acceleration in processing power eventually makes its way to our desktops, mobile phones and other devices.)

The head of CSIRO’s computational and simulation sciences team, Dr John Taylor, leads teams of researchers with expertise in statistics, mathematics, information and communication technologies and other areas of science. The teams analyse large datasets from huge sensor networks such as radio telescopes, large experiments such as those using the synchrotron, and high-throughput DNA analysis systems.
The head of CSIRO’s computational and simulation sciences team, Dr John Taylor, leads teams of researchers with expertise in statistics, mathematics, information and communication technologies and other areas of science. The teams analyse large datasets from huge sensor networks such as radio telescopes, large experiments such as those using the synchrotron, and high-throughput DNA analysis systems.
Credit: CSIRO

CSIRO’s greenest supercomputer – a relatively new type of supercomputer called a graphics processing unit (GPU) cluster – has made the Top500 several times since its launch in November 2009. It ranked 212 in the November 2011 list.

Located in Canberra, it’s one of the world’s fastest and least energy-hungry supercomputers. Intriguingly, the GPUs at its heart started out as graphics rendering hardware for computer games. So, it’s no surprise that the cluster – now a workhorse for many scientists in CSIRO – can produce informative and stunning animations as it rapidly crunches enormous numbers of numbers.

‘In recent years, the huge increase in computer power and speed, along with advances in algorithm development, have allowed mathematical modellers like us to make big strides in our research,’ says Mahesh Prakash of CSIRO's computational modelling team, led by Dr Paul Cleary.

‘Now, we can model millions, even billions of fluid particles,’ says Dr Prakash. ‘That means we can predict quite accurately the effects of natural and man-made fluid flows like tsunamis, dam breaks, floods, mudslides, coastal inundation and storm surges.’

Dr Mahesh Prakash is one of a team of computational modellers at CSIRO who’ve clocked up several decades of work on fluid computer models and algorithms, including rendering to create ‘real life’ 3D wave and flood effects.
Dr Mahesh Prakash is one of a team of computational modellers at CSIRO who’ve clocked up several decades of work on fluid computer models and algorithms, including rendering to create ‘real life’ 3D wave and flood effects.
Credit: CSIRO

A dam break, for example, is essentially a human-made flood. Like a flood caused by excessive rainfall, a dam break can be modelled on computer.

The models create colourful and detailed animations that show how rapidly the water moves and where it goes: where it ‘overtops’ hills and how quickly it reaches towns or infrastructure such as power stations. This information can help town planners plan structures such as levies and help emergency services respond more efficiently.

CSIRO’s dam break models have been validated using historical data from the St Francis Dam break, which occurred in California in 1928 and killed more than 400 people. Dr Prakash and his team have used the validated modelling techniques for a range of ‘what-if’ scenarios for other dams.

Working with the Chinese Academy of Surveying and Mapping, the CSIRO team simulated the hypothetical collapse of the massive Geheyan Dam: one of the world's biggest. CSIRO combined their unique modelling techniques with digital terrain models (3-D maps of the landscape) to obtain a realistic picture of how a real-life disaster might unfold.

Realistic animations help flood mitigation and emergency response groups to better manage disasters.
Realistic animations help flood mitigation and emergency response groups to better manage disasters.
Credit: CSIRO

These evidence-based fluid-modelling tools can also help decision makers manage dam operations during excessive rainfall, for example, allowing them to determine when to undertake controlled water releases and how much water to release.

The future of computer modelling of floods and other natural disasters can only improve as computers and algorithms become more powerful. CSIRO's own supercomputer arsenal will be given a boost when its GPU cluster is upgraded this year. The tender was won by Xenon Systems of Melbourne and the upgrade is currently taking place.

The leader of CSIRO’s computational and simulation sciences team, Dr John Taylor, says the upgrade will open up even more possibilities.

‘We're anticipating a significant boost in computational performance and greater compatibility with the next generation of accelerator cards, all achieved using less energy per calculation,’ says Dr Taylor.

Flood modellers, regional planners and emergency managers – watch this space!

View a clip on computational fluid modelling for disaster management here.


1 In supercomputing, flops – or more accurately, flop/s, for floating-point operations per second – is a measure of a computer's performance, especially in fields of scientific calculations that rely on floating-point calculations. The prefix ‘peta’ denotes 1015 or 1 000 000 000 000 000 flops.




ECOS Archive

Welcome to the ECOS Archive site which brings together 40 years of sustainability articles from 1974-2014.

For more recent ECOS articles visit the blog. You can also sign up to the email alert or RSS feed