A team from the Integrated Marine Observing System is using a series of floating wireless sensor networks to monitor atmospheric and aquatic conditions at the Great Barrier Reef to determine the cause of coral bleaching.

Bleaching occurs when coral expels the algae living in its tissue structure due to stress caused by changes in temperature, light and nutrient availability. This in turn causes the coral to turn completely white.

The Great Barrier Reef (GBR) recently experienced a coral bleaching event the likes of which have never been recorded, and Scott Bainbridge, project manager of the GBR ocean observing system and architect of the wireless sensor networks currently deployed, is spearheading efforts to collect data about the conditions that caused the event.

“We set out to wire up seven reefs along the GBR, and the idea was to intensively study these reefs so we understand the processes at the smaller scale,” he told IoT Hub.

Bainbridge said that while oceanographic buoys provide data for larger scale processes, little is known about the effects of oceans on reefs, and vice versa.

“The connection between the oceans and the reefs is poorly understood, so we’re asking questions in terms of how the global ocean patterns impact individual reefs,” he said.

Each sensor network deployed in the reefs measures both conditions underwater and weather conditions above the surface. Weather transmitters attached to poles are measuring barometric pressure, humidity, precipitation, temperature and wind speed and direction.

On the floating sensor buoys, the connected water temperature sensor holds the most importance to the researchers, as “temperature is a good proxy for just about everything else.”

“We also measure salinity, turbidity, chlorophyll levels, light and oxygen levels in the water,” Bainbridge added.

Combining old and new technology

Bainbridge said that technology was not advanced enough at the start of the monitoring project in 2007, so the systems they had available to them at the time had to be re-engineered to suit their purposes.

“We used a set of technologies that we’ve been using for our remote weather stations and adapted those to a sensor network design for the seven reefs [we monitor],” he said.

“When we started this project, the big thing for us was reliability, because these sensor platforms are 70 kilometres out to sea, and cost us thousands of dollars to fix things.

“We looked around at what platforms were available, and we were really keen to try and get as much intelligence as possible on the platform, but there was very little that was available and ready for our level of deployment.”

This forced the team to reprogram their dataloggers with the intelligence required for the research team’s purposes, according to Bainbridge.

“One of the things we initially struggled with was implementing the sensor web enablement series of protocols, because when they became available, the research team was really keen that it was the way we were going to transmit data,” he said.

“The oceanographic dataloggers we were using only possessed serial comms, so we provided the loggers with a library of SensorML (Sensor Model Language) stubs.

“When the logger detected that a new sensor attached was, it interrogated the sensor, determined its type and pushed the SensorML stub back to the datastore which created a metadata record for that new sensor, essentially creating a plug and play architecture that we bolted on to the existing systems.”

Bainbridge also said that the 900MHz radio band was used early on to transmit the data, and only in recent years has 2.4GHz and 5GHz wifi been deployed for connectivity purposes.

“We still use the 900MHz frequency as a backdoor radio system, so if the wifi systems go down, we can restart them,” he added.

Bainbridge said that the research team is hoping to integrate some emerging technologies to provide a richer data picture for scientists.

“We’re really interested in sensors that can do things like genomics, which enable us to pick up chemicals in the water and gene types,” he said.

“We’re also working on ideas to directly measure coral health and to perform water sampling to measure contaminants.”

Collected data painted a bleak picture

The general consensus among the scientific community is that global climate change events like El Niño have caused the global increase of coral bleaching by warming the world’s oceans, and the data collected by Bainbridge’s team supported that assertion.

“For most of the period we’ve been recording data, water temperatures have actually been cooler than normal, but what we have been able to do with ten years of data is climatology, which provides an average of temperatures,” Bainbridge explained.

“We know what the average weather conditions are and we can look at things which are specifically different. We can now say what a normal weather pattern is for a given time of year and what an abnormal one is, and this year we’ve seen some very high temperatures.

“Our current sensor deployment means we now have real-time data of temperatures on the reefs which we use in conjunction with the historical data to contextualise it, and we can quantify how unusually warm it has been and give our scientists valuable insight.”

Bainbridge said that the data indicated that the northern areas of the GBR were severely impacted, with the southern areas escaping the effects.

“What we observed was that weather patterns came through which cooled the southern parts of the reef but not the northern areas, where the majority of the bleaching has occurred,” he said.

Natural and technological challenges

Bainbridge said that the difficulty in determining the causes of declining reef health stems from the myriad factors that are impacting them.

“At the moment, the natural issues are probably larger than those caused by man, such as the crown-of-thorns starfish population and the coral bleaching, and the general degradation of the reef due to layered stresses, more than one particular issue,” he explained.

The volume of data collected is also presenting challenges, with over 150 million observations collected by the study, according to Bainbridge.

“Most of the data we’ve collected doesn’t have any interesting components to it, as it’s just background data. The challenge is being able to use data analytics to be able to pull out patterns of interest,” he said.

“We’re struggling with how to transition to a Google Analytics-type platform that allows us to really do the analytics, in terms of identifying patterns that we don’t see or know about at the moment.

We’re really struggling to make sense of the data – we’ve turned the tap on, but that’s just caused other problems, and collecting data we never get to look at is pretty pointless.”

Bainbridge said that the current drive is for contextualised data, and while some progress is being made in certain areas, he has yet to see an analytics platform that perfectly suits his purposes.

“We’re still focused on collection of data when we need to be focused on information and the knowledge we can impart,” he added.

Bainbridge hopes to achieve a state where the real-time data can be used to create dynamic models and to combine the sensor data with other data sources and providing information to other industries, such as shipping or search and rescue.