|Member Login

Research

ENVIRONMENT – Where We Live

Overview


The environment in which we live is critical to our wellbeing.  This environment comprises both the natural environment and the built environment.

The natural environment sustains us with fundamental necessities such as drinkable water and breathable air, but also provides beauty and wonder that nourishes the human spirit.  The natural environment is vast and today we can only afford to infrequently inspect a tiny fraction of it, and rely on extrapolation to understand the full condition.  Local variations in time and space can be overlooked.   Robotic vision technology has the potential to reduce the cost of inspection, allowing more of the world to be inspected, more frequent inspections, or both.  Ultimately, intelligent vision-capable robots would be capable of environmental remediation as well as inspection.

The built environment, our cities, roads, dams, tunnels, pipelines and electrical distribution networks represent a massive investment over generations that need to be maintained and enhanced and passed on to future generations.  Maintaining these assets in good condition requires continuous inspection and traditional methods are labour intensive and thus expensive.  Intelligent vision-capable robots have the potential to reduce the cost of inspection, perhaps even allowing more frequent inspection than could be imagined today.  Ultimately, robotic vision technology would be capable of repair as well as inspection.

Projects


CRCSI project with Ergon Energy


2016

Jason Ford, Peter Corke, Feras Dayoub, Steve Martin

It’s a daunting task – having to maintain a network of hundreds of thousands of miles of power lines and the poles that support them. That’s the challenge facing Queensland energy giant, Ergon. Centre researchers are now part of an effort to help Ergon with its challenge. They are involved with a new CRCSI and QUT collaboration with Ergon to develop a vision and sensing platform for UAV’s to fly over all the power poles and capture images. They are looking for things like rot in a pole’s cross-arms, along with other possible structural issues. Right now, the work is being done by UAV operators, who must keep the airborne vehicles within their line of sight. It’s hoped this project will make their job much easier, by using on-board sensors to develop a force field, if you will, that will keep the UAV’s from getting too close to the poles or wires. Centre Associate Investigator Jason Ford is among those involved with the project, which is an example of the type of research being done to tackle issues with infrastructure. “This project was empowered by the Centre existing and provides an example of the type of work the Centre can do,” Jason said. Other Centre members involved with the project include Centre Director Peter Corke, Research Fellow Feras Dayoub, and PhD Student Steven Martin. Other researchers involved include Andrew Keir, from QUT’s Institute of Future Environments and QUT Research Fellow Aaron Mcfadyen. In 2016, Ergon merged with Energex, with the new company called Energy Queensland.

j2.ford@qut.edu.au

Helping maintain and service Australia’s Solar Industry


2016

Fatih Porikli

Our colleagues from the Australian National University (ANU) were successful in the 2016 Australian Renewable Energy Agency (ARENA) grant outcomes. The project titled “A Robotic Vision System for Automatic Inspection and Evaluation of Solar Plant Infrastructure” received direct funding of $876,000 from ARENA with a total project value of $3 million with in-kind contributions from Fotowatio Renewable Ventures (FRV), Vast Solar, Data61, and 4D Surveying. The project was initiated by Centre Associate Investigator Fatih Porikli (ANU) and Chief Investigator Robert Mahony (ANU) in collaboration with Dr Evan Franklin (ANU) and Dr Joe Coventry (ANU) who bring renewables specific subject expertise to the project. “This project is the first in the world where advanced robotic vision technology will be deployed to provide deep and comprehensive diagnostic information to solar plant owners,” Rob Mahony said. Rob says the rapidly expanding solar industry in Australia will lead to millions of square kilometres of photovoltaic panels and solar concentrating mirrors that must be maintained and serviced. “Robotic vision has a key role to play in autonomous inspection to identify and monitor faults and performance issues such as soiling levels, allowing renewable infrastructure to be cheaply and efficiently maintained at peak efficiency,” Rob said. Rob also says the project draws on three key research themes: • Robust vision – to deal with the high exposure environments. • Visual Learning - in visually identifying faults and soiling levels. • Vision and action – in control of the aerial drone. Others involved in the project include Dr Naveed Akhtar (ANU), who is lead postdoctoral fellow along with Dr Salman Khan (Data61), and Mr Ehab Salahat (ANU) who is a centre affiliated PhD student working on the project.

robert.mahony@anu.edu.au

RangerBot the robo reef protector


2017 -

Matthew Dunbabin, Feras Dayoub

Centre Researchers Chief Investigator Matt Dunbabin and Research Fellow Feras Dayoub won the people's choice award in the Google Impact Challenge Australia in 2016. The award is worth $750,000. Their project with the Great Barrier Reef Foundation will create a low-cost 'robo reef protector'. The foundation says the team will build on the researchers’ successful COTSbot platform, which was designed to try to tackle one of the greatest threats to the reef, the crown-of-thorns starfish, or COTS. The COTSbot identifies a crown-of-thorns starfish (COTS) and injects it with a solution to kill it. “To be recognised in this way is pretty awesome,” said Dr Dunbanin, an Associate Investigator with the Centre. “We learnt a lot from COTSbot, what works, what doesn’t work. What we learnt will be brought into the new design.” The team now wants to create the RangerBot, a low-cost, more versatile version of the COTSbot. It will do that by shrinking COTSbot, adding a suite of vision-based sensors and developing a range of attachments to tackle various monitoring and management activities along the Great Barrier Reef. “This is a fantastic opportunity that opens the door to building more robots that will help protect the Reef,” said Dr Dayoub, a Research Fellow with the Centre. “This motivates us to answer the support of the people who voted for us by working very hard towards building a great robot, the RangerbBot.” The Google Impact Challenge Australia was created to help not-for-profit organisations develop technologies that can help tackle the world’s biggest social challenges. Certainly, the Great Barrier Reef and its health is an important challenge facing Australia. “We wouldn’t be here without the support of the Great Barrier Reef Foundation, an organisation truly dedicated to reef conservation,” said Dr Dunbabin. The funding will also allow the team to drive down the cost of building the robot, making it affordable for communities. Media contact: Tim Macuga 07 3138 6741 or 0478 571 226 timothy.macuga@qut.edu.au

timothy.macuga@qut.edu.au

The Crown-Of-Thorns Starfish (COTS) robot (COTSbot)


2015 -

Matthew Dunbabin, Feras Dayoub

It's one of the biggest threats to an Australian national treasure - the Great Barrier Reef. But now, a revolutionary new advancement in robotic environmental monitoring and management could help in the fight against Crown of Thorn Starfish, or COTS. The Crown of Thorns Starfish Robot, the COTSbot, is the world's first robotic sumbersible that will seek out and inject COTS with a toxin. The goal is to control their numbers, which are responsible for an estimated 40 per cent of the reef's total decline in coral cover. Looking like a mini-yellow submarine, COTSbot can move up and down through the Great Barrier Reef with the help of five thrusters, GPS and pitch-and-roll sensors. The unique aspect of the COTSbot is that it can think for itself. It uses a visual recogntiion system made possible by Centre Robotic Vision algorithms to identify crown of thorns starfish in the visually challenging environment of the Great Barrier Reef. The robot is designed to cruise about a metre above the coral surface, looking for COTS. When it sees one, a robotic arm extends to inject the starfish with vinegar, proven by James Cook University to be effective in controlling COTS numbers within 24 hours of application. The COTSbot was designed by Dr Matt Dunbabin, a QUT Research Fellow and Associate Investigator with our Centre. He said the robot was intended as a first responder system to beef up the existing program that uses divers to hunt for COTS. "Human divers are doing an incredible job of eradicating this starfish from targeted sites but there just aren't enough divers to cover all the COTS hotspots across the Great Barrier Reef," said Matt. "Imagine how much ground the programs could cover with a fleet of 10 or 100 COTSbots at their disposal, robots that can work day and night in any weather condition." QUT roboticists spent months developing and training the robots to recognise COTS among coral, using still images of the reef and videos taken by COTS-eradicating divers. Dr Feras Dayoub, a QUT Research Fellow specialising in Robotic Vision, trained the COTSbot to pick out the pest from other sea life. "The system has seen thousands of images of COTS and not COTS and now it's able to detect and decide which one is COTS and which one is not," Feras told the ABC. The story about the COTSbo first aired on ABC Television on 31 August, with the BBC World News and NBC picking up the story. Check out the ABC story here http://tinyurl.com/zt5hnhx

matt.dunbabin@roboticvision.org

Australian Centre for Robotic Vision
2 George Street Brisbane, 4001
+61 7 3138 7549