Driving in adverse conditions—like heavy rain, fog, or poor light can put even the most experienced driver to the test. As the idea of autonomous vehicles becomes more of a reality, we need to know the on-board navigation systems are up for these challenges, which can present themselves at any time and with little warning.
Water hazard detection, in particular, is crucial for autonomous navigation of outdoor robots. Traditional methods for detecting water hazards include stereo matching, texture-colour classification and polarisation difference. While they work well with clean images, they nevertheless have a limited effective range. In Centre project RV2—Novel visual sensing and hybrid hardware for robotic operation in adverse conditions and for difficult objects, Centre Research Fellow Chuong Nguyen and his team, extended these methods to work with challenging wet and raining conditions where existing methods fail.
The team proposed a new stereo matching for wet-muddy ground, along with extending u- and v-disparities to better detect the ground plane. Finally, they incorporated a classification with temporal detection to improve the accuracy of detecting water puddles in both on- and off-road conditions.
The team validated their approaches against other detection methods, and released their video and image data sets for independent studies. The resulting system can detect water hazards up to 100m distant.