Robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future. In simple terms, this means machines need to understand motive the way humans do, and not just perform tasks blindly, without context.
According to a new article by the National Centre for Nuclear Robotics, based at the University of Birmingham, this could herald a profound change for the world of robotics, but one that is necessary.
Lead author Dr Valerio Ortenzi, from the University of Birmingham, argues the shift in thinking will be necessary as economies embrace automation, connectivity and digitisation (‘Industry 4.0’) and levels of human – robot interaction, whether in factories or homes, increase dramatically.
The paper, published in Nature Machine Intelligence, as the cover article of the August edition, explores the issue of robots using objects. ‘Grasping’ is an action perfected long ago in nature but one which represents the cutting-edge of robotics research.
Most factory-based machines are ‘dumb’, blindly picking up familiar objects that appear in pre-determined places at just the right moment. Getting a machine to pick up unfamiliar objects, randomly presented, requires the seamless interaction of multiple, complex technologies. These include vision systems and advanced AI so the machine can see the target and determine its properties (for example, is it rigid or flexible?); and potentially, sensors in the gripper are required so the robot does not inadvertently crush an object it has been told to pick up.
Even when all this is accomplished, researchers in the National Centre for Nuclear Robotics highlighted a fundamental issue: what has traditionally counted as a ‘successful’ grasp for a robot might actually be a real-world failure, because the machine does not take into account what the goal is and why it is picking an object up.
The Nature Machine Intelligence paper cites the example of a robot in a factory picking up an object for delivery to a customer. It successfully executes the task, holding the package securely without causing damage. Unfortunately, the robot’s gripper obscures a crucial barcode, which means the object can’t be tracked and the firm has no idea if the item has been picked up or not; the whole delivery system breaks down because the robot does not know the consequences of holding a box the wrong way.
Dr Ortenzi and his co-authors give other examples, involving robots working alongside people.
“Imagine asking a robot to pass you a screwdriver in a workshop. Based on current conventions the best way for a robot to pick up the tool is by the handle. Unfortunately, that could mean that a hugely powerful machine then thrusts a potentially lethal blade towards you, at speed. Instead, the robot needs to know what the end goal is, i.e.,to pass the screwdriver safely to its human colleague, in order to rethink its actions.
“Another scenario envisages a robot passing a glass of water to a resident in a care home. It must ensure that it doesn’t drop the glass but also that water doesn’t spill over the recipient during the act of passing, or that the glass is presented in such a way that the person can take hold of it.
“What is obvious to humans has to be programmed into a machine and this requires a profoundly different approach. The traditional metrics used by researchers, over the past 20 years, to assess robotic manipulation, are not sufficient. In the most practical sense, robots need a new philosophy to get a grip.”
Australian Centre for Robotic Vision Director Peter Corke said the ability for robots to interact physically with people by handing them things they want, in a way that is comfortable and efficient, is a really important step forward.
“Future robots will be expected to work with us in a natural and human-like way,” Distinguished Professor Corke said.
Professor Rustam Stolkin, NCNR Director, said, “National Centre for Nuclear Robotics is unique in working on practical problems with industry, while simultaneously generating the highest calibre of cutting-edge academic research – exemplified by this landmark paper.”
Centre Director Peter Corke, who founded the QUT Robot Academy, will present a talk at QUT’s Robotronica on August 18 on the potential directions of robotic research and design and what that might mean for humans and society.
Staged at QUT’s Gardens Point campus, Robotronica is a free, all-day, all-ages robotics and technology festival. View the program at qut.edu.au/robotronica
Shelley Thomas, Communications Specialist
Australian Centre for Robotic Vision
P: +61 7 3138 4265 | M: +61 416 377 444 | E: firstname.lastname@example.org
About The Australian Centre for Robotic Vision
The Australian Centre for Robotic Vision is an ARC Centre of Excellence, funded for $25.6 million over seven years to form the largest collaborative group of its kind generating internationally impactful science and new technologies that will transform important Australian industries and provide solutions to some of the hard challenges facing Australia and the globe. Formed in 2014, the Australian Centre for Robotic Vision is the world’s first research centre specialising in robotic vision. They are a group of researchers on a mission to develop new robotic vision technologies to expand the capabilities of robots. Their work will give robots the ability to see and understand for the sustainable well-being of people and the environments we live in. The Australian Centre for Robotic Vision has assembled an interdisciplinary research team from four leading Australian research universities: QUT, The University of Adelaide (UoA), The Australian National University (ANU), and Monash University as well as CSIRO’s Data61 and overseas universities and research organisations including the French national research institute for digital sciences (INRIA), Georgia Institute of Technology, Imperial College London, the Swiss Federal Institute of Technology Zurich (ETH Zurich), University of Toronto, and the University of Oxford.
Australian Centre for Robotic Vision
2 George Street Brisbane, 4001
+61 7 3138 7549