Robots will need to understand why they’re doing work

Please login to favourite this article.

Researchers say that robots need to know the reason why they are doing a job if they are to effectively and safely work alongside people in the near future.

This thought-provoking article highlights some of the issues of future STEM technology that are starting to arise. It could be an excellent discussion-starter to students in years 4 and above to get them thinking about the challenges of technology in the future and the ethical implication involved.

Word Count: 620

According to research, robots need to understand the motive behind an action, rather than just carrying actions out blindly.

Researchers from England and Australia have argued that, in the future, robots will need to understand motive the way humans do, and not just perform tasks blindly.

Lead author Valerio Ortenzi from the National Centre for Nuclear Robotics at the University of Birminghamargues the shift in thinking will be necessary as economies embrace automation, connectivity and digitisation (‘Industry 4.0’) and levels of human – robot interaction, whether in factories or homes, increase dramatically.

Most factory-based machines are ‘dumb’

The paper, published in Nature Machine Intelligence, explores the issue of robots using objects. Ortenzi collaborated with researchers from Italy, Germany and the Queensland University of Technology on the research.

‘Grasping’ is an action perfected long ago in nature but one which represents the cutting-edge of robotics research.

Most factory-based machines are ‘dumb’, blindly picking up familiar objects that appear in pre-determined places at just the right moment.

Getting a machine to pick up unfamiliar objects, randomly presented, requires the seamless interaction of multiple, complex technologies. These include vision systems and advanced AI so the machine can see the target and determine its properties (for example, is it rigid or flexible?); and potentially, sensors in the gripper are required so the robot does not inadvertently crush an object it has been told to pick up.

Even when all this is accomplished, researchers highlight a fundamental issue: what has traditionally counted as a ‘successful’ grasp for a robot might actually be a real-world failure, because the machine does not take into account what the goal is and why it is picking an object up.

Robots don’t know the consequences of some actions

The paper cites the example of a robot in a factory picking up an object for delivery to a customer. It successfully executes the task, holding the package securely without causing damage. Unfortunately, the robot’s gripper obscures a crucial barcode, which means the object can’t be tracked and the firm has no idea if the item has been picked up or not; the whole delivery system breaks down because the robot does not know the consequences of holding a box the wrong way.

Many factory-based machines are seen as ‘dumb’ due to their lack of flexibility with commands.

Ortenzi and his co-authors give other examples, involving robots working alongside people.

“Imagine asking a robot to pass you a screwdriver in a workshop. Based on current conventions the best way for a robot to pick up the tool is by the handle. Unfortunately, that could mean that a hugely powerful machine then thrusts a potentially lethal blade towards you, at speed. Instead, the robot needs to know what the end goal is, i.e.,to pass the screwdriver safely to its human colleague, in order to rethink its actions.

“Another scenario envisages a robot passing a glass of water to a resident in a care home. It must ensure that it doesn’t drop the glass but also that water doesn’t spill over the recipient during the act of passing, or that the glass is presented in such a way that the person can take hold of it.

Traditional programming methods aren’t good enough

“What is obvious to humans has to be programmed into a machine and this requires a profoundly different approach. The traditional metrics used by researchers, over the past 20 years, to assess robotic manipulation, are not sufficient. In the most practical sense, robots need a new philosophy to get a grip.”

QUT robotics researcher Peter Corke, the director of the Australian Centre for Robotic Vision headquartered at QUT, says that the ability for robots to interact physically with people by handing them things they want, in a way that is comfortable and efficient, is a really important step forward.

“Future robots will be expected to work with us in a natural and human-like way,” Corke says.

Login or Sign up for FREE to download a copy of the full teacher resource

Years: 4, 5, 6, 7, 8, 9, 10

Topics:

Biological Sciences – The Body, Living Things

Chemical Sciences – Chemical Reactions, Atoms

Physical Sciences – Forces, Energy

Additional: Careers, Technology, Engineering.

Concepts (South Australia):

Biological Sciences – Form and Function

Chemical Sciences – Properties of Matter

Physical Sciences – Forces and Motion, Energy

Years:

4-10