Breakthroughs in AI learning, natural language processing and robotic manipulation have helped develop a new collaborative robot to offer human maintenance workers ‘a second pair of hands’.
After five years of development, the EU Horizon2020 SecondHands consortium has successfully developed a potentially revolutionary robotic platform, the ARMAR-6.
Launched in 2015, the consortium’s mission was to develop a collaborative robot (cobot) assistant to make industrial environments safer by proactively offering useful support and assistance to humans in maintenance tasks.
In essence, to allow people to concentrate on the ‘skilled’ part of a task while a robot takes responsibility for the heavy lifting and support, such as through holding, lifting, reaching or passing objects.
H2020 consortium partners:
- Ecole Polytechnique Fédérale de Lausanne (EPFL) – Switzerland
- Karlsruhe Institute of Technology (KIT) – Germany
- Ocado Technology – UK
- Sapienza, University of Rome – Italy
- University College London (UCL) – UK
The underlying robot platform, the ARMAR-6, was developed at KIT specifically for the project requirements with respect to human-robot interaction.
The ARMAR-6 was tested in the real-world environments of Ocado’s highly automated warehouse.
By providing the engineering expertise for dealing with large complex distributed software systems, Ocado Technology enabled the academic partners to focus on the scientific and technical challenges, and not have to focus on the process of “glueing” their solutions together.
UCL’s Professor Lourdes Agapito commented: “It was really valuable to be able to test in the Ocado warehouse as a real-world environment. The dynamic, constantly changing conditions tested our vision, robotics and language processing algorithms to go beyond the current state-of-the-art.”
Industry-academia collaboration
The project brought together industry and academia, engaging world-leading experts from a range of robotics and scientific disciplines.
The consortium partners have developed best-in-breed technologies across robotic perception, robotic communication, and human-robot interaction, and after five years of collaboration, the resulting innovations far exceed what each partner could have accomplished alone.
Professor Lourdes Agapito noted: “Scientists in different areas of AI often work in isolation so designing vision algorithms for ARMAR 6 has given us the chance for close collaborations with top roboticists, and has pushed us out of our “comfort zone” towards increasing the robustness and performance of our algorithms.”
Grasping and manipulation
The humanoid ARMAR-6 robot combines advanced sensorimotor skills with learning and reasoning abilities, enabling it to infer when a human needs help and proactively offer the most appropriate assistance.
The robot can reportedly recognise human activities and intentions, reason about situations, and interact with humans in a natural way. It can also grasp and manipulate objects bimanually to accurately and safely use tools such as power drills and hammers.
Key breakthroughs made by the consortium members in this space include:
- a functional control architecture for integration of sensorimotor skills, learning and reasoning abilities
- novel methods for grasping tools combining visual and haptic sensing with model-based and data-driven machine learning approaches
- learning techniques ranging from explorative learning to teaching or coaching by humans
Proactive collaboration
ARMAR-6’s AI capabilities allow it to act in situations that aren’t foreseen at programming time, making it capable of autonomously performing maintenance tasks in industrial facilities.
It can also recognise its collaboration partners’ need of help and offer assistance. The consortium expects the ability to teach how to see in 3D without the requirement of fully supervised 3D training data to revolutionise the field of machine learning.
Key breakthroughs made by the consortium members in this space include:
- new learning algorithms for gathering knowledge, from activities and actions sequence recognition to tools segmentation and context classification
- reactive motion planning which enables the robot to react to human/environment interaction forces
- new algorithms for dynamic 3D scene understanding that can build geometric and semantic 3D maps of the environment even when objects move independently from the camera.
Natural language
With cobots, trust and adoption are key; what you build needs to fit into the natural ways people work, which includes being able to respond in human-suited timescales and in ways that are meaningful to humans.
This means that developments in natural language are crucial to usability.
Key breakthroughs made by the consortium members in this space include the creation of a human-robot speech interface solely based on all-neural models – automatic speech recognition, dialog modelling and speech synthesis.
Dr Graham Deacon, robotics research fellow at Ocado Technology explained: “Humanoid robots are key for improving flexibility and safety in industrial contexts in a way that is genuinely useful.
“The same technologies which enable the ARMAR-6 to communicate and interact with humans, like natural language comprehension, soft manipulation and 3D spatial awareness, also mean the robot could be developed further to help in other situations, like in helping to reduce contamination, or in assisted living. “