Integrating different levels of automation: Lessons from winning the Amazon Robotics Challenge 2016

Carlos Hernandez Corbato, Mukunda Bharatheesha, Jeff van Egmond, J. Ju, Martijn Wisse

Research output: Contribution to journalArticleScientificpeer-review

31 Citations (Scopus)
213 Downloads (Pure)

Abstract

This article describes Team Delft's robot winning the Amazon Robotics Challenge 2016. The competition involves automating pick and place operations in semi-structured environments, specifically the shelves in an Amazon warehouse.
Team Delft's entry demonstrated that current robot technology can already address most of the challenges in product handling: object recognition, grasping, motion, or task planning; under broad yet bounded conditions. The system combines an industrial robot arm, 3D cameras and a custom gripper. The robot's software is based on the Robot Operating System to implement solutions based on deep learning and other state-of-the-art artificial intelligence techniques, and to integrate them with off-the-shelf components.
From the experience developing the robotic system it was concluded that: 1) the specific task conditions should guide the selection of the solution for each capability required, 2) understanding the characteristics of the individual solutions and the assumptions they embed is critical to integrate a performing system from them, and 3) this characterization can be based on `levels of robot automation'. This paper proposes automation levels based on the usage of information at
design or runtime to drive the robot's behaviour, and uses them to discuss Team Delft's design solution and the lessons learned from this robot development experience.
Original languageEnglish
Pages (from-to)4916-4926
JournalIEEE Transactions on Industrial Informatics
Volume14
Issue number11
DOIs
Publication statusPublished - 2018

Fingerprint

Dive into the research topics of 'Integrating different levels of automation: Lessons from winning the Amazon Robotics Challenge 2016'. Together they form a unique fingerprint.

Cite this