Mixture of Attractors: A novel movement primitive representation for learning motor skills from demonstrations

Simon Manschitz*, Michael Gienger, Jens Kober, Jan Peters

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

4 Citations (Scopus)
2 Downloads (Pure)

Abstract

In this letter, we introduce Mixture of Attractors, a novel movement primitive representation that allows for learning complex object-relative movements. The movement primitive representation inherently supports multiple coordinate frames, enabling the system to generalize a skill to unseen object positions and orientations. In contrast to most other approaches, a skill is learned by solving a convex optimization problem. Therefore, the quality of the skill does not depend on a good initial estimate of parameters. The resulting movements are automatically smooth and can be of arbitrary shape. The approach is evaluated and compared to other movement primitive representations on data from the Omniglot handwriting dataset and on real demonstrations of a handwriting task. The evaluations show that the presented approach outperforms other state-of-the-art concepts in terms of generalization capabilities and accuracy.

Original languageEnglish
Pages (from-to)926-933
JournalIEEE Robotics and Automation Letters
Volume3
Issue number2
DOIs
Publication statusPublished - 2018

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Keywords

  • learning and adaptive systems
  • Learning from demonstration
  • motion control

Fingerprint

Dive into the research topics of 'Mixture of Attractors: A novel movement primitive representation for learning motor skills from demonstrations'. Together they form a unique fingerprint.

Cite this