Title | Automatic Selection of Task Spaces for Imitation Learning |
Publication Type | Conference Paper |
Year of Publication | 2009 |
Authors | Muehlig M, Gienger M, Steil JJ, Goerick C |
Conference Name | IEEE International Conference on Intelligent Robots and Systems |
Pagination | 4996–5002 |
Publisher | IEEE |
Abstract | Previous work [1] shows that the movement representation in task spaces offers many advantages for learning object-related and goal-direct movement tasks through imitation. It allows to reduce the dimensionality of the data that is learned and simplifies the correspondence problem that results from different kinematic structures of teacher and robot. Further, the task space representation provides a first generalization, for example wrt. differing absolute positions, if bi-manual movements are represented in relation to each other. Although task spaces are widely used, even if they are not mentioned explicitly, they are mostly defined a priori. This work is a step towards an automatic selection of task spaces. Observed movements are projected into a pool of possibly even conflicting task spaces and we present methods that analyze this task space pool in order to acquire task space descriptors that match the observation best. As statistical measures cannot explain importance for all kinds of movements, the presented selection scheme incorporates additional criteria such as an attention-based measure. Further, we introduce methods that make a significant step from purely statistically-driven task space selection towards model-based movement analysis using a simulation of a complex human model. Effort and discomfort of the human teacher is being analyzed and used as a hint for important task elements. All methods are validated with realworld data, gathered using color tracking with a stereo vision system and a VICON motion capturing system. |
DOI | 10.1109/iros.2009.5353894 |