Assisted teleoperation in changing environments with a mixture of virtual guides
Type of publication: | Journal paper |
Citation: | Ewerton_ADVANCEDROBOTICS_2020 |
Publication status: | Published |
Journal: | Advanced Robotics |
Volume: | 34 |
Number: | 18 |
Year: | 2020 |
Month: | July |
Pages: | 1157-1170 |
URL: | https://www.tandfonline.com/do... |
DOI: | 10.1080/01691864.2020.1785326 |
Abstract: | Haptic guidance is a powerful technique to combine the strengths of humans and autonomous systems for teleoperation. The autonomous system can provide haptic cues to enable the operator to perform precise movements; the operator can interfere with the plan of the autonomous system leveraging his/her superior cognitive capabilities. However, providing haptic cues such that the individual strengths are not impaired is challenging because low forces provide little guidance, whereas strong forces can hinder the operator in realizing his/her plan. Based on variational inference, we learn a Gaussian mixture model (GMM) over trajectories to accomplish a given task. The learned GMM is used to construct a potential field which determines the haptic cues. The potential field smoothly changes during teleoperation based on our updated belief over the plans and their respective phases. Furthermore, new plans are learned online when the operator does not follow any of the proposed plans or after changes in the environment. User studies confirm that our framework helps users perform teleoperation tasks more accurately than without haptic cues and, in some cases, faster. Moreover, we demonstrate the use of our framework to help a subject teleoperate a 7 DoF manipulator in a pick-and-place task. |
Keywords: | Gaussian Mixture Models, movement primitives, policy search, teleoperation, variational inference |
Projects |
Idiap |
Authors | |
Added by: | [UNK] |
Total mark: | 0 |
Attachments
|
|
Notes
|
|
|