Real-time Perception meets Reactive Motion Generation




We address the challenging problem of robotic grasping and manipulation in the presence of uncertainty. This uncertainty is due to noisy sensing, inaccurate models and hard-to-predict environment dynamics. Our approach emphasizes the importance of continuous, real-time perception and its tight integration with reactive motion generation methods. We present a fully integrated system where real-time object and robot tracking as well as ambient world modeling provides the necessary input to feedback controllers and continuous motion optimizers. Specifically, they provide attractive and repulsive potentials based on which the controllers and motion optimizer can online compute movement policies at different time intervals. We extensively evaluate the proposed system on a real robotic platform in four scenarios that exhibit either challenging workspace geometry or a dynamic environment. We compare the proposed integrated system with a more traditional sense-plan-act approach that is still widely used. In 333 experiments, we show the robustness and accuracy of the proposed system.

Author(s): Daniel Kappler and Franziska Meier and Jan Issac and Jim Mainprice and Garcia Cifuentes, Cristina and Manuel Wüthrich and Vincent Berenz and Stefan Schaal and Nathan Ratliff and Jeannette Bohg
Book Title: ArXiv
Year: 2017

Department(s): Autonomous Motion
Research Project(s): Real-Time Perception meets Reactive Motion Generation
Bibtex Type: Article (article)

State: Published

Links: arxiv


  title = {Real-time Perception meets Reactive Motion Generation},
  author = {Kappler, Daniel and Meier, Franziska and Issac, Jan and Mainprice, Jim and Garcia Cifuentes, Cristina and W{\"u}thrich, Manuel and Berenz, Vincent and Schaal, Stefan and Ratliff, Nathan and Bohg, Jeannette},
  journal = {},
  booktitle = {ArXiv},
  year = {2017}