Probabilistic Articulated Real-Time Tracking for Robot Manipulation
|Probabilistic Articulated Real-Time Tracking for Robot Manipulation
This videos demonstrates the performance of a probabilistic filtering method which fuses joint measurements with depth images to yield a precise, real-time estimate of the robot end-effector pose in the camera frame. Precision is achieved by modeling and correcting biases in the joint measurements as well as inaccuracies in the robot model, such as poor extrinsic camera calibration. We make our method computationally efficient through a principled combination of Kalman filtering of the joint measurements and asynchronous depth-image updates based on the Coordinate Particle Filter.
We quantitatively evaluate our approach on a dataset recorded from a real robotic platform, annotated with ground truth from a motion capture system. We show that our approach is robust and accurate even under challenging conditions such as fast motion, significant and long-term occlusions, and time-varying biases. We release the dataset along with open-source code of our approach to allow for quantitative comparison with alternative approaches.
The open source code and data set is available at https://github.com/bayesian-object-tracking.
The method is published in the following paper:
Probabilistic Articulated Real-Time Tracking for Robot Manipulation. Garcia Cifuentes, Cristina and Jan Issac and Manuel Wüthrich and Stefan Schaal and Jeannette Bohg. IEEE Robotics and Automation Letters (RA-L) 2017.
https://am.is.tuebingen.mpg.de/publications/garciacifuentes-ral
Probabilistic Articulated Real-Time Tracking for Robot Manipulation
Probabilistic Articulated Real-Time Tracking for Robot Manipulation