VILENS – Visual-Inertial Odometry for Legged Robots

VILENS (Visual Inertial Legged Navigation System) is a factor-graph based odometry algorithm for legged robots that fuses leg odometry, vision, and IMU data. This algorithm was designed by David Wisth, Marco Camurri, and Maurice Fallon at the Oxford Robotics Institute (ORI). The papers describing this work are listed below.

VILENS is entirely ROS-based, uses GTSAM as a back end optimiser and achieves results equivalent to VINS-Mono and OKVIS on the EUROC datasets with specific modifications to incorporate legged robot odometry. Our vision front-end uses a consumer grade RealSense D435i. If you are interested in using VILENS on your walking robot please contact the authors.

RAL/IROS 2019: Robust Legged Robot State Estimation Using Factor Graph Optimization [1]

Abstract Legged robots, specifically quadrupeds, are becoming increasingly attractive for industrial applications such as inspection. However, to leave the laboratory and to become useful to an end user requires reliability in harsh conditions. From the perspective of state estimation, it is essential to be able to accurately estimate the robot’s state despite challenges such as uneven or slippery terrain, textureless and reflective scenes, as well as dynamic camera occlusions. We are motivated to reduce the dependency on foot contact classifications, which fail when slipping, and to reduce position drift during dynamic motions such as trotting. To this end, we present a factor graph optimization method for state estimation which tightly fuses and smooths inertial navigation, leg odometry and visual odometry. The effectiveness of the approach is demonstrated using the ANYmal quadruped robot navigating in a realistic outdoor industrial environment. This experiment included trotting, walking, crossing obstacles and ascending a staircase. The proposed approach decreased the relative position error by up to 55% and absolute position error by 76% compared to kinematic-inertial odometry.

arXiv: Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry [2]

AbstractIn this paper, we present a novel factor graph formulation to estimate the pose and velocity of a quadruped robot on slippery and deformable terrains. The factor graph includes a new type of preintegrated velocity factor that incorporates velocity inputs from leg odometry. To accommodate for leg odometry drift, we extend the robot’s state vector with a bias term for this preintegrated velocity factor. This term incorporates all the effects of unmodeled uncertainties at the contact point, such as slippery or deformable grounds and leg flexibility. The bias term can be accurately estimated thanks to the tight fusion of the preintegrated velocity factor with stereo vision and IMU factors, without which it would be unobservable. The system has been validated on several scenarios that involve dynamic motions of the ANYmal robot on loose rocks, slopes and muddy ground. We demonstrate a 26% improvement of relative pose error compared to our previous work and 52% compared to a state-of-the-art proprioceptive state estimator.

References:

  • [1] D. Wisth, M. Camurri, and M. Fallon, “Robust Legged Robot State Estimation Using Factor Graph Optimization,” IEEE Robotics and Automation Letters, vol. 4, no. 4, pp. 4507-4514, 2019. [arXiv]
  • [2] D. Wisth, M. Camurri, and M. Fallon, “Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry,” arXiv preprint arXiv:1910.09875, 2019. [arXiv]

Dataset

We make available a dataset of our ANYmal robot operating in a realistic industrial environment, Fire Service College, Moreton-in-Marsh, UK.

Details:

  • 350 second, 2GB log file. A rosbag with standard format topics
  • IMU and Kinematic sensing from the core joints of the robot at 400Hz. The IMU is an Xsens IMU, joint sensing is joint position, velocity and torque. The robot has no contact sensing.
  • Stereo camera sensing of a forward looking RealSense D435i including its IMU sensing. This is two grayscale IR cameras with the IR projector disabled. The sensing was captured using an improved driver from ANYbotics which produces well sensing which is well synchronised with the rest of the robot.
  • The dataset also has ground truth from a Leica TS-16 Tracking sensor at 10Hz

Download the log file here

We have many other, longer datasets. If you are interested in more logs, please get in touch