Dynamic Robot Systems Group
Dynamic Robot Systems Group researches control, motion planning, mapping and navigation for dynamically moving robots. Our work has application to legged robots, autonomous vehicles and handheld mapping systems.
Brief News ...
8 papers from DRS will appear at ICRA/RAL this year. This includes collaborations with the A2I group here in Oxford, Poznan University of Technology (Poland), KTH and Scania (Sweden). Papers and videos are listed below:
- Where Should I Look? Optimised Gaze Control for Whole-Body Collision Avoidance in Dynamic Environments, M. N. Finean, W. Merkt and I. Havoutis, IEEE Robotics and Automation Letters (RAL). pdf / video
- Semantically Grounded Object Matching for Robust Robotic Scene Rearrangement, Walter Goodwin, Sagar Vaze, Ioannis Havoutis, Ingmar Posner, IEEE International Conference on Robotics and Automation (ICRA). pdf / video
- Next Steps: Learning a Disentangled Gait Representation for Versatile Quadruped Locomotion, Alexander Luis Mitchell, Wolfgang Xaver Merkt, Mathieu Geisert, Siddhant Gangapurwala, Martin Engelcke, Oiwi Parker Jones, Ioannis Havoutis, Ingmar Posner, IEEE International Conference on Robotics and Automation (ICRA). pdf / video
- Reaching Through Latent Space: From Joint Statistics to Path Planning in Manipulation, Chia-Man Hung, Shaohong Zhong, Walter Goodwin, Oiwi Parker Jones, Martin Engelcke, Ioannis Havoutis, Ingmar Posner, IEEE Robotics and Automation Letters (RAL). pdf
- Unsupervised Learning of Terrain Representations for Haptic Monte Carlo Localization, Mikołaj Łysakowski, Michał R. Nowicki, Russell Buchanan, Marco Camurri, Maurice Fallon, Krzysztof Walas, IEEE International Conference on Robotics and Automation (ICRA). pdf / video
- Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry, Lintong Zhang, David Wisth, Marco Camurri, Maurice Fallon, IEEE Robotics and Automation Letters (RAL). pdf / video
- An Efficient Locally Reactive Controller for Safe Navigation in Visual Teach and Repeat Missions, Matias Mattamala, Nived Chebrolu, Maurice Fallon, IEEE Robotics and Automation Letters (RAL). pdf / video
- Extrinsic Calibration and Verification of Multiple Non-overlapping Field of View Lidar Sensors, Sandipan Das, Navid Mahabadi, Addi Djikic, Cesar Nassir, Saikat Chatterjee, Maurice Fallon, IEEE International Conference on Robotics and Automation (ICRA). pdf / video
We have been involved in a collaboration with University of Bristol on nuclear radiation mapping of the Chernobyl Nuclear Reactor in Ukraine. The research team of Prof. Tom Scott visited the facility and used our Frontier mapping device to build 3D maps of reactor buildings, the control room and waste storage facilities. We are delighted to be able to contribute to this important work.
Our multi-sensor SLAM system, VILENS, has won a prestigous mapping challenge (academic track) organised by HILTI, a construction technology company. More details of our approach and results is on the challenge website.
Team Cerberus wins the DARPA SubT Challenge and the grand prize! After 3 difficult and challenge years and many trips to mines, power plants and other test locations in place such as Pittsburgh, Washington State, Switzerland, Norway and Wiltshire (UK); we saw our team win the final nail biting conclusion to the challenge in Louisville Mega Cavern in Kentucky.
Unfortunately, our ANYmal robot, Coyote, did not make it to Kentucky due to Covid restrictions on getting visas on US entry but we could still cheer on our collaborators from NTNU Norway and ETH Zurich. A video clip of the ANYmal robots exploring the finals course is here:
David Wisth, Marco Camurri and Maurice Fallon's paper at the International Conference on Robotics and Automation (ICRA) was nominated for best student paper award. The paper was entitled "Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry". Congradulations to you all. You can read about their system, called VILENS, here.
PhD student Siddhant Gangapurwala had a successful research visit to Korea Advanced Institute of Science and Technology (KAIST) this spring. He visited the RAI Lab of Prof. Jemin Hwangbo and worked on a novel controller for quadrupeds which works with different robot sizes/masses without retraining. Experiments were carried out on KAIST's copy of Mini Cheetah.
We have had 10 papers accepted for ICRA/RAL (including collaborations)! Full details and final versions are to follow. Several of the papers are listed on our publications page. Topics span trajectory optimisation, learning for locomotion, 3D reconstruction, visual teach-and-repeat and multi-sensor odometry.
Final versions, video and a short summary are presented here.
We’ve added a new webpage describing our research and involvement in the Cerberus team competing in the DARPA SubT Challenge which runs from 2018-2021. We are preparing multi-robot SLAM, exploration-based planning and testing our approach using our ANYmal C robot and a tracked robot made by our collaborator Ross Robotics.
Our project to precisely map New College has been featured on the college’s own website.
Three papers from the group will appear at IROS in October:
- Alex Mitchell presents his first research paper on using latent space representations to learn quadrupedal locomotion. The paper's video shows intuitive cyclic behaviour in as the robot's gait moves through the representation. This work was in collaboration with ORI's A2I Group.
- Russell Buchanan’s work on contact localisation is part of our research into using haptics for innovative locomotion with the ANYmal. The method works in about the same way as a person behaves when using touch to find his/her way at night in the dark. Read the paper. Watch the video.
- Milad Ramezani led a major effort in our group to create a dataset with 3D LIDAR (Ouster 64-beam), IMU and stereo image sensing (Realsense) scanning New College here in Oxford. It’s really useful for SLAM research because it includes precise ground truth. Project Website including paper.
Recent experiments in an nearby underground mine as preparation for the DARPA SubT Challenge.
We carried out a trial on the High Speed 2 (London to Birmingham) construction site in London as part of the MEMMO EU Project.
We’ve had 6 papers accepted for ICRA/RA-L in Paris in May 2020. Topics range from efficient learning of locomotion, active mapping, SLAM, path planning, state estimation and trajectory optimisation. All of the papers have real world experiments with our ANYmal robot. Pre-prints of the papers are on our publications page. Here is a video sampler of all the papers:
Additionally, two new members of DRS have papers from their previous research groups:
Brief News 2019
We have created a project page for VILENS, our legged robot state estimator. It has been heavily tested on kilometres of logs from our ANYmal robot. More details here
Three papers about our quadruped research have recently been accepted at TAROS and IROS:
- Work led by Mathieu Geisert with LAAS to adapt the HPP footstep planner to work with ANYmal. Appearing at TAROS in July. The method can plan several steps ahead on complex terrains very quickly as well as choosing selecting the step sequence. Read the paper. Watch the video.
- A paper on tight Visual-Inertial Navigation (VINS) which also includes Leg Odometry by David Wisth and Marco Camurri. We call it VILENS. The work was heavily tested in real outdoor environments and appears at IROS in November. Read the paper. Watch the video.
- Quadrupeds have a variety of gaits (such as walking and trotting), some of which assume the world to be flat and others plan over rough terrain using elevation maps. Martim’s work (also at IROS) automatically chooses between gaits when given an input map. Read the paper.
We co-organized the workshop “Towards Real-World Deployment of Legged Robots” at ICRA. It was attended by over 200 people and included a demo of 5 rugged and highly mobile quadrupeds – including ANYmal. This demonstrates that quadruped robots are fast becoming a commercial reality with ANYbotics and Boston Dynamics holding industrial demos and trials.
Two papers from DRS will appear at ICRA this year:
- Georgi’s paper is about using deep learning in LIDAR localisation. An article on Georgi Tinchev’s paper can be read here.
- Christian’s paper is about passively estimating the state of a robot arm during manipulation.
- You can download the papers from here: https://ori.ox.ac.uk/labs/drs/drs-publications/
We hosted the AIMS PhD Programme’s 1st year class for a week long introduction to walking robots. David gave a demo of ANYmal’s capabilities and the winners took home some apt trophies 🙂
Brief News 2018
We had a successful field trial for the ORCA project. Over 50 researchers from Oxford, Edinburgh, Heriot-Watt, Liverpool Universities and Imperial College carried out trials. Our ANYmal was the focus of the day with a 45 minute demo of walking robot inspection. Read our overview article.
Georgi Tinchev’s paper will appear at IROS next month “Seeing the Wood for the Trees: Reliable Localization in Urban and Natural Environments”. The paper involves localising in natural environments (forests) using LIDAR. The work adapts Segmatch from Renaud Dubé and Cesar Cadena.
Here’s a project page about the NSM algorithm.
Phew! It’s been a busy couple of months in DRS. Both EU projects have kicked off. Each now has a website and twitter feed:
As part of the THING and MEMMO projects we have 3 post-doc positions available across the research areas of dynamic locomotion:
- Motion Planning and Control: planning dynamic motions in challenge environments. The MEMMO project is about pre-computing high dimensional trajectories to improve thereactivity of (model predicative) control.
- Tactile Estimation and Active Control: The THING project is about improving everything tactile and haptic during quadruped walking & contact – materials, sensing, control, planning and estimation.
- Reconstruction and Mapping: building efficient reliable navigation systems to demonstrate quadrupeds doing inspection – fusing inertial sensing and vision.
Feel free to contact Maurice or Ioannis for more information. We are making a major investment in walking robots in ORI including purchasing a copy of Anymal – an awesome quadruped robot from the RSL Lab in ETH. We will have lab space a new building in late 2018. Oxford is a wonderful place to live and work, come join us!
DRS has had some great success with our ICRA submissions this year. The following 4 papers will be presented in Brisbane in May:
- M. J. A. Zeestraten, I. Havoutis, and S. Calinon, ‘Programming by Demonstration for Shared Control with an Application in Teleoperation’. (also appearing in the RA-L Journal). pdf
- S. Nobili, G. Tinchev, and M. Fallon, ‘Predicting Alignment Risk to Prevent Localization Failure’. pdf. video
- C. Rauch, T. Hospedales, J. Shotton, and M. Fallon, ‘Visual Articulated Tracking in the Presence of Occlusions’. pdf. video
- R. Scona, M. Jaimez, Y. R. Petillot, M. Fallon, and D. Cremers, ‘StaticFusion: Background Reconstruction for Dense RGB-D SLAM in Dynamic Environments’. pdf. video
These works involve collaborations with IIT and IDIAP, the Cremers lab in TUM, Jamie Shotton at Microsoft Hololens.
We are excited to say that two EU Projects in the area of walking robots have been funded and will begin in 2018:
- MEMMO will be led by Nicolas Mansard from LAAS (Toulouse). It will involve motion planning for dynamic legged robots. Partners include MPI (L. Rigetti), Edinburgh (S. Vijayakumar), IDIAP (S. Calinon) as well as PAL Robotics and Wandercraft.
- THING will be led by Michael Mistry from University of Edinburgh. THING will develop technology and algorithms in all areas of physical contact – such as contact sensing, navigation with contact sensors and better footstep placement. Developments will be centred around the state-of-the-art quadruped ANYmal developed by the Robot Systems Lab in ETH Zurich.
DRS have the following papers appearing in IROS 2017:
- M Zeestraten, I Havoutis, S Calinon and D Caldwell, “Learning Task-Space Synergies Using Riemannian Geometry”.
- R Scona, S Nobili, Y Petillot, M Fallon, “Direct Visual SLAM Fusing Proprioception for a Humanoid Robot”.
Pre-prints can be downloaded from the publications page.
We use @7Srobotics multi-camera, IMU and Hesai lidar. Real-world challenges include sensor occlusion, darkness, narrow corridors. Prize winners will be announced at @ieee_ras_icra Organized with @MichaelHelmber1 (@Hiltigroup) and @davsca1 (@UZH_en). #slam #robotics #bim #lidar https://t.co/DSl36VnuTt