Dynamic Robot Systems Group researches control, motion planning, mapping and navigation for dynamically moving robots. Our work has application to walking, flying and mobile robots. Here’s a playlist of our recent research:
We’ve added a new webpage describing our research and involvement in the Cerberus team competing in the DARPA SubT Challenge which runs from 2018-2021. We are preparing multi-robot SLAM, exploration-based planning and testing our approach using our ANYmal C robot and a tracked robot made by our collaborator Ross Robotics.
Two papers from the group will appear at IROS in October:
Russell Buchanan’s work on contact localisation is part of our research into using haptics for innovative locomotion with the ANYmal. The method works in about the same way as a person behaves when using touch to find his/her way at night in the dark. Read the paper. Watch the video.
Milad Ramezani led a major effort in our group to create a dataset with 3D LIDAR (Ouster 64-beam), IMU and stereo image sensing (Realsense) scanning New College here in Oxford. It’s really useful for SLAM research because it includes precise ground truth. Project Website including paper.
We carried out a trial on the High Speed 2 (London to Birmingham) construction site in London as part of the MEMMO EU Project.
We’ve had 6 papers accepted for ICRA/RA-L in Paris in May 2020. Topics range from efficient learning of locomotion, active mapping, SLAM, path planning, state estimation and trajectory optimisation. All of the papers have real world experiments with our ANYmal robot.
Three papers about our quadruped research have recently been accepted at TAROS and IROS:
Work led by Mathieu Geisert with LAAS to adapt the HPP footstep planner to work with ANYmal. Appearing at TAROS in July. The method can plan several steps ahead on complex terrains very quickly as well as choosing selecting the step sequence. Read the paper. Watch the video.
A paper on tight Visual-Inertial Navigation (VINS) which also includes Leg Odometry by David Wisth and Marco Camurri. We call it VILENS. The work was heavily tested in real outdoor environments and appears at IROS in November. Read the paper. Watch the video.
Quadrupeds have a variety of gaits (such as walking and trotting), some of which assume the world to be flat and others plan over rough terrain using elevation maps. Martim’s work (also at IROS) automatically chooses between gaits when given an input map. Read the paper.
We co-organized the workshop “Towards Real-World Deployment of Legged Robots” at ICRA. It was attended by over 200 people and included a demo of 5 rugged and highly mobile quadrupeds – including ANYmal. This demonstrates that quadruped robots are fast becoming a commercial reality with ANYbotics and Boston Dynamics holding industrial demos and trials.
We have an open Post Doc position in Visual Navigation and Teach and Repeat. More details are here:
We hosted the AIMS PhD Programme’s 1st year class for a week long introduction to walking robots. David gave a demo of ANYmal’s capabilities and the winners took home some apt trophies 🙂
We had a successful field trial for the ORCA project. Over 50 researchers from Oxford, Edinburgh, Heriot-Watt, Liverpool Universities and Imperial College carried out trials. Our ANYmal was the focus of the day with a 45 minute demo of walking robot inspection. Read our overview article.
We tested autonomous navigation at recent field trials at the Fire Service College (a training centre for firefighters) for the EPSRC ORCA Robotics Hub.
In late 2019, ORI acquired a second legged robot to join ANYmal, the primary platform for DRS over the last few years. Vision 60 is a rugged quadruped robot developed by Ghost Robotics, a spin-off of the University of Pennsylvania (UPenn) based in Philadelphia.
On Tuesday the 26th of November, Oxford Robotics Institute visited RACE to test out their ANYmal at the mobility benchmarking lanes. This is part of our RAIN collaboration to test the reliability of different [...]
VILENS - Visual-Inertial Odometry for Legged Robots VILENS (Visual Inertial Legged Navigation System) is a factor-graph based odometry algorithm for legged robots that fuses leg odometry, vision, and IMU data. This algorithm was designed by David [...]
In a cold, wet week at the beginning of October, a team from Oxford Robotics Institute headed north to Blyth, near Newcastle, for the ORCA Hub demonstrations. A large warehouse, normally used for testing [...]
Last week a group from ORI attended the Towards Autonomous Robotic Systems (TAROS) 2019 Conference. The conference this year was held at the newly build Mile End campus of the Queen Mary University of London. [...]