Biography
Maurice Fallon (IEEE, Senior Member) is an Associate Professor in Engineering Science and a Royal Society University Research Fellow. He leads the Dynamic Robot Systems Group (Perception). You will find more information about his research on the DRS website.
22 November 2024 - PhD/DPhil Admissions Information:
I have two funded studentships for starting in October 2025 - the details are here
His research is focused on probabilistic methods for localization and mapping. He has also made research contributions to state estimation for legged robots and is interested in dynamic motion planning and control. Of particular concern is developing methods which are robust in the most challenging situations by leveraging sensor fusion.
Dr. Fallon studied Electronic Engineering at University College Dublin. His PhD research in the field of acoustic source tracking was carried out in the Engineering Department of the University of Cambridge.
Immediately after his PhD he moved to MIT as a post-doc and later research scientist in the Marine Robotics Group (2008-2012). From 2012-2015 he was the perception lead of MIT’s team in the DARPA Robotics Challenge – a multi-year competition developing technologies for semi-autonomous humanoid exploration and manipulation in disaster situations.
After a period as a Lecturer at University of Edinburgh, he moved to Oxford and took up the Royal Society University Research Fellowship in October 2017.
He has been PI/co-I on several large UK and EU collaborative projects including ORCA, RAIN, THING, MEMMO as well as the DARPA SubT Challenge winning team CERBERUS. Current ongoing projects include the EU Horizon Europe project DigiForest as well as collaborations with UKAEA (RACE).
Most Recent Publications
Observability-aware online multi-lidar extrinsic calibration
Observability-aware online multi-lidar extrinsic calibration
Towards real-time forest inventory using handheld LiDAR
Towards real-time forest inventory using handheld LiDAR
Strategies for large scale elastic and semantic LiDAR reconstruction
Strategies for large scale elastic and semantic LiDAR reconstruction
VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Most Recent Publications
Observability-aware online multi-lidar extrinsic calibration
Observability-aware online multi-lidar extrinsic calibration
Towards real-time forest inventory using handheld LiDAR
Towards real-time forest inventory using handheld LiDAR
Strategies for large scale elastic and semantic LiDAR reconstruction
Strategies for large scale elastic and semantic LiDAR reconstruction
VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry