The original group in what is now the ORI, the Mobile Robotics Group is about building robots and systems which answer “where am I and what surrounds me?” On real vehicles in real-hard places.
It has a proud systems heritage, taking on ambitious field deployments which shine a light on what doesn’t work and has to be fixed by smart application of machine learning, AI and robotics expertise.
Keep off the Grass: Permissible Driving Routes from Radar with Weak Audio Supervision Abstract – Reliable outdoor deployment of mobile robots requires the robust identification of permissible driving routes in a given environment. The performance of LiDAR and vision-based [...]
Sense-Assess-eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios
Sense-Assess-eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios Abstract – This paper discusses ongoing work in demonstrating research in mobile autonomy in challenging driving scenarios. In our approach, we address fundamental technical issues to overcome [...]
RSS-Net: Weakly-Supervised Multi-Class Semantic Segmentation with FMCW Radar Abstract – This paper presents an efficient annotation procedure and an application thereof to end-to-end, rich semantic segmentation of the sensed environment using FMCW scanning radar. We advocate radar over the traditional [...]
LiDAR Lateral Localisation Despite Challenging Occlusion from Traffic This paper presents a system for improving the robustness of LiDAR lateral localisation systems. This is made possible by including detections of road boundaries which are invisible to the sensor (due [...]
Look Around You: Sequence-based Radar Place Recognition with Learned Rotational Invariance Abstract - This paper details an application which yields significant improvements to the adeptness of place recognition with Frequency-Modulated Continuous-Wave radar - a commercially promising sensor poised for [...]