Understanding the decisions taken by an autonomous machine is key to building public trust in robotics and autonomous systems (RAS). This project will design, develop, and demonstrate fundamental AI technologies in real-world applications to address this issue of explainability.
The aim of the project is to build robots, or autonomous vehicles, that can:
- sense and fully understand their environment,
- assess their own capabilities,
- provide causal explanations for their own decisions
Motivated by the question: “How can we develop autonomous vehicles that can explain the decisions they take?”
In on-road and off-road driving scenarios, ORI are studying the requirements of explanations for key stakeholders (users, system developers, regulators). These requirements will inform the development of the algorithms that will generate the causal explanations.
The work will focus on scenarios in which the performance of traditional sensors (e.g cameras) significantly degrades or completely fails (e.g. in harsh weather conditions). The project will develop methods that can assess the performance of perception systems and adapt to environmental changes by switching to another sensor model or a different sensor modality. For the latter, alternative sensing devices will be investigated (incl. radar and acoustic sensors) which can guarantee robust perception in situations when traditional sensors fail.
Investigators: Paul Newman and Lars Kunze
Co-Researcher Investigators: Matt Gadd, Daniele De Martini, and Letizia Marchegiani
Project Website: https://www.york.ac.uk/assuring-autonomy/projects/sax/
- S. Saftescu, M. Gadd, D. De Martini, D. Barnes, and P. Newman, “Kidnapped Radar: Topological Radar Localisation using Rotationally-Invariant Metric Learning,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Paris, 2020.
- M. Gadd, D. De Martini, and P. Newman, “Look Around You: Sequence-based Radar Place Recognition with Learned Rotational Invariance,” in IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, OR, USA, 2020.
- P. Kaul, D. De Martini, M. Gadd, and P. Newman, “RSS-Net: Weakly-Supervised Multi-Class Semantic Segmentation with FMCW Radar,” in Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 2020.
- M. Gadd, D. De Martini, L. Marchegiani, L. Kunze, and P. Newman, “Sense-Assess-eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios,” in Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Workshop on Ensuring and Validating Safety for Automated Vehicles (EVSAV), 2020.
- D. Williams, D. De Martini, M. Gadd, L. Marchegiani, and P. Newman, “Keep off the Grass: Permissible Driving Routes from Radar with Weak Audio Supervision,” in IEEE Intelligent Transportation Systems Conference (ITSC), Rhodes, Greece, 2020.