Our new all-weather platform pictured outside Blenheim Palace. For more information please take a look at the paper as well as the presentation.

This blog post provides an overview of our paper which was recently presented by Stephen Kyberd at the 12th Conference on Field and Service Robotics, Tokyo, Japan – “The Hulk: Design and Development of a Weather-proof Vehicle for Long-term Autonomy in Outdoor Environments” – as well as the ongoing work by our engineers and researchers in deploying this exciting new platform in challenging conditions and places.

  • [PDF] S. Kyberd, J. Attias, P. Get, P. Murcutt, C. Prahacs, M. Towlson, S. Venn, A. Vasconcelos, M. Gadd, D. De Martini, and P. Newman, “The Hulk: Design and Development of a Weather-proof Vehicle for Long-term Autonomy in Outdoor Environments,” in International Conference on Field and Service Robotics (FSR), Tokyo, Japan, 2019.
    [Bibtex]
    @InProceedings{2019FSR_kyberd,
    author = {Kyberd, Stephen and Attias, Jonathan and Get, Peter and Murcutt, Paul and Prahacs, Chris and Towlson, Matthew and Venn, Simon and Vasconcelos, Andreia and Gadd, Matthew and De Martini, Daniele and Newman, Paul},
    title = {{The Hulk: Design and Development of a Weather-proof Vehicle for Long-term Autonomy in Outdoor Environments}},
    booktitle = {{International Conference on Field and Service Robotics (FSR)}},
    year = {2019},
    address = {Tokyo, Japan},
    month = {August},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2019FSR_kyberd.pdf},
    }

Why have we built this new platform?

At the ORI, we commonly approach robotics by:

  1. Isolating key questions when fielding complex systems and
  2. Augmenting or inventing new techniques to solve the problem.

In the past we have had great success with driverless cars – self-driving vehicles equipped with ORI-developed autonomy software were tested successfully in public for the first time in the UK. In this project, however, we wanted to push forward autonomy to unlock its potential and test full L5 autonomy not just for driverless cars. We wanted a rugged platform which is robust to all weather conditions, could drive in places we have not typically seen robots drive before, and which could operate for many hours with minimal expert supervision.

What were our design principles?

Long-term autonomous operation: A key requirement we had of our new platform was that the vehicle should be able to operate for long periods – weeks or months – reliably and unattended, except for being put on charge. We want it to perform tasks like inspection and transport in diverse and unstructured environments and weather conditions.

Terrain capabilities: To enable the widest possible scope for deployment we wanted the vehicle to have little or no impact on the surfaces it drives on. This is essential in getting permission from landowners for the new platform to drive on manicured lawns (see the picture shown above).

Simplicity of use and Human safety: The safety of humans and objects around the platform was one of our primary concerns. The goal was that non-technical users should be able to operate and work alongside the new platform with minimal prior knowledge.

What’s the sensor payload?

We recognise of course that many of the challenges in mobile autonomy have been met with great success by cameras and LiDARs. To this end, our sensor payload includes stereo and monocular vision to enable research into dealing with challenging environmental and weather conditions. We also use LiDARs within our software-safety-curtain (SCS) to gently slow the platform down when approaching obstacles.

Human Robot Interaction with the new platform is easy! Our remote monitoring system shown here includes sensor visualisations (camera, laser) as well as the autonomy state and plans that the robot is executing.

However, to increase the new platform’s resilience and reliability in all conditions, the main sensor for perception is a CTS350-X Navtech Frequency-Modulated Continuous-Wave (FMCW) scanning radar.

Why radar?

Compared to cameras and LiDARs, radar suffers far fewer ill effects from adverse weather conditions such as rain, fog, or snow. As an institute, we have produced a lot of recent research proving that radar can be used for obstacle detection, odometry and localisation. Here are some of those research items:

  • [PDF] S. H. Cen and P. Newman, “Precise Ego-Motion Estimation with Millimeter-Wave Radar under Diverse and Challenging Conditions,” in Proceedings of the 2018 IEEE International Conference on Robotics and Automation, Oxford, UK, 2018.
    [Bibtex]
    @Article{2018ICRA_cen,
    author = {Sarah H. Cen and Paul Newman},
    title = {Precise Ego-Motion Estimation with Millimeter-Wave Radar under Diverse and Challenging Conditions},
    journal = {Proceedings of the 2018 IEEE International Conference on Robotics and Automation},
    year = {2018},
    address = {Oxford, UK},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2018ICRA_cen.pdf},
    }
  • [PDF] S. Cen and P. Newman, “Radar-only ego-motion estimation in difficult settings via graph matching,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019.
    [Bibtex]
    @InProceedings{2019ICRA_cen,
    author = {Cen,Sarah and Newman, Paul},
    title = {Radar-only ego-motion estimation in difficult settings via graph matching},
    booktitle = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada},
    year = {2019},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2019ICRA_cen.pdf},
    }
  • [PDF] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “Fast Radar Motion Estimation with a Learnt Focus of Attention using Weak Supervision,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019.
    [Bibtex]
    @InProceedings{2019ICRA_aldera,
    author = {Aldera, Roberto and De Martini, Daniele and Gadd, Matthew and Newman, Paul},
    title = {Fast Radar Motion Estimation with a Learnt Focus of Attention using Weak Supervision},
    booktitle = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada},
    year = {2019},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2019ICRA_aldera.pdf},
    }
  • [PDF] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “What Could Go Wrong? Introspective Radar Odometry in Challenging Environments,” in IEEE Intelligent Transportation Systems (ITSC) Conference, Auckland, New Zealand, 2019.
    [Bibtex]
    @InProceedings{2019ITSC_aldera,
    author = {Aldera, Roberto and De Martini, Daniele and Gadd, Matthew and Newman, Paul},
    title = {{What Could Go Wrong? Introspective Radar Odometry in Challenging Environments}},
    booktitle = {{IEEE Intelligent Transportation Systems (ITSC) Conference}},
    year = {2019},
    address = {Auckland, New Zealand},
    month = {October},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2019ITSC_aldera.pdf},
    }
  • [PDF] R. Weston, S. Cen, P. Newman, and I. Posner, “Probably Unknown: Deep Inverse Sensor Modelling Radar.” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019 2019.
    [Bibtex]
    @InProceedings{ICRA19_weston,
    author = {Weston, Rob and Cen, Sarah and Newman, Paul and Posner, Ingmar},
    title = {Probably Unknown: Deep Inverse Sensor Modelling Radar},
    year = {2019},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/ICRA19_weston.pdf},
    journal = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019},
    }
  • .” .
    [Bibtex]
    @article{MaskingByMovingArXiv,
    % author = {Barnes, Dan and Weston, Rob and Posner, Ingmar},
    % title = {Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information},
    % journal = {arXiv preprint arXiv: 1909.03752},
    % url = {https://arxiv.org/pdf/1909.03752},
    % pdf = {https://arxiv.org/pdf/1909.03752.pdf},
    % year = {2019}
    %}

We are also inspired by the recently released Oxford Radar RobotCar Dataset advocating for the increased exploitation of this class of radar for autonomous vehicle research,

  • [PDF] D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset.” in arXiv preprint arXiv: 1909.01300 2019 (link).
    [Bibtex]
    @Article{RadarRobotCarDatasetArXiv,
    author = {Barnes, Dan and Gadd, Matthew and Murcutt, Paul and Newman, Paul and Posner, Ingmar},
    title = {The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset},
    journal = {arXiv preprint arXiv: 1909.01300},
    year = {2019},
    pdf = {https://arxiv.org/pdf/1909.01300.pdf},
    url = {https://arxiv.org/pdf/1909.01300},
    }

and hope to release interesting radar datasets collected with our new platform.

What are we doing with the platform?

This new platform will take an important role in ORI’s ongoing long-term autonomy efforts, navigating completely unmanned for days at a time. We are aiming to begin a long-term deployment of the platform in wide-open spaces frequently occupied by people in the near future, where the video below shows the platform in action during our initial scout of the Blenheim Palace site. Concretely, in addition to ongoing work around human robot interaction, we are pushing for full closed-loop autonomy using radar inputs alone, in order to make the future operation of this platform immune to poor weather conditions.