Skip to main content
Menu

Oxford Robotics Institute | Robots - Hulk

Hulk robot in front of Blenheim Palace.

Hulk

The Hulk started life as a SK-48 Stalker, built by Mean Green Mowers, hence the moniker.

Developed with long-term autonomy in mind, it's built with a range of camera, laser and radar sensors.

Our new all-weather platform pictured outside Blenheim Palace. For more information please take a look at the  paper  as well as the  presentation.

This page provides an overview of our paper which was recently presented by Stephen Kyberd at the 12th Conference on Field and Service Robotics, Tokyo, Japan – “The Hulk: Design and Development of a Weather-proof Vehicle for Long-term Autonomy in Outdoor Environments” – as well as the ongoing work by our engineers and researchers in deploying this exciting new platform in challenging conditions and places.

  • [PDF] S. Kyberd, J. Attias, P. Get, P. Murcutt, C. Prahacs, M. Towlson, S. Venn, A. Vasconcelos, M. Gadd, D. De Martini, and P. Newman, “The Hulk: Design and Development of a Weather-proof Vehicle for Long-term Autonomy in Outdoor Environments,” in International Conference on Field and Service Robotics (FSR), Tokyo, Japan, 2019.
    [Bibtex]

Why have we built this new platform?

At the ORI, we commonly approach robotics by:

  • Isolating key questions when fielding complex systems and
  • Augmenting or inventing new techniques to solve the problem.

In the past we have had great success with driverless cars – self-driving vehicles equipped with ORI-developed autonomy software were tested successfully in public for the first time in the UK. In this project, however, we wanted to push forward autonomy to unlock its potential and test full L5 autonomy not just for driverless cars. We wanted a rugged platform which is robust to all weather conditions, could drive in places we have not typically seen robots drive before, and which could operate for many hours with minimal expert supervision.

What were our design principles?

Long-term autonomous operation: A key requirement we had of our new platform was that the vehicle should be able to operate for long periods – weeks or months – reliably and unattended, except for being put on charge. We want it to perform tasks like inspection and transport in diverse and unstructured environments and weather conditions.

Terrain capabilities: To enable the widest possible scope for deployment we wanted the vehicle to have little or no impact on the surfaces it drives on. This is essential in getting permission from landowners for the new platform to drive on manicured lawns (see the picture shown above).

Simplicity of use and Human safety: The safety of humans and objects around the platform was one of our primary concerns. The goal was that non-technical users should be able to operate and work alongside the new platform with minimal prior knowledge.

What’s the sensor payload?

We recognise of course that many of the challenges in mobile autonomy have been met with great success by cameras and LiDARs. To this end, our sensor payload includes stereo and monocular vision to enable research into dealing with challenging environmental and weather conditions. We also use LiDARs within our software-safety-curtain (SCS) to gently slow the platform down when approaching obstacles.


Human Robot Interaction with the new platform is easy! Our remote monitoring system shown here includes sensor visualisations (camera, laser) as well as the autonomy state and plans that the robot is executing.

However, to increase the new platform’s resilience and reliability in all conditions, the main sensor for perception is a CTS350-X Navtech Frequency-Modulated Continuous-Wave (FMCW) scanning radar.

Why radar?

Compared to cameras and LiDARs, radar suffers far fewer ill effects from adverse weather conditions such as rain, fog, or snow. As an institute, we have produced a lot of recent research proving that radar can be used for obstacle detection, odometry and localisation. Here are some of those research items:

  • [PDF] S. H. Cen and P. Newman, “Precise Ego-Motion Estimation with Millimeter-Wave Radar under Diverse and Challenging Conditions,” Proceedings of the 2018 IEEE International Conference on Robotics and Automation, 2018.
    [Bibtex]
  • [PDF] S. Cen and P. Newman, “Radar-only ego-motion estimation in difficult settings via graph matching,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019.
    [Bibtex]
  • [PDF] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “Fast Radar Motion Estimation with a Learnt Focus of Attention using Weak Supervision,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019.
    [Bibtex]
  • [PDF] R. Aldera, D. De Martini, M. Gadd, and P. Newman, “What Could Go Wrong? Introspective Radar Odometry in Challenging Environments,” in IEEE Intelligent Transportation Systems (ITSC) Conference, Auckland, New Zealand, 2019.
    [Bibtex]
  • [PDF] R. Weston, S. Cen, P. Newman, and I. Posner, “Probably Unknown: Deep Inverse Sensor Modelling Radar,” in IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada, 2019.
    [Bibtex]
  • [PDF] D. Barnes, R. Weston, and I. Posner, “Masking by Moving: Learning Distraction-Free Radar Odometry from Pose Information,” in Conference on Robot Learning (CoRL), 2019.
    [Bibtex]

We are also inspired by the recently released Oxford Radar RobotCar Dataset advocating for the increased exploitation of this class of radar for autonomous vehicle research,

and hope to release interesting radar datasets collected with our new platform.

What are we doing with the platform?

This new platform will take an important role in ORI’s ongoing long-term autonomy efforts, navigating completely unmanned for days at a time. We are aiming to begin a long-term deployment of the platform in wide-open spaces frequently occupied by people in the near future, where the video below shows the platform in action during our initial scout of the Blenheim Palace site. Concretely, in addition to ongoing work around human robot interaction, we are pushing for full closed-loop autonomy using radar inputs alone, in order to make the future operation of this platform immune to poor weather conditions.