Classic sensor systems for mobile robotic platforms depend primarily on vision and lidar sensors, incorporating the GPS and IMU for additional robustness. Although radar has been well studied for target detection over the past century, its role in mobile robotics has been limited. Radars have mostly served as warning sensors, alerting autonomous systems when an object is very near, or as velocity sensors that utilise the Doppler effect. These applications of radar, while effective, do not take full advantage of the rich information encoded in its output.
One of the directions ORI intends to pursue is the incorporation of radar more thoroughly into mobile autonomy. Overlooked due its noisy returns as well as its lower spatial and temporal resolution when compared to lidar, radar is a promising alternative sensor that complements the current sensor suite. It is an on-board, long-range, 2.5D sensor that is robust to adverse conditions. These characteristics mean that, among the other benefits that it offers, radar is capable of detecting stable environmental features under variable weather and lighting, making it ideal for tasks like ego-motion estimation, localisation, path planning, and obstacle avoidance.
To substantiate these efforts, we have demonstrated precise ego-motion estimation under diverse conditions (e.g., in the dark and rain, indoors and outdoors) using only an frequency-modulated continuous-wave (FMCW) scanning radar. The radar odometry (RO) estimates closely match those of state-of-the-art GPS and visual odometry (VO) systems under ideal conditions for the latter two sensors. Furthermore, radar operates well even when the other two fail, as shown in the figure below.
These results and the accompanying methodology are included in the paper by Sarah Huiyi Cen and Paul Newman, titled “Precise Ego-Motion Estimation with Millimeter-Wave Radar under Diverse and Challenging Conditions“,that was presented at ICRA 2018. A summary video is provided below.