Localisation answers the all important: “Where am I?” This is a fundamental requirement of mobile robots. To be useful mobile robots have to have a sense of place. Exactly how this question is framed and indeed how it is answered is an important facet of our work.
We place a great deal of emphasis on Infrastructure Free Navigation – that is figuring out where our vehicles (robots or cameras) are without having to modify the environment or depending on any bespoke hardware (like GPS) in the workspace.
Why is this so important? Well, if we can make our mobile machines independently capable, using only onboard sensors and computing, then they are inherently more flexible, more useful and, vitally, cheaper to use. This is important to us. Of course we have no problem using systems like GPS if it is available (so only in outdoor settings) but we will never depend on it entirely. That is too limiting.
In this research topic we examine large scale, long duration localisation, both indoors and outdoors, day and night, in rain, snow and even some sunshine. In doing this we come across profound questions regarding spatial representations, dealing with stark scene changes (from midnight to midday) and handling slowly changing structure.
We typically use cameras or 2D lasers as sensors as these are cheap and ubiquitous, but we have be known to use millimetre radar also.
This work is about metric localisation across extreme lighting and weather conditions. The typical approach in robot vision is to use a point-feature-based system for localisation tasks. However, these system typically fail when appearance changes are too drastic. This research takes a ... Read More
This work is about extending the reach and endurance of outdoor localisation using stereo vision. At the heart of the localisation is the fundamental task of discovering feature correspondences between recorded and live images. One aspect of this problem involves ... Read More
This work addresses the challenging problem of vision-based pose estimation in busy and distracting urban environments. By leveraging laser-generated 3D scene priors, we demonstrate how distracting objects of arbitrary types can be identified and masked in order to improve egomotion ... Read More
This work addresses the difficult problem of navigation in changing, dynamic environments. Assuming the world is static in appearance results in brittle mapping and localisation systems. Change comes from many sources (dynamic objects, time of day, weather, seasons) and over different time ... Read More
In outdoor environments shadows are common. These typically strong visual features cause considerable change in the appearance of a place, and therefore confound vision- based localisation approaches. In this work we describe how to convert a colour image of the ... Read More
This work performs visual localisation using synthesised local orthographic imagery. We exploit state of the art stereo visual odometry (VO) on our survey vehicle to generate high precision synthetic orthographic images of the road surface as would be seen from overhead ... Read More
Abstract—This paper is about estimating a smooth, continuous-time trajectory of a vehicle relative to a prior 3D laser map. We pose the estimation problem as that of finding a sequence of Catmull-Rom splines which optimise the Kernelised Rényi Distance (KRD) ... Read More
Abstract— This paper describes a probabilistic framework for appearance based navigation and mapping using spatial and visual appearance data. Like much recent work on appearance based navigation we adopt a bag-of-words approach in which positive or negative observations of visual ... Read More
Abstract— This paper is about pose estimation using monocular cameras with a 3D laser pointcloud as a workspace prior. We have in mind autonomous transport systems in which low cost vehicles equipped with monocular cameras are furnished with preprocessed 3D ... Read More
In this paper we consider long-term navigation using fixed 2D LIDARs. We consider how localization algorithms based on scan-matching - commonly used in indoor environments - are prone to failure when exposed to a challenging real-world outdoor environment. The driving motivation behind this work is to ... Read More
In this paper we describe and demonstrate a method for precisely localizing a road vehicle using a single push-broom 2D laser scanner while leveraging a prior 3D survey. In contrast to conventional scan matching, our laser is oriented downwards, thus causing continual ground strike. Our ... Read More
Abstract - In this paper we describe a relative approach to simultaneous localisation and mapping, based on the insight that a continuous relative representation can make the problem tractable at large scales. First, it is well known that bundle adjustment is ... Read More
This paper is about online, constant-time pose es- timation for road vehicles. We exploit both the state of the art in vision based SLAM and the wide availability of overhead imagery of road networks. We show that by formulating the ... Read More
This paper is concerned with large-scale localisation at city scales with monocular cameras. Our primary motivation lies with the development of autonomous road vehicles — an application domain in which low-cost sensing is particularly important. Here we present a method ... Read More
Successful approaches to autonomous vehicle localisation and navigation typically involve 3D LIDAR scanners and a static, curated 3D map, both of which are expensive to acquire and maintain. We propose an experience-based approach to matching a local 3D swathe built ... Read More
This paper is about life-long vast-scale localisation in spite of changes in weather, lighting and scene structure. Building upon our previous work in Experience-based Navigation, we continually grow and curate a visual map of the world that explicitly supports multiple ... Read More
Abstract—This paper is about localising at night in urban environments using vision. Despite it being dark exactly half of the time, surprisingly little attention has been given to this problem. A defining aspect of night-time urban scenes is the presence and effect of ... Read More
Learning to See the Wood for the Trees: Deep Laser Localization in Urban and Natural Environments on a CPU Georgi Tinchev, Adrian Penate-Sanchez, Maurice Fallon IEEE Robotics and Automation Letters/IEEE International Conference on Robotics and Automation (RA-L/ICRA) 2019. [arXiv] Figure ... Read More