ICRA

/Tag:ICRA

Where did my robot go? Being robust and cautious during Laser-based Localization

Let’s imagine a scenario where a robot, either walking on two or four legs or moving on wheels, is required to explore a cluttered environment containing corridors, constrictions, uneven terrains or staircases. The robot would process proprioceptive inertial and leg/wheel odometry measurements, as well as exteroceptive observations from a 3D laser scanner. The odometry estimate [...]

Where did my robot go? Being robust and cautious during Laser-based Localization 2018-02-15T08:00:01+00:00

From Dusk till Dawn: Localisation at Night using Artificial Light Sources

Abstract—This paper is about localising at night in urban environments using vision. Despite it being dark exactly half of the time, surprisingly little attention has been given to this problem. A defining aspect of night-time urban scenes is the presence and effect of artificial lighting -- be that in the form of street or interior lighting through windows. By [...]

From Dusk till Dawn: Localisation at Night using Artificial Light Sources 2016-10-22T19:49:34+00:00

Work Smart, Not Hard: Recalling Relevant Experiences for Vast-Scale but Time-Constrained Localisation

This paper is about life-long vast-scale localisation in spite of changes in weather, lighting and scene structure. Building upon our previous work in Experience-based Navigation, we continually grow and curate a visual map of the world that explicitly supports multiple representations of the same place. We refer to these representations as experiences, where a single [...]

Work Smart, Not Hard: Recalling Relevant Experiences for Vast-Scale but Time-Constrained Localisation 2016-10-22T19:49:34+00:00

Leveraging Experience for Long-Term LIDAR Localisation In Changing Cities

 Successful approaches to autonomous vehicle localisation and navigation typically involve 3D LIDAR scanners and a static, curated 3D map, both of which are expensive to acquire and maintain. We propose an experience-based approach to matching a local 3D swathe built using a push-broom 2D LIDAR to a number of prior 3D maps, each of which [...]

Leveraging Experience for Long-Term LIDAR Localisation In Changing Cities 2018-06-20T15:10:08+00:00

Discovering and Mapping Complete Surfaces With Stereo

This paper is about the automated discovery and mapping of surfaces using a stereo pair. We begin with the observation that for any workspace which is topologically connected (i.e. does not contain free flying islands) there exists a single surface that covers the entirety of the workspace. We call this surface the covering surface. We [...]

Discovering and Mapping Complete Surfaces With Stereo 2016-10-22T19:49:34+00:00

FAB-MAP 3D: Topological Mapping with Spatial and Visual Appearance

  Abstract— This paper describes a probabilistic framework for appearance based navigation and mapping using spatial and visual appearance data. Like much recent work on appearance based navigation we adopt a bag-of-words approach in which positive or negative observations of visual words in a scene are used to discriminate between already visited and new places. [...]

FAB-MAP 3D: Topological Mapping with Spatial and Visual Appearance 2016-10-22T19:49:34+00:00

TICSync: Knowing When Things Happened

  Modern robotic systems are composed of many distributed processes sharing a common communications infrastructure. High-bandwidth sensor data is often collected on one computer and served to many consumers. It is vital that every device on the network agrees on how time is measured. If not, sensor data may be at best inconsistent and at [...]

TICSync: Knowing When Things Happened 2016-10-22T19:49:34+00:00

Cross-Calibration of Push-Broom 2D LIDARs and Cameras In Natural Scenes

Abstract— This paper addresses the problem of automatically estimating the relative pose between a push-broom LIDAR and a camera without the need for artificial calibration targets or other human intervention. Further we do not require the sensors to have an overlapping field of view, it is enough that they observe the same scene but at [...]

Cross-Calibration of Push-Broom 2D LIDARs and Cameras In Natural Scenes 2018-07-11T14:31:57+00:00

Road vehicle localization with 2D push-broom lidar and 3D priors

In this paper we describe and demonstrate a method for precisely localizing a road vehicle using a single push-broom 2D laser scanner while leveraging a prior 3D survey. In contrast to conventional scan matching, our laser is oriented downwards, thus causing continual ground strike. Our method exploits this to produce a small 3D swathe of laser data which can be matched [...]

Road vehicle localization with 2D push-broom lidar and 3D priors 2016-10-22T19:49:35+00:00

Planes, Trains and Automobiles. Autonomy for the Modern Robot

Abstract—We are concerned with enabling truly large scale autonomous navigation in typical human environments. To this end we describe the acquisition and modeling of large urban spaces from data that reflects human sensory input. Over 181GB of image and inertial data are captured using head- mounted stereo cameras. This data is processed into a relative [...]

Planes, Trains and Automobiles. Autonomy for the Modern Robot 2016-10-22T19:52:23+00:00