We have published a number of datasets. Under each heading you will find the associated paper, as well as links to their websites where relevant. We also make some data from other publications available as is.
Newer College Dataset
The Newer College Dataset revisits New College, Oxford with modern sensors and ground truth. The dataset was collected at human walking speed on a compact handheld device equipped with a RealSense D435i stereo camera, a Ouster OS-1 LIDAR and embedded IMU sensors. A millimeter accurate point cloud of the college, acquired with a professional laser scanner, is used to generate ground truth poses at 10 Hz in post processing, to allow the evaluation of visual-inertial algorithms outdoor at unprecedented small scales.
You can find it here: Newer College Dataset
Radar RobotCar Dataset
The Oxford Radar RobotCar Dataset is a new dataset for researching scene understanding using Millimetre-Wave FMCW scanning radar data. The target application is autonomous vehicles where this modality remains unencumbered by environmental conditions such as fog, rain, snow, or lens flare, which typically challenge other sensor modalities such as vision and LIDAR.
You can find it here: Radar RobotCar Dataset
D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Paris, 2020.
The Oxford RobotCar Dataset contains over 100 repetitions of a consistent route through Oxford, UK, captured over a period of over a year. The dataset captures many different combinations of weather, traffic and pedestrians, along with longer term changes such as construction and roadworks.
You can find it here: http://robotcar-dataset.robots.ox.ac.uk/
Data used in “Navigating, Recognising and Describing Urban Spaces With Vision and Laser”
Note: Alogs can be parsed by the software available on The New College Vision and Laser Dataset (see above). If you are prompted for a username and password, use the following:
P. Newman, G. Sibley, M. Smith, M. Cummins, A. Harrison, C. Mei, I. Posner, R. Shade, D. Schroeter, L. Murphy, W. Churchill, D. Cole, and I. Reid, “Navigating, Recognising and Describing Urban Spaces With Vision and Laser,” The International Journal of Robotics Research, vol. 28, 2009.
FABMAP Multimedia Extension Dataset
FABMAP 10k and 100k word vocabularies can be requested here.
New College Vision and Laser Dataset
M. Smith, I. Baldwin, W. Churchill, R. Paul, and P. Newman, “The New College Vision and Laser Data Set,” The International Journal of Robotics Research, vol. 28, iss. 5, pp. 595-599, 2009.
A simulation-based dataset featuring 20,000 stack configurations composed of a variety of elementary geometric primitives richly annotated regarding semantics and structural stability. (A collaboration between ORI and Andrea Vedaldi of Visual Geometry Group, Department of Engineering Science).