Oxford Robotics Institute | Publications - Datasets
Datasets
We have published a number of datasets. Under each heading you will find the associated paper, as well as links to their websites where relevant.
Oxford Indoor Human Motion Dataset [2024]
We present The Oxford Indoor Human Motion (Oxford-IHM) Dataset – a human trajectory prediction dataset in which humans walk between regions of interest in an indoor environment. We use static and robot-mounted RGB-D cameras to observe humans as they move between goals while being tracked within a motion-capture setup.
Oxford Spires Dataset [2024]
The Oxford Spires dataset was collected in multiple Oxford landmarks using a LiDAR, three colour cameras with an inertial sensor, as well as millimetre-accurate maps from a terrestrial LiDAR scanner (TLS). The dataset consists of 24 sequences across 6 sites (~1 hectare each). It can be used for LiDAR and/or visual localisation, reconstruction, and novel-view synthesis using NeRF or Gaussian Splatting.
You can find it here: The Oxford Spires Dataset
Y. Tao, M. Á. Muñoz-Bañón, L. Zhang, J. Wang, L. Fu, M. Fallon, "The Oxford Spires Dataset: Benchmarking Large-Scale LiDAR-Visual Localisation, Reconstruction and Radiance Field Methods", Arxiv 2024.
RobotCycle Dataset [2024]
We are excited to start recording the RobotCycle dataset. The project aims to assess how cycling infrastructure can affect the behaviour and safety of cyclists.
You can find more information about the project in the project's webpage and published paper. You can also find about the data collection policy on our dedicated ethics hub page.
E. Panagiotaki, T. Reinmund, S. Mouton, L. Pitt, A. Shaji Shanthini, W. Tubby, M. Towlson, S. Sze, B. Liu, C. Prahacs, D. De Martini, L. Kunze, "RobotCycle: Assessing Cycling Safety in Urban Environments", IEEE Intelligent Vehicles Symposium (IV), 2024.
Oxford Offroad Radar Dataset [2024]
We release data collected in the rugged Scottish highlands in extreme weather. The radar data we offer to the community are accompanied by GPS/INS reference - to further stimulate research in radar place recognition.
We include data from four areas of the Ardverikie Estate. These feature distinct landscape (therefore typical radar returns) as well as driving conditions. In total we drive a diverse set of four routes over 11 forays, totalling approximately 154km of rugged driving.
You can find it here: The Oxford Offroad Radar Dataset [OORD]
M. Gadd, D. De Martini, O. Bartlett, P. Murcutt, M. Towlson, M. Widojo, V. Muşat, L. Robinson, E. Panagiotaki, G. Pramatarov, M. A. Kühn, L. Marchegiani, P. Newman, L. Kunze, "OORD: The Oxford Offroad Radar Dataset", arXiv preprint arXiv:2403.02845, 2024.
Radar Doppler Dataset [2023]
The Radar Doppler Dataset extends the works on the Radar RobotCar Dataset with a new radar configuration containing Doppler information. The dataset was collected with the collaboration of Oxa and Navtech Radar in and near the Culham Science Centre, Abingdon and contains, alongside radar data, LiDAR, cameras and RTK GPS, to allow the evaluation of odometry algorithms.
You can find it here: Radar Doppler Dataset
F. Rennie, D. Williams, P. Newman, D. De Martini, "Doppler-aware Odometry from FMCW Scanning Radar", Arxiv 2024.
Newer College Dataset [2020]
The Newer College Dataset revisits New College, Oxford with modern sensors and ground truth. The dataset was collected at human walking speed on a compact handheld device equipped with a RealSense D435i stereo camera, a Ouster OS-1 LIDAR and embedded IMU sensors. A millimeter accurate point cloud of the college, acquired with a professional laser scanner, is used to generate ground truth poses at 10 Hz in post processing, to allow the evaluation of visual-inertial algorithms outdoor at unprecedented small scales.
You can find it here: Newer College Dataset
M. Ramezani, Y. Wang, M. Camurri, D. Wisth, M. Mattamala, M. Fallon, "The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth", IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020.
Radar RobotCar Dataset [2020]
The Oxford Radar RobotCar Dataset is a new dataset for researching scene understanding using Millimetre-Wave FMCW scanning radar data. The target application is autonomous vehicles where this modality remains unencumbered by environmental conditions such as fog, rain, snow, or lens flare, which typically challenge other sensor modalities such as vision and LIDAR.
You can find it here: Radar RobotCar Dataset
D. Barnes, M. Gadd, P. Murcutt, P. Newman, and I. Posner, “The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset”, IEEE International Conference on Robotics and Automation (ICRA), 2020.
ShapeStacks Dataset [2018]
A simulation-based dataset featuring 20,000 stack configurations composed of a variety of elementary geometric primitives richly annotated regarding semantics and structural stability. (A collaboration between ORI and Andrea Vedaldi of Visual Geometry Group, Department of Engineering Science).
O. Groth, F. Fuchs, I. Posner, A. Vedaldi, "ShapeStacks: Learning Vision-Based Physical Intuition for Generalised Object Stacking". Arxiv 2018.
RobotCar Dataset [2017]
The Oxford RobotCar Dataset contains over 100 repetitions of a consistent route through Oxford, UK, captured over a period of over a year. The dataset captures many different combinations of weather, traffic and pedestrians, along with longer term changes such as construction and roadworks.
You can find it here: http://robotcar-dataset.robots.ox.ac.uk/
W. Maddern, G. Pascoe, C. Linegar, and P. Newman, “1 Year, 1000 km: The Oxford RobotCar dataset,” International Journal of Robotics Research, vol. 36, iss. 1, p. 3–15, 2017.
Navigating, Recognising and Describing Urban Spaces With Vision and Laser [2009]
This dataset comprises the data recorded for the paper “Navigating, Recognising and Describing Urban Spaces With Vision and Laser”
(ftp) Alog File For November 7th 2008 (180M)
Note: Alogs can be parsed by the software available on The New College Vision and Laser Dataset (see above). If you are prompted for a username and password, use the following:
User: Anonymous
Pass:
P. Newman, G. Sibley, M. Smith, M. Cummins, A. Harrison, C. Mei, I. Posner, R. Shade, D. Schroeter, L. Murphy, W. Churchill, D. Cole, and I. Reid, “Navigating, Recognising and Describing Urban Spaces With Vision and Laser,” International Journal of Robotics Research, vol. 28, 2009.
New College Vision and Laser Dataset [2009]
M. Smith, I. Baldwin, W. Churchill, R. Paul, and P. Newman, “The New College Vision and Laser Data Set,” International Journal of Robotics Research, vol. 28, iss. 5, pp. 595-599, 2009.
FABMAP Multimedia Extension Dataset [2008]
FABMAP 10k and 100k word vocabularies can be requested here.
M. Cummins and P. Newman, “FAB-MAP: Probabilistic Localization and Mapping in the Space of Appearance,” International Journal of Robotics Research, vol. 27, iss. 6, pp. 647-665, 2008.