Topics

Home/Topics

Imagine That! Leveraging Emergent Affordances for Tool Synthesis in Reaching Tasks

Imagine That! Leveraging Emergent Affordances for Tool Synthesis in Reaching Tasks   In this paper we investigate an artificial agent’s ability to perform task-focused tool synthesis via imagination. Our motivation is to explore the richness of information captured by the latent space of an object-centric generative model –and how to exploit it. In particular, our [...]

Imagine That! Leveraging Emergent Affordances for Tool Synthesis in Reaching Tasks2019-11-03T11:01:18+00:00

Permutation Invariance and Relational Reasoning in Multi-Object Tracking

Permutation Invariance and Relational Reasoning in Multi-Object Tracking Relational Reasoning is the ability to model interactions and relations between objects. A machine learning model performing relational reasoning often has access to a list of object representations. The ordering of the list could carry information for the task at hand, but that is not necessarily the [...]

Permutation Invariance and Relational Reasoning in Multi-Object Tracking2019-11-03T10:55:33+00:00

VILENS

VILENS - Visual-Inertial Odometry for Legged Robots VILENS (Visual Inertial Legged Navigation System) is a factor-graph based odometry algorithm for legged robots that fuses leg odometry, vision, and IMU data. This algorithm was designed by David Wisth, Marco Camurri, and Maurice Fallon at the Oxford Robotics Institute (ORI). The papers describing this work are listed below. [...]

VILENS2019-11-08T01:17:21+00:00

GENESIS: Generative Scene Inference and Sampling with Object-Centric Latent Representations

GENESIS: Generative Scene Inference and Sampling with Object-Centric Latent Representations   Generative latent-variable models are emerging as promising tools in robotics and reinforcement learning. Yet, even though tasks in these domains typically involve distinct objects, most state-of-the-art generative models do not explicitly capture the compositional nature of visual scenes. Two recent exceptions, MONet and IODINE, [...]

GENESIS: Generative Scene Inference and Sampling with Object-Centric Latent Representations2019-11-01T16:05:02+00:00

Learning Robust, Distraction-Free Radar Odometry from Pose Information

Masking by Moving: Learning Robust, Distraction-Free Radar Odometry from Pose Information Abstract - This paper presents an end-to-end radar odometry system which delivers robust, real-time pose estimates based on a learned embedding space free of sensing artefacts and distractor objects. The system deploys a fully differentiable, correlation-based radar matching approach. This provides the same level of [...]

Learning Robust, Distraction-Free Radar Odometry from Pose Information2019-11-04T13:40:52+00:00

The Oxford Radar RobotCar Dataset

The Oxford Radar RobotCar Dataset: A Radar Extension to the Oxford RobotCar Dataset Abstract - In this paper we present The Oxford Radar RobotCar Dataset, a new dataset for researching scene understanding using Millimetre-Wave FMCW scanning radar data. The target application is autonomous vehicles where this modality remains unencumbered by environmental conditions such as fog, [...]

The Oxford Radar RobotCar Dataset2019-09-26T13:13:22+00:00

Scrutinizing and De-Biasing Intuitive Physics with Neural Stethoscopes

Scrutinizing and De-Biasing Intuitive Physics with Neural Stethoscopes Predicting the stability of block towers is a popular task in the domain of intuitive physics. Previously, work in this area focused on prediction accuracy, a one-dimensional performance measure. We provide a broader analysis of the learned physical understanding of the final model and how the learning [...]

Scrutinizing and De-Biasing Intuitive Physics with Neural Stethoscopes2019-11-03T10:51:08+00:00

Deep Inverse Sensor Modelling in Radar

In the last decade, systems utilising camera and lasers have been remarkably successful increasing our expectations for what robotics might achieve in the decade to come. Our robots now need to see further, not only operating in environments where humans can operate, but also in environments where humans cannot! To this end radar is a [...]

Deep Inverse Sensor Modelling in Radar2019-04-17T15:55:44+00:00

On the Limitations of Representing Functions on Sets

Our recent work on analysing a set of permutation invariant neural network architectures is probably on the theoretical end of the spectrum of the type of work we do at the A2I lab. Nevertheless it is equally exciting as it has concrete implications for real-world robotics such as working with point clouds from Lidars. [...]

On the Limitations of Representing Functions on Sets2019-03-27T15:10:25+00:00