The Path Less Taken: A Fast Variational Approach for Scene Segmentation Used for Closed Loop Control

Abstract – In this paper we propose an on-line system that discovers and drives collision-free traversable paths, using a variational approach to dense stereo vision. Our system is light weight, can be run on low cost hardware and is remarkably quick to predict the semantics. In addition to the scene’s path affordance it yields a segmentation of the local scene as a composite of distinctive labels – e.g, ground, sky, obstacles and vegetation. To estimate the labels, we combine a very fast and light weight (shallow) image classifier which considers informative feature channels derived from colour images and dense depth maps estimates. Unlike other approaches, we do not use local descriptors around pixel features. Instead, we encompass label-predicted probabilities with a variational approach for image segmentation. Akin to dense depth map estimation, we obtain semantically segmented images by means of convex regularisation. We show how our system can rapidly obtain the required semantics and paths at VGA resolution. Extensive experiments on the KITTI dataset support the robustness of our system to derive collision-free local routes. An accompanied video supports the robustness of the system at live execution in an outdoor experiment.

Further Info – For more experimental details please read our paper:

  • [PDF] T. Suleymanov, L. M. Paz, P. Piniés, G. Hester, and P. Newman, “The Path Less Taken: A Fast Variational Approach for Scene Segmentation Used for Closed Loop Control,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, South Korea, 2016.
    [Bibtex]
    @InProceedings{SuleymanovIROS2016,
    author = {Suleymanov, Tarlan and Paz, Lina Maria and Pini{\'e}s, Pedro and Hester, Geoff and Newman, Paul},
    title = {{The Path Less Taken: A Fast Variational Approach for Scene Segmentation Used for Closed Loop Control}},
    booktitle = {{IEEE/RSJ} International Conference on Intelligent Robots and Systems (IROS)},
    year = {2016},
    address = {Daejeon, South Korea},
    month = October,
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/IROS2016Tarlan_PathFollowing.pdf},
    }

For a quick overview you can take a look at our video: