Julie Dequaire

/Julie Dequaire

08e85a6

Email: julie_at_robots.ox.ac.uk

Julie joined the ORI in July 2014 as a DPhil student and is a member of Pembroke College.

Her main area of research consists in state estimation and scene understanding for autonomous driving using recurrent neural networks. Her current focus is context-based tracking to understand the dynamics of a scene given the environment the robot is in, in order to aid the planner in producing feasible and optimal paths through the environment.

Her interests lie in assistive robotics, where machine learning and medicine combine to aid patients with disability (smart exoskeletons and prosthetics, interfacing with the brain for stroke rehabilitation).

Bio

Julie holds a Master’s degree in Aerospace Engineering from ISAE-SupAero (Toulouse, France) , and has previously worked at NASA Ames Research Center (California, USA) studying the climate of Mars.

In her free time, she enjoys playing the piano, engaging in team sports, and exploring the air, the land and the seas.

 

 

Publications

2017

  • [PDF] J. Dequaire, P. Ondrúška, D. Rao, D. Wang, and I. Posner, “Deep tracking in the wild: End-to-end tracking using recurrent neural networks,” The International Journal of Robotics Research, 2017.
    [Bibtex]

    @article{DequaireIJJ2017,
    author = {Dequaire, Julie and Ondr{\'u}{\v{s}}ka, Peter and Rao, Dushyant and Wang, Dominic and Posner, Ingmar},
    title = {Deep tracking in the wild: End-to-end tracking using recurrent neural networks},
    year = {2017},
    eprint = {http://journals.sagepub.com/doi/abs/10.1177/0278364917710543},
    journal = {The International Journal of Robotics Research},
    publisher={SAGE Publications Sage UK: London, England},
    Pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2017_IJRR_Dequaire.pdf}
    }

  • [PDF] B. Yeomans, H. Porav, M. Gadd, D. Barnes, J. Dequaire, T. Wilcox, S. Kyberd, S. Venn, and P. Newman, “MURFI 2016 – From Cars to Mars: Applying Autonomous Vehicle Navigation Methods To a Space Rover Mission,” in Proceedings of the 14th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA), Leiden, Netherlands, 2017.
    [Bibtex]

    @inproceedings{BrianYeomansASTRA2017,
    author = {Yeomans, Brian and Porav, Horia and Gadd, Matthew and Barnes, Dan and Dequaire, Julie and Wilcox, Tom and Kyberd, Stephen and Venn, Simon and Newman, Paul},
    title = {MURFI 2016 - From Cars to Mars: Applying Autonomous Vehicle Navigation Methods To a Space Rover Mission},
    booktitle = {Proceedings of the 14th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA)},
    address = {Leiden, Netherlands},
    year = {2017},
    pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2017ASTRA_yeomans.pdf}}

2016

  • [PDF] J. Dequaire, D. Rao, P. Ondruska, D. Zeng Wang, and I. Posner, “Deep Tracking on the Move: Learning to Track the World from a Moving Vehicle using Recurrent Neural Networks,” ArXiv e-prints, 2016.
    [Bibtex]

    @article{DequaireArXivSeptember2016,
    author = {Dequaire, J. and Rao, D. and Ondruska, P. and Zeng Wang, D. and Posner, I.},
    title = "{Deep Tracking on the Move: Learning to Track the World from a Moving Vehicle using Recurrent Neural Networks}",
    journal = {ArXiv e-prints},
    archivePrefix = "arXiv",
    eprint = {1609.09365},
    primaryClass = "cs.CV",
    keywords = {Computer Science - Computer Vision and Pattern Recognition, Computer Science - Artificial Intelligence, Computer Science - Learning, Computer Science - Robotics},
    year = 2016,
    month = sep,
    Pdf = {https://arxiv.org/abs/1609.09365}
    }

  • [PDF] P. Ondruska, J. Dequaire, D. Zeng Wang, and I. Posner, “End-to-End Tracking and Semantic Segmentation Using Recurrent Neural Networks,” in Robotics: Science and Systems, Workshop on Limits and Potentials of Deep Learning in Robotics, 2016.
    Best Workshop Paper [Bibtex]

    @inproceedings{OndruskaRSS2016,
    Author = {Ondruska, Peter and Dequaire, Julie and Zeng Wang, Dominic and Posner, Ingmar},
    Title = "{End-to-End Tracking and Semantic Segmentation Using Recurrent Neural Networks}",
    Booktitle = {Robotics: Science and Systems, Workshop on Limits and Potentials of Deep Learning in Robotics},
    Year = 2016,
    Month = June,
    Pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2016RSS_ondruska.pdf},
    award = "Best Workshop Paper",
    awardlink = "http://juxi.net/workshop/deep-learning-rss-2016/#papers"
    }

  • [PDF] P. Ondruska, J. Dequaire, Zeng Wang Dominic, and I. Posner, “End-to-End Tracking and Semantic Segmentation Using Recurrent Neural Networks,” ArXiv e-prints, 2016.
    [Bibtex]

    @article{OndruskaArXivApril2016,
    Author = {Ondruska, Peter and Dequaire, Julie and Zeng Wang, Dominic, and Posner, Ingmar},
    Title ="{End-to-End Tracking and Semantic Segmentation Using Recurrent Neural Networks}",
    Journal = {ArXiv e-prints},
    ArchivePrefix = "arXiv",
    Eprint = {1604.05091},
    PrimaryClass = "cs.LG",
    Keywords = {Computer Science - Learning - Artificial Intelligence - Computer Vision and Pattern Recognition - Neural and Evolutionary Computing - Robotics},
    Year = 2016,
    Month = apr,
    Adsurl = {http://adsabs.harvard.edu/abs/2016arXiv160405091O},
    Adsnote = {Provided by the SAO/NASA Astrophysics Data System},
    Pdf = {http://arxiv.org/abs/1604.05091}}

  • [PDF] J. Dequaire, C. H. Tong, W. Churchill, and I. Posner, “Off the Beaten Track: Predicting Localisation Performance in Visual Teach and Repeat,” in Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 2016.
    [Bibtex]

    @inproceedings{DequaireICRA2016,
    Address = {Stockholm, Sweden},
    Author = {Dequaire, Julie and Tong, Chi Hay and Churchill, Winston and Posner, Ingmar},
    Booktitle = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)},
    Month = {May},
    Pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/2016ICRA_dequaire.pdf},
    Title = {Off the Beaten Track: Predicting Localisation Performance in Visual Teach and Repeat},
    Year = {2016}}

2018-01-29T19:01:45+00:00