Project dates: 09/01/2019 - 09/01/2024
State-of-the-art Autonomous Vehicles (AVs) are trained for specific, well-structured environments and, in general, would fail to operate in unstructured or novel settings. This project aims at developing next-generation AVs, capable of learning and on-the-fly adaptation to environmental novelty. These systems need to be orders of magnitude more energy efficient than current systems and able to pursue complex goals in highly dynamic and even adversarial environments.
Biological organisms exhibit the capabilities envisioned for next-generation AVs. From insects to birds, rodents and humans, one can observe the fusing of multiple sensor modalities, spatial awareness, and spatial memory, all functioning together as a suite of perceptual modalities that enable navigation in unstructured and complex environments. With this motivation, the project will leverage deep neurophysiological insights from the living world to develop new neuroscience-inspired methods capable of achieving advanced, next-generation perception and navigation for AVs.
The project will run for 3, extendable to 5 years, and is worth approximately $15M over the full 5 year period.
QUT’s Involvement and Research Areas
QUT’s involvement in this project is focused on biologically-inspired place perception and place recognition, both in terms of bio-inspired sensing (including dynamic vision or “event” cameras) and bio-inspired navigation, mapping and machine learning.
Selected Publications
- Stephen Hausler, Sourav Garg, Ming Xu, Michael Milford, Tobias Fischer, “Patch-NetVLAD: Multi-Scale Fusion of Locally-Global Descriptors for Place Recognition“, in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021.
- TL Molloy, T Fischer, M Milford, GN Nair, “Intelligent Reference Curation for Visual Place Recognition via Bayesian Selective Fusion“, in IEEE Robotics and Automation Letters, 6 (2), 2021.
- D Dall’Osto, T Fischer, M Milford, “Fast and robust Bio-inspired teach and repeat navigation“, in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2021.
- T Fischer, W Vollprecht, S Traversaro, S Yen, C Herrero, M Milford, “RoboStack: Using the Robot Operating System alongside the Conda and Jupyter Data Science Ecosystems“, conditionally accepted to IEEE Robotics and Automation Magazine, 2021.
- Sourav Garg, Tobias Fischer, Michael Milford, “Where is your place, Visual Place Recognition?“, in International Joint Conference on Artificial Intelligence, 2021.
- Ming Xu, Tobias Fischer, Niko Sunderhauf, Michael J Milford, “Probabilistic Appearance-Invariant Topometric Localization with New Place Awareness“, in IEEE Robotics and Automation Letters, 2021
- T Fischer, M Milford, “Event-based visual place recognition with ensembles of temporal windows“, in IEEE Robotics and Automation Letters, 5 (4), 2020
- Stephen Hausler, Zetao Chen, Michael E. Hasselmo & Michael Milford, “Bio-inspired multi-scale fusion“, in Biological Cybernetics, 114, pages 209–229 (2020)
- M Chancán, L Hernandez-Nunez, A Narendra, AB Barron, M Milford, “A Hybrid Compact Neural Architecture for Visual Place Recognition“, in IEEE Robotics and Automation Letters, 5 (2), 993-1000, 2020
Partners
- Defence Science and Technology
- Boston University
- MIT
- University of Melbourne
- Macquarie University
- UNSW
- US Office of Naval Research