The QUT Centre for Robotics has made significant progress towards robust and reliable algorithms that can localise an autonomous agent (like robots, autonomous vehicles or augmented reality devices) in pre-mapped environments. This project will develop the first fully neuromorphic pipeline for robot localisation. A dynamic vision sensor (event camera) will be used to capture changes in the robot’s environment. These sensors output spikes that can be processed using spiking neural networks. Thanks to collaboration within Intel’s Neuromorphic Research Community, the PhD student will have the opportunity to develop massively parallel algorithms that can run on Intel’s super energy-efficient, low-latency Loihi chip.
Key innovations in this PhD could include:
- New feature extraction and feature matching algorithms for event-based sensor footage, including attention mechanisms, and sensory fusion mechanisms.
- The first pipeline for place recognition that makes use of neuromorphic sensors and processing and robot control.
- Novel algorithms for ultra low-latency place recognition, which could be used in high-speed scenarios.
- An active perception component that actively chooses the robot controls that maximise the expected information gain for place recognition.
- A computational model of biological neural networks to validate biological theories of animal localisation and navigation.