Event-based localisation for ultra-fast systems

One of the most significant challenges for robots and autonomous vehicles is the ability to self-localize. Closed-loop localization and control of ultra-fast systems like jet aircraft and low-flying drones require extremely low latencies reduction that could be achieved by combining event cameras with spiking neural networks on neuromorphic hardware; in particular, Intel’s Loihi. You will implement novel spatiotemporal filtering algorithms that extract meaningful features. If you are interested in hardware prototype development, you could also design a high-speed test rig to evaluate the proposed algorithms. We expect that this project will lead to advances in visual odometry (path integration), change detection (detect which path is taken with microsecond latencies) and place recognition (detecting where you are in the world). The project will provide significant benefits to industry and defence by providing vision-only based localization techniques suitable for fast camera motion.

Chief Investigators