Project dates: 01/06/2015 - 31/12/2019
The main objective of any roboticist is to build robots that can do things in the world with precision and repetition, whether it be recognising the environment or moving around in it. One of the main trade-offs is the cost of the sensors used in doing this; the sensors that are being used in autonomous cars can costs tens of thousands of dollars, and then there’s processing the data on top of that. So either we try to decrease the sensor costs by purchasing a lot of them, or we try to build better algorithms. Humans have about 21 billion neurons in their brains to process inputs and create understanding, whereas mice and rats range from 4 million to 18 million. By modelling how rats and mice behave, we will be able to create better algorithms to put on-top of existing cheap sensors.
This project will revolutionize our understanding of how humans and animals use vision to determine their location in the world. This understanding will lead to new computer algorithms that enable robots to navigate in any environmental conditions using cheap visual sensors and breakthroughs in our knowledge of the brain.
- Jacobson, A., Chen, Z. and Milford, M., 2018. Leveraging variable sensor spatial acuity with a homogeneous, multi-scale place recognition framework. Biological cybernetics, 112(3), pp.209-225.
- Yu, F., Shang, J., Hu, Y. and Milford, M., 2019. NeuroSLAM: a brain-inspired SLAM system for 3D environments. Biological cybernetics, 113(5), pp.515-545.
- Hausler, S., Jacobson, A. and Milford, M., 2019. Multi-process fusion: Visual place recognition using multiple image processing methods. IEEE Robotics and Automation Letters, 4(2), pp.1924-1931.
- Bruce, J., Jacobson, A. and Milford, M., 2017. Look no further: Adapting the localization sensory window to the temporal characteristics of the environment. IEEE Robotics and Automation Letters, 2(4), pp.2209-2216.
- Chen, Z., Lowry, S., Jacobson, A., Hasselmo, M.E. and Milford, M., 2015. Bio-inspired homogeneous multi-scale place recognition. Neural Networks, 72, pp.48-61.
Funding / Grants
- Australian Research Council Future Fellowship Scheme FT140101229 (2015 - 2019)