Robust perception in dusty environments for autonomous drones

While it might be easy for humans to recognise and differentiate dust from other objects such as wires and thin structures, it is a very hard problem for a robot to do it reliably. This project will focus on researching novel ways of autonomous perception and navigation in dusty environments using different sensor modalities and AI algorithms to classify and filter-out dust and/or develop resilient perception systems.

As a PhD candidate, you will research, implement, and demonstrate advanced visual odometry systems for use on a variety of autonomous platforms, including drones and ground robots such Spot from Boston Dynamics. You will be part of a world class autonomy team with researchers and engineers from Emesent and QUT who are pushing the state of the art in autonomous systems and artificial intelligence.

 Fully funded and top-up scholarships are available with this project.


Chief Investigators

Partners