This research program addresses the fundamental problem of how a robot or autonomous vehicle uses perception to create maps and calculates and tracks its location in the world. Research questions include addressing how:
- the appearance of a place changes in relation as a function of time, season, weather, viewpoint and environment type
- understanding context and semantics can enhance performance
- lifelong reliability can be achieved as the world continually changes
- the relationship to neurological structures and behavioural mechanisms are used in animal and human navigation; and how new perception technologies can be applied to this problem.
This research program builds on already considerable industry collaborations in high-value sectors including mining and defence. We look to expand into logistics, construction, space and remote operations.
We are continually building on our extensive engagement program of over 50 public and sector-based talks, workshops, media, and large-scale events each year.
Projects
- A baseline dataset for performance evaluation of visual detection and classification techniques in mining environments
- AgBot II Robotic Site-specific Crop and Weed Management Tool
- Novel autonomous robotic weed control to maximise agricultural productivity
- AUSMURI: Neuro-Autonomy: Neuroscience-Inspired Perception, Navigation, and Spatial Awareness for Autonomous Robots
- Australian Research Council Industrial Transformation Training Centre for Joint Biomechanics
- Automated Early-Detection of the Invasive Grass African Lovegrass
- Autonomous Mission Planning, Navigation and Geological Feature Recognition using UAVs (Drones)
- Autonomous UAV decision making under environment and target detection uncertainty
- [COMPLETED] ARC Future Fellowship: Superhuman place recognition
- [COMPLETED] Automation-enabling positioning for underground mining
- [COMPLETED] How automated vehicles will interact with road infrastructure
- Contextual Hazard Detection
- Continuum robots for minimally invasive orthopaedic surgeries
- Evaluating the effect of illumination on the performance of visual odometry in underground mining environments
- HD maps for automated driving
- Illuminant invariant cameras and flexible image sensors
- Mini Autonomous Vehicles
- Rheinmetall Defence Australia: Advanced Terrain Detection (ATD)
- Robotic knee arthroscopy
- UAV Navigation using semantic cues
- US Air Force / AOARD: An infinitely scalable learning and recognition network
- Visual Place Recognition for Robotics in Extreme Environments
Team
Led by Professor Michael Milford
-
Ahmad Khaliq
PhD Researcher
-
Prof Andry Rakotonirainy
Professor
-
Brandon Richard Webster
Visiting Researcher (Fulbright Scholar from University of Notre Dame)
-
Connor Malone
PhD Researcher
-
Dorian Tsai
Research Associate
-
Assoc Prof Felipe Gonzalez
Associate Professor
-
Dr Fernando Vanegas Alvarez
Research Fellow
-
Garima Samvedi
Senior Research Engineer
-
Ian Greyvensteyn
MPhil Researcher
-
James Mount
Research Engineer
-
Prof Jennifer Firn
Professor
-
John Skinner
PhD Researcher
-
Jon Hu
Project Officer
-
Julian Galvez-Serna
PhD Researcher
-
Marvin Chancán
PhD Researcher
-
Prof Michael Milford
Deputy Director, Program Lead (Perception & Localisation), Professor
-
Ming Xu
PhD Researcher
-
Nicolas Mandel
PhD Researcher
-
Scarlett Raine
PhD Researcher
-
Prof Sebastien Glaser
Professor
-
Dr Sourav Garg
Research Fellow
-
Stephen Hausler
PhD Researcher
-
Dr Tobias Fischer
Research Fellow