Australian Robotic Inspection and Asset Management Hub

Project dates: 01/11/2022 - Ongoing

What We Do

We are an ARC Research Hub developing autonomous robotic systems to transform the way our assets and infrastructure are managed.

Our Hub unites a team of globally recognised leaders in field robotics research and development from academia, government and industry to generate important new knowledge in robotics and associated fields.

We are the Australian Robotic Inspection and Asset Management Hub (ARIAM), an ARC Research hub hosted by the University of Sydney and in partnership with Queensland University of Technology and The Australian National University.

Much of Australia’s infrastructure assets were built in a period of rapid growth in the three decades following World War II. Many of these assets are now rapidly approaching the end of their 50- to 80-year lifespan, creating an “infrastructure cliff”. This is a clear call to action.

ARIAM aims to transform the way assets and infrastructure are managed through the development of intelligent robotic systems with new capabilities for inspection, monitoring, maintenance, and optimisation.

We share a vision for a future of robots equipped with the ability to autonomously collect data for integration into a digital twin that provides a real-time representation of the true state of a physical asset.

 

Work With Us: PhD and Postdoc Positions Available!

Starting in 2023, we will have several fully-funded PhD positions available at QUT (more positions are available with our partner universities in Sydney and Canberra).

Available from April 2023: 2x PhD positions on the Topic “Representing and Understanding the Environment through Multi-Modal Spatio-Temporal Implicit Scene Representations “

Project Description

Accurately mapping a large-scale asset with intricate geometry in the presence of changes in structure and appearance over long periods is still exceptionally challenging.

The problem difficulty is increased even more when multi-modal sensor data from a range of different sensors (e.g. lidars and cameras, but also more specialised hardware such as gas sensors) need to be integrated; and when the sensor data is gathered by multiple heterogeneous agents (e.g. robots, drones, or human-operated sensor platforms of different kinds).

Furthermore, extracting insights and knowledge from the created maps is an ongoing challenge, especially when the requested insights are of semantic or similar high-level nature, or not even fully known at the time of creating the representation.

The project has two principal aims:

  1. This project aims to investigate novel algorithms that can efficiently construct and maintain an implicit neural field representation from diverse sensor data such as lidars and cameras, but also more specialised hardware such as gas sensors. The resulting representation is spatio-temporal, i.e. it not only represents the 3D spatial structure of an environment but integrates a temporal dimension, allowing to integrate sensor data taken at different points in time. This project will investigate how sensor data from multiple heterogeneous robotic platforms can be utilised by the implicit representation.
  2. Furthermore, the project aims to develop new algorithms to extract high-level semantically meaningful information and insights from the resulting representation of the environment, including identifying relevant changes in the environment over time. The nature of these insights and information will vary across different concrete applications

We will recruit two PhD Students with a strong background in machine learning and robotic vision. Each PhD student will be responsible for working towards one of the two Aims described above.

If you are interested, please contact the project leader, Prof Niko Suenderhauf.


Funding / Grants

  • ARC ITRH (2022 - 2027)

Chief Investigators

Partners