OpenCV AI competition project aims to search for early signs of life on other planets

The QUT Space White Belly Sea Eagles Team has submitted their project to the OpenCV AI competition 2021. The project aims to advance some of the technologies needed to search for early signs of life on other planets, one of the frontiers of the twenty-first century.

The competition is focussed around demonstrating the performance of a new set of cameras, that allows for AI inference directly on the device. The team has worked on integrating the camera with an autonomous mission planner on a UAV, that detects biosignatures and approaches them for closer inspection and confirmation of the detection.

There were 210 teams who were announced as first round winners in March and received their sets of OpenCV AI Kit (OAK) cameras. Teams from all over the world are competing for regional and global prizes, which will be announced on the 6th of September 2021.

The camera on the UAV

The project

The project fuses different project capabilities developed at the QUT Centre for Robotics. Julian Galvez’s expertise in biosignatures for life detection on other planets was able to be fused with the autonomous UAV decision making under environment and target uncertainty, under the instruction of Fernando Vanegas and Juan Sandino. Furthermore, the created maps carry an inherent semantic meaning, which is part of Nicolas Mandel’s work.

Under the supervision of Professor Felipe Gonzalez and with the support of students Nam Ly, Vanessa Zepeda and Leonardo Dominguez, the team was able to collect and label a proprietary dataset, construct simulation environments and run field experiments.

Experimental Setup, With the drone mid-mission

Julian Galvez collected images of mud cracks, which are the objects of interest, from a UAV perspective in Forest Lake in Brisbane, as well in Western Australia under the guidance of Dr. David Flannery on a 3 week field trip. Nam Ly supported him with labelling and training an Object Detector, which was also able to detect mud cracks on images collected on Mars from NASA, unseen during training.

A semantic segmentation training pipeline, using Pytorch and Pytorch Lightning on the High-Performance Computing Cluster of QUT, was developed by Nicolas Mandel. The training allows for swift retraining with new data and export of the model before converting it to a deployable format. Julian Galvez exported the model to the OpenCV camera with the support of Juan Sandino, who deployed the model on the drone and integrated the observations into a decision making pipeline.

Conceptual setup

Experiments were conducted at the Samford Ecological Reserve Facility (SERF) with the kind support of Marcus Yates, who supplied tools and local access. The UAVs used for this research were X500 airframes with a Pixhawk Autopilot, an Intel Up2 onboard computer and an OAK-camera.

All computations, including the network inference, were conducted onboard the UAV, showing potential for future autonomous applications.

Open Source software played an integral part in the success of the project. The Px4 flight controller software, the OpenCV depth AI API, as well as the Pytorch Lightning framework are all open source and underline the importance of available tools.

 

Current projects

Autonomous Mission Planning, Navigation and Geological Feature Recognition using UAVs (Drones)

UAV Navigation using semantic cues

Autonomous UAV decision making under environment and target detection uncertainty