Contact tasks like grinding, polishing and assembly require a robot to physically interact with both rigid and flexible objects. Current methods relying on force control have difficulty achieving consistent finishing results and lack robustness in dealing with non-linear dynamics inherent in how the material is handled. This Project will take a new approach that detects and diagnoses the dynamical process through deep learning fusion of multi-sensory data, including force/tactile, visual, thermal, sound, and acoustic emission; and generate corrective process parameters in achieving the goals of a contact task. The Project will investigate and develop new theories and methods for machine and process modelling, and model-based robot contact control in time-and space-variant processes. The Project will utilise industry IoT and cloud computing to develop digital twins to track, predict and adjust the robot behaviour in executing the contact task – a grinding or finishing task.
Chief Investigators
Partners
Other Partners
This project is sponsored by the Australian Cobotics Centre and will be a collaboration with partner organisations WELD Australia, IR4, B&R Enclosures and Infrabuild.