If you use our code or dataset, please consider citing:

@misc{tiboni2022paintnet,
  title = {PaintNet: Unstructured Multi-Path Learning from 3D Point Clouds for Robotic Spray Painting},
  author = {Tiboni, Gabriele and Camoriano, Raffaello and Tommasi, Tatiana},
  doi = {10.48550/ARXIV.2211.06930},  
  keywords = {Robotics (cs.RO), Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},  
  publisher = {arXiv},
  year = {2022}
}
×

PaintNet: Unstructured Multi-Path Learning from 3D Point Clouds for Robotic Spray Painting

G. Tiboni, R. Camoriano, T. Tommasi
2023 IEEE/RSJ international conference on intelligent robots and systems (IROS)

Abstract
Popular industrial robotic problems such as spray painting and welding require (i) conditioning on free-shape 3D objects and (ii) planning of multiple trajectories to solve the task. Yet, existing solutions make strong assumptions on the form of input surfaces and the nature of output paths, resulting in limited approaches unable to cope with real-data variability. By leveraging on recent advances in 3D deep learning, we introduce a novel framework capable of dealing with arbitrary 3D surfaces, and handling a variable number of unordered output paths (i.e. unstructured). Our approach focuses on predicting smaller path segments, which can be later concatenated to reconstruct long-horizon paths. We extensively validate the proposed method in the context of robotic spray painting by releasing PaintNet, the first public dataset of expert demonstrations on free-shape 3D objects collected in a real industrial scenario. A thorough experimental analysis demonstrates the capabilities of our model to promptly predict smooth output paths that cover up to 95% of the surface of previously unseen object instances. Furthermore, we show how models learned from PaintNet capture relevant features which serve as a reliable starting point to improve data and time efficiency when dealing with new object categories.


Authored by Gabriele Tiboni, Raffaello Camoriano, and Tatiana Tammasi from Politecnico di Torino. This work was supported by EFORT group, providing the authors with domain knowledge, original object meshes, trajectory data, and access to the proprietary spray painting simulator used during the experiments.

Overview of our method for multi-path prediction of 6D-pose spray painting paths given a raw 3D point cloud in input.

Dataset

We introduce the PaintNet dataset to accelerate research on supervised learning for multi-path prediction conditioned on free-shape 3D objects. PaintNet includes more than 800 object meshes and the associated spray painting strokes collected in a real industrial setting. The data currently covers four object categories of growing complexity: cuboids, windows, shelves, containers. All object meshes are already provided in a subdivided, smoothed watertight version to avoid sharp edges and holes. For each object, the associated unordered set of spray painting paths (a.k.a. strokes) is given, each being a sequence of end-effector poses. The 6-dimensional poses encode the 3D position of the ideal paint deposit point—12cm away from the gun nozzle—and the gun orientations as Euler angles. Each pose is collected by sampling from the end-effector kinematics at a rate of 4ms during offline program execution.

Download the PaintNet dataset at https://zenodo.org/records/10105273.

You may proceed to unzip the desired categories at the link above into a local path path/to/dataset/. Then, add the environment variable export PAINTNET_ROOT=path/to/dataset/.

Citing

@misc{tiboni2022paintnet,
  title = {PaintNet: Unstructured Multi-Path Learning from 3D Point Clouds for Robotic Spray Painting},
  author = {Tiboni, Gabriele and Camoriano, Raffaello and Tommasi, Tatiana},
  doi = {10.48550/ARXIV.2211.06930},  
  keywords = {Robotics (cs.RO), Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},  
  publisher = {arXiv},
  year = {2022}
}