This project is financed through CHIST-ERA and is part of an international research project named Perception-guided robust and reproducible robotic grasping and manipulation. In this project, the team of researchers will address the problem of autonomous robotic grasping of objects in challenging scenes.
We consider two industrially and economically important open challenges which require advanced vision-guided grasping:
- “Bin-picking” for manufacturing, where components must be grasped from a random, self-occluding heap inside a bin or box. Parts may have known models, but will only be partially visible in the heap and may have complex shapes. Shiny/reflective metal parts make 3D vision difficult, and the bin walls provide difficult reach-to-grasp and visibility constraints.
- Waste materials handling, which may be hazardous (e.g., nuclear) waste, or materials for recycling in the circular economy. Here the robot has no prior models of object shapes, and grasped materials may also be deformable (e.g., contaminated gloves, hoses).
More specifically, Norlab will be responsible to investigate 3D perception solutions robust to harsh environmental conditions leading to the grasping of reflective, transparent, or flexible objects. The initial solutions will be proposed using an UR10e equipped with a flash lidar on the wrist.
- Principal Investigator: Clément Gosselin
- PI on Perception: François Pomerleau
- International Collaborators
- Technical Lead: TBD