MER Lab conducts research on robotic manipulation strategies and their applications. We integrate computer vision, control theory, and machine learning techniques to design skillful and robust robots.
One of the major goals of the MER Lab is to identify environmental problems (e.g., recycling, waste sorting) that robots can alleviate, and to develop solutions for the missing manipulation capabilities. MER Lab also focuses on benchmarking efforts for robotic manipulation, dexterous manipulation, visual servoing, soft robot control, and active vision.
For any questions, please e-mail email@example.com.
OUr Undergraduates implemented a Waste Sorting Robot for recyclinG
First Public Dataset From a Recycling Plant
‘Our paper “ZeroWaste Dataset: Towards Deformable Object Segmentation in Cluttered Scenes” has been accepted to CVPR 2022! We introduce the first publicly available dataset for waste classification that is collected from an operating materials recovery facility (MRF). We also present baseline classification results with the state of the art algorithms. This work will enable researchers to develop vision and robotic systems for recycling applications. Thank you Boston University and Washington University for the excellent collaboration!
Human Robot Partnership for Ship Breaking
In collaboration with European Metal Recycling company, we are developing a metal scrap cutting robot, to break down large metal structures into small recyclable pieces. You can learn more from our paper “Towards Robotic Metal Scrap Cutting: A Novel Workflow and Pipeline for Cutting Path Generation” has been published in the proceedings of CASE 2021.
Towards a cloth object set
Our paper “Household Cloth Object Set: Fostering Benchmarking in Deformable Object Manipulation” has been accepted to RA-Letters! In this paper we discuss how to form a set of cloth objects for developing benchmarking protocols to be used in cloth manipulation research. Thanks to Institut de Robòtica i Informàtica Industria for leading this effort and to UMass Lowell for the great collaboration.