[arXiv] [Project Website] Demonstration Dataset
Manipulating deformable objects remains a challenge within robotics due to the difficulties of state estimation, long-horizon planning, and predicting how the object will deform given an interaction. These challenges are the most pronounced with 3D deformable objects. We propose SculptDiff, a goal-conditioned imitation learning framework that works with point cloud state observations to directly learn clay sculpting policies for a variety of target shapes. To the best of our knowledge this is the first real-world method that successfully learns manipulation policies for 3D deformable objects.
Follow the link to the Demonstration Dataset, download and unzip the files and update the path in dataset.py
Follow the installation instructions and download the model weights of Point-BERT
To train a point cloud-based sculpting policy, run train_policy.py.
Follow the link to the Hardware CAD to replicate our camera cage.