rh20t.github.io - RH20T: A Comprehensive Robotic Dataset for Learning Diverse Skills in One-Shot

Description: RH20T

robotics (2106) robot learning (25) rh20t (1)

Example domain paragraphs

A key challenge in robotic manipulation in open domains is how to acquire diverse and generalizable skills for robots. Recent research in one-shot imitation learning has shown promise in transferring trained policies to new tasks based on demonstrations. This feature is attractive for enabling robots to acquire new skills and improving task and motion planning. However, due to limitations in the training dataset, the current focus of the community has mainly been on simple cases, such as push or pick-place

This paper aims to unlock the potential for an agent to generalize to hundreds of real-world skills with multi-modal perception. To achieve this, we have collected a dataset comprising over 110,000 contact-rich robot manipulation sequences across diverse skills, contexts, robots, and camera viewpoints, all collected in the real world. Each sequence in the dataset includes visual, force, audio, and action information, along with a corresponding human demonstration video. We have invested significant efforts

We select 48 tasks from RLBench , 29 tasks from MetaWorld , and introduce 70 self-proposed tasks that are frequently encountered and achievable by robots. Here are some selected tasks:

Links to rh20t.github.io (5)