User Case-Mobile ALOHA Mobile ALOHA based on
AgileX Robotics AgileX Robotics
2.94K subscribers
3,940 views
0

 Published On Jan 4, 2024

Introduce 𝐌𝐨𝐛𝐢𝐥𝐞 𝐀𝐋𝐎𝐇𝐀🏄 -- Learning!

With 50 demos, our robot can autonomously complete complex mobile manipulation tasks:
- cook and serve shrimp 🦐
- call and take an elevator 🛗
- store a 3 Ibs pot to a two-door cabinet
- push 5 consecutive chairs
- rinse pan using a water faucet
- play high fives with people

Co-led by Tony Z. Zhao, Chelsea Finn

Our robot can consistently handle these tasks, succeeding:
- 9 times in a row for Wipe Wine
- 5 times for Call Elevator
- robust against distractors for Use Cabinet
- extrapolate to chairs unseen during training

How do we achieve this with only 50 demos? The key is to co-train imitation learning algorithms with static ALOHA data. We found this to consistently improve performance, especially for tasks that require precise manipulation.

Co-training (1) improves the performance across all tasks, (2) is compatible with ACT, Diffusion Policy and VINN, (3) is robust to different data mixtures.

We open-source all the software and data of Mobile ALOHA!

Project Website 🛜: https://lnkd.in/gE6A43fR
Code for Imitation Learning 🖥️: https://lnkd.in/gDCmgy_E
Data 📊: https://lnkd.in/gCJJtmvT

#AgileXRobotics #AI #UGV #AGV #Tracer #MobileALOHA

show more

Share/Embed