Skip to main content

TossingBot: Learning to Throw Arbitrary Objects With Residual Physics

Author(s): Zeng, Andy; Song, Shuran; Lee, Johnny; Rodriguez, Alberto; Funkhouser, Thomas

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr15j97
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZeng, Andy-
dc.contributor.authorSong, Shuran-
dc.contributor.authorLee, Johnny-
dc.contributor.authorRodriguez, Alberto-
dc.contributor.authorFunkhouser, Thomas-
dc.date.accessioned2021-10-08T19:46:35Z-
dc.date.available2021-10-08T19:46:35Z-
dc.date.issued2020-06en_US
dc.identifier.citationZeng, Andy, Shuran Song, Johnny Lee, Alberto Rodriguez, and Thomas Funkhouser. "TossingBot: Learning to Throw Arbitrary Objects With Residual Physics." IEEE Transactions on Robotics 36, no. 4 (2020): pp. 1307-1319. doi:10.1109/TRO.2020.2988642en_US
dc.identifier.issn1552-3098-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr15j97-
dc.description.abstractWe investigate whether a robot arm can learn to pick and throw arbitrary rigid objects into selected boxes quickly and accurately. Throwing has the potential to increase the physical reachability and picking speed of a robot arm. However, precisely throwing arbitrary objects in unstructured settings presents many challenges: from acquiring objects in grasps suitable for reliable throwing, to handling varying object-centric properties (e.g., mass distribution, friction, shape) and complex aerodynamics. In this work, we propose an end-to-end formulation that jointly learns to infer control parameters for grasping and throwing motion primitives from visual observations (RGB-D images of arbitrary objects in a bin) through trial and error. Within this formulation, we investigate the synergies between grasping and throwing (i.e., learning grasps that enable more accurate throws) and between simulation and deep learning (i.e., using deep networks to predict residuals on top of control parameters predicted by a physics simulator). The resulting system, TossingBot, is able to grasp and successfully throw arbitrary objects into boxes located outside its maximum reach range at 500+ mean picks per hour (600+ grasps per hour with 85% throwing accuracy); and generalizes to new objects and target locations.en_US
dc.format.extent1307 - 1319en_US
dc.language.isoen_USen_US
dc.relation.ispartofIEEE Transactions on Roboticsen_US
dc.rightsFinal published version. This is an open access article.en_US
dc.titleTossingBot: Learning to Throw Arbitrary Objects With Residual Physicsen_US
dc.typeJournal Articleen_US
dc.identifier.doi10.1109/TRO.2020.2988642-
dc.identifier.eissn1941-0468-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
TossingBotArbitraryObjectsResidualPhysics.pdf4.42 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.