Transferring Implicit Knowledge of Non-Visual Object Properties Across Heterogeneous Robot Morphologies

by   Gyan Tatiya, et al.

Humans leverage multiple sensor modalities when interacting with objects and discovering their intrinsic properties. Using the visual modality alone is insufficient for deriving intuition behind object properties (e.g., which of two boxes is heavier), making it essential to consider non-visual modalities as well, such as the tactile and auditory. Whereas robots may leverage various modalities to obtain object property understanding via learned exploratory interactions with objects (e.g., grasping, lifting, and shaking behaviors), challenges remain: the implicit knowledge acquired by one robot via object exploration cannot be directly leveraged by another robot with different morphology, because the sensor models, observed data distributions, and interaction capabilities are different across these different robot configurations. To avoid the costly process of learning interactive object perception tasks from scratch, we propose a multi-stage projection framework for each new robot for transferring implicit knowledge of object properties across heterogeneous robot morphologies. We evaluate our approach on the object-property recognition and object-identity recognition tasks, using a dataset containing two heterogeneous robots that perform 7,600 object interactions. Results indicate that knowledge can be transferred across robots, such that a newly-deployed robot can bootstrap its recognition models without exhaustively exploring all objects. We also propose a data augmentation technique and show that this technique improves the generalization of models. We release our code and datasets, here:


page 1

page 3

page 5


Cross-Tool and Cross-Behavior Perceptual Knowledge Transfer for Grounded Object Recognition

Humans learn about objects via interaction and using multiple perception...

MOSAIC: Learning Unified Multi-Sensory Object Property Representations for Robot Perception

A holistic understanding of object properties across diverse sensory mod...

ObjectFolder: A Dataset of Objects with Implicit Visual, Auditory, and Tactile Representations

Multisensory object-centric perception, reasoning, and interaction have ...

Active Clothing Material Perception using Tactile Sensing and Deep Learning

Humans represent the objects in the same category using their properties...

Leveraging Robotic Prior Tactile Exploratory Action Experiences For Learning New Objects's Physical Properties

Reusing the tactile knowledge of some previously-explored objects helps ...

OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction

Learning how humans manipulate objects requires machines to acquire know...

From Multi-modal Property Dataset to Robot-centric Conceptual Knowledge About Household Objects

Tool-use applications in robotics require conceptual knowledge about obj...

Please sign up or login with your details

Forgot password? Click here to reset