Attention for Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control

07/26/2023
by   Yijiong Lin, et al.
0

High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks. However, the deployment of such tasks in unstructured environments remains under-investigated. To improve the robustness of tactile robot control in unstructured environments, we propose and study a new concept: tactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience and the visual saliency prediction problem from computer vision. In analogy to visual saliency, this concept involves identifying key information in tactile images captured by a tactile sensor. While visual saliency datasets are commonly annotated by humans, manually labelling tactile images is challenging due to their counterintuitive patterns. To address this challenge, we propose a novel approach comprised of three interrelated networks: 1) a Contact Depth Network (ConDepNet), which generates a contact depth map to localize deformation in a real tactile image that contains target and noise features; 2) a Tactile Saliency Network (TacSalNet), which predicts a tactile saliency map to describe the target areas for an input contact depth map; 3) and a Tactile Noise Generator (TacNGen), which generates noise features to train the TacSalNet. Experimental results in contact pose estimation and edge-following in the presence of distractors showcase the accurate prediction of target features from real tactile images. Overall, our tactile saliency prediction approach gives robust sim-to-real tactile control in environments with unknown distractors. Project page: https://sites.google.com/view/tactile-saliency/.

READ FULL TEXT

page 1

page 2

page 4

page 5

page 6

page 7

research
03/05/2023

Tac-VGNN: A Voronoi Graph Neural Network for Pose-Based Tactile Servoing

Tactile pose estimation and tactile servoing are fundamental capabilitie...
research
03/26/2021

A Tactile Sensing Foot for Single Robot Leg Stabilization

Tactile sensing on human feet is crucial for motion control, however, ha...
research
10/17/2021

Deep Tactile Experience: Estimating Tactile Sensor Output from Depth Sensor Data

Tactile sensing is inherently contact based. To use tactile data, robots...
research
04/10/2023

PoseFusion: Robust Object-in-Hand Pose Estimation with SelectLSTM

Accurate estimation of the relative pose between an object and a robot h...
research
07/08/2023

Robust Learning-Based Incipient Slip Detection using the PapillArray Optical Tactile Sensor for Improved Robotic Gripping

The ability to detect slip, particularly incipient slip, enables robotic...
research
12/13/2021

Contact-Rich Manipulation of a Flexible Object based on Deep Predictive Learning using Vision and Tactility

We achieved contact-rich flexible object manipulation, which was difficu...
research
04/29/2021

Uncertainty-aware deep learning for robot touch: Application to Bayesian tactile servo control

This work investigates uncertainty-aware deep learning (DL) in tactile r...

Please sign up or login with your details

Forgot password? Click here to reset