The scaling of the gripper affects the action and perception in teleoperated grasping via a robot-assisted minimally invasive surgery system

by   Amit Milstein, et al.

We use psychophysics to investigate human-centered transparency of grasping in unilateral robot-assisted minimally invasive surgery (RAMIS) without force feedback. Instead of the classical definition of transparency, we define here a human-centered transparency, focusing on natural action and perception in RAMIS. Demonstrating this approach, we assess the effect of gripper scaling on human-centered transparency in teleoperated grasping of rigid objects. Thirty-one participants were divided into three groups, with different scaling between the opening of the gripper of the surgeon-side manipulator and the gripper of the surgical instrument. Each participant performed two experiments: (1) action experiment: reaching and grasping of different cylinders; and (2) perception experiment: reporting the size of the cylinders. In two out of three gripper scaling, teleoperated grasping was similar to natural grasping. In the action experiment, the maximal grip aperture of the surgical instrument was proportional to the size of the grasped object, and its variability did not depend on object size, whereas in the perception experiment, consistently with Weber's law, variability of perceived size increased with size. In the fine gripper scaling, action and perception variabilities decreased with size, suggesting reduced transparency. These results suggest that in our RAMIS system, if the gripper scaling is not too fine, grasping kinematics and the gap between action and perception are similar to natural grasping. This means that in the context of grasping, our system induces natural grasping behavior and is human-centered transparent. We anticipate that using psychophysics for optimizing human-centered teleoperation control will eventually improve the usability of RAMIS.


page 1

page 2


Fusing Visuo-Tactile Perception into Kernelized Synergies for Robust Grasping and Fine Manipulation of Non-rigid Objects

Handling non-rigid objects using robot hands necessities a framework tha...

Grasping Using Tactile Sensing and Deep Calibration

Tactile perception is an essential ability of intelligent robots in inte...

Eyes on the Prize: Improved Perception for Robust Dynamic Grasping

This paper is concerned with perception challenges for robust grasping i...

Learning Postural Synergies for Categorical Grasping through Shape Space Registration

Every time a person encounters an object with a given degree of familiar...

Optimization of Robot Grasping Forces and Worst Case Loading

We consider the optimization of the vector of grasping forces that suppo...

Automated pick-up of suturing needles for robotic surgical assistance

Robot-assisted laparoscopic prostatectomy (RALP) is a treatment for pros...

GRASPA 1.0: GRASPA is a Robot Arm graSping Performance benchmArk

The use of benchmarks is a widespread and scientifically meaningful prac...

Please sign up or login with your details

Forgot password? Click here to reset