Eye Gaze Controlled Robotic Arm for Persons with SSMI

05/25/2020
by   Vinay Krishna Sharma, et al.
2

Background: People with severe speech and motor impairment (SSMI) often uses a technique called eye pointing to communicate with outside world. One of their parents, caretakers or teachers hold a printed board in front of them and by analyzing their eye gaze manually, their intentions are interpreted. This technique is often error prone and time consuming and depends on a single caretaker. Objective: We aimed to automate the eye tracking process electronically by using commercially available tablet, computer or laptop and without requiring any dedicated hardware for eye gaze tracking. The eye gaze tracker is used to develop a video see through based AR (augmented reality) display that controls a robotic device with eye gaze and deployed for a fabric printing task. Methodology: We undertook a user centred design process and separately evaluated the web cam based gaze tracker and the video see through based human robot interaction involving users with SSMI. We also reported a user study on manipulating a robotic arm with webcam based eye gaze tracker. Results: Using our bespoke eye gaze controlled interface, able bodied users can select one of nine regions of screen at a median of less than 2 secs and users with SSMI can do so at a median of 4 secs. Using the eye gaze controlled human-robot AR display, users with SSMI could undertake representative pick and drop task at an average duration less than 15 secs and reach a randomly designated target within 60 secs using a COTS eye tracker and at an average time of 2 mins using the webcam based eye gaze tracker.

READ FULL TEXT

page 3

page 8

page 9

page 11

page 12

page 13

page 20

page 21

research
05/27/2020

Eye Gaze Controlled Interfaces for Head Mounted and Multi-Functional Displays in Military Aviation Environment

Eye gaze controlled interfaces allow us to directly manipulate a graphic...
research
02/17/2023

Build a training interface to install the bat's echolocation skills in humans

Bats use a sophisticated ultrasonic sensing method called echolocation t...
research
06/04/2023

Accessible Robot Control in Mixed Reality

A novel method to control the Spot robot of Boston Dynamics by Hololens ...
research
04/25/2021

A Novel Unified Stereo Stimuli based Binocular Eye-Tracking System for Accurate 3D Gaze Estimation

In addition to the high cost and complex setup, the main reason for the ...
research
07/12/2023

Assessing Augmented Reality Selection Techniques for Passengers in Moving Vehicles: A Real-World User Study

Nowadays, cars offer many possibilities to explore the world around you ...
research
04/15/2019

Custom Video-Oculography Device and Its Application to Fourth Purkinje Image Detection during Saccades

We built a custom video-based eye-tracker that saves every video frame a...
research
06/02/2022

RELAY: Robotic EyeLink AnalYsis of the EyeLink 1000 using an Artificial Eye

There is a widespread assumption that the peak velocities of visually gu...

Please sign up or login with your details

Forgot password? Click here to reset