Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities

06/18/2020
by   Rakshit Kothari, et al.
0

The study of gaze behavior has primarily been constrained to controlled environments in which the head is fixed. Consequently, little effort has been invested in the development of algorithms for the categorization of gaze events (e.g. fixations, pursuits, saccade, gaze shifts) while the head is free, and thus contributes to the velocity signals upon which classification algorithms typically operate. Our approach was to collect a novel, naturalistic, and multimodal dataset of eye + head movements when subjects performed everyday tasks while wearing a mobile eye tracker equipped with an inertial measurement unit and a 3D stereo camera. This Gaze-in-the-Wild dataset (GW) includes eye + head rotational velocities (deg/s), infrared eye images and scene imagery (RGB + D). A portion was labelled by coders into gaze motion events with a mutual agreement of 0.74 sample based Cohen’s κ. This labelled data was used to train and evaluate two machine learning algorithms, Random Forest and a Recurrent Neural Network model, for gaze event classification. Assessment involved the application of established and novel event based performance metrics. Classifiers achieve ~87% human performance in detecting fixations and saccades but fall short (50%) on detecting pursuit movements. Moreover, pursuit classification is far worse in the absence of head movement information. A subsequent analysis of feature significance in our best performing model revealed that classification can be done using only the magnitudes of eye and head movements, potentially removing the need for calibration between the head and eye tracking systems. The GW dataset, trained classifiers and evaluation metrics will be made publicly available with the intention of facilitating growth in the emerging area of head-free gaze event classification.

READ FULL TEXT

page 1

page 4

page 6

page 7

page 8

page 9

page 10

page 14

research
12/20/2016

A Statistical Approach to Continuous Self-Calibrating Eye Gaze Tracking for Head-Mounted Virtual Reality Systems

We present a novel, automatic eye gaze tracking scheme inspired by smoot...
research
05/18/2018

Recognition of Activities from Eye Gaze and Egocentric Video

This paper presents a framework for recognition of human activity from e...
research
01/22/2022

MIDAS: Deep learning human action intention prediction from natural eye movement patterns

Eye movements have long been studied as a window into the attentional me...
research
03/06/2022

A Rule-Based System for ADHD Identification using Eye Movement Data

Attention Deficit Hyperactivity Disorder (ADHD) is one of the common psy...
research
04/12/2023

Bridging the Gap: Gaze Events as Interpretable Concepts to Explain Deep Neural Sequence Models

Recent work in XAI for eye tracking data has evaluated the suitability o...
research
06/11/2021

States of confusion: Eye and Head tracking reveal surgeons' confusion during arthroscopic surgery

During arthroscopic surgeries, surgeons are faced with challenges like c...

Please sign up or login with your details

Forgot password? Click here to reset