MotionInput v2.0 supporting DirectX: A modular library of open-source gesture-based machine learning and computer vision methods for interacting and controlling existing softwa

by   Ashild Kummen, et al.

Touchless computer interaction has become an important consideration during the COVID-19 pandemic period. Despite progress in machine learning and computer vision that allows for advanced gesture recognition, an integrated collection of such open-source methods and a user-customisable approach to utilising them in a low-cost solution for touchless interaction in existing software is still missing. In this paper, we introduce the MotionInput v2.0 application. This application utilises published open-source libraries and additional gesture definitions developed to take the video stream from a standard RGB webcam as input. It then maps human motion gestures to input operations for existing applications and games. The user can choose their own preferred way of interacting from a series of motion types, including single and bi-modal hand gesturing, full-body repetitive or extremities-based exercises, head and facial movements, eye tracking, and combinations of the above. We also introduce a series of bespoke gesture recognition classifications as DirectInput triggers, including gestures for idle states, auto calibration, depth capture from a 2D RGB webcam stream and tracking of facial motions such as mouth motions, winking, and head direction with rotation. Three use case areas assisted the development of the modules: creativity software, office and clinical software, and gaming software. A collection of open-source libraries has been integrated and provide a layer of modular gesture mapping on top of existing mouse and keyboard controls in Windows via DirectX. With ease of access to webcams integrated into most laptops and desktop computers, touchless computing becomes more available with MotionInput v2.0, in a federated and locally processed method.


Hand Gesture Controlled Drones: An Open Source Library

Drones are conventionally controlled using joysticks, remote controllers...

GesSure – A Robust Face-Authentication enabled Dynamic Gesture Recognition GUI Application

Using physical interactive devices like mouse and keyboards hinders natu...

SHREC 2021: Track on Skeleton-based Hand Gesture Recognition in the Wild

Gesture recognition is a fundamental tool to enable novel interaction pa...

A survey on Kornia: an Open Source Differentiable Computer Vision Library for PyTorch

This work presents Kornia, an open source computer vision library built ...

MARIO: Modular and Extensible Architecture for Computing Visual Statistics in RoboCup SPL

This technical report describes a modular and extensible architecture fo...

Mobile Head Tracking for eCommerce and Beyond

Shopping is difficult for people with motor impairments. This includes o...

Hand Gesture Recognition Library

In this paper we have presented a hand gesture recognition library. Vari...

Please sign up or login with your details

Forgot password? Click here to reset