Sign Language Recognition Using Temporal Classification

01/07/2017
by   Hardie Cate, et al.
0

Devices like the Myo armband available in the market today enable us to collect data about the position of a user's hands and fingers over time. We can use these technologies for sign language translation since each sign is roughly a combination of gestures across time. In this work, we utilize a dataset collected by a group at the University of South Wales, which contains parameters, such as hand position, hand rotation, and finger bend, for 95 unique signs. For each input stream representing a sign, we predict which sign class this stream falls into. We begin by implementing baseline SVM and logistic regression models, which perform reasonably well on high quality data. Lower quality data requires a more sophisticated approach, so we explore different methods in temporal classification, including long short term memory architectures and sequential pattern mining methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset