Deep Learning for Electromyographic Hand Gesture Signal Classification by Leveraging Transfer Learning
In recent years, the use of deep learning algorithms has become increasingly more prominent. Within the field of electromyography-based gesture recognition however, deep learning algorithms are seldom employed. This is due in part to the large quantity of data required for the network to train on. The data sparsity arises from the fact that it would take an unreasonable amount of time for a single person to generate tens of thousands of examples for training such algorithms. In this paper, two datasets are recorded with the Myo Armband (Thalmic Labs), a low-cost, low-sampling rate (200Hz), 8-channel, consumer-grade, dry electrode sEMG armband. These datasets, referred to as the pre-training and evaluation dataset, are comprised of 19 and 17 able-bodied participants respectively. A convolutional network (ConvNet) is augmented with transfer learning techniques to leverage inter-user data from the first dataset, alleviating the burden imposed on a single individual to generate a vast quantity of training data for sEMG-based gesture recognition. This transfer learning scheme is shown to outperform the current state-of-the-art in gesture recognition achieving an average accuracy of 98.31 gestures over 17 able-bodied participants. Finally, a use-case study of eight able-bodied participants is presented to evaluate the impact of feedback on the degradation accuracy normally experienced from a classifier over time.
READ FULL TEXT