CISUC

Action Recognition for American Sign Language

Authors

Abstract

In this research, we present our findings to recognize American Sign Language
from series of hand gestures. While most researches in literature
focus only on static handshapes, our work target dynamic hand gestures.
Since dynamic signs dataset are very few, we collect an initial dataset of
150 videos for 10 signs and an extension of 225 videos for 15 signs. We
apply transfer learning models in combination with deep neural networks
and background subtraction for videos in different temporal settings. Our
primarily results show that we can get an accuracy of 0.86 and 0.71 using
DenseNet201, LSTM with video sequence of 12 frames accordingly

Keywords

Deep Learning, Action Recognition, American Sign Language, Transfer Learning

Conference

24th Portuguese Conference on Pattern Recognition (RECPAD), October 2018


Cited by

No citations found