Many deaf students, whose first language is ASL, want to learn English to not only learn how to read and write, but also communicate with others who may not know ASL. In order to find a solution to this problem, our team DASL (Digital American Sign Language), aims to create an On-Device, Real Time ASL Translator. This idea has become a reality by use of Google’s Framework: Mediapipe. In the future we hope to create our own CNN(for image classification) and RNN(video classification) to translate both static and dynamic gesture and possibly even sentences if time permits.
ASL detection using RNN
The entire code base can be found here.