American Sign Language

Detect ASL Language using RNN

Posted by Sneha Mahapatra on January 31, 2020 · 1 min read

Overview

Many deaf students, whose first language is ASL, want to learn English to not only learn how to read and write, but also communicate with others who may not know ASL. In order to find a solution to this problem, our team DASL (Digital American Sign Language), aims to create an On-Device, Real Time ASL Translator. This idea has become a reality by use of Google’s Framework: Mediapipe. In the future we hope to create our own CNN(for image classification) and RNN(video classification) to translate both static and dynamic gesture and possibly even sentences if time permits.

5 Terre

ASL detection using RNN

Powerpoint Presentation

This powerpoint explains at a higher level how this project was created.

Hand Gesture Recognition Calculator

This is one file within the repository that shows how we embedded and called the ML model into the application

Project

The entire code base can be found here.