Build a model to recognize the hand sign-language of deaf and dumb. Live video through mobile/web-cam should recognize the sign language in real time.
Right now, in teh world there are more than 38 crore people who are deaf & dumb (we'll address them as specially abled). These people often find themselves amidst a sea of loneliness as people often find it difficult to communicate with them since the Sign Language used by them is difficult to grasp.
Also, there are no such app present in the market, that help in, recognising the sign language gestures at real time and translate them for a common user, so that he/she can understand a specially abled and also befriend them.
Team D-VPN brings to you Gesture Talk - an app that promises to be user-friendly and work on Machine Learning in order to decode for the normal user at real-time, what the specially abled is trying to convey through his/her hand gestures via the sign language.
With an attractive and user-friendly UI of the Native app that our Machine Learning Model will be incorporated into, we plan to bring everyone closer. Regardless of their ability and way of communication. The Market Validation for such a product is very high, as no solutions other than to learn the Sign Language from scratch exists.
- Attractive & User-friendly UI
- Model Predicts the Entire Sign-Language Alphabets and Digits
- Real-time translation into words/letters
- Voice assistant enabled
- Model trained over almost 90,000 images collectively
HOME PAGE | PREDICTIONS | REAL-TIME SENTENCE FORMATION |
---|---|---|
- Turkey Ankara Ayrancı Anadolu High School's Sign Language Digits Dataset (Digits)
- ASL Alphabet (Alphabets)
Use-Case | Technology |
---|---|
Languages | |
Machine Learning | |
Frontend | |
Tools | |
OS |
All of us are IT Engineers in making, at K. J. Somaiya College of Engineering