Real Time Sign Detection And Motion Prediction System for ISL Using Deep Learning.

  • Shraddha Bhaware, Prof. M. S. Wakode

Abstract

Hand gestures are the nonverbal method of communication used along with verbal statement for transferring the information. Ideally, a mechanism in which the hand gesture plays a vital role for information interchange is called as Sign Language. In this sign language individually alphabet of the English vocabulary is assigned with a unique sign. The main objective of this project is to develop a system to support verbally challenged people. In the proposed system, the novel mechanism using neural networks has been proposed for real time dynamic gesture recognition for Indian English sign language dataset which accepts live video stream as input and displays the predicted text output for the trained detected sign. The live stream is divided into many fragments for feature extraction and the person performing the gesture is identified as a subject. After this, the background is disregarded and detection of the hand is performed in every frame of the given input live video. As soon as the hand sign is detected, the movement of hand is fetched and corresponding text is displayed. This motion data is then accumulated from a live video stream using open-cv and Machine Learning model with CNN and represented in the form of interval-valued data synthesis is performed. A suitable classifier is generated based on these statistics. Testing can be completed to obtain the efficiency of the given system. The given testing input is checked to be within the range of given intermission values and then confirmed to be a certain gesture.

Published
2019-12-25
Section
Articles