Automatic conversion of Indian Sign Language (ISL) to Natural Language: A Case Study"

Loading Events

ndian Sign Language acts as a medium of communication by visually impaired, deaf and dumb people who constitute a significant portion of Indian population. Most people find it difficult to apprehend ISL gestures. This has created a communication gap between the hearing and speech impaired and those who do not understand ISL. This project aims to bridge this gap of communication by developing a model that converts Indian sign language to text. Mediapipe python library provides detection solutions for hands, face, etc. Mediapipe hands out 21 landmarks of a hand. Using these landmarks, the region of hands can be segmented. A sign language gesture is represented as a video sequence which consists of spatial and temporal features. Spatial features are extracted from the frames of the video and the temporal features are extracted by relating the frames of video with respect to time. A model can be trained on the spatial features using CNN and temporal features using RNN, to convert Indian Sign Language to text.
Speaker(s): Prof Babu,
Virtual: https://events.vtools.ieee.org/m/385082

Share This Story, Choose Your Platform!

Go to Top