You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To develop an application which could be used by especially abled person to be able to convey their hand sign or gesture language into speech and aid an ordinary person to translate speech to gesture or hand sign language in order to make the communication more fluent
Developed a real-time Indian Sign Language recognition system using Python, TensorFlow, Keras, and OpenCV! 🤖👏 This innovative system can accurately recognize and interpret alphabets and numbers, making communication more accessible for the deaf and hard of hearing community.
This is the second portfolio project of Code Institute's Diploma in Full-Stack Software Development course. The second portfolio project uses HTML, CSS, and JavaScript to create an interactive website. The Guess Indian Sign Language Alphabet was chosen as the main idea built on in this project.
WeSpeak is primarily an Indian Sign Language interpretation tool that uses machine learning and image processing to translate sign language in real time.
This project is aimed at detecting and recognizing Indian Sign Language (ISL) gestures in real-time using the Mediapipe library and Artificial Neural Network.