The report summarizes the basic concepts and methods in creating this android application that uses gestures recognition to understand American sign language words and uses pattern recognition techniques for gesture recognition.
We have come to know a very genuine issue of sign language recognition, that problem being the issue of two-way communication i.e. between normal person and deaf/dumb. Current sign language recognition applications lack basic characteristics which are very necessary for the interaction with environment. Our project is focused on providing a portable and customizable solution for understanding sign language through an android app. The report summarizes the basic concepts and methods in creating this android application that uses gestures recognition to understand American sign language words. The project uses different image processing tools to separate the hand from the rest and then uses pattern recognition techniques for gesture recognition. A complete summary of the results obtained from the various tests performed is also provided to demonstrate the validity of the application.