Lip Motion based Alphabet Recognition using Neural Network

Disha George, Yogesh Rathore


The facial expressions in image sequence deliver information in context to the subject spoken by the speaker but it adds a challenge when it to be represented in animation system. It has great impact in the field of audio-visual speech recognition (AVSR). Some of us are capable of Lip reading by interpreting the lip motion. This research is divided into two-levels: (i) Firstly, frames are taken and features are extracted from these frames to be kept in database as standard. (ii) Secondly, test image samples to be trained in neural network to check the alphabet spoken with the trained images (stored in database) to recognize what the person has spoke. Lip reading system has been developed using K- Means Clustering using input images and 60% success has been achieved in similar alphabets lip movements (such as u, o, q, b, e, i, l, n etc.).


Keywords: Lip Motion, K-Means Clustering, Pattern Recognition, Artificial Neural Network

Full Text:




  • There are currently no refbacks.

Copyright (c) 2016 International Journal of Advanced Research in Computer Science