SMART APPLICATION FOR THE VISUALLY- IMPAIRED

Main Article Content

Olumide O. Obe
Akinwonmi A. E.

Abstract

In this work, an application that would allow recognizing objects from images recorded by the camera of a mobile device was developed. An android phone camera was used to take images of some objects and then store them in the android database and the name of each object was stored in an audio mode. The SIFT (Scale-Invariant Feature Transform) was applied for the development of the application. To improve the performance of the application, one of the fastest corner detection algorithms, the Features from Accelerated Segment Test (FAST) algorithm was implemented. Since the algorithm was implemented on a smartphone, OpenCV for Android SDK was used. The cascaded filters approach was used by SIFT to detect scale-invariant characteristic points, where the difference of Gaussians (DoG) was calculated on rescaled images progressively.  A blob detector based on the Hessian matrix to find points of interest was used by SURF. To measure local change around the points, the determinant of the Hessian matrix was used, and points were chosen based on where this determinant is maximal. The determinant of the Hessian was used by SURF to select scale. The auditory presentation of object recognition results to the blind user was done through a pre-recorded message. 97% accuracy is recorded in the performance of the system.

Downloads

Download data is not yet available.

Article Details

Section
Articles
Author Biography

Olumide O. Obe, Federal University of Technology, Akure, Nigeria

Computer Science Department

Associate Professor

References

VIII. REFERENCES

WHO (2020): Blindness and Visual Impairment. https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment

Strumillo, Pawel. (2012). Electronic Systems Aiding Spatial Orientation and Mobility of the Visually Impaired. 98. 373-386. 10.1007/978-3-642-23187-2_24.

Gill, J., & Jolliff, L. (2008). People with Visual Disabilities. The Engineering Handbook of Smart Technology for Aging, Disability, and Independence, 143.

Onishi, J., & Ono, T. (2011). Contour pattern recognition through auditory labels of Freeman chain codes for people with visual impairments. In 2011 IEEE International Conference on Systems, Man, and Cybernetics (pp. 1088-1093). IEEE.

Kornsingha, T., & Punyathep, P. (2011, May). A voice system, reading medicament label for visually impaired people. In RFID SysTech 2011 7th European Workshop on Smart Objects: Systems, Technologies, and Applications (pp. 1-6). VDE.

Hakobyan, L., Lumsden, J., O’Sullivan, D., & Bartlett, H. (2013). Mobile assistive technologies for the visually impaired. Survey of ophthalmology, 58(6), 513-528.

Google Goggles (www.google.com/mobile/goggles)

Matusiak, K., Skulimowski, P., & Strurniłło, P. (2013, June). Object recognition in a mobile phone application for visually impaired users. In 2013 6th International Conference on Human System Interactions (HSI) (pp. 479-484). IEEE.

Varpe, K. M., & Wankhade, M. P. (2013). Visually impaired assistive system. International Journal of Computer Applications, 77(16).

Mathankumar, M., & Sugandhi, N. (2013, August). A low-cost smart shopping facilitator for the visually impaired. In 2013 International Conference on Advances in Computing, Communications, and Informatics (ICACCI) (pp. 1088-1092). IEEE.

Rafian, P., & Legge, G. E. (2017). Remote sighted assistants for indoor location sensing of visually impaired pedestrians. ACM Transactions on Applied Perception (TAP), 14(3), 1-14.