A CNN sign language recognition system with single & double-handed gestures

Buckley, Neil and Sherrett, Lewis and Secco, Emanuele Lindo (2021) A CNN sign language recognition system with single & double-handed gestures. In: 2021 IEEE COMPSAC, 12-16 July. 2021. (Accepted for Publication)

[thumbnail of signLanguage_v2_SEL.pdf] Text
signLanguage_v2_SEL.pdf - Accepted Version
Restricted to Repository staff only until 17 July 2021.

Download (416kB)

Abstract

This work aims at presenting a novel Computer Vision approach in the development of a real-time, web-camera based, British Sign Language recognition system. A literature review focused on current (1) state of sign language recognition systems and (2) techniques used is conducted. This review process is used as a foundation on which a Convolutional Neural Network (CNN) based system is designed and then implemented. A bespoke British Sign Language dataset - containing 11,875 images - is then performed to train and test the CNN which is used for the classification of human hand performed gestures. Finally, the CNN architecture recognized 19 static British Sign Language gestures, incorporating both single and double-handed gestures. During testing, the system achieved an average recognition accuracy of 89%.

Item Type: Conference or Workshop Item (Paper)
Additional Information and Comments: This paper has been accepted for presentation at 2021 IEEE COMPSAC: Intelligent and Resilient Computing for a Collaborative World.
Faculty / Department: Faculty of Science > Mathematics and Computer Science
Depositing User: Emanuele Secco
Date Deposited: 25 May 2021 13:16
Last Modified: 25 May 2021 13:16
URI: https://hira.hope.ac.uk/id/eprint/3289

Actions (login required)

View Item View Item