Buckley, Neil and Sherrett, Lewis and Secco, Emanuele Lindo (2021) A CNN sign language recognition system with single & double-handed gestures. In: 2021 IEEE COMPSAC, 12-16 July. 2021.
Preview |
Text
signLanguage_v2_SEL.pdf - Accepted Version Download (416kB) | Preview |
Abstract
This work aims at presenting a novel Computer Vision approach in the development of a real-time, web-camera based, British Sign Language recognition system. A literature review focused on current (1) state of sign language recognition systems and (2) techniques used is conducted. This review process is used as a foundation on which a Convolutional Neural Network (CNN) based system is designed and then implemented. A bespoke British Sign Language dataset - containing 11,875 images - is then performed to train and test the CNN which is used for the classification of human hand performed gestures. Finally, the CNN architecture recognized 19 static British Sign Language gestures, incorporating both single and double-handed gestures. During testing, the system achieved an average recognition accuracy of 89%.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Additional Information and Comments: | This paper was presented at 2021 IEEE COMPSAC: Intelligent and Resilient Computing for a Collaborative World. The final version is available from: https://ieeexplore.ieee.org/document/9529449 |
Faculty / Department: | Faculty of Human and Digital Sciences > Mathematics and Computer Science |
Depositing User: | Emanuele Secco |
Date Deposited: | 25 May 2021 13:16 |
Last Modified: | 05 Sep 2022 13:19 |
URI: | https://hira.hope.ac.uk/id/eprint/3289 |
Actions (login required)
View Item |