Secco, Emanuele Lindo and McHugh, Daniel and Buckley, N (2021) A CNN-based Computer Vision Interface for Prosthetics’ application. In: 10th EAI International Conference on Wireless Mobile Communication and Healthcare, 13th-14th November, 2021, Chongqing, China. (Accepted for Publication)
Preview |
Text
paper_CNN.pdf - Accepted Version Download (642kB) | Preview |
Abstract
In this paper we present a CNN-based Interface for the control of prosthetic and robotic hand: a CNN visual system is trained with a set of images of daily life object in order to classify and recognize them. Such a classification provides useful information for the configuration of prosthetic and robotic hand: following the training, in fact, a low cost embedded computer combined with a low cost camera on the device (i.e. a prosthetic or robotic hand) can drive the device in order to approach and grasp whatever object belong to the training set.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Additional Information and Comments: | This paper has been accepted for presentation at the 10th EAI International Conference on Wireless Mobile Communication and Healthcare, 2021. |
Faculty / Department: | Faculty of Human and Digital Sciences > Mathematics and Computer Science |
Depositing User: | Emanuele Secco |
Date Deposited: | 14 Oct 2021 12:22 |
Last Modified: | 14 Nov 2021 01:15 |
URI: | https://hira.hope.ac.uk/id/eprint/3389 |
Actions (login required)
View Item |