Latif, Bilawal and Buckley, N and Secco, Emanuele Lindo (2022) Hand Gesture & Human-Drone Interaction. In: Intelligent Systems Conference (IntelliSys) 2022, 1-2 September, 2022, Amsterdam. (Accepted for Publication)
Preview |
Text
HandGesture_HRI_Bilawal.pdf - Accepted Version Download (425kB) | Preview |
Abstract
Human computer Interaction is a wide domain which includes different ways of interaction i.e., using hand gestures and body postures. Gestures Detection relate to non-verbal ways to deliver information to the system for control. The aim of gesture recognition is first recording the gestures and then these gestures are read and interpreted by a camera. Gesture recognition has wide range of applications. It can be used by disabled persons to communicate.
This paper focuses on detailed research of controlling drones with hand gestures. The presented system is made of three main blocks i.e. (1) the detection of gestures, (2) translating the gestures and (3) controlling the drone. Deep learning algorithm is used in the first module to detect the real-time gestures of hands. Secondly gesture translator uses some image processing techniques to identify gestures. Control signals are then generated for the drone. Third part shows the implementation of the algorithm using Tensorflow. The accuracy of system is 95.7%.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Additional Information and Comments: | This paper has been accepted for presentation at Intelligent Systems Conference (IntelliSys) 2022. |
Faculty / Department: | Faculty of Human and Digital Sciences > Mathematics and Computer Science |
Depositing User: | Emanuele Secco |
Date Deposited: | 02 Mar 2022 09:25 |
Last Modified: | 03 Sep 2022 00:15 |
URI: | https://hira.hope.ac.uk/id/eprint/3493 |
Actions (login required)
![]() |
View Item |