A pattern recognition model for static gestures in Malaysian sign language based on machine learning techniques

Ali H. Alrubayi, M. A. Ahmed, A. A. Zaidan, A. S. Albahri, B. B. Zaidan, O. S. Albahri, A. H. Alamoodi, Mamoun Alazab

Research output: Contribution to journalArticlepeer-review

40 Citations (Scopus)

Abstract

This work proposes a pattern recognition model for static gestures in Malaysian Sign Language (MSL) based on Machine Learning (ML) techniques. The proposed model is divided into two phases, namely, data acquisition and data processing. The first phase involves capturing the required sign data, such as the shape and orientation of the hand, to construct a sensor-based SL dataset. The dataset is collected using a DataGlove device. This device is used to measure the motions of the fingers and wrists. Sixty-four features represent each sign in the dataset. The collected sensory dataset is cleaned in the second phase by removing redundant data. Then, the features are scaled and normalised to exhibit symmetrical behaviour and eliminate outliers. Then, ten different ML techniques are utilised based on real-time data for SL gesture recognition. Experimental results confirmed the efficacy of the proposed pattern recognition model compared with previous work.

Original languageEnglish
Article number107383
Pages (from-to)1-15
Number of pages15
JournalComputers and Electrical Engineering
Volume95
Early online date17 Aug 2021
DOIs
Publication statusPublished - Oct 2021

Bibliographical note

Funding Information:
The authors would like to thank the support of NVIDIA Corporation for the donation of the Quadro GPU used for this research.

Fingerprint

Dive into the research topics of 'A pattern recognition model for static gestures in Malaysian sign language based on machine learning techniques'. Together they form a unique fingerprint.

Cite this