This work proposes a pattern recognition model for static gestures in Malaysian Sign Language (MSL) based on Machine Learning (ML) techniques. The proposed model is divided into two phases, namely, data acquisition and data processing. The first phase involves capturing the required sign data, such as the shape and orientation of the hand, to construct a sensor-based SL dataset. The dataset is collected using a DataGlove device. This device is used to measure the motions of the fingers and wrists. Sixty-four features represent each sign in the dataset. The collected sensory dataset is cleaned in the second phase by removing redundant data. Then, the features are scaled and normalised to exhibit symmetrical behaviour and eliminate outliers. Then, ten different ML techniques are utilised based on real-time data for SL gesture recognition. Experimental results confirmed the efficacy of the proposed pattern recognition model compared with previous work.