UPSI Digital Repository (UDRep)
|
|
|
Abstract : Universiti Pendidikan Sultan Idris |
This work proposes a pattern recognition model for static gestures in Malaysian Sign Language (MSL) based on Machine Learning (ML) techniques. The proposed model is divided into two phases, namely, data acquisition and data processing. The first phase involves capturing the required sign data, such as the shape and orientation of the hand, to construct a sensor-based SL dataset. The dataset is collected using a DataGlove device. This device is used to measure the motions of the fingers and wrists. Sixty-four features represent each sign in the dataset. The collected sensory dataset is cleaned in the second phase by removing redundant data. Then, the features are scaled and normalised to exhibit symmetrical behaviour and eliminate outliers. Then, ten different ML techniques are utilised based on real-time data for SL gesture recognition. Experimental results confirmed the efficacy of the proposed pattern recognition model compared with previous work. ? 2021 |
References |
Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Alamoodi, A. H., Albahri, O. S., Al-Qaysi, Z. T., . . . Salih, M. M. (2021). Real-time sign language framework based on wearable device: Analysis of MSL, DataGlove, and gesture recognition. Soft Computing, 25(16), 11101-11122. doi:10.1007/s00500-021-05855-6 Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Salih, M. M., Al-qaysi, Z. T., & Alamoodi, A. H. (2021). Based on wearable sensory device in 3D-printed humanoid: A new real-time sign language recognition system. Measurement: Journal of the International Measurement Confederation, 168 doi:10.1016/j.measurement.2020.108431 Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Salih, M. M., & Lakulu, M. M. B. (2018). A review on systems-based sensory gloves for sign language recognition state of the art between 2007 and 2017. Sensors (Switzerland), 18(7) doi:10.3390/s18072208 Bhavsar, H., & Ganatra, A. (2012). A comparative study of training algorithms for supervised machine learning. International Journal of Soft Computing and Engineering (IJSCE), 2(4), 74-81. Retrieved from www.scopus.com Fatmi, R., Rashad, S., & Integlia, R. (2019). Comparing ANN, SVM, and HMM based machine learning methods for american sign language recognition using wearable motion sensors. Paper presented at the 2019 IEEE 9th Annual Computing and Communication Workshop and Conference, CCWC 2019, 290-297. doi:10.1109/CCWC.2019.8666491 Retrieved from www.scopus.com Gupta, R., & Rajan, S. (2020). Comparative analysis of convolution neural network models for continuous indian sign language classification. Paper presented at the Procedia Computer Science, , 171 1542-1550. doi:10.1016/j.procs.2020.04.165 Retrieved from www.scopus.com Jaiswal, S., & Gupta, P. (2021). A review on american sign language character recognition doi:10.1007/978-981-15-6014-9_32 Retrieved from www.scopus.com Kanwal, K., Abdullah, S., Ahmed, Y. B., Saher, Y., & Jafri, A. R. (2014). Assistive glove for pakistani sign language translation pakistani sign language translator. Paper presented at the 17th IEEE International Multi Topic Conference: Collaborative and Sustainable Development of Technologies, IEEE INMIC 2014 - Proceedings, 173-176. doi:10.1109/INMIC.2014.7097332 Retrieved from www.scopus.com Khosla, A., Khandnor, P., & Chand, T. (2020). A comparative analysis of signal processing and classification methods for different applications based on EEG signals. Biocybernetics and Biomedical Engineering, 40(2), 649-690. doi:10.1016/j.bbe.2020.02.002 Kishore, P. V. V., Kishore, S. R. C., & Prasad, M. V. D. (2013). Conglomeration of hand shapes and texture information for recognizing gestures of indian sign language using feed forward neural networks. International Journal of Engineering and Technology, 5(5), 3742-3756. Retrieved from www.scopus.com Kumar, P., Gauba, H., Roy, P. P., & Dogra, D. P. (2017). Coupled HMM-based multi-sensor data fusion for sign language recognition. Pattern Recognition Letters, 86, 1-8. doi:10.1016/j.patrec.2016.12.004 Liu, B., Cai, H., Ju, Z., & Liu, H. (2019). RGB-D sensing based human action and interaction analysis: A survey. Pattern Recognition, 94, 1-12. doi:10.1016/j.patcog.2019.05.020 Mohandes, M., Deriche, M., & Liu, J. (2014). Image-based and sensor-based approaches to arabic sign language recognition. IEEE Transactions on Human-Machine Systems, 44(4), 551-557. doi:10.1109/THMS.2014.2318280 Pattanaworapan, K., Chamnongthai, K., & Guo, J. -. (2016). Signer-independence finger alphabet recognition using discrete wavelet transform and area level run lengths. Journal of Visual Communication and Image Representation, 38, 658-677. doi:10.1016/j.jvcir.2016.04.015 Pezzuoli, F., Corona, D., & Corradini, M. L. (2021). Recognition and classification of dynamic hand gestures by a wearable data-glove. SN Computer Science, 2(1) doi:10.1007/s42979-020-00396-5 Saggio, G., Cavallo, P., Ricci, M., Errico, V., Zea, J., & Benalcázar, M. E. (2020). Sign language recognition using wearable electronics: Implementing K-nearest neighbors with dynamic time warping and convolutional neural network algorithms. Sensors (Switzerland), 20(14), 1-14. doi:10.3390/s20143879 Sharma, S., & Singh, S. (2020). Vision-based sign language recognition system: A comprehensive review. Paper presented at the Proceedings of the 5th International Conference on Inventive Computation Technologies, ICICT 2020, 140-144. doi:10.1109/ICICT48043.2020.9112409 Retrieved from www.scopus.com Shukor, A. Z., Miskon, M. F., Jamaluddin, M. H., Ali Ibrahim, F. B., Asyraf, M. F., & Bahar, M. B. B. (2015). A new data glove approach for malaysian sign language detection. Paper presented at the Procedia Computer Science, , 76 60-67. doi:10.1016/j.procs.2015.12.276 Retrieved from www.scopus.com Suharjito, Anderson, R., Wiryana, F., Ariesta, M. C., & Kusuma, G. P. (2017). Sign language recognition application systems for deaf-mute people: A review based on input-process-output. Paper presented at the Procedia Computer Science, , 116 441-448. doi:10.1016/j.procs.2017.10.028 Retrieved from www.scopus.com Sutarman, Majid, M. B. A., Zain, J. B. M., & Hermawan, A. (2016). Recognition of malaysian sign language using skeleton data with neural network. Paper presented at the Proceedings - 2015 International Conference on Science in Information Technology: Big Data Spectrum for Future Information Economy, ICSITech 2015, 231-236. doi:10.1109/ICSITech.2015.7407809 Retrieved from www.scopus.com Tan, T. S., Salleh, S. -., Ariff, A. K., Ting, C. -., Siew, K. S., & Leong, S. H. (2007). Malay sign language gesture recognition system. Paper presented at the 2007 International Conference on Intelligent and Advanced Systems, ICIAS 2007, 982-985. doi:10.1109/ICIAS.2007.4658532 Retrieved from www.scopus.com Tang, J., Cheng, H., Zhao, Y., & Guo, H. (2018). Structured dynamic time warping for continuous hand trajectory gesture recognition. Pattern Recognition, 80, 21-31. doi:10.1016/j.patcog.2018.02.011 Zamani, M., & Kanan, H. R. (2014). Saliency based alphabet and numbers of american sign language recognition using linear feature extraction. Paper presented at the Proceedings of the 4th International Conference on Computer and Knowledge Engineering, ICCKE 2014, 398-403. doi:10.1109/ICCKE.2014.6993442 Retrieved from www.scopus.com Zhou, Y., Jiang, G., & Lin, Y. (2016). A novel finger and hand pose estimation technique for real-time hand gesture recognition. Pattern Recognition, 49, 102-114. doi:10.1016/j.patcog.2015.07.014 |
This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials. You may use the digitized material for private study, scholarship, or research. |