UPSI Digital Repository (UDRep)
Start | FAQ | About
Menu Icon

QR Code Link :

Type :article
Subject :L Education
ISSN :02632241
Main Author :Bilal Zaidan
Additional Authors :Altaha, Mohamed Aktham Ahmed
Zaidan, A. A.
Alamoodi, Abdullah Hussein
Title :Based on wearable sensory device in 3D-printed humanoid: a new real-time sign language recognition system
Place of Production :Tanjung Malim
Publisher :Fakulti Seni, Komputeran dan Industri Kreatif
Year of Publication :2021
Notes :Measurement: Journal of the International Measurement Confederation
Corporate Name :Universiti Pendidikan Sultan Idris
HTTP Link :Click to view web link

Abstract : Universiti Pendidikan Sultan Idris
This study proposes a new real-time sign recognition system based on a wearable sensory glove, which has 17 sensors with 65 channels. We introduce the DataGlove to recognize various, possibly complex hand gestures of the Malaysian Sign Language (MSL). With 65 data channels, DataGlove can satisfy the requirement suggested by the analysis of hand anatomy, kinematic and gestures. Four groups of sensors were tested to select the optimal sensors that can capture hand gesture information. Also, a 3D-printed humanoid arm is used to validate the sensor mounted on the glove. In an extensive set of experiments to test our system, five well-known MSL participants were chosen to perform 75 gestures, all taken from the MSL numbers, alphabet, and words. The error rate was used to check system performance. Discussion confirms that the proposed system competes well with an advanced benchmark of previous works is up 100% based on 14 criteria in term of the type of captured signals, recognised gestures, and solved issues. Results show that our system is capable of recognizing the wide range of gestures with recognition accuracies of 99%, 96% and 93.4% for numbers, alphabet letters and words of MSL, respectively. This research contributes to enhancing the lifestyle of people with disabilities, and bridge the gap between people with hearing impairment and ordinary people. ? 2020 Elsevier Ltd

References

Abhishek, K. S., Qubeley, L. C. F., & Ho, D. (2016). Glove-based hand gesture recognition sign language translator using capacitive touch sensor. Paper presented at the 2016 IEEE International Conference on Electron Devices and Solid-State Circuits, EDSSC 2016, 334-337. doi:10.1109/EDSSC.2016.7785276 Retrieved from www.scopus.com

Aguiar, S., Erazo, A., Romero, S., Garces, E., Atiencia, V., & Figueroa, J. P. (2016). Development of a smart glove as a communication tool for people with hearing impairment and speech disorders. Paper presented at the 2016 IEEE Ecuador Technical Chapters Meeting, ETCM 2016, doi:10.1109/ETCM.2016.7750815 Retrieved from www.scopus.com

Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Salih, M. M., & Lakulu, M. M. B. (2018). A review on systems-based sensory gloves for sign language recognition state of the art between 2007 and 2017. Sensors (Switzerland), 18(7) doi:10.3390/s18072208

Altan, A., & Hacıoğlu, R. (2018). The algorithm development and implementation for 3D printers based on adaptive PID controller. Politeknik Dergisi, 21(3), 559-564. Retrieved from www.scopus.com

Arif, A., Rizvi, S. T. H., Jawaid, I., Waleed, M. A., & Shakeel, M. R. (2016). Techno-talk: An american sign language (ASL) translator. Paper presented at the International Conference on Control, Decision and Information Technologies, CoDIT 2016, 665-670. doi:10.1109/CoDIT.2016.7593642 Retrieved from www.scopus.com

Athira, P., Sruthi, C., & Lijiya, A. (2019). Retrieved from www.scopus.com

Borghetti, M., Sardini, E., & Serpelloni, M. (2013). Sensorized glove for measuring hand finger flexion for rehabilitation purposes. IEEE Transactions on Instrumentation and Measurement, 62(12), 3308-3314. doi:10.1109/TIM.2013.2272848

Buczek, F. L., Sinsel, E. W., Gloekler, D. S., Wimer, B. M., Warren, C. M., & Wu, J. Z. (2011). Kinematic performance of a six degree-of-freedom hand model (6DHand) for use in occupational biomechanics. Journal of Biomechanics, 44(9), 1805-1809. doi:10.1016/j.jbiomech.2011.04.003

Bui, T. D., & Nguyen, L. T. (2007). Recognizing postures in vietnamese sign language with MEMS accelerometers. IEEE Sensors Journal, 7(5), 707-712. doi:10.1109/JSEN.2007.894132

Bullock, I. M., Borras, J., & Dollar, A. M. (2012). Assessing assumptions in kinematic hand models: A review. Paper presented at the Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, 139-146. doi:10.1109/BioRob.2012.6290879 Retrieved from www.scopus.com

Cheng, H., Dai, Z., Liu, Z., & Zhao, Y. (2016). An image-to-class dynamic time warping approach for both 3D static and trajectory hand gesture recognition. Pattern Recognition, 55, 137-147. doi:10.1016/j.patcog.2016.01.011

Chong, T. -., & Kim, B. -. (2020). American sign language recognition system using wearable sensors with deep learning approach. The Journal of the Korea Institute of Electronic Communication Sciences, 15(2), 291-298. Retrieved from www.scopus.com

Elmahgiubi, M., Ennajar, M., Drawil, N., & Elbuni, M. S. (2015). Sign language translator and gesture recognition. Paper presented at the GSCIT 2015 - Global Summit on Computer and Information Technology - Proceedings, doi:10.1109/GSCIT.2015.7353332 Retrieved from www.scopus.com

Harish, N., & Poonguzhali, S. (2015). Design and development of hand gesture recognition system for speech impaired people. Paper presented at the 2015 International Conference on Industrial Instrumentation and Control, ICIC 2015, 1129-1133. doi:10.1109/IIC.2015.7150917 Retrieved from www.scopus.com

Ibarguren, A., Maurtua, I., & Sierra, B. (2010). Layered architecture for real time sign recognition: Hand gesture and movement. Engineering Applications of Artificial Intelligence, 23(7), 1216-1228. doi:10.1016/j.engappai.2010.06.001

Kanwal, K., Abdullah, S., Ahmed, Y. B., Saher, Y., & Jafri, A. R. (2014). Assistive glove for pakistani sign language translation pakistani sign language translator. Paper presented at the 17th IEEE International Multi Topic Conference: Collaborative and Sustainable Development of Technologies, IEEE INMIC 2014 - Proceedings, 173-176. doi:10.1109/INMIC.2014.7097332 Retrieved from www.scopus.com

Kapandji, A. I. (2008). Retrieved from www.scopus.com

Kau, L. -., Su, W. -., Yu, P. -., & Wei, S. -. (2015). A real-time portable sign language translation system. Paper presented at the Midwest Symposium on Circuits and Systems, , 2015-September doi:10.1109/MWSCAS.2015.7282137 Retrieved from www.scopus.com

Kishore, P. V. V., Kishore, S. R. C., & Prasad, M. V. D. (2013). Conglomeration of hand shapes and texture information for recognizing gestures of indian sign language using feed forward neural networks. International Journal of Engineering and Technology, 5(5), 3742-3756. Retrieved from www.scopus.com

Kortier, H. G., Sluiter, V. I., Roetenberg, D., & Veltink, P. H. (2014). Assessment of hand kinematics using inertial and magnetic sensors. Journal of NeuroEngineering and Rehabilitation, 11(1) doi:10.1186/1743-0003-11-70

Kumar, E. K., Kishore, P. V. V., Kiran Kumar, M. T., & Kumar, D. A. (2020). 3D sign language recognition with joint distance and angular coded color topographical descriptor on a 2 – stream CNN. Neurocomputing, 372, 40-54. doi:10.1016/j.neucom.2019.09.059

Kumar, P., Gauba, H., Pratim Roy, P., & Prosad Dogra, D. (2017). A multimodal framework for sensor based sign language recognition. Neurocomputing, 259, 21-38. doi:10.1016/j.neucom.2016.08.132

Kumar, P., Gauba, H., Roy, P. P., & Dogra, D. P. (2017). Coupled HMM-based multi-sensor data fusion for sign language recognition. Pattern Recognition Letters, 86, 1-8. doi:10.1016/j.patrec.2016.12.004

Liu, B., Cai, H., Ju, Z., & Liu, H. (2019). RGB-D sensing based human action and interaction analysis: A survey. Pattern Recognition, 94, 1-12. doi:10.1016/j.patcog.2019.05.020

Moreira Almeida, S. G., Guimarães, F. G., & Arturo Ramírez, J. (2014). Feature extraction in brazilian sign language recognition based on phonological structure and using RGB-D sensors. Expert Systems with Applications, 41(16), 7259-7271. doi:10.1016/j.eswa.2014.05.024

Mummadi, C. K., Leo, F. P. P., Verma, K. D., Kasireddy, S., Scholl, P. M., Kempfle, J., & Van Laerhoven, K. (2018). Real-time and embedded detection of hand gestures with an IMU-based glove. Informatics, 5(2) doi:10.3390/informatics5020028

Oz, C., & Leu, M. C. (2011). American sign language word recognition with a sensory glove using artificial neural networks. Engineering Applications of Artificial Intelligence, 24(7), 1204-1213. doi:10.1016/j.engappai.2011.06.015

Pattanaworapan, K., Chamnongthai, K., & Guo, J. -. (2016). Signer-independence finger alphabet recognition using discrete wavelet transform and area level run lengths. Journal of Visual Communication and Image Representation, 38, 658-677. doi:10.1016/j.jvcir.2016.04.015

Pradhan, G., Prabhakaran, B., & Li, C. (2008). Hand-gesture computing for the hearing and speech impaired. IEEE Multimedia, 15(2), 20-27. doi:10.1109/MMUL.2008.28

Pugeault, N., & Bowden, R. (2011). Spelling it out: Real-time ASL fingerspelling recognition. Paper presented at the Proceedings of the IEEE International Conference on Computer Vision, 1114-1119. doi:10.1109/ICCVW.2011.6130290 Retrieved from www.scopus.com

Sadek, M. I., Mikhael, M. N., & Mansour, H. A. (2017). A new approach for designing a smart glove for arabic sign language recognition system based on the statistical analysis of the sign language. Paper presented at the National Radio Science Conference, NRSC, Proceedings, 380-388. doi:10.1109/NRSC.2017.7893499 Retrieved from www.scopus.com

Saleh, N., Farghaly, M., Elshaaer, E., & Mousa, A. (2020). Smart glove-based gestures recognition system for arabic sign language. Paper presented at the Proceedings of 2020 International Conference on Innovative Trends in Communication and Computer Engineering, ITCE 2020, 303-307. doi:10.1109/ITCE48509.2020.9047820 Retrieved from www.scopus.com

Sharma, D., Verma, D., & Khetarpal, P. (2016). LabVIEW based sign language trainer cum portable display unit for the speech impaired. Paper presented at the 12th IEEE International Conference Electronics, Energy, Environment, Communication, Computer, Control: (E3-C3), INDICON 2015, doi:10.1109/INDICON.2015.7443381 Retrieved from www.scopus.com

Shukor, A. Z., Miskon, M. F., Jamaluddin, M. H., Ali Ibrahim, F. B., Asyraf, M. F., & Bahar, M. B. B. (2015). A new data glove approach for malaysian sign language detection. Paper presented at the Procedia Computer Science, , 76 60-67. doi:10.1016/j.procs.2015.12.276 Retrieved from www.scopus.com

Suri, K., & Gupta, R. (2019). Convolutional neural network array for sign language recognition using wearable IMUs. Paper presented at the 2019 6th International Conference on Signal Processing and Integrated Networks, SPIN 2019, 483-488. doi:10.1109/SPIN.2019.8711745 Retrieved from www.scopus.com

Tang, J., Cheng, H., Zhao, Y., & Guo, H. (2018). Structured dynamic time warping for continuous hand trajectory gesture recognition. Pattern Recognition, 80, 21-31. doi:10.1016/j.patcog.2018.02.011

Tanyawiwat, N., & Thiemjarus, S. (2012). Design of an assistive communication glove using combined sensory channels. Paper presented at the Proceedings - BSN 2012: 9th International Workshop on Wearable and Implantable Body Sensor Networks, 34-39. doi:10.1109/BSN.2012.17 Retrieved from www.scopus.com

Tao, W., Leu, M. C., & Yin, Z. (2018). American sign language alphabet recognition using convolutional neural networks with multiview augmentation and inference fusion. Engineering Applications of Artificial Intelligence, 76, 202-213. doi:10.1016/j.engappai.2018.09.006

Vijayalakshmi, P., & Aarthi, M. (2016). Sign language to speech conversion. Paper presented at the 2016 International Conference on Recent Trends in Information Technology, ICRTIT 2016, doi:10.1109/ICRTIT.2016.7569545 Retrieved from www.scopus.com

Vutinuntakasame, S., Jaijongrak, V. -., & Thiemjarus, S. (2011). An assistive body sensor network glove for speech- and hearing-impaired disabilities. Paper presented at the Proceedings - 2011 International Conference on Body Sensor Networks, BSN 2011, 7-12. doi:10.1109/BSN.2011.13 Retrieved from www.scopus.com

Wang, C., Liu, Z., & Chan, S. -. (2015). Superpixel-based hand gesture recognition with kinect depth camera. IEEE Transactions on Multimedia, 17(1), 29-39. doi:10.1109/TMM.2014.2374357

Zamani, M., & Kanan, H. R. (2014). Saliency based alphabet and numbers of american sign language recognition using linear feature extraction. Paper presented at the Proceedings of the 4th International Conference on Computer and Knowledge Engineering, ICCKE 2014, 398-403. doi:10.1109/ICCKE.2014.6993442 Retrieved from www.scopus.com

Zhou, Y., Jiang, G., & Lin, Y. (2016). A novel finger and hand pose estimation technique for real-time hand gesture recognition. Pattern Recognition, 49, 102-114. doi:10.1016/j.patcog.2015.07.014


This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials.
You may use the digitized material for private study, scholarship, or research.

Back to previous page

Installed and configured by Bahagian Automasi, Perpustakaan Tuanku Bainun, Universiti Pendidikan Sultan Idris
If you have enquiries, kindly contact us at pustakasys@upsi.edu.my or 016-3630263. Office hours only.