UPSI Digital Repository (UDRep)
|
|
|
Abstract : Universiti Pendidikan Sultan Idris |
The purpose of this study was to propose a low-cost and real-time recognition system using a
sensory glove, which has 17 sensors with 65 channels to capture static sign data of the Malaysian
sign language (MSL). The study uses an experimental design. Five participants well-known MSL
were chosen to perform 75 gestures throughout wear sensory glove. This research was carried
out in six phases as follows: Phase I involved a review of literature via a systematic review
approach to identify the relevant set of articles that helped formulate the research questions.
Phase II focused on the analysis of hand anatomy, hand kinematic, and hand gestures to help
understand the nature of MSL and to define the glove requirements. In Phase III, DataGlove was
designed and developed based on the glove requirements to help optimize the best functions of the
glove. Phase IV involved the pre-processing, feature extraction, and classification of the data
collected from the proposed DataGlove and identified gestures of MSL. A new vision and sensor-based
MSL datasets were collected in Phase V. Phase VI focused on the evaluation and validation process
across different development stages. The error rate was used to check system performance. Also, a
3D printed humanoid arm was used to validate the sensor mounted on the glove. The results of data
analysis showed 37 common patterns with similar hand gestures in MSL. Furthermore, the
design of DataGlove based on MSL analysis was effective in capturing a wide range of gestures with
a recognition accuracy of 99%, 96%, and 93.4% for numbers, alphabet letters, and words,
respectively. In conclusion, the research findings suggest that 37 group's gestures of MSL
can increase the recognition accuracy of MSL hand gestures to bridge the gap between people
with hearing impairments and ordinary people. For future research, a more
comprehensive analysis of the MSL recognition system is
recommended.
|
References |
Abdulla, D., Abdulla, S., Manaf, R., & Jarndal, A. H. (2016). Design and implementation of a sign-to-speech/text system for deaf and dumb people. Paper presented at the Electronic Devices, Systems and Applications (ICEDSA), 2016 5th International Conference on.
Abdulnabi, M., Al-Haiqi, A., Kiah, M. L. M., Zaidan, A., Zaidan, B., & Hussain, M. (2017). A distributed framework for health information exchange using smartphone technologies. Journal of biomedical informatics, 69, 230-250.
Abhishek, K. S., Qubeley, L. C. F., & Ho, D. (2016). Glove-based hand gesture recognition sign language translator using capacitive touch sensor. Paper presented at the Electron Devices and Solid-State Circuits (EDSSC), 2016 IEEE International Conference on.
Abualola, H., Al Ghothani, H., Eddin, A. N., Almoosa, N., & Poon, K. (2016). Flexible gesture recognition using wearable inertial sensors. Paper presented at the Circuits and Systems (MWSCAS), 2016 IEEE 59th International Midwest Symposium on.
Adnan, N. H., Wan, K., Shahriman, A., Zaaba, S., nisha Basah, S., Razlan, Z. M., . . . Aziz, A. A. (2012). Measurement of the flexible bending force of the index and middle fingers for virtual interaction. Procedia engineering, 41, 388-394.
Aguiar, S., Erazo, A., Romero, S., Garcés, E., Atiencia, V., & Figueroa, J. P. (2016). Development of a smart glove as a communication tool for people with hearing impairment and speech disorders. Paper presented at the Ecuador Technical Chapters Meeting (ETCM), IEEE.
Ahmed, S., Islam, R., Zishan, M. S. R., Hasan, M. R., & Islam, M. N. (2015). Electronic speaking system for speech impaired people: Speak up. Paper presented at the Electrical Engineering and Information Communication Technology (ICEEICT), 2015 International Conference on.
Ahmed, S. F., Ali, S. M. B., & Qureshi, S. S. M. (2010). Electronic speaking glove for speechless patients, a tongue to a dumb. Paper presented at the Sustainable Utilization and Development in Engineering and Technology (STUDENT), 2010 IEEE Conference on.
Al-Ahdal, M. E., & Nooritawati, M. T. (2012). Review in sign language recognition systems. Paper presented at the Computers & Informatics (ISCI), 2012 IEEE Symposium on.
Alaa, M., Zaidan, A., Zaidan, B., Talal, M., & Kiah, M. (2017). A review of smart home applications based on Internet of Things. Journal of Network and Computer Applications, 97, 48-65.
Alvi, A. K., Azhar, M. Y. B., Usman, M., Mumtaz, S., Rafiq, S., Rehman, R. U., & Ahmed, I. (2004). Pakistan sign language recognition using statistical template matching. International Journal of Information Technology, 1(1), 1-12.
Anderson, R., Wiryana, F., Ariesta, M. C., & Kusuma, G. P. (2017). Sign Language Recognition Application Systems for Deaf-Mute People: A Review Based on Input-Process-Output. Procedia Computer Science, 116, 441-448.
Ani, A. I. C., Rosli, A. D., Baharudin, R., Abbas, M. H., & Abdullah, M. F. (2014). Preliminary study of recognizing alphabet letter via hand gesture. Paper presented at the Computational Science and Technology (ICCST), 2014 International Conference on.
Anupreethi, H., & Vijayakumar, S. (2012). MSP430 based sign language recognizer for dumb patients. Procedia engineering, 38, 1374-1380.
Arif, A., Rizvi, S. T. H., Jawaid, I., Waleed, M. A., & Shakeel, M. R. (2016). Techno- Talk: An American Sign Language (ASL) Translator. Paper presented at the Control, Decision and Information Technologies (CoDIT), 2016 International Conference on.
Bajpai, D., Porov, U., Srivastav, G., & Sachan, N. (2015). Two Way Wireless Data Communication and American Sign Language Translator Glove for Images Text and Speech Display on Mobile Phone. Paper presented at the Communication Systems and Network Technologies (CSNT), 2015 Fifth International Conference on.
Bedregal, B. R. C., & Dimuro, G. P. (2006). Interval fuzzy rule-based hand gesture recognition. Paper presented at the Scientific Computing, Computer Arithmetic and Validated Numerics, 2006. SCAN 2006. 12th GAMM-IMACS International Symposium on.
Bhatnagar, V. S., Magon, R., Srivastava, R., & Thakur, M. K. (2015). A cost effective Sign Language to voice emulation system. Paper presented at the Contemporary Computing (IC3), 2015 Eighth International Conference on.
Borghetti, M., Sardini, E., & Serpelloni, M. (2013). Sensorized glove for measuring hand finger flexion for rehabilitation purposes. IEEE Transactions on Instrumentation and Measurement, 62(12), 3308-3314.
Buczek, F. L., Sinsel, E. W., Gloekler, D. S., Wimer, B. M., Warren, C. M., & Wu, J. Z. (2011). Kinematic performance of a six degree-of-freedom hand model (6DHand) for use in occupational biomechanics. Journal of biomechanics, 44(9), 1805-1809.
Bui, T. D., & Nguyen, L. T. (2007). Recognizing postures in Vietnamese sign language with MEMS accelerometers. IEEE Sensors Journal, 7(5), 707-712.
Bullock, I. M., Borràs, J., & Dollar, A. M. (2012). Assessing assumptions in kinematic hand models: a review. Paper presented at the Biomedical Robotics and Biomechatronics (BioRob), 2012 4th IEEE RAS & EMBS International Conference on.
Cambridge, U. (2018a). Meaning of “evaluation” in the English Dictionary. Retrieved 17-10, 2018, from https://dictionary.cambridge.org/dictionary/english/evaluation
Cambridge, U. (2018b). Meaning of “validate” in the English Dictionary. Retrieved 17- 10, 2018, from https://dictionary.cambridge.org/dictionary/english/validate
Chen, Y.-P. P., Johnson, C., Lalbakhsh, P., Caelli, T., Deng, G., Tay, D., . . . Doube, W. (2016). Systematic review of virtual speech therapists for speech disorders. Computer Speech & Language, 37, 98-128.
Chouhan, T., Panse, A., Voona, A. K., & Sameer, S. (2014). Smart glove with gesture recognition ability for the hearing and speech impaired. Paper presented at the Global Humanitarian Technology Conference-South Asia Satellite (GHTC- SAS), 2014 IEEE.
Das, P., De, R., Paul, S., Chowdhury, M., & Neogi, B. (2015). Analytical study and overview on glove based Indian Sign Language interpretation technique.
Dipietro, L., Sabatini, A. M., & Dario, P. (2008). A survey of glove-based systems and their applications. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 38(4), 461-482.
El Hayek, H., Nacouzi, J., Kassem, A., Hamad, M., & El-Murr, S. (2014). Sign to letter translator system using a hand glove. Paper presented at the e-Technologies and Networks for Development (ICeND), 2014 Third International Conference on.
Elmahgiubi, M., Ennajar, M., Drawil, N., & Elbuni, M. S. (2015). Sign language translator and gesture recognition. Paper presented at the Computer & Information Technology (GSCIT), 2015 Global Summit on.
Erol, A., Bebis, G., Nicolescu, M., Boyle, R. D., & Twombly, X. (2007). Vision-based hand pose estimation: A review. Computer Vision and Image Understanding, 108(1-2), 52-73.
Fu, Y.-F., & Ho, C.-S. (2007). Static finger language recognition for handicapped aphasiacs. Paper presented at the Innovative Computing, Information and Control, 2007. ICICIC'07. Second International Conference on.
Fu, Y.-F., & Ho, C.-S. (2008). Development of a programmable digital glove. Smart Materials and Structures, 17(2), 025031.
Ga?ka, J., M?sior, M., Zaborski, M., & Barczewska, K. (2016). Inertial motion sensing glove for sign language gesture acquisition and recognition. IEEE Sensors Journal, 16(16), 6310-6316.
Gupta, D., Singh, P., Pandey, K., & Solanki, J. (2015). Design and development of a low cost Electronic Hand Glove for deaf and blind. Paper presented at the Computing for Sustainable Global Development (INDIACom), 2015 2nd International Conference on.
Harish, N., & Poonguzhali, S. (2015). Design and development of hand gesture recognition system for speech impaired people. Paper presented at the Industrial Instrumentation and Control (ICIC), 2015 International Conference on.
HOCK, O. S. (2007). A Review on the teaching and learning resources for the deaf community in Malaysia. Chiang Mai University Journal of Social Sciences and Humanities.
Hoque, M. T., Rifat-Ut-Tauwab, M., Kabir, M. F., Sarker, F., Huda, M. N., & Abdullah- Al-Mamun, K. (2016). Automated Bangla sign language translation system: Prospects, limitations and applications. Paper presented at the Informatics, Electronics and Vision (ICIEV), 2016 5th International Conference on.
Hussain, M., Al-Haiqi, A., Zaidan, A., Zaidan, B., Kiah, M. L. M., Anuar, N. B., & Abdulnabi, M. (2015). The landscape of research on smartphone medical apps: Coherent taxonomy, motivations, open challenges and recommendations. Computer methods and programs in biomedicine, 122(3), 393-408.
Ibarguren, A., Maurtua, I., & Sierra, B. (2009). Layered architecture for real-time sign recognition. The Computer Journal, 53(8), 1169-1183.
Ibarguren, A., Maurtua, I., & Sierra, B. (2010). Layered architecture for real time sign recognition: Hand gesture and movement. Engineering Applications of Artificial Intelligence, 23(7), 1216-1228.
Iwasako, K., Soga, M., & Taki, H. (2014). Development of finger motion skill learning support system based on data gloves. Procedia Computer Science, 35, 1307- 1314.
Jadhav, A. J., & Joshi, M. P. (2016). AVR based embedded system for speech impaired people. Paper presented at the Automatic Control and Dynamic Optimization Techniques (ICACDOT), International Conference on.
Kadam, K., Ganu, R., Bhosekar, A., & Joshi, S. (2012). American sign language interpreter. Paper presented at the Technology for Education (T4E), 2012 IEEE Fourth International Conference on.
Kanwal, K., Abdullah, S., Ahmed, Y. B., Saher, Y., & Jafri, A. R. (2014). Assistive Glove for Pakistani Sign Language Translation. Paper presented at the Multi- Topic Conference (INMIC), 2014 IEEE 17th International.
Kapandji, A. I. (2008). The physiology of the joints, Volume3: The spinal column, pelvic girdle and head. Edinburgh: Churchill Livingstone.
Kau, L.-J., Su, W.-L., Yu, P.-J., & Wei, S.-J. (2015). A real-time portable sign language translation system. Paper presented at the Circuits and Systems (MWSCAS), 2015 IEEE 58th International Midwest Symposium on.
Khambaty, Y., Quintana, R., Shadaram, M., Nehal, S., Virk, M. A., Ahmed, W., & Ahmedani, G. (2008). Cost effective portable system for sign language gesture recognition. Paper presented at the System of Systems Engineering, 2008. SoSE'08. IEEE International Conference on.
Khan, S., Gupta, G. S., Bailey, D., Demidenko, S., & Messom, C. (2009). Sign language analysis and recognition: A preliminary investigation. Paper presented at the Image and Vision Computing New Zealand, 2009. IVCNZ'09. 24th International Conference.
Kim, J., Wagner, J., Rehm, M., & André, E. (2008). Bi-channel sensor fusion for automatic sign language recognition. Paper presented at the Automatic Face & Gesture Recognition, 2008. FG'08. 8th IEEE International Conference on.
Kong, W., & Ranganath, S. (2008). Signing exact english (SEE): Modeling and recognition. Pattern Recognition, 41(5), 1638-1652.
Kong, W., & Ranganath, S. (2014). Towards subject independent continuous sign language recognition: A segment and merge approach. Pattern Recognition, 47(3), 1294-1308.
Kortier, H. G., Sluiter, V. I., Roetenberg, D., & Veltink, P. H. (2014). Assessment of hand kinematics using inertial and magnetic sensors. Journal of neuroengineering and rehabilitation, 11(1), 70.
Kosmidou, V. E., & Hadjileontiadis, L. J. (2009). Sign language recognition using intrinsic-mode sample entropy on sEMG and accelerometer data. IEEE transactions on biomedical engineering, 56(12), 2879-2890.
Kumar, P., Gauba, H., Roy, P. P., & Dogra, D. P. (2017). A multimodal framework for sensor based sign language recognition. Neurocomputing, 259, 21-38.
LaViola, J. (1999). A survey of hand posture and gesture recognition techniques and technology. Brown University, Providence, RI, 29.
Lee, J., & Kunii, T. L. (1995). Model-based analysis of hand posture. IEEE Computer Graphics and applications, 15(5), 77-86.
Lei, L., & Dashun, Q. (2015). Design of data-glove and Chinese sign language recognition system based on ARM9. Paper presented at the Electronic Measurement & Instruments (ICEMI), 2015 12th IEEE International Conference on.
Lokhande, P., Prajapati, R., & Pansare, S. (2015). Data Gloves for Sign Language Recognition System. International Journal of Computer Applications, 11-14.
López-Noriega, J. E., Fernández-Valladares, M. I., & Uc-Cetina, V. (2014). Glove- based sign language recognition solution to assist communication for deaf users. Paper presented at the Electrical Engineering, Computing Science and Automatic Control (CCE), 2014 11th International Conference on.
Luqman, H., & Mahmoud, S. A. (2017). Transform-based Arabic sign language recognition. Procedia Computer Science, 117, 2-9.
Majid, M. B. A., Zain, J. B. M., & Hermawan, A. (2015). Recognition of Malaysian sign language using skeleton data with neural network. Paper presented at the Science in Information Technology (ICSITech), 2015 International Conference on.
Malaysia, K. P. (1985). Komunikasi Seluruh Bahasa Malaysia Kod Tangan: Jilid 1: Kuala Lumpur: Dewan Bahasa dan Pustaka.
Malaysia, P. O. P. (2000). Bahasa Isyarat Malaysia. Penerbit Persekutuan Orang Pekak Malaysia.
Mátételki, P., Pataki, M., Turbucz, S., & Kovács, L. (2014). An assistive interpreter tool using glove-based hand gesture recognition. Paper presented at the Humanitarian Technology Conference-(IHTC), 2014 IEEE Canada International.
McGuire, R. M., Hernandez-Rebollar, J., Starner, T., Henderson, V., Brashear, H., & Ross, D. S. (2004). Towards a one-way American sign language translator. Paper presented at the Automatic Face and Gesture Recognition, 2004. Proceedings. Sixth IEEE International Conference on.
Mehdi, S. A., & Khan, Y. N. (2002). Sign language recognition using sensor gloves. Paper presented at the Neural Information Processing, 2002. ICONIP'02. Proceedings of the 9th International Conference on.
Mohandes, M., & Deriche, M. (2013). Arabic sign language recognition by decisions fusion using Dempster-Shafer theory of evidence. Paper presented at the Computing, Communications and IT Applications Conference (ComComAp), 2013.
Nair, S., De La Vara, J. L., Sabetzadeh, M., & Briand, L. (2014). An extended systematic literature review on provision of evidence for safety certification. Information and Software Technology, 56(7), 689-717. orgnanization;, w. h. (Fact sheet Updated February 2017;). Deafness and hearing loss; . 01-Sep-2017, from http://www.who.int/mediacentre/factsheets/fs300/en/#content
Orphanides, A. K., & Nam, C. S. (2017). Touchscreen interfaces in context: a systematic review of research into touchscreens across settings, populations, and implementations. Applied ergonomics, 61, 116-143.
Oszust, M., & Wysocki, M. (2013). Recognition of signed expressions observed by Kinect Sensor. Paper presented at the Advanced Video and Signal Based Surveillance (AVSS), 2013 10th IEEE International Conference on.
Oz, C., & Leu, M. C. (2007). Linguistic properties based on American Sign Language isolated word recognition with artificial neural networks using a sensory glove and motion tracker. Neurocomputing, 70(16-18), 2891-2901.
Oz, C., & Leu, M. C. (2011). American Sign Language word recognition with a sensory glove using artificial neural networks. Engineering Applications of Artificial Intelligence, 24(7), 1204-1213.
Phi, L. T., Nguyen, H. D., Bui, T. Q., & Vu, T. T. (2015). A glove-based gesture recognition system for Vietnamese sign language. Paper presented at the Control, Automation and Systems (ICCAS), 2015 15th International Conference on.
P?awiak, P., So?nicki, T., Nied?wiecki, M., Tabor, Z., & Rzecki, K. (2016). Hand body language gesture recognition based on signals from specialized glove and machine learning algorithms. IEEE Transactions on Industrial Informatics, 12(3), 1104-1113.
Pourmirza, S., Peters, S., Dijkman, R., & Grefen, P. (2017). A systematic literature review on the architecture of business process management systems. Information Systems, 66, 43-58.
Pradhan, G., Prabhakaran, B., & Li, C. (2008). Hand-gesture computing for the hearing and speech impaired. IEEE MultiMedia, 15(2).
Praveen, N., Karanth, N., & Megha, M. (2014). Sign language interpreter using a smart glove. Paper presented at the Advances in Electronics, Computers and Communications (ICAECC), 2014 International Conference on.
Preetham, C., Ramakrishnan, G., Kumar, S., Tamse, A., & Krishnapura, N. (2013). Hand talk-implementation of a gesture recognizing glove. Paper presented at the India Educators' Conference (TIIEC), 2013 Texas Instruments.
Ramli, S. (2012). GMT feature extraction for representation of BIM sign language. Paper presented at the Control and System Graduate Research Colloquium (ICSGRC), 2012 IEEE.
Rishikanth, C., Sekar, H., Rajagopal, G., Rajesh, R., & Vijayaraghavan, V. (2014). Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals. Paper presented at the Global Humanitarian Technology Conference (GHTC), 2014 IEEE.
Sadek, M. I., Mikhael, M. N., & Mansour, H. A. (2017). A new approach for designing a smart glove for Arabic Sign Language Recognition system based on the statistical analysis of the Sign Language. Paper presented at the Radio Science Conference (NRSC), 2017 34th National.
Sagawa, H., & Takeuchi, M. (2000). A method for recognizing a sequence of sign language words represented in a japanese sign language sentence. Paper presented at the Automatic Face and Gesture Recognition, 2000. Proceedings. Fourth IEEE International Conference on.
Sekar, H., Rajashekar, R., Srinivasan, G., Suresh, P., & Vijayaraghavan, V. (2016). Low-cost intelligent static gesture recognition system. Paper presented at the Systems Conference (SysCon), 2016 Annual IEEE.
Sharma, D., Verma, D., & Khetarpal, P. (2015). LabVIEW based Sign Language Trainer cum portable display unit for the speech impaired. Paper presented at the India Conference (INDICON), 2015 Annual IEEE.
Sharma, V., Kumar, V., Masaguppi, S. C., Suma, M., & Ambika, D. (2013). Virtual Talk for Deaf, Mute, Blind and Normal Humans. Paper presented at the India Educators' Conference (TIIEC), 2013 Texas Instruments.
Shukor, A. Z., Miskon, M. F., Jamaluddin, M. H., bin Ali, F., Asyraf, M. F., & bin Bahar, M. B. (2015). A new data glove approach for Malaysian sign language detection. Procedia Computer Science, 76, 60-67.
Sidek, O., & Hadi, M. A. (2014). Wireless gesture recognition system using MEMS accelerometer. Paper presented at the Technology Management and Emerging Technologies (ISTMET), 2014 International Symposium on.
Sriram, N., & Nithiyanandham, M. (2013). A hand gesture recognition based communication system for silent speakers. Paper presented at the Human Computer Interactions (ICHCI), 2013 International Conference on.
Swee, T. T., Ariff, A., Salleh, S.-H., Seng, S. K., & Huat, L. S. (2007). Wireless data gloves Malay sign language recognition system. Paper presented at the Information, Communications & Signal Processing, 2007 6th International Conference on.
Swee, T. T., Salleh, S.-H., Ariff, A., Ting, C.-M., Seng, S. K., & Huat, L. S. (2007). Malay Sign Language gesture recognition system. Paper presented at the Intelligent and Advanced Systems, 2007. ICIAS 2007. International Conference on.
Tanyawiwat, N., & Thiemjarus, S. (2012). Design of an assistive communication glove using combined sensory channels. Paper presented at the Wearable and Implantable Body Sensor Networks (BSN), 2012 Ninth International Conference on.
Trottier-Lapointe, W., Majeau, L., El-Iraki, Y., Loranger, S., Chabot-Nobert, G., Borduas, J., . . . Lapointe, J. (2012). Signal processing for low cost optical dataglove. Paper presented at the Information Science, Signal Processing and their Applications (ISSPA), 2012 11th International Conference on.
Tubaiz, N., Shanableh, T., & Assaleh, K. (2015). Glove-based continuous Arabic sign language recognition in user-dependent mode. IEEE Transactions on Human- Machine Systems, 45(4), 526-533.
Vijay, P. K., Suhas, N. N., Chandrashekhar, C. S., & Dhananjay, D. K. (2012). Recent developments in sign language recognition: A review. Int J Adv Comput Eng Commun Technol, 1, 21-26.
Vijayalakshmi, P., & Aarthi, M. (2016). Sign language to speech conversion. Paper presented at the Recent Trends in Information Technology (ICRTIT), 2016 International Conference on.
Vutinuntakasame, S., Jaijongrak, V.-r., & Thiemjarus, S. (2011). An assistive body sensor network glove for speech-and hearing-impaired disabilities. Paper presented at the Body Sensor Networks (BSN), 2011 International Conference on.
Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., & Yang, J. (2011). A framework for hand gesture recognition based on accelerometer and EMG sensors. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 41(6), 1064-1076.
|
This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials. You may use the digitized material for private study, scholarship, or research. |