Modelling of Flex Sensor Resistance Value for Fingerspelling Recognition Prototype Using Sensor-Based Signal
DOI:
https://doi.org/10.31436/iiumej.v26i3.3633Keywords:
Fingerspelling, Flex Sensor, Hand Gesture, Sign LanguageAbstract
Individuals with hearing and speech impairments often rely on sign language to communicate with each other. While the hearing and speech impairment community can understand sign language to communicate with each other, individuals with normal hearing generally do not understand sign language. However, communication between the hearing and speech-impaired individual and a normally hearing individual can be made possible by fingerspelling with the help of a fingerspelling recognition system. Current research in fingerspelling recognition systems aims to improve the system's accuracy, rather than focusing on the ability of the system to convert the fingerspelling for real-time communication. The developed system can be computationally complex when the research aims to improve accuracy, resulting in unsuitability for real-time communication due to the extensive processing needed and oversized non-wearable hardware. Hence, this paper presents one of the options in developing a fingerspelling recognition system prototype that aims for ease in real-time communication using sensor-based signals. The prototype comprises a hand glove equipped with five flex sensors, designed to recognize and convert the hand gestures into the fingerspelled letters. Each flex sensor is fitted along the length of each finger to capture the gestured fingerspelling, measuring the changes in the flex sensor’s resistance as it bends. The flex sensor resistance value was then modelled for fingerspelling recognition through demonstration on five users in the preliminary characterization, and later by ten users to obtain the system accuracy. An Arduino MEGA 2560 platform processed the sensor's data, which then transmits the recognized fingerspelled letters to an OLED display. The average recognition accuracy recorded for the prototype is 76.15% which is 15.38% higher than the preliminary characterization, indicating the possibility of using sensor-based signals in a fingerspelling recognition system for real-time communication. This system allows individuals with hearing and speech impairment to make various hand gestures while wearing the hand glove, which will be recognized as the corresponding fingerspelled letters for easy communication with individuals with normal hearing.
ABSTRAK: Individu dengan kekurangan pada pendengaran dan pertuturan biasanya bergantung kepada bahasa insyarat bagi berkomunikasi antara satu sama lain. Walaupun komuniti dengan kekurangan pada pendengaran dan pertuturan boleh memahami bahasa isyarat bagi berkomunikasi antara satu sama lain, individu dengan pendengaran normal secara amnya tidak memahami bahasa isyarat. Walau bagaimanapun, komunikasi antara individu kekurangan pada pendengaran dan pertuturan, dengan individu normal boleh diwujudkan melalui ejaan jari dengan bantuan sistem pengecaman ejaan jari. Kajian terkini dalam sistem pengecaman ejaan jari bertujuan bagi meningkatkan ketepatan sistem, bukan memfokuskan kepada kebolehan sistem dalam menterjemah ejaan secara masa nyata. Apabila tujuan kajian tersebut adalah bagi meningkatkan ketepatan, sistem yang terbina boleh menjadi sangat kompleks dalam pengiraannya, mengakibatkan sebuah sistem yang tidak sesuai bagi tujuan komunikasi masa nyata kerana memerlukan proses berlebihan dan perkakasan bersaiz besar yang tidak boleh dipakai. Oleh itu, kajian ini memberi pilihan dalam membangunkan sebuah prototaip sistem pengecaman ejaan jari, bertujuan memudahkan komunikasi masa nyata menggunakan isyarat berasaskan penderia. Prototaip ini terdiri daripada sebuah sarung tangan yang dilengkapi dengan lima penderia lentur, direka bagi mengenalpasti huruf yang dieja jari pada gaya tangan. Setiap penderia lentur dimuatkan sepanjang setiap jari bagi mengesan ejaan jari yang digayakan. Ukuran dibuat melalui perubahan rintangan penderia lentur apabila ianya dibengkokkan. Nilai rintangan penderia lentur ini kemudiannya dimodelkan bagi pengecaman ejaan jari melalui demonstrasi ke atas lima pengguna dalam perincian awal, dan seterusnya oleh sepuluh pengguna bagi mendapatkan ketepatan sistem. Data penderia tersebut diproses menggunakan platform Arduino MEGA 2560, dan huruf ejaan jari yang telah dikenal pasti dipaparkan ke paparan OLED. Purata ketepatan pengecaman yang direkodkan bagi prototaip ini adalah 76.15% di mana 15.38% lebih tinggi dari perincian awal, menunjukkan potensi penggunaan isyarat berasaskan penderia dalam sistem pengecaman ejaan jari bagi komunikasi masa nyata. Sistem ini membolehkan individu kekurangan pada pendengaran dan pertuturan membuat pelbagai gaya tangan sewaktu memakai sarung tangan ini, kemudian huruf ejaan jari yang sepadan akan dikenalpasti bagi memudahkan komunikasi bersama individu normal.
Downloads
Metrics
References
Advancing hearing health in Malaysia: a harmonious call to action — Olivier Chupin. Available: https://codeblue.galencentre.org/2023/10/advancing-hearing-health-in-malaysia-a-harmonious-call-to-action-olivier-chupin/
Raising awareness for the deaf community in the country. Available: https://www.bernama.com/en/thoughts/news.php?id=2251189
Hilario-Acuapan G, Ordaz-Hernández K, Castelán M, Lopez-Juarez I. (2025) Toward a recognition system for Mexican sign language: arm movement detection. Sensors, 25:3636. doi:10.3390/s25123636
Herrick YHL, Norhanifah M. (2022) BIM sign language translator using machine learning (TensorFlow). Journal of Soft Computing and Data Mining, 3(1): 68-77.
Nurrahma N, Rahadian Y, Ary SP. (2021) Indonesian sign language fingerspelling recognition using vision-based machine learning. In Proceedings of the 2021 International Conference on Intelligent Cybernetics Technology and Applications (ICICyTA): 1-2 December 2021; virtually. doi:10.1109/ICICyTA53712.2021.9689176
David M, Zikang L, Tan G, Jon W, Jocelyn H, Bill N, Hyeokhyen K, Thomas P, Thad S. (2023) FingerSpeller: camera-free text entry using smart rings for American sign language fingerspelling recognition. In Proceedings of the 25th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ’23): 22-25 October 2023; New York. doi:10.1145/3597638.3614491
Wei HY, Sin TL, Rui C, Ashwaq MSQ. (2024) GestureSpeak: An efficient real-time finger spelling recognition model. In Proceedings of the 2024 International Conference on Computing Innovation, Intelligence, Technologies and Education (CIITE): 5-6 September 2024; Malaysia. pp 1-6.
Hyunchul L, Nam AD, Dylan L, Tianhong CY, Jane L, Franklin ML, Yiqi J, Yan M, Xiaojun B, Francois G, Cheng Z. (2025) SpellRing: recognizing continuous fingerspelling in American sign language using a ring. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems: 26 April 2025 – 1 May 2025; Yokohama. pp 1-17.
Mithun E, Dharun KS, Abirami A, Mathiarasun RJ, Harikumar ME. (2024) (SHWASI) Smart hand wearable aid for speech impaired: sign language communication using flex sensor-based finger spelling. In Proceedings of the 2024 IEEE 21st India Council International Conference (INDICON): 19-21 December 2024; India. pp 1-6.
Marie A, Hamid RM, Ivan G. (2023) A survey on sign language literature. Machine Learning with Applications, 14: 100504. doi:10.1016/j.mlwa.2023.100504
Yuvraj G, Riya A, Deepak S, Prashant KG. (2021) Sign language translation systems for hearing/speech impaired people: a review. In Proceedings of the 2021 International Conference on Innovative Practices in Technology and Management (ICIPTM): 17-19 February 2021; India. pp 10-14.
Bahasa Isyarat Malaysia (BIM): Abjad. Available: https://www.bimsignbank.org/groups/general/alphabets
Razieh R, Kourosh K, Sergio E. (2021) Sign language recognition: a deep survey. Expert Systems with Applications, 164: 113794.
Md. AAF, Farhan FA, Mosabber UA. (2021) Sensor dataglove for real-time static and dynamic hand gesture recognition. In Proceedings of the 2021 Joint 10th International Conference on Informatics, Electronics & Vision (ICIEV) and 2021 5th International Conference on Imaging, Vision & Pattern Recognition (icIVPR): 16-20 August 2021; Japan. doi:10.1109/ICIEVicIVPR52578.2021.9564226.
Zinah RS, Zurinahni Z, Zaidan BB, Alamoodi AH. (2022) A systematic review on systems-based sensory gloves for sign language pattern recognition: an update from 2017 to 2022. IEEE Access, 10:123358-123377.
Santosh KS, Ravi G, Sunil KKN, Nataraja N, Rajendra PP, Sarala T. (2021) Glove based deaf-dumb sign language interpreter. In Proceedings of the 2021 6th International Conference on Recent Trends on Electronics, Information, Communication & Technology (RTEICT): 27-28 August 2021; India. pp 947-950.
Lu C, Amino S, Jing L. (2023) Data glove with bending sensor and inertial sensor based on weighted DTW fusion for sign language recognition. Electronics, 12(3):613. doi:10.3390/electronics12030613
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 IIUM Press

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.








