UNWEARABLE MULTI-MODAL GESTURES RECOGNITION SYSTEM FOR INTERACTION WITH MOBILE DEVICES IN UNEXPECTED SITUATIONS

Authors

DOI:

https://doi.org/10.31436/iiumej.v20i2.1000

Keywords:

eye gesture recognition, Hand gesture detection, Multimodal interaction, Fuzzy inference system, human-computer intercation

Abstract

In this paper, a novel real-time system to control mobile devices, in unexpected situations like driving, cooking and practicing sports, based on eyes and hand gestures is proposed. The originality of the proposed system is that it uses a real-time video streaming captured by the front-facing camera of the device. To this end, three principal modules are charged to recognize eyes gestures, hand gestures and the fusion of these motions. Four contributions are presented in this paper. First, the proposition of the fuzzy inference system in the purpose of determination of eyes gestures. Second, a new database has been collected that is used in the classification of open and closed hand gesture. Third, two descriptors have been combined to have boosted classifiers that can detect hands gestures based on Adaboost detector. Fourth, the eyes and hand gestures are erged to command the mobile devices based on the decision tree classifier. Different experiments were assessed to show that the proposed system is efficient and competitive with other existing systems by achieving a recall of 76.53%, 98 % and 99% for eyes gesture recognition, detection of fist gesture, detection of palm gesture respectively and a success rate of 88% for eyes and hands gestures correlation.

ABSTRAK:  Kajian ini mencadangkan satu sistem masa nyata bagi mengawal peranti mudah alih, dalam keadaan tak terjangka seperti sedang memandu, memasak dan bersukan, berdasarkan gerakan mata dan tangan. Kelainan sistem yang dicadangkan ini adalah ia menggunakan masa nyata video yang diambil daripada peranti kamera hadapan. Oleh itu, tiga modul utama ini telah ditugaskan bagi mengenal pasti isyarat mata, tangan dan gabungan kedua-dua gerakan. Empat sumbangan telah dibentangkan dalam kajian ini. Anggaran pertama bahawa isyarat gerak mata mempengaruhi sistem secara kabur. Kedua, pangkalan data baru telah dikumpulkan bagi pengelasan isyarat tangan terbuka dan tertutup. Ketiga, dua pemerihal data telah digabungkan bagi merangsangkan pengelasan yang dapat mengesan isyarat tangan berdasarkan pengesan Adaboost. Keempat, gerakan mata dan tangan telah digunakan bagi mengarah peranti mudah alih berdasarkan pengelasan carta keputusan. Eksperimen berbeza telah dijalankan bagi membuktikan bahawa sistem yang dicadang adalah berkesan dan berdaya saing dengan sistem sedia ada. Keputusan menunjukkan 76.53%, 98% dan 99% masing-masing telah dikesan pada pengesanan gerak isyarat mata, genggaman tangan dan tapak tangan, dengan kadar 88% berjaya mengesan gerak isyarat mata dan tangan.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Author Biography

Ali Wali, REGIM-Lab.

Associate Professor on Signal & Images Processing at ISIM, University of Sfax. Got his Ph.D. in Engineering Computer Systems at National school of Engineers of Sfax, in 2013. He is member of the REsearch Groups on Intelligent Machines (REGIM). His research interests include Computer Vision and Image and video analysis. These research activities are centered around Video Events Detection and Pattern Recognition. He is a Graduate member of IEEE. He was the member of the organization committee of the International Conference on Machine Intelligence ACIDCA-ICMI2005, Third IEEE International Conference on Next Generation Networks and Services NGNS2011 and 4th International Conference on Logistics LOGISTIQUA2011, International Conference on Advanced Logistics and Transport (ICALT’2013, ICALT'2014,ICALT'2015), 13th International Conference on Document Analysis and Recognition (ICDAR 2015), International Conference on Hybrid Intelligent Systems (HIS'2013), International Conference on Information Assurance and Security (IAS'2013).

References

Nagamatsu T, Yamamoto M, Sato H. (2010). Mobigaze: Development of a gaze interface for handheld mobile devices. In Proceedings of Extended Abstracts on Human Factors in Computing Systems, pp: 3349-3354.

Samadi M R H, Cooke N. (2014). EEG signal processing for eye tracking. In Proceedings 22nd European Signal Processing Conference. pp. 2030-2034

Dhuliawala M, Lee J, Shimizu J, Bulling A, Kunze K, Starner T, Woo W. (2016). Smooth eye movement interaction using EOG glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction, 307-311. doi: 10.1145/2993148.2993181.

Bulling A, Ward JA, Gellersen H, Troster G. (2011). Eye movement analysis for activity recognition using electrooculography, IEEE transactions on pattern analysis and machine intelligence, 33(4):741-753

TheEyeTribe [http://www.theeyetribe.com/]

Chen P, Wang P, Wang J, Yao Y. (2017). Design and motion tracking of a strip glove based on machine vision. Neurocomputing. doi: https://doi.org/10.1016/j.neucom.2017.03.098.

Dipietro L, Sabatini A M, Dario P. (2008). A survey of glove-based systems and their applications. IEEE Transactions on Systems, Man, and Cybernetics, Part C (App/lications and Reviews) 38(4): 461-482. doi: 10.1109/TSMCC.2008.923862

Wood E, Bulling A. (2014) Eyetab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications, pp: 207-210

Pino C, Kavasidis I. (2012). Improving mobile device interaction by eye tracking analysis. In Proceeding of Federated Conference on Computer Science and Information Systems. pp:1199-1202.

Huang Q, Veeraraghavan A, Sabharwal A. (2017). Tabletgaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. In Machine Vision and Applications. 28(5): 445-461. doi: 10.1007/s00138-017-0852-4

Miluzzo E, Wang T, Campbell A T. (2010). Eyephone: activating mobile phones with your eyes. In Proceedings of the second workshop on Networking, systems, and applications on mobile handhelds, pp: 15-20.

Iqbal N, Lee H, Lee S Y. (2013). Smart user interface for mobile consumer devices using model-based eye-gaze estimation. In IEEE Transactions on Consumer Electronics, 59(1): 161-166

Vaitukaitis V, Bulling A. (2012). Eye gesture recognition on portable devices. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, pp: 711-714

Zhang X, Kulkarni, H, Morris M R. (2017). Smartphone-based gaze gesture communication for people with motor disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. pp: 2878-2889.

Elleuch, H., Wali, A., Samet, A., Alimi, A.M. (2016). A real-time eye gesture recognition system based on fuzzy inference system for mobile devices monitoring. In Proceedings of International Conference on Advanced Concepts for Intelligent Vision Systems, pp. 172-180.

Meng X, Cheung C M, Ho K L, Lui K S, Lam E Y, Tam V. (2012). Building smart cameras on mobile tablets for hand gesture recognition. In Proceeding of Sixth International Conference on Distributed Smart Cameras, pp: 1-5

Prasuhn L, Oyamada Y, Mochizuki Y, Ishikawa H. (2014). A hog-based hand gesture recognition system on a mobile device. In Proceedings of IEEE International Conference on Image Processing, pp. 3973-3977

Dalal N, Triggs B. (2005). Histograms of oriented gradients for human detection, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1:886-893

Song J, Soros G, Pece F, Fanello S R, Izadi S, Keskin C, Hilliges O. (2014). In-air gestures around unmodified mobile devices. In Proceedings of the 27th annual ACM symposium on User interface software and technology, pp: 319-329

Elleuch, H., Wali, A., Samet, A., Alimi, A.M. (2015). A static hand gesture recognition system for real time mobile device monitoring. In Proceedings of the 15th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 195-200.

Chatterjee I, Xiao R , Harrison C. (2015). Gaze+ gesture: Expressive, precise and targeted free-space interactions, Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, 131-138

Hales J, Rozado D, Mardanbegi D. (2013). Interacting with objects in the environment by gaze and hand gestures. In Proceedings of the 3rd international workshop on pervasive eye tracking and mobile eye-based interaction; pp: 1-9

Pouke M, Karhu A, Hickey S, Arhippainen L. (2012). Gaze tracking and non-touch gesture based interaction method for mobile 3d virtual spaces. In: Proceedings of the 24th Australian Computer-Human Interaction Conference, pp. 505-512

Yoo B, Han J J, Choi C, Yi K, Suh S, Park D, Kim C. (2010). 3d user interface combining gaze and hand gestures for large-scale display. In Proceedings of Extended Abstracts on Human Factors in Computing Systems, pp: 3709-3714

Viola P, Jones M J. (2004). Robust real-time face detection. International journal of computer vision. 57(2): 137-154

Elleuch H, Wali A, Alimi A M. (2014). Smart tablet monitoring by a real-time head movement and eye gestures recognition system. In Proceedings of International Conference on Future Internet of Things and Cloud (FiCloud); pp. 393-398.

OpenCV [http://www.opencv.org/]

Gonzalez-Ortega D, Diaz-Pernas F, Martinez-Zarzuela M, Anton-Rodriguez M, Diez-Higuera J, Boto-Giralda D. (2010). Real-time hands, face and facial features detection and tracking: Application to cognitive rehabilitation tests monitoring. Journal of Network and Computer Applications. 33(4): 447-466

REHG [http://www.regim.org/publications/databases/rehg/]

MIT [ http://web.mit.edu/torralba/www/indoor.html]

Memo A, Zanuttigh P. (2018). Head-mounted gesture controlled interface for human-computer interaction. Multimedia Tools and Applications. 77(1): 27-53. doi: 10.1007/s11042-016-4223-3.

Everingham M, Van Gool L, Williams C K, Winn J, Zisserman A. (2010). The pascal visual object classes (voc) challenge. International journal of computer vision. 88(2): 303-338

NUS [https://www.ece.nus.edu.sg/stfpage/elepv/NUS-HandSet/]

Pisharady P K, Vadakkepat P, Loh A P. Attention based detection and recognition of hand postures against complex backgrounds. International Journal of Computer Vision.101(3) :403-419

Elleuch H, Wali A, Alimi M A, Samet A. (2017). Interacting with mobile devices by fusion eye and hand gestures recognition systems based on decision tree approach. In Ninth International Conference on Machine Vision, pp: 103-410.

Dumas B, Lalanne D, Oviatt S. (2009). Multimodal interfaces: A survey of principles, models and frameworks. Human machine interaction. 5440:3-26. doi: https://doi.org/10.1007/978-3-642-00437-7_1

FuzzyLite [http://www.fuzzylite.com/]

Skodras E, Fakotakis N. (2015). Precise localization of eye centers in low resolution color images. Image and Vision Computing. 36: 51-60

Downloads

Published

2019-12-02

How to Cite

Elleuch, H., & Wali, A. (2019). UNWEARABLE MULTI-MODAL GESTURES RECOGNITION SYSTEM FOR INTERACTION WITH MOBILE DEVICES IN UNEXPECTED SITUATIONS. IIUM Engineering Journal, 20(2), 142–162. https://doi.org/10.31436/iiumej.v20i2.1000

Issue

Section

Engineering Mathematics and Applied Science