Skip to main content

User-Defined Hand Gesture Interface to Improve User Experience of Learning American Sign Language

  • Conference paper
  • First Online:
Augmented Intelligence and Intelligent Tutoring Systems (ITS 2023)

Abstract

Sign language can make possible effective communication between hearing and deaf-mute people. Despite years of extensive pedagogical research, learning sign language remains a formidable task, with the majority of the current systems relying extensively on online learning resources, presuming that users would regularly access them; yet, this approach can feel monotonous and repetitious. Recently, gamification has been proposed as a solution to the problem, however, the research focus is on game design, rather than user experience design. In this work, we present a system for user-defined interaction for learning static American Sign Language (ASL), supporting gesture recognition for user experience design, and enabling users to actively learn through involvement with user-defined gestures, rather than just passively absorbing knowledge. Early findings from a questionnaire-based survey show that users are more motivated to learn static ASL through user-defined interactions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.who.int/zh/news-room/fact-sheets/detail/deafness-and-hearing-loss.

References

  1. Adamo-Villani, N., Carpenter, E., Arns, L.: An immersive virtual environment for learning sign language mathematics. In: ACM SIGGRAPH 2006 Educators Program, p. 20-es (2006)

    Google Scholar 

  2. Battistoni, P., Di Gregorio, M., Sebillo, M., Vitiello, G.: AI at the edge for sign language learning support. In: IEEE HCC Conference, pp. 16–23 (2019)

    Google Scholar 

  3. Bheda, V., Radpour, D.: Using deep convolutional networks for gesture recognition in American sign language. arXiv preprint arXiv:1710.06836 (2017)

  4. Bradski, G., Kaehler, A.: OpenCV. DDJ Softw. Tools 3, 120 (2000)

    Google Scholar 

  5. Bragg, D., Caselli, N., Gallagher, J.W., Goldberg, M., Oka, C.J., Thies, W.: ASL sea battle: gamifying sign language data collection. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2021)

    Google Scholar 

  6. Camgoz, N.C., Koller, O., Hadfield, S., Bowden, R.: Sign language transformers: joint end-to-end sign language recognition and translation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10023–10033 (2020)

    Google Scholar 

  7. Dai, Q., Li, X., Geng, W., Jin, W., Liang, X.: CAPG-MYO: a muscle-computer interface supporting user-defined gesture recognition. In: Proceedings of the 9th ICCCM, pp. 52–58 (2021)

    Google Scholar 

  8. Dillon, J.V., et al.: TensorFlow distributions. arXiv preprint arXiv:1711.10604 (2017)

  9. Empe, N.A.A., Echon, R.C.L., Vega, H.D.A., Paterno, P.L.C., Jamis, M.N., Yabut, E.R.: SimboWika: a mobile and web application to learn Filipino sign language for deaf students in elementary schools. In: R10-HTC, pp. 1–6. IEEE (2020)

    Google Scholar 

  10. Estrada-Cota, I., Carreño-León, M.A., Sandoval-Bringas, J.A., Leyva-Carrillo, A.A., Quiroz, H.X.C.: Design of a web tool for teaching-learning of states and capitals of México through the Mexican sign language. In: International Conference on Inclusive Technology and Education (CONTIE), pp. 174–179. IEEE (2021)

    Google Scholar 

  11. Goswami, T., Javaji, S.R.: CNN model for American sign language recognition. In: Kumar, A., Mozar, S. (eds.) ICCCE 2020. LNEE, vol. 698, pp. 55–61. Springer, Singapore (2021). https://doi.org/10.1007/978-981-15-7961-5_6

    Chapter  Google Scholar 

  12. Jiang, X., Hu, B., Chandra Satapathy, S., Wang, S.H., Zhang, Y.D.: Fingerspelling identification for Chinese sign language via AlexNet-based transfer learning and Adam optimizer. Sci. Program. (2020)

    Google Scholar 

  13. Kim, S., Ji, Y., Lee, K.B.: An effective sign language learning with object detection based ROI segmentation. In: 2018 Second IEEE IRC, pp. 330–333. IEEE (2018)

    Google Scholar 

  14. Koh, J.I., Cherian, J., Taele, P., Hammond, T.: Developing a hand gesture recognition system for mapping symbolic hand gestures to analogous emojis in computer-mediated communication. ACM TiiS 9(1), 1–35 (2019)

    Article  Google Scholar 

  15. Pallavi, P., Sarvamangala, D.: Recognition of sign language using deep neural network. IJARCS 12, 92–97 (2021)

    Google Scholar 

  16. Patricks, A.: Developing an accessible learning application for sign language (c) (2022)

    Google Scholar 

  17. Phan, H.D., Ellis, K., Dorin, A., Olivier, P.: Feedback strategies for embodied agents to enhance sign language vocabulary learning. In: Proceedings of the 20th ACM International Conference on Intelligent Virtual Agents, pp. 1–8 (2020)

    Google Scholar 

  18. Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8118, pp. 282–299. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40480-1_18

    Chapter  Google Scholar 

  19. Python, W.: Python. Python Releases for Windows 24 (2021)

    Google Scholar 

  20. Schioppo, J., Meyer, Z., Fabiano, D., Canavan, S.: Sign language recognition: learning American sign language in a virtual environment. In: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–6 (2019)

    Google Scholar 

  21. Schrepp, M., Hinderks, A., Thomaschewski, J.: Applying the user experience questionnaire (UEQ) in different evaluation scenarios. In: Marcus, A. (ed.) DUXU 2014. LNCS, vol. 8517, pp. 383–392. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07668-3_37

    Chapter  Google Scholar 

  22. Takayama, Y., Ichikawa, Y., Shizuki, B., Kawaguchi, I., Takahashi, S.: A user-based mid-air hand gesture set for spreadsheets. In: Asian CHI Symposium, pp. 122–128 (2021)

    Google Scholar 

  23. Wu, H., Wang, Y., Liu, J., Qiu, J., Zhang, X.L.: User-defined gesture interaction for in-vehicle information systems. Multimed. Tools Appl. 79(1), 263–288 (2020)

    Article  Google Scholar 

  24. Wu, H., Wang, Y., Qiu, J., Liu, J., Zhang, X.: User-defined gesture interaction for immersive VR shopping applications. BIT 38(7), 726–741 (2019)

    Google Scholar 

  25. Zhang, F., et al.: MediaPipe hands: on-device real-time hand tracking. arXiv preprint arXiv:2006.10214 (2020)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jindi Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L. (2023). User-Defined Hand Gesture Interface to Improve User Experience of Learning American Sign Language. In: Frasson, C., Mylonas, P., Troussas, C. (eds) Augmented Intelligence and Intelligent Tutoring Systems. ITS 2023. Lecture Notes in Computer Science, vol 13891. Springer, Cham. https://doi.org/10.1007/978-3-031-32883-1_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-32883-1_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-32882-4

  • Online ISBN: 978-3-031-32883-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics