skip to main content
10.1145/2070942.2070969acmconferencesArticle/Chapter ViewAbstractPublication PagessensysConference Proceedingsconference-collections
research-article

E-Gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices

Authors Info & Claims
Published:01 November 2011Publication History

ABSTRACT

Gesture is a promising mobile User Interface modality that enables eyes-free interaction without stopping or impeding movement. In this paper, we present the design, implementation, and evaluation of E-Gesture, an energy-efficient gesture recognition system using a hand-worn sensor device and a smartphone. E-gesture employs a novel gesture recognition architecture carefully crafted by studying sporadic occurrence patterns of gestures in continuous sensor data streams and analyzing the energy consumption characteristics of both sensors and smartphones. We developed a closed-loop collaborative segmentation architecture, that can (1) be implemented in resource-scarce sensor devices, (2) adaptively turn off power-hungry motion sensors without compromising recognition accuracy, and (3) reduce false segmentations generated from dynamic changes of body movement. We also developed a mobile gesture classification architecture for smartphones that enables HMM-based classification models to better fit multiple mobility situations.

Skip Supplemental Material Section

Supplemental Material

human_sensing_2.mp4

mp4

155.4 MB

References

  1. HMM toolkit (HTK). http://htk.eng.cam.ac.uk/.Google ScholarGoogle Scholar
  2. R. Amstutz, O. Amft, B. French, A. Smailagic, D. Siewiorek, and G. Troster. Performance analysis of an hmm-based gesture recognition using a wristwatch device. In Proceedings of the International Conference on Computational Science and Engineering, pages 303--309. IEEE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. W. Bang, W. Chang, K. Kang, E. Choi, A. Potanin, and D. Kim. Self-contained spatial input device for wearable computers. In Proceedings of the IEEE International Symposium on Wearable Computers, page 26. IEEE, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. L. Bao and S. Intille. Activity recognition from user-annotated acceleration data. In Pervasive Computing, volume 3001 of LNCS, pages 1--17. Springer, 2004.Google ScholarGoogle Scholar
  5. A. Benbasat and J. Paradiso. An inertial measurement framework for gesture recognition and applications. In Gesture and Sign Language in Human-Computer Interaction, volume 2298 of LNCS, pages 77--90. Springer, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. R. Ganti, P. Jayachandran, T. Abdelzaher, and J. Stankovic. Satire: a software architecture for smart attire. In Proceedings of the 4th international conference on Mobile systems, applications and services, MobiSys '06, pages 110--123. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Haro, K. Mori, T. Capin, and S. Wilkinson. Mobile camera-based user interaction. In Proceedings of Computer vision in human-computer interaction Workshop, page 79, Springer, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. H. Junker, O. Amft, P. Lukowicz, and G. Troster. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognition, 41(6):2010--2024, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. S. Kang, S. S. Iyengar, Y. Lee, J. Lee, C. Min, Y. Ju, T. Park, Y. Rhee, and J. Song. Mobicon: Mobile context monitoring platform for sensor-rich dynamic environments. To appear in Commun. ACM, 2011.Google ScholarGoogle Scholar
  10. S. Kang, J. Lee, H. Jang, H. Lee, Y. Lee, S. Park, T. Park, and J. Song. Seemon: scalable and energy-efficient context monitoring framework for sensor-rich mobile environments. In Proceedings of the 6th international conference on Mobile systems, applications, and services, MobiSys '08, pages 267--280. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. S. Kang, Y. Lee, C. Min, Y. Ju, T. Park, J. Lee, Y. Rhee, and J. Song. Orchestrator: An active resource orchestration framework for mobile context monitoring in sensor-rich mobile environments. In Proceedings of the 8th Annual IEEE International Conference on Pervasive Computing and Communications, pages 135--144.Google ScholarGoogle Scholar
  12. J. Kela, P. Korpipaa, J. Mantyjarvi, S. Kallio, G. Savino, L. Jozzo, and S. Marca. Accelerometer-based gesture control for a design environment. Personal and Ubiquitous Computing, 10(5):285--299, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. Kim, J. He, K. Lyons, and T. Starner. The gesture watch: A wireless contact-free gesture based wrist interface. In Proceedings of the IEEE International Symposium on Wearable Computers, pages 1--8. IEEE, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. H. Lee and J. Kim. An hmm-based threshold model approach for gesture recognition. 21(10):961--973, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Liu, Z. Wang, L. Zhong, J. Wickramasuriya, and V. Vasudevan. uwave: Accelerometer-based personalized gesture recognition and its applications. In Proceedings of the Annual IEEE International Conference on Pervasive Computing and Communications, pages 1--9. IEEE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. K. Lorincz, B. Chen, G. Challen, A. Chowdhury, S. Patel, P. Bonato, and M. Welsh. Mercury: a wearable sensor network platform for high-fidelity motion analysis. In Proceedings of the 7th ACM Conference on Embedded Networked Sensor Systems, pages 183--196. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. K. Lyons, H. Brashear, T. Westeyn, J. Kim, and T. Starner. Gart: The gesture and activity recognition toolkit. In Proceedings of the international conference on Human-computer interaction, pages 718--727. Springer, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Nintendo. Nintendo wii. http://www.nintendo.com/wii/.Google ScholarGoogle Scholar
  19. T. Pering, Y. Anokwa, and R. Want. Gesture connect: facilitating tangible interaction with a flick of the wrist. In Proceedings of the international conference on Tangible and embedded interaction, pages 259--262. ACM, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. T. Pering, P. Zhang, R. Chaudhri, Y. Anokwa, and R. Want. The psi board: Realizing a phone-centric body sensor network. In Proceedings of the 4th International Workshop on Wearable and Implantable Body Sensor Networks, BSN '07, pages 53--58. Springer, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  21. G. Raffa, J. Lee, L. Nachman, and J. Song. Don't slow me down: Bringing energy efficiency to continuous gesture recognition. In Proceedings of International Symposium on Wearable Computers, pages 1--8. IEEE, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  22. T. Schlomer, B. Poppinga, N. Henze, and S. Boll. Gesture recognition with a wii controller. In Proceedings of the 2nd international conference on Tangible and embedded interaction, TEI '08, pages 11--14. ACM, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. T. Stiefmeier, D. Roggen, and G. Troster. Gestures are strings: efficient online gesture spotting and classification using string matching. In Proceedings of the ICST 2nd international conference on Body area networks, pages 1--8. ICST, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Y. Wang, J. Lin, M. Annavaram, Q. Jacobson, J. Hong, B. Krishnamachari, and N. Sadeh. A framework of energy efficient mobile sensing for automatic user state recognition. In Proceedings of the 7th international conference on Mobile systems, applications, and services, MobiSys '09, pages 179--192. ACM, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. A. Wilson and S. Shafer. Xwand: Ui for intelligent spaces. In Proceedings of the SIGCHI conference on Human factors in computing systems, CHI '03, pages 545--552, New York, NY, USA, 2003. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Y. Wu and T. Huang. Vision-based gesture recognition: A review. In Proceedings of the International Gesture Workshop, GW '99, page 103. Springer, March 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. E-Gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SenSys '11: Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems
        November 2011
        452 pages
        ISBN:9781450307185
        DOI:10.1145/2070942

        Copyright © 2011 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 1 November 2011

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate174of867submissions,20%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader