ABSTRACT
Research on hands-free input methods has been actively conducted. However, most of the previous methods are difficult to use at any time in daily life due to using speech sounds or body movements. In this study, in order to realize a hands-free input method based on nasal breath using wearable devices, we propose a method for recognizing nasal breath gestures, using piezoelectric elements placed on the nosepiece of a glasses-type device. In the proposed method, nasal vibrations generated by nasal breath are acquired as sound data from the devices. Next, the breath pattern is recognized based on the factors of breath count, time interval, and intensity. We implemented a prototype system for initial evaluation. The evaluation results for eight subjects showed that the proposed method can recognize eight types of nasal breath gestures with 0.89% of F value. Our study provides the first wearable sensing technology that uses nasal breathing for hands-free input.
- [1] Koguchi, Y., Oharada, K., Takagi, Y., Sawada, Y., Shizuki, B., and Takahashi, S, A Mobile Command Input Through Vowel Lip Shape Recognition, In International Conference on Human-Computer Interaction, pp. 297–305 (2018).Google Scholar
- [2] Katsutoshi Masai, Kai Kunze, Yuta Sugiura, Masa Ogata, Masahiko Inami, and Maki Sugimoto, Evaluation of Facial Expression Recognition by a Smart Eyewear for Facial Direction Changes, Repeatability, and Positional Drift, ACM Transactions on Interactive Intelligent Systems (TiiS), 7(4), pp. 1–23 (2017). Google ScholarDigital Library
- [3] Ando, T., Kubo, Y., Shizuki, B., and Takahashi, S, Canalsense: Face-related movement recognition system based on sensing air pressure in ear canals, In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, pp. 679–689 (2017). Google ScholarDigital Library
- [4] Lutz, O. H. M., Venjakob, A. C., and Ruff, S, SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements, Journal of Eye Movement Research, 8(1), (2015).Google ScholarCross Ref
- [5] Chauhan, J., Hu, Y., Seneviratne, S., Misra, A., Seneviratne, A., and Lee, Y, BreathPrint: Breathing acoustics-based user authentication, In Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services, pp. 278–291 (2017). Google ScholarDigital Library
- [6] Shih, C. H., Tomita, N., Lukic, Y. X., Reguera, Á. H., Fleisch, E., and Kowatsch, T, Breeze: Smartphone-based acoustic real-time detection of breathing phases for a gamified biofeedback breath training, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(4), pp. 1–30 (2019). Google ScholarDigital Library
- [7] Tennent, P., Rowland, D., Marshall, J., Egglestone, S. R., Harrison, A., Jaime, Z.,... and Benford, S, Breathalising games: understanding the potential of breath control in game interfaces, In Proceedings of the 8th international conference on advances in computer entertainment technology, pp. 1–8 (2011). Google ScholarDigital Library
- [8] Sra, M., Xu, X., and Maes, P, Breathvr: Leveraging breathing as a directly controlled interface for virtual reality games, In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2018). Google ScholarDigital Library
- [9] Evreinov, G., and Evreinova, T, ” Breath-Joystick”-Graphical Manipulator for Physically Disabled Users, Proc. of the ICCHP2000, pp. 193–200 (2000).Google Scholar
- [10] Yamamoto, M., Ikeda, T., and Sasaki, Y, Real-time analog input device using breath pressure for the operation of powered wheelchair, In 2008 IEEE International Conference on Robotics and Automation, pp. 3914–3919, (2008).Google ScholarCross Ref
- [11] Marshall, J., Rowland, D., Rennick Egglestone, S., Benford, S., Walker, B., and McAuley, D, Breath control of amusement rides, In Proceedings of the SIGCHI conference on Human Factors in computing systems, pp. 73–82 (2011). Google ScholarDigital Library
- [12] Héctor A. Cordourier Maruri, Paulo Lopez-Meyer, Jonathan Huang, Willem Marco Beltman, Lama Nachman, and Hong Lu, V-Speech: Noise-Robust Speech Capturing Glasses Using Vibration Sensors, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol, 2(4), pp. 180 (2018). Google ScholarDigital Library
- [13] He, Jibo and Chaparro, Alex and Nguyen, Bobby and Burge, Rondell and Crandall, Joseph and Chaparro, Barbara and Ni, Rui and Cao, Shi, Texting while driving: Is speech-based texting less risky than handheld texting, Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 124–130 (2013). Google ScholarDigital Library
- [14] Amesaka, T., Watanabe, H., and Sugimoto, M., Facial expression recognition using ear canal transfer function, In Proceedings of the 23rd International Symposium on Wearable Computers, pp. 1–9 (2019). Google ScholarDigital Library
Index Terms
- NasalBreathInput: A Hands-Free Input Method by Nasal Breath Gestures using a Glasses Type Device
Recommendations
DualBreath: Input Method Using Nasal and Mouth Breathing
AHs '21: Proceedings of the Augmented Humans International Conference 2021In this study, we propose DualBreath, an input method with eight commands using nasal and mouth breathing. To achieve this, we use an intentional timing shift of the nasal and mouth breathing rhythms as input commands. By combining DualBreath with ...
A method to recognize eye movements based on uplift movement of skin
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable ComputersMovements of human eyes provide useful insights for understanding human's physical and mental condition and ability, which has high attention in various industrial and academic fields. Although many methods for sensing eye movements have been proposed, ...
Haptic feedback of gaze gestures with glasses: localization accuracy and effectiveness
UbiComp/ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable ComputersWearable devices including smart eyewear require new interaction methods between the device and the user. In this paper, we describe our work on the combined use of eye tracking for input and haptic (touch) stimulation for output with eyewear. Input ...
Comments