Skip to main content

A Touchless Gestural Platform for the Interaction with the Patients Data

  • Conference paper
  • First Online:
XIV Mediterranean Conference on Medical and Biological Engineering and Computing 2016

Part of the book series: IFMBE Proceedings ((IFMBE,volume 57))

Abstract

Usually, during a surgery, an assistant or a nurse operates mouse and keyboard in place of the surgeon and this indirect manipulation may cause a bad image interpretation or communication problems or misunderstandings. More recently the spread of tablets and touchscreen even in hospitals represented a step forward. However, in contexts where it is necessary an absolutely sterile environment, such as in the operating room, the touch screen does not represent a definitive solution. For this reason, the use of touchless technology in a medical context is motivated by the need to have aseptic interactions with the computer systems and offers the advantage of greater simplicity and intuitiveness of use. The system presented in this paper is based on the use of the Microsoft Kinect as input sensor for the detection of the user’s hand movements. The idea is to create an interaction modality that permits doctors to interact with the patient’s data without contact with any device but only moving the hand in the free space. The interaction is based on the movements of only one hand and specific operations are associated to pertinent gestures. The system is able to let you to browse a list of patients and pick up one of these, refer to his data, display the medical images and interact with these in terms of translation and zooming in/out in order to highlight some specific details of the image.

The original version of this chapter was inadvertently published with an incorrect chapter pagination 874–878 and DOI 10.1007/978-3-319-32703-7_171. The page range and the DOI has been re-assigned. The correct page range is 880–884 and the DOI is 10.1007/978-3-319-32703-7_172. The erratum to this chapter is available at DOI: 10.1007/978-3-319-32703-7_260

An erratum to this chapter can be found at http://dx.doi.org/10.1007/978-3-319-32703-7_260

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. B. Hartmann, M. Benson, A. Junger et al., Computer keyboard and mouse as a reservoir of pathogens in an intensive care unit, Journal of clinical monitoring and computing, 2004, pp. 7–12.

    Google Scholar 

  2. M. Schultz, J. Gill, S. Zubairi, R. Huber, F. Gordin, Bacterial contamination of computer keyboards in a teaching hospital, Infect Control Hospital Epidemiology, 2003;24:302–303.

    Google Scholar 

  3. L. Garber, Gestural Technology: Moving Interfaces in a New Direction [Technology News], Computer2013, 46, 22–25.

    Google Scholar 

  4. J. LaViola, 3D Gestural Interaction: The State of the Field, ISRN Artificial Intelligence, Vol. 2013.

    Google Scholar 

  5. A. Bigdelou, L. A. Schwarz, N. Navab, An adaptive solution for intra-operative gesture-based human machine interaction, Proc. of the ACM international IUI 012 Conference, 2012.

    Google Scholar 

  6. T. Takahashi, F. Kishino, Hand Gesture Coding Based on Experiments Using a Hand Gesture Interface Device. SIGCHI Bull.1991, 23, 67–74.

    Google Scholar 

  7. J. M., Rehg, T. Kanade, Visual Tracking of High DOF Articulated Structures: An Application to Human Hand Tracking. In Proceedings of the Third European Conference-Volume II on Computer Vision - Volume II; ECCV.

    Google Scholar 

  8. D. Xu, W. Yao, Y. Zhang, Hand Gesture Interaction for Virtual Training of SPG, Artificial Reality and Telexistence–Workshops, ICAT ’06, 2006; pp. 672–676.

    Google Scholar 

  9. Sony PlayStation. Available online: http://www.playstation.com

  10. Nintendo Wii System description. Available: http://wii.com

  11. M.-C. Giuroiu, T. Marita, Gesture recognition toolkit using a Kinect sensor. Intelligent Computer Communication and Processing (ICCP), IEEE International Conference on; 2015; pp. 317–324.

    Google Scholar 

  12. Microsoft Kinect. Available: http://support.xbox.com/en-US/browse/xbox-one/kinect

  13. K. Khoshelham, S.O. Elberink, Accuracy and resolution of Kinect depth data for indoor mapping applications, Sensors 2012, 12, 1437-1454.

    Google Scholar 

  14. M. Grzegorzek, C. Theobalt, R. Koch, A. Kolb, Time-of-Flight and Depth Imaging. Sensors, Algorithms and Applications: Dagstuhl Seminar 2012 and GCPR Workshop on Imaging New Modalities; Springer Publishing Company, Incorporated, 2013.

    Google Scholar 

  15. L. C. Chin, S.N. Basah, S. Yaacob, M. Y. Din, Y. E. Juan, Accuracy and reliability of optimum distance for high performance Kinect Sensor. In Biomedical Engineering (ICoBE), 2015 2nd International Conference on; 2015; pp. 1–7.

    Google Scholar 

  16. N.M. Di Filippo, M.K. Jouaneh, Characterization of Different Microsoft Kinect Sensor Models. Sensors Journal, IEEE2015, 15, 4554–4564.

    Google Scholar 

  17. Leap Motion. Available: https://www.leapmotion.com

  18. F. Weichert, D. Bachmann, B. Rudak, D. Fisseler, Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors (Switzerland)2013, 13, 6380–6393.

    Google Scholar 

  19. K. O’Hara, G. Gonzalez, A. Sellen, G. Penney, A. Varnavas, H. Mentis, A. Criminisi, R. Corish, M. Rouncefield, N. Dastur, T. Carrell, Touchless Interaction in Surgery, Communications of the ACM, Vol. 57 No. 1, pp. 70-77.

    Google Scholar 

  20. L.A. Schwarz, A. Bigdelou, N. Navab, Learning gestures for customizable human-computer interaction in the operating room, Proceedings of the Medical Image Computing and Computer-Assisted Intervention (MICCAI ’11), G. Fichtinger, A. Martel, and T. Peters, Eds., vol. 6891 of Lecture Notes in Computer Science, pp. –136, Springer, Berlin, Germany, 2011.

    Google Scholar 

  21. G.C.S. Ruppert, L.O. Reis, P.H.J. Amorim, T.F. de Moraes, and J.V.L. da Silva, Touchless gestures user interfaces for interactive image visualization in urological surgery, 2012, pp. 687-691.

    Google Scholar 

  22. L. Gallo, A.P. Placitelli and M. Ciampi, Controller-free exploration of medical image data: Experiencing the Kinect, Computer-Based Medical Systems (CBMS), 2011, pp. 1–6.

    Google Scholar 

  23. L.T. De Paolis, M. Pulimeno, G. Aloisio, “Advanced Visualization and Interaction Systems for Surgical Pre-Operative Planning”, Journal of Computing and Information Technology - CIT, Vol. 18, No. 4, pp. 385–392, 2010, ISSN 1330-1136.

    Google Scholar 

  24. L.T. De Paolis, M. Pulimeno, G. Aloisio, “An Advanced Modality of Visualization and Interaction with Virtual Models of the Human Body”, Digital Human Modeling, Lecture Notes in Computer Science, Vol. 5620/2009, pp. 13–18, V.G. Duffy (Ed.), Springer Publisher.

    Google Scholar 

  25. J. P. Wachs, H. I. Stern, Y. Edan, M. Gillam, J. Handler, C. Feied, M. Smith, A Gesture-based Tool for Sterile Browsing of Radiology Images, Journal of the American Medical Informatics Association, , 15:321-323

    Google Scholar 

  26. C. Graetzel, T.W. Fong, S. Grange, C. Baur, A non-contact mouse for surgeon-computer interaction, Technol Health Care 2004, (3):245-257.

    Google Scholar 

  27. OpenNI Programmer’s Guide. Available: http://openni.org/Documentation/ProgrammerGuide.html

  28. NITE middleware. Available: http://www.primesense.com/nite

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lucio Tommaso De Paolis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

De Paolis, L.T. (2016). A Touchless Gestural Platform for the Interaction with the Patients Data. In: Kyriacou, E., Christofides, S., Pattichis, C. (eds) XIV Mediterranean Conference on Medical and Biological Engineering and Computing 2016. IFMBE Proceedings, vol 57. Springer, Cham. https://doi.org/10.1007/978-3-319-32703-7_172

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-32703-7_172

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-32701-3

  • Online ISBN: 978-3-319-32703-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics