Skip to main content

Advertisement

Log in

Comparison of gesture and conventional interaction techniques for interventional neuroradiology

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Interaction with radiological image data and volume renderings within a sterile environment is a challenging task. Clinically established methods such as joystick control and task delegation can be time-consuming and error-prone and interrupt the workflow. New touchless input modalities may have the potential to overcome these limitations, but their value compared to established methods is unclear.

Methods

We present a comparative evaluation to analyze the value of two gesture input modalities (Myo Gesture Control Armband and Leap Motion Controller) versus two clinically established methods (task delegation and joystick control). A user study was conducted with ten experienced radiologists by simulating a diagnostic neuroradiological vascular treatment with two frequently used interaction tasks in an experimental operating room. The input modalities were assessed using task completion time, perceived task difficulty, and subjective workload.

Results

Overall, the clinically established method of task delegation performed best under the study conditions. In general, gesture control failed to exceed the clinical input approach. However, the Myo Gesture Control Armband showed a potential for simple image selection task.

Conclusion

Novel input modalities have the potential to take over single tasks more efficiently than clinically established methods. The results of our user study show the relevance of task characteristics such as task complexity on performance with specific input modalities. Accordingly, future work should consider task characteristics to provide a useful gesture interface for a specific use case instead of an all-in-one solution.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Ahmed K, Keeling AN, Khan RS, Ashrafian H, Arora S, Nagpal K, Burrill J, Darzi A, Athanasiou T, Hamady M (2010) What does competence entail in interventional radiology? Cardiovasc Interv Radiol 33(1):3–10

    Article  Google Scholar 

  2. Bigdelou A, Schwarz L, Navab N (2012) An adaptive solution for intra-operative gesture-based human-machine interaction. In: Proceedings of the international conference on intelligent user interfaces, ACM, pp 75–84

  3. Bizzotto N, Costanzo A, Bizzotto L, Regis D, Sandri A, Magnan B (2014) Leap motion gesture control with OsiriX in the operating room to control imaging first experiences during live surgery. Surg Innov 1:655–656

    Article  Google Scholar 

  4. Gallo L (2013) A study on the degrees of freedom in touchless interaction. In: SIGGRAPH Asia 2013 Technical briefs, ACM, p 28

  5. Göbel F, Klamka K, Siegel A, Vogt S, Stellmach S, Dachselt R (2013) Gaze-supported foot interaction in zoomable information spaces. In: Extended abstracts on human factors in computing systems (CHI EA ’13). ACM, New York, USA, pp 3059–3062. doi:10.1145/2468356.2479610

  6. Hart SG, Stavenland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. In: Hancock PA, Meshkati N (eds) Human mental workload. Advances in psychology, vol 52. North-Holland, pp 139–183. http://www.sciencedirect.com/science/article/pii/S0166411508623869

  7. Hettig J, Mewes A, Riabikin O, Skalej M, Preim B, Hansen C (2015) Exploration of 3D medical image data for interventional radiology using myoelectric gesture control. In: Eurographics workshop on visual computing for biology and medicine, pp 177–185

  8. Hübler A, Hansen C, Beuing O, Skalej M, Preim B (2014) Workflow analysis for interventional neuroradiology using frequent pattern mining. In: Proceedings of the annual meeting of the German society of computer- and robot-assisted surgery (CURAC), pp 165–168

  9. Jacob MG, Wachs JP (2014) Context-based hand gesture recognition for the operating room. Pattern Recognit Lett 36:196–203

    Article  Google Scholar 

  10. Mauser S, Burgert O (2014) Touch-free, gesture-based control of medical devices and software based on the leap motion controller. Stud Health Technol Inform 196:265–270

    PubMed  Google Scholar 

  11. Meng M, Fallavollita P, Habert S, Weider S, Navab N (2016) Device and system independent personal touchless user interface for operating rooms. In: International conference on information processing in computer-assisted interventions (IPCAI)

  12. Mewes A, Hensen B, Wacker F, Hansen C (2016a) Touchless interaction with software in interventional radiology and surgery: a systematic literature review. Int J CARS 1–15. doi:10.1007/s11548-016-1480-6

  13. Mewes A, Saalfeld P, Riabikin O, Skalej M, Hansen C (2016b) A gesture-controlled projection display for CT-guided interventions. Int J Comput Assist Radiol Surg 11(1):157–164

    Article  CAS  PubMed  Google Scholar 

  14. Odisio BC, Wallace MJ (2014) Image-guided interventions in oncology. Surg Oncol Clin N Am 23(4):937–955

    Article  PubMed  Google Scholar 

  15. O’Hara K, Gonzalez G, Sellen A, Penney G, Varnavas A, Mentis H, Criminisi A, Corish R, Rouncefield M, Dastur N, Carrell T (2014) Touchless interaction in surgery. Commun ACM 57(1):70–77

    Article  Google Scholar 

  16. Onceanu D, Stewart AJ (2011) Direct surgeon control of the computer in the operating room. In: Medical image computing and computer-assisted intervention—(MICCAI), pp 121–128

  17. promolla A, Volpi V, Ingrosso A, Fabri S, Rapuano C, Passalacqua D, Medaglia CM (2015) A usability study of a gesture recognition system applied during the surgical procedures. In: Marcus A (eds) Design, User Experience, and Usability: Interactive Experience Design. DUXU 2015. Lecture Notes in Computer Science, vol 9188. Springer, Cham, pp 682–692. doi:10.1007/978-3-319-20889-3_63

  18. Park BJ, Jang T, Choi JW, Kim N (2016) Gesture-controlled interface for contactless control of various computer programs with a hooking-based keyboard and mouse-mapping technique in the operating room. Comput Math Methods Med 2016:5170379:1–5170379:7. doi:10.1155/2016/5170379

  19. Robert G, Hockey J (1997) Compensatory control in the regulation of human performance under stress and high workload: a cognitive-energetical framework. Biol Psychol 45(1):73–93

    Article  Google Scholar 

  20. Rosa GM, Elizondo ML (2014) Use of a gesture user interface as a touchless image navigation system in dental surgery: case series report. Imag Sci Dent 44(2):155–160

    Article  Google Scholar 

  21. Rutala WA, White MS, Gergen MF, Weber DJ (2006) Bacterial contamination of keyboards: efficacy and functional impact of disinfectants. Infect Control 27(4):372–377

    Google Scholar 

  22. Saalfeld P, Mewes A, Luz M, Preim B, Hansen C (2015) Comparative evaluation of gesture and touch input for medical software. In: Mensch und computer proceedings, pp 143–152

  23. Schwarz LA, Bigdelou A, Navab N (2011) Learning gestures for customizable human-computer interaction in the operating room. In: Medical image computing and computer-assisted intervention–(MICCAI), pp 129–136

  24. Stevenson D, Gardner H, Neilson W, Beenen E, Gananadha S, Fergusson J, Jeans P, Mews P, Bandi H (2016) Evidence from the surgeons: gesture control of image data displayed during surgery. Behav Inform Technol 35(12):1063–1079. doi:10.1080/0144929X.2016.1203025

    Article  Google Scholar 

  25. Stevenson DR (2011) Tertiary-level telehealth: a media space application. Comput Support Coop Work (CSCW) 20(1):61–92

    Article  Google Scholar 

  26. Tedesco DP, Tullis TS (2006) A comparison of methods for eliciting post-task subjective ratings in usability testing. In: Usability Professionals Association (UPA), pp 1–9

  27. Wipfli R, Dubois-Ferrière V, Budry S, Hoffmeyer P, Lovis C (2016) Gesture-controlled image management for operating room: a randomized crossover study to compare interaction using gestures, mouse, and third person relaying. PloS One 11(4):e0153–596

    Article  Google Scholar 

Download references

Funding

This work is partially funded by the Federal Ministry of Education and Research (BMBF) within the STIMULATE research campus (Grant number 13GW0095A).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Julian Hettig.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical standard

For this type of study, formal consent is not required.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hettig, J., Saalfeld, P., Luz, M. et al. Comparison of gesture and conventional interaction techniques for interventional neuroradiology. Int J CARS 12, 1643–1653 (2017). https://doi.org/10.1007/s11548-017-1523-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-017-1523-7

Keywords

Navigation