Skip to main content
Log in

A natural interface based on intention prediction for semi-autonomous micromanipulation

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Manipulation at micro and nano scales is a particular case for remote handling. Although novel robotic approaches emerge, these tools are not yet commonly adopted due to their inherent complexity and their lack of user-friendly interfaces. In order to fill this gap, this work first introduces a novel paradigm dubbed semi-autonomous. Its aim is to combine full-automation and user-driven manipulation by sequencing simple automated elementary tasks following user instructions. To acquire these instructions in a more natural and intuitive way, we propose a “metaphor-free” user interface implemented in a virtual reality environment. A predictive intention extraction technique is introduced through a computational model inspired from cognitive sciences and implemented using a Kinect depth sensor. The model is compared in terms of naturalness and intuitiveness to a gesture recognition technique to detect user actions, in a semi-autonomous pick-and-place operation. It shows an improvement in user performance in duration and success of the task, and a qualitative preference for the proposed approach as evaluated by a user survey. The projected technique may be a worthy alternative to manual operation on a basic keyboard/joystick setup or even an interesting complement to the use of a haptic feedback arm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. http://www.blender.org.

References

  1. Atkeson CG, Hollerbach JM (1985) Kinematic features of unrestrained vertical arm movements. J Neurosci 5(9):2318–2330

    Google Scholar 

  2. Becchio C, Manera V, Sartori L, Cavallo A, Castiello U (2012) Grasping intentions: from thought experiments to empirical evidence. Front Hum Neurosci 6:117

    Article  Google Scholar 

  3. Binnig G, Quate CF, Gerber C (1986) Atomic force microscope. Phys Rev Lett 56(9):930

    Article  Google Scholar 

  4. Bolopion A, Régnier S (2013) A review of haptic feedback teleoperation systems for micromanipulation and microassembly. IEEE Trans Autom Sci Eng 10(3):496–502

    Article  Google Scholar 

  5. Bolopion A, Stolle C, Tunnell R, Haliyo S, Régnier S, Fatikow S (2011) Remote microscale teleoperation through virtual reality and haptic feedback. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 894–900

  6. Bolopion A, Xie H, Haliyo S, Régnier S (2012) Haptic teleoperation for 3-d microassembly of spherical objects. IEEE/ASME Trans Mechatron 17(1):116–127

    Article  Google Scholar 

  7. Bolt RA (1980) Put-that-there: voice and gesture at the graphics interface, vol 14. ACM, New york

    Google Scholar 

  8. Brooke J (1996) SUS-A quick and dirty usability scale. Usability Eval Ind 189:194

    Google Scholar 

  9. Chandarana M, Trujillo A, Shimada K, Allen BD (2017) A natural interaction interface for UAVs using intuitive gesture recognition. In: Savage-Knepshield P, Chen J (eds) Advances in human factors in robots and unmanned systems. Springer, Berlin, pp 387–398

    Chapter  Google Scholar 

  10. Gauthier M, Régnier S (2010) Robotic micro-assembly. IEEE Press, New Jersey

    Book  Google Scholar 

  11. Haliyo S, Dionnet F, Régnier S (2004) Controlled rolling of microobjects for autonomous manipulation. J Micromechatron 3(2):75–102

    Article  Google Scholar 

  12. Haliyo S, Régnier S, Guinot JC (2003) [mü]MAD, the adhesion based dynamic micro-manipulator. Eur J Mech A Solids 22(6):903–916

    Article  MATH  Google Scholar 

  13. MacKenzie IS (1992) Fitts’ law as a research and design tool in human–computer interaction. Hum Comput Interact 7(1):91–139

    Article  Google Scholar 

  14. Millet G, Lécuyer A, Burkhardt JM, Haliyo S, Régnier S (2008) Improving perception and understanding of nanoscale phenomena using haptics and visual analogy. In: Ferre M (ed) Haptics: perception, devices and scenarios. Springer, Berlin, pp 847–856

    Chapter  Google Scholar 

  15. Millet G, Lécuyer A, Burkhardt JM, Haliyo S, Régnier S (2013) Haptics and graphic analogies for the understanding of atomic force microscopy. Int J Hum Comput Stud 71(5):608–626

    Article  Google Scholar 

  16. Nagasaki H (1989) Asymmetric velocity and acceleration profiles of human arm movements. Exp Brain Res 74(2):319–326

    Article  Google Scholar 

  17. Norman DA (2010) Natural user interfaces are not natural. Interactions 17(3):6–10. https://doi.org/10.1145/1744161.1744163

    Article  Google Scholar 

  18. Oztop E, Wolpert D, Kawato M (2005) Mental state inference using visual control parameters. Cognit Brain Res 22(2):129–151

    Article  Google Scholar 

  19. Plamondon R, Alimi AM, Yergeau P, Leclerc F (1993) Modelling velocity profiles of rapid movements: a comparative study. Biol Cybern 69(2):119–128

    Article  Google Scholar 

  20. Régnier S, Chaillet N (2010) Microrobotics for micromanipulation. Wiley-ISTE, London

    Google Scholar 

  21. Ren G, O’Neill E (2013) 3d selection with freehand gesture. Comput Graph 37(3):101–120

    Article  Google Scholar 

  22. Sartori L, Becchio C, Castiello U (2011) Cues to intention: the role of movement information. Cognition 119(2):242–252

    Article  Google Scholar 

  23. Sauvet B, Ouarti N, Haliyo S, Régnier S (2012) Virtual reality backend for operator controlled nanomanipulation. In: IEEE international conference on manipulation, manufacturing and measurement on the nanoscale (3M-NANO), pp 121–127

  24. Searle JR (1983) Intentionality: an essay in the philosophy of mind. Cambridge University Press, Cambridge

    Book  Google Scholar 

  25. Sol A (2013) Real-world machine learning: how kinect gesture recognition works. http://alissonsol.com/

  26. Stapel JC, Hunnius S, Bekkering H (2012) Online prediction of others’ actions: the contribution of the target object, action context and movement kinematics. Psychol Res 76(4):434–445

    Article  Google Scholar 

  27. Taranta EM, Vargas AN, LaViola JJ (2016) Streamlined and accurate gesture recognition with penny pincher. Comput Graph 55:130–142

    Article  Google Scholar 

  28. Vinter A, Mounoud P (1991) Isochrony and accuracy of drawing movements in children: effects of age and context. In: Development of graphic skills. Research perspectives and educational implications. Academic Press, New York, pp 113–134. http://archive-ouverte.unige.ch/unige:21512

  29. Viviani P, Flash T (1995) Minimum-jerk, two-thirds power law, and isochrony: converging approaches to movement planning. J Exp Psychol Hum Percept Perform 21(1):32

    Article  Google Scholar 

Download references

Funding

Funding was provided by the French government research program “Investissements d’avenir” through SMART Laboratory of Excellence (Grant No. ANR-11-LABX-65) and Robotex Equipment of Excellence (Grant No. ANR-10-EQPX-44).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sinan Haliyo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cohen, L., Chetouani, M., Régnier, S. et al. A natural interface based on intention prediction for semi-autonomous micromanipulation. J Multimodal User Interfaces 12, 17–30 (2018). https://doi.org/10.1007/s12193-018-0259-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-018-0259-1

Keywords

Navigation