Abstract
Manipulation at micro and nano scales is a particular case for remote handling. Although novel robotic approaches emerge, these tools are not yet commonly adopted due to their inherent complexity and their lack of user-friendly interfaces. In order to fill this gap, this work first introduces a novel paradigm dubbed semi-autonomous. Its aim is to combine full-automation and user-driven manipulation by sequencing simple automated elementary tasks following user instructions. To acquire these instructions in a more natural and intuitive way, we propose a “metaphor-free” user interface implemented in a virtual reality environment. A predictive intention extraction technique is introduced through a computational model inspired from cognitive sciences and implemented using a Kinect depth sensor. The model is compared in terms of naturalness and intuitiveness to a gesture recognition technique to detect user actions, in a semi-autonomous pick-and-place operation. It shows an improvement in user performance in duration and success of the task, and a qualitative preference for the proposed approach as evaluated by a user survey. The projected technique may be a worthy alternative to manual operation on a basic keyboard/joystick setup or even an interesting complement to the use of a haptic feedback arm.
Similar content being viewed by others
Notes
References
Atkeson CG, Hollerbach JM (1985) Kinematic features of unrestrained vertical arm movements. J Neurosci 5(9):2318–2330
Becchio C, Manera V, Sartori L, Cavallo A, Castiello U (2012) Grasping intentions: from thought experiments to empirical evidence. Front Hum Neurosci 6:117
Binnig G, Quate CF, Gerber C (1986) Atomic force microscope. Phys Rev Lett 56(9):930
Bolopion A, Régnier S (2013) A review of haptic feedback teleoperation systems for micromanipulation and microassembly. IEEE Trans Autom Sci Eng 10(3):496–502
Bolopion A, Stolle C, Tunnell R, Haliyo S, Régnier S, Fatikow S (2011) Remote microscale teleoperation through virtual reality and haptic feedback. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 894–900
Bolopion A, Xie H, Haliyo S, Régnier S (2012) Haptic teleoperation for 3-d microassembly of spherical objects. IEEE/ASME Trans Mechatron 17(1):116–127
Bolt RA (1980) Put-that-there: voice and gesture at the graphics interface, vol 14. ACM, New york
Brooke J (1996) SUS-A quick and dirty usability scale. Usability Eval Ind 189:194
Chandarana M, Trujillo A, Shimada K, Allen BD (2017) A natural interaction interface for UAVs using intuitive gesture recognition. In: Savage-Knepshield P, Chen J (eds) Advances in human factors in robots and unmanned systems. Springer, Berlin, pp 387–398
Gauthier M, Régnier S (2010) Robotic micro-assembly. IEEE Press, New Jersey
Haliyo S, Dionnet F, Régnier S (2004) Controlled rolling of microobjects for autonomous manipulation. J Micromechatron 3(2):75–102
Haliyo S, Régnier S, Guinot JC (2003) [mü]MAD, the adhesion based dynamic micro-manipulator. Eur J Mech A Solids 22(6):903–916
MacKenzie IS (1992) Fitts’ law as a research and design tool in human–computer interaction. Hum Comput Interact 7(1):91–139
Millet G, Lécuyer A, Burkhardt JM, Haliyo S, Régnier S (2008) Improving perception and understanding of nanoscale phenomena using haptics and visual analogy. In: Ferre M (ed) Haptics: perception, devices and scenarios. Springer, Berlin, pp 847–856
Millet G, Lécuyer A, Burkhardt JM, Haliyo S, Régnier S (2013) Haptics and graphic analogies for the understanding of atomic force microscopy. Int J Hum Comput Stud 71(5):608–626
Nagasaki H (1989) Asymmetric velocity and acceleration profiles of human arm movements. Exp Brain Res 74(2):319–326
Norman DA (2010) Natural user interfaces are not natural. Interactions 17(3):6–10. https://doi.org/10.1145/1744161.1744163
Oztop E, Wolpert D, Kawato M (2005) Mental state inference using visual control parameters. Cognit Brain Res 22(2):129–151
Plamondon R, Alimi AM, Yergeau P, Leclerc F (1993) Modelling velocity profiles of rapid movements: a comparative study. Biol Cybern 69(2):119–128
Régnier S, Chaillet N (2010) Microrobotics for micromanipulation. Wiley-ISTE, London
Ren G, O’Neill E (2013) 3d selection with freehand gesture. Comput Graph 37(3):101–120
Sartori L, Becchio C, Castiello U (2011) Cues to intention: the role of movement information. Cognition 119(2):242–252
Sauvet B, Ouarti N, Haliyo S, Régnier S (2012) Virtual reality backend for operator controlled nanomanipulation. In: IEEE international conference on manipulation, manufacturing and measurement on the nanoscale (3M-NANO), pp 121–127
Searle JR (1983) Intentionality: an essay in the philosophy of mind. Cambridge University Press, Cambridge
Sol A (2013) Real-world machine learning: how kinect gesture recognition works. http://alissonsol.com/
Stapel JC, Hunnius S, Bekkering H (2012) Online prediction of others’ actions: the contribution of the target object, action context and movement kinematics. Psychol Res 76(4):434–445
Taranta EM, Vargas AN, LaViola JJ (2016) Streamlined and accurate gesture recognition with penny pincher. Comput Graph 55:130–142
Vinter A, Mounoud P (1991) Isochrony and accuracy of drawing movements in children: effects of age and context. In: Development of graphic skills. Research perspectives and educational implications. Academic Press, New York, pp 113–134. http://archive-ouverte.unige.ch/unige:21512
Viviani P, Flash T (1995) Minimum-jerk, two-thirds power law, and isochrony: converging approaches to movement planning. J Exp Psychol Hum Percept Perform 21(1):32
Funding
Funding was provided by the French government research program “Investissements d’avenir” through SMART Laboratory of Excellence (Grant No. ANR-11-LABX-65) and Robotex Equipment of Excellence (Grant No. ANR-10-EQPX-44).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Cohen, L., Chetouani, M., Régnier, S. et al. A natural interface based on intention prediction for semi-autonomous micromanipulation. J Multimodal User Interfaces 12, 17–30 (2018). https://doi.org/10.1007/s12193-018-0259-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-018-0259-1