Skip to main content
Log in

Visual computation of egomotion using an image interpolation technique

  • Original Papers
  • Published:
Biological Cybernetics Aims and scope Submit manuscript

Abstract

A novel technique is presented for the computation of the parameters of egomotion of a mobile device, such as a robot or a mechanical arm, equipped with two visual sensors. Each sensor captures a panoramic view of the environment. We show that the parameters of egomotion can be computed by interpolating the position of the image captured by one of the sensors at the robot's present location, with respect to the images captured by the two sensors at the robot's previous location. The algorithm delivers the distance travelled and angle rotated, without the explicit measurement or integration of velocity fields. The result is obtained in a single step, without any iteration or successive approximation. Tests of the algorithm on real and synthetic images reveal an accuracy to within 5% of the actual motion. Implementation of the algorithm on a mobile robot reveals that stepwise rotation and translation can be measured to within 10% accuracy in a three-dimensional world of unknown structure. The position and orientation of the robot at the end of a 30-step trajectory can be estimated with accuracies of 5% and 5°, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Dulimarta HS, Jain AK (1994) Mobile robot localisation in indoor environments. Third International Conference on Automation, Robotics, and Computer Vision, Singapore, 9–11 November 1994, pp 2204–2207.

  • Esch HE, Burns JE (1995) Honeybees use optic flow to measure the distance to a food source. Naturwissenschaften 82:38–40.

    Google Scholar 

  • Facchinetti C, Tieche F, Hugli H (1994) Using and learning vision-based self positioning for autonomous robot navigation. Third International Conference on Automation, Robotics, and Computer Vision, Singapore, 9–11 November 1994, pp 1694–1698.

  • Hong J, Tan X, Pinette X, Weiss R, Riseman E (1991) Image based homing. IEEE Conference on Robotics and Automation, Sacramento, California, April 1991, pp 620–625.

  • Koenderink JJ (1986) Optic flow. Vision Res 26:161–180.

    Google Scholar 

  • Müller M, Wehner R (1988) Path integration in desert ants, Cataglyphis fortis. Proc Natl Acad Sci USA 85:5287–5290.

    Google Scholar 

  • Nelson RC, Aloimonos J (1988) Finding motion parameters from spherical motion fields (or the advantages of having eyes in the back of your head). Biol Cybern 58:261–273.

    Google Scholar 

  • Papi F (ed) (1992) Animal homing. Chapman & Hall, London.

    Google Scholar 

  • Pichon J, Blanes C, Franceschini N (1989) Visual guidance of a mobile robot equipped with a network of self motion sensors. Proc SPIE 1195:44–53.

    Google Scholar 

  • Ronacher B, Wehner R (1995) Desert ants Cataglyphis fortis use self-induced optic flow to measure distances travelled. J Comp Physiol A 177:21–27.

    Google Scholar 

  • Srinivasan MV (1994) An image-interpolation technique for the computation of optic flow and egomotion. Biol Cybern 71:401–415.

    Google Scholar 

  • Srinivasan MV, Zhang SW, Lehrer M, Collett TS (1996) Honeybee navigation en route to the goal: visual flight control and odometry. J Exp Biol 199:237–244.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chahl, J.S., Srinivasan, M.V. Visual computation of egomotion using an image interpolation technique. Biol. Cybern. 74, 405–411 (1996). https://doi.org/10.1007/BF00206707

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00206707

Keywords

Navigation