Skip to main content
Log in

Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor

  • Published:
Journal of Bionic Engineering Aims and scope Submit manuscript

Abstract

An omnidirectional mobile robot has the advantage that three degrees of freedom of motion in a 2D plane can be set independently, and it can thus move in arbitrary directions while maintaining the same heading. Dead reckoning is often used for self-localization using onboard sensors in omnidirectional robots, by means of measuring wheel velocities from motor encoder data, as well as in car-like robots. However, omnidirectional mobile robots can easily slip because of the nature of omni-wheels with multiple free rollers, and dead reckoning will not work if even one wheel is not attached to the ground. An odometry method where the data is not affected by wheel slip must be introduced to acquire high quality self-location data for omnidirectional mobile robots. We describe a method to obtain robot ego-motion using camera images and optical flow calculation, i.e., where the camera is used as a velocity sensor. In this paper, a silicon retina vision camera is introduced as a mobile robot sensor, which has a good dynamic range under various lighting conditions. A Field-Programmable Gate Array (FPGA) optical flow circuit for the silicon retina is also developed to measure ego-motion of the mobile robot. The developed optical flow calculation system is introduced into a small omnidirectional mobile robot and evaluation experiments for the mobile robot ego-motion are carried out. In the experiments, the accuracy of self-location by the dead reckoning and optical flow methods are evaluated by comparison using motion capture. The results show that the correct position is obtained by the optical flow sensor rather than by dead reckoning.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Sanada A, Takemura Y, Ishii K, Nassiraei A A F, Godler I, Kitazumi Y, Ogawa Y. Development of omnidirectional mobile robot for RoboCup middle size league team. SICE System Integration 2009, Japan, 2009, 2BI-2 (in Japanese).

    Google Scholar 

  2. Takemura Y, Sanada A, Ichinose T, Nakano Y, Nassiraei A A F, Azeura K, Kitazumi Y, Ogawa Y, Godler I, Ishii K. Development of “Hibikino-Musashi” omnidirectional mobile robot. International Congress Series, 2007, 1301, 201–205.

    Article  Google Scholar 

  3. Nassiraei A A F, Takemura Y, Sanada A, Kitazumi Y, Ogawa Y, Godler I, Ishii K, Miyamoto H, Ghaderi A. Concept of mechatronics modular design for an autonomous mobile soccer robot. International Symposium on Computational Intelligence in Robotics and AutomationCIRA 2007”, Jacksonville, USA, 2007, 178–183

    Google Scholar 

  4. Shi J, Tomasi C. Good features to track. IEEE Conference on Computer Vision and Pattern Recognition, Seattle, USA, 1994, 593–600.

    Google Scholar 

  5. Barron J L, Fleet D J, Beauchemin S S. Performance of optical flow techniques. International Journal of Computer Vision, 1994, 12, 43–77.

    Article  Google Scholar 

  6. Kameda S, Yagi T. An analog VLSI chip emulating sustained and transient response channels of the vertebrate retina. IEEE Transactions on Neural Networks, 2003, 14, 1405–1412.

    Article  Google Scholar 

  7. Shimonomura K, Yagi T. Computing lightness constancy with an APS-based silicon retina. IEEE Biomedical Circuits and Systems ConferenceBioCAS 2008”, Baltimore, MD, USA, 2008, 201–204.

    Chapter  Google Scholar 

  8. Mead C. Analog VLSI and Neural Systems, Addison-Wesley, Reading, MA, USA, 1989.

    MATH  Google Scholar 

  9. Gyaourova A, Kamath C, Cheung S C. Block matching for object tracking. LLNL Technical Report, 2003, UCRL-TR-200271.

    Google Scholar 

  10. Di Stefano L, Viarani E. Vehicle detection and tracking using the block matching algorithm. In: Mastorakis N (ed). Recent Advances in Signal Processing and Communications, World Scientific and Engineering Society Press, Athens, Greece, 1999.

    Google Scholar 

  11. Bierling M. Displacement estimation by hierarchical block matching. Proceedings of the SPIE Conference, 1988, 1001, 942–951.

    Google Scholar 

  12. Horn B K P, Schunck B G. Determining optical flow, Artificial Inteligence, 1981, 17, 185–203.

    Article  Google Scholar 

  13. XILINX. XtremeDSP for Virtex-4 FPGAs User Guide, [2010-07-01], http://www.xilinx.com/

  14. Takemura Y, Sato M, Ishii K. Toward realization of swarm intelligence mobile robots. International Congress Series, 2006, 1291, 273–276.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Atsushi Sanada.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sanada, A., Ishii, K. & Yagi, T. Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor. J Bionic Eng 7 (Suppl 4), S172–S176 (2010). https://doi.org/10.1016/S1672-6529(09)60232-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1016/S1672-6529(09)60232-8

Keywords

Navigation