Skip to main content

A Hierarchical Vision Architecture for Robotic Manipulation Tasks

  • Conference paper
  • First Online:
Computer Vision Systems (ICVS 1999)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1542))

Included in the following conference series:

Abstract

Real world manipulation tasks vary in their demands for precision and freedoms controlled. In particular, during any one task the complexity may vary with time. For a robotic hand-eye system, precise tracking and control of full pose is computationally expensive and less robust than rough tracking of a subset of the pose parameters (e.g. just translation). We present an integrated vision and control system in which the vision component provides (1) the continuous, local feedback at the required complexity for robot manipulation and (2) the discrete state information needed to switch between control modes of differing complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. P.K. Allen, B. Yoshimi, and A. Timcenko. Hand-eye coordination for robotics tracking and grasping. In K. Hashimoto, editor, Visual Servoing, pages 33–70. World Scientific, 1994.

    Google Scholar 

  2. R. L. Anderson. Dynamic sensing in a ping-pong playing robot. IEEE Trans. Robot. Automat, 5(6):723–739, December 1989.

    Google Scholar 

  3. M. Asada, K. Hosoda, and S. Suzuki. Vision-based learning and development for emergence of robot behaviors. In Proc. of Symp. on Emergent Systems, pages 16–22, 1997.

    Google Scholar 

  4. A. Blake, R. Curwen, and A. Zisserman. Afine-invariant contour tracking with automatic control of spatiotemporal scale. In Proc. Internal Conf. on Computer Vision, pages 421–430. IEEE Computer Society Press, 1993.

    Google Scholar 

  5. R. A. Brooks. Intelligence without representation. Artificial Intelligence, 47:139–159, 1991.

    Article  Google Scholar 

  6. A. Castano and S. A. Hutchinson. Visual compliance: Task-directed visual servo control. IEEE Trans. Robot. Automat, 10(3):334–342, June 1994.

    Google Scholar 

  7. F. Chaumette, E. Malis, and S. Boudet. 2-D 1/2 visual servoing with respect to a planar object. In Workshop on New Trends in Image-Based Robot Servoing, pages 45–52, 1997.

    Google Scholar 

  8. F. Chaumette, P. Rives, and B. Espiau. Classification and realization of the different vision-based tasks. In K. Hashimoto, editor, Visual Servoing, pages 199–228. World Scientific, 1994.

    Google Scholar 

  9. W.Z. Chen, U.A. Korde, and S.B. Skaar. Position control experiments using vision. Int. J. of Robot Res., 13(3):199–208, June 1994.

    Google Scholar 

  10. Jiyoon Chung and Hyun S. Yang. Fast and effective multiple moving targets tracking method for mobile robots. In IEEE Int. Conf. Robotics Automat., pages 2645–2650, Nagoya, Japan, May 1995.

    Google Scholar 

  11. P. I. Corke. Visual control of robot manipulators-a review. In K. Hashimoto, editor, Visual Servoing, pages 1–32. World Scientific, 1994.

    Google Scholar 

  12. M. W. Eklund, G. Ravichandran, M. M. Trivedi, and S. B. Marapane. Adaptive visual tracking algorithm and real-time implemenation. In IEEE Int. Conf. Robotics Automat., pages 2657–2662, Nagoya, Japan, May 1995.

    Google Scholar 

  13. B. Espiau, F. Chaumette, and P. Rives. A New Approach to Visual Servoing in Robotics. IEEE Trans. on Robotics and Automation, 8:313–326, 1992.

    Article  Google Scholar 

  14. J. Faymann, E. Rivlin, and H. I. Christensen. A system for active vision driven robotics. pages 1986–1992, April 1996.

    Google Scholar 

  15. J.T. Feddema, C.S.G. Lee, and O.R. Mitchell. Weighted selection of image features for resolved rate visual feedback control. IEEE Trans. Robot. Automat, 7(1):31–47, February 1991.

    Google Scholar 

  16. G. D. Hager. A modular system for robust hand-eye coordination. DCS RR-1074, Yale University, New Haven, CT, June 1995. Accepted to appear in IEEE Trans. on Robotics and Automation.

    Google Scholar 

  17. G. D. Hager and P. N. Belhumeur. Efficient region tracking of with parametric models of illumination and geometry. To appear in IEEE PAMI., April 1997.

    Google Scholar 

  18. G. D. Hager and Z. Dodds. A projective framework for constructing accurate hand-eye systems. In Proc. IROS Workshop on New Trends in Image-based Robot Servoing, pages 71–82, 1997.

    Google Scholar 

  19. G. D. Hager and K. Toyama. The “XVision” system: A general purpose substrate for real-time vision applications. Comp. Vision, Image Understanding., 69(1):23–27, January 1998.

    Google Scholar 

  20. G. D. Hager, K. Toyama, W. Feiten, and B. Magnussen. Modeling and control for mobile manipulation in everyday environments. Technical Report TR-1137, Yale University, New Haven, CT, 1997.

    Google Scholar 

  21. G. D. Hager, J. Wang, and K. Toyama. Servomatic: A modular approach to robust positioning using stereo visual servoing. In Proc. Conf. on Robotics and Automation, pages 2636–2641, 1996.

    Google Scholar 

  22. K. Hashimoto. LQ optimal and nonlinear approaches to visual servoing. In K. Hashimoto, editor, Visual Servoing, pages 165–198. World Scientific, 1994.

    Google Scholar 

  23. J. Hespanha, Z. Dodds, G. D. Hager, and A. S. Morse. What tasks can be performed with an uncalibrated stereo vision system? Submitted for review to IJCV, November 1997.

    Google Scholar 

  24. K. Hosoda and M. Asada. Versatile visual servoing without knowledge of true jacobian. In IEEE Int. Workshop on Intelligent Robots and Systems, pages 186–191. IEEE Computer Society Press, 1994.

    Google Scholar 

  25. M. Jagersand, O. Fuentes, and R. Nelson. Acquiring visual-motor models for precision manipulation with robot hands. In Proc., ECCV, 1996.

    Google Scholar 

  26. M. Jagersand, O. Fuentes, and R. Nelson. Experimental evaluation of uncalibrated visual servoing for precision manipulation. In Proc., ICRA, pages 2874–2880, 1997.

    Google Scholar 

  27. H. Kass, A. Witkin, and D. Terzopoulos. Snakes: Active contour models. Int. Journal of Computer Vision, 1:321–331, 1987.

    Article  Google Scholar 

  28. D. Koller, J.W. Weber, and J. Malik. Robust multiple car tracking with occlusion reasoning. In Proc. European Conf. on Computer Vision, pages A:189–196, 1994.

    Google Scholar 

  29. J. Kosecka, H. Christensen, and R. Bajcsy. Discrete event modeling and visually guided behaviors. IJCV, 14:179–191, 1995.

    Article  Google Scholar 

  30. Demian M. Lyons. Representing and analyzing action plans as networks of concurrent processes. IEEE Transactions on Robotics and Automation, 9(7):241–256, 1993.

    Article  Google Scholar 

  31. N. Maru, H. Kase, A. Nishikawa, and F. Miyazaki. Manipulator control by visual servoing with the stereo vision. In IEEE Int. Workshop on Intelligent Robots and Systems, pages 1866–1870. IEEE Computer Society Press, 1993.

    Google Scholar 

  32. D. Murray and A. Basu. Motion tracking with an active camera. IEEE Trans. Pattern Anal. Mach. Intelligence, 16(5):449–459, May 1994.

    Google Scholar 

  33. B. Nelson and P. K. Khosla. Increasing the tracking region of an eye-in-hand system by singularity and joint limit avoidance. In Proc. IEEE Int. Conf. Robot. and Automat., pages 418–423. IEEE Computer Society Press, 1993.

    Google Scholar 

  34. P. Prokopowicz, M. Swain, and R. Kahn. Task and environment-sensitive tracking. Technical Report 94-05, University of Chicago, March 1994.

    Google Scholar 

  35. C. Rasmussen and G. D. Hager. Joint probabilistic techniques for tracking multi-part objects. to appear in CVPR 98, November 1998.

    Google Scholar 

  36. C. Rasmussen, K. Toyama, and G. D. Hager. Tracking objects by color alone. In Proceedings of the Workshop on Applications of Computer Vision, 1996. Submitted.

    Google Scholar 

  37. A.A. Rizzi and D. E. Koditschek. Further progress in robot juggling: The spatial two-juggle. In Proc. IEEE Int. Conf. Robot. and Automat., pages 919–924. IEEE Computer Society Press, 1993.

    Google Scholar 

  38. C. Samson, M. Le Borgne, and B. Espiau. Robot Control: The Task Function Approach. Clarendon Press, Oxford, England, 1992.

    Google Scholar 

  39. J. Shi and C. Tomasi. Good features to track. In Proc. IEEE Conf. Comp. Vision and Patt. Recog., pages 593–600, 1994.

    Google Scholar 

  40. S. B. Skaar, W. H. Brockman, and W. S. Jang. Three-dimensional camera space manipulation. Int. J. of Robot Res., 9(4):22–39, 1990.

    Article  Google Scholar 

  41. C.E. Smith, S.A. Brandt, and N.P. Papnikolopoulos. Controlled active exploration of uncalibrated environments. In Proc. IEEE Conf. Comp. Vision and Patt. Recog., pages 792–795. IEEE Computer Society Press, 1994.

    Google Scholar 

  42. M. Tonko, K. Schafer, F. Heimes, and H.-H. Nagel. Towards visually servoed manipulation of car engine parts. In IEEE Proc. Int. Conf. on Robotics and Automation, 1997.

    Google Scholar 

  43. K. Toyama. Robust Vision-based Object Tracking. PhD thesis, Yale University, 1997.

    Google Scholar 

  44. K. Toyama. The surfball. Technical report, Yale University, http:// www.cs.yale.edu/ HTML/ YALE/ CS/ HyPlans/ toyama/ surfball/ surfball.html, 1997.

  45. L.E. Weiss, A.C. Sanderson, and C.P. Neuman. Dynamic sensor-based control of robots with visual feedback. IEEE J. Robot. Automat., RA-3(5):404–417, Oct. 1987.

    Google Scholar 

  46. S.W. Wijesoma, D.F.H Wolfe, and R.J. Richards. Eye-to-hand coordination for vision-guided robot control applications. Int. J. of Robot Res., 12(1):65–78, 1993.

    Article  Google Scholar 

  47. W.J. Wilson. Visual servo control of robots using kalman filter estimates of robot pose relative to work-pieces. In K. Hashimoto, editor, Visual Servoing, pages 71–104. World Scientific, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Dodds, Z., Jägersand, M., Hager, G., Toyama, K. (1999). A Hierarchical Vision Architecture for Robotic Manipulation Tasks. In: Computer Vision Systems. ICVS 1999. Lecture Notes in Computer Science, vol 1542. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49256-9_19

Download citation

  • DOI: https://doi.org/10.1007/3-540-49256-9_19

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-65459-9

  • Online ISBN: 978-3-540-49256-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics