Skip to main content

Approximating Optimal Information Transmission using Local Hebbian Algorithms in a Double Feedback Loop

  • Conference paper
  • First Online:
ICANN ’93 (ICANN 1993)

Included in the following conference series:

Abstract

Maximising mutual information (MI) under various constraints has been suggested as a goal for neural networks in a perceptual system. Networks using Hebbian algorithms have been found to be suitable for optimising MI with either input or output noise. In this paper we show that a double feedback loop network, using local Hebbian algorithms, can approximate the characteristics required for optimizing MI with both input and output noise. This represents a better approximation than simply orthonormalising the principal subspace.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. J. J. Atick and A. N. Redlich. What does the retina know about natural scenes? Neural Computation, 4: 196–210. 1992.

    Article  Google Scholar 

  2. H. G. Barrow and J. M. L. Budd. Automatic gain control by a basic neural circuit. In I. Aleksander and J. Taylor, editors, Artifical Neural Networks, 2. Elsevier, 1992.

    Google Scholar 

  3. R. Linsker. Self-organization in a perceptual network. IEEE Computer, 21(3): 105–117, Mar. 1988.

    Article  Google Scholar 

  4. E. Oja. Principal components, minor components, and linear neural networks. Neural Networks, 5: 927–935, 1992.

    Article  Google Scholar 

  5. M. D. Plumbley. On information theory and unsupervised neural networks. Technical Report CUED/F-INFENG/TR.78, Cambridge University Engineering Department, UK, 1991.

    Google Scholar 

  6. M. D. Plumbley. Efficient information transfer and anti-Hebbian neural networks. Neural Networks, 1993. (in press).

    Google Scholar 

  7. M. D. Plumbley. A Hebbian/anti-Hebbian network which optimizes information capacity by orthonormalizing the principal subspace. In Proceedings of the IEE Conference on Artificial Neural Networks, ANN’93, Brighton, UK, May 1993. (To appear).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1993 Springer-Verlag London Limited

About this paper

Cite this paper

Plumbley, M.D. (1993). Approximating Optimal Information Transmission using Local Hebbian Algorithms in a Double Feedback Loop. In: Gielen, S., Kappen, B. (eds) ICANN ’93. ICANN 1993. Springer, London. https://doi.org/10.1007/978-1-4471-2063-6_105

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-2063-6_105

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-3-540-19839-0

  • Online ISBN: 978-1-4471-2063-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics