Mutual Information of Population Codes and Distance Measures in Probability Space

K. Kang and H. Sompolinsky
Phys. Rev. Lett. 86, 4958 – Published 21 May 2001
PDFExport Citation

Abstract

We studied the mutual information between a stimulus and a system consisting of stochastic, statistically independent elements that respond to a stimulus. Using statistical mechanical methods the properties of the mutual information (MI) in the limit of a large system size N are calculated. For continuous valued stimuli, the MI increases logarithmically with N and is related to the log of the Fisher information of the system. For discrete stimuli the MI saturates exponentially with N. We find that the exponent of saturation of the MI is the Chernoff distance between response probabilities that are induced by different stimuli.

  • Received 11 January 2001

DOI:https://doi.org/10.1103/PhysRevLett.86.4958

©2001 American Physical Society

Authors & Affiliations

K. Kang and H. Sompolinsky

  • Racah Institute of Physics and Center for Neural Computation, Hebrew University, Jerusalem 91904, Israel

References (Subscription Required)

Click to Expand
Issue

Vol. 86, Iss. 21 — 21 May 2001

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×