Cooperative dynamics in visual processing

H. Sompolinsky, D. Golomb, and D. Kleinfeld
Phys. Rev. A 43, 6990 – Published 1 June 1991
PDFExport Citation

Abstract

An oscillator neural network model that is capable of processing local and global attributes of sensory input is proposed and analyzed. Local features in the input are encoded in the average firing rate of the neurons while the relationships between these features can modulate the temporal structure of the neuronal output. Neurons that share the same receptive field interact via relatively strong feedback connections, while neurons with different fields interact via specific, relatively weak connections. The model is studied in the context of processing visual stimuli that are coded for orientation. The effect of axonal propagation delays on synchronization of oscillatory activity is analyzed. We compare our theoretical results with recent experimental evidence on coherent oscillatory activity in the cat visual cortex. The computational capabilities of the model for performing discrimination and segmentation tasks are demonstrated. Coding and linking of visual features other than orientation are discussed.

  • Received 6 February 1991

DOI:https://doi.org/10.1103/PhysRevA.43.6990

©1991 American Physical Society

Authors & Affiliations

H. Sompolinsky and D. Golomb

  • Racah Institute of Physics, Hebrew University, Jerusalem 91904, Israel
  • AT&T Bell Laboratories, Murray Hill, New Jersey 07974

D. Kleinfeld

  • AT&T Bell Laboratories, Murray Hill, New Jersey 07974

References (Subscription Required)

Click to Expand
Issue

Vol. 43, Iss. 12 — June 1991

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review A

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×