Elsevier

Vision Research

Volume 50, Issue 22, 28 October 2010, Pages 2200-2212
Vision Research

Encoding natural scenes with neural circuits with random thresholds

https://doi.org/10.1016/j.visres.2010.03.015Get rights and content
Under an Elsevier user license
open archive

Abstract

We present a general framework for the reconstruction of natural video scenes encoded with a population of spiking neural circuits with random thresholds. The natural scenes are modeled as space-time functions that belong to a space of trigonometric polynomials. The visual encoding system consists of a bank of filters, modeling the visual receptive fields, in cascade with a population of neural circuits, modeling encoding in the early visual system. The neuron models considered include integrate-and-fire neurons and ON–OFF neuron pairs with threshold-and-fire spiking mechanisms. All thresholds are assumed to be random. We demonstrate that neural spiking is akin to taking noisy measurements on the stimulus both for time-varying and space-time-varying stimuli. We formulate the reconstruction problem as the minimization of a suitable cost functional in a finite-dimensional vector space and provide an explicit algorithm for stimulus recovery. We also present a general solution using the theory of smoothing splines in Reproducing Kernel Hilbert Spaces. We provide examples of both synthetic video as well as for natural scenes and demonstrate that the quality of the reconstruction degrades gracefully as the threshold variability of the neurons increases.

Keywords

Neural encoding of natural scenes
Receptive fields
Neural circuits with random thresholds
Reconstruction of visual stimuli
Reproducing Kernel Hilbert Spaces
Regularization

Cited by (0)