Elsevier

Neural Networks

Volume 11, Issue 5, July 1998, Pages 951-962
Neural Networks

Contributed article
Inhibitory connections in the assembly neural network for texture segmentation

https://doi.org/10.1016/S0893-6080(98)00053-7Get rights and content

Abstract

A neural network with assembly organization is described. This assembly network is applied to the problem of texture segmentation in natural scenes. The network is partitioned into several subnetworks: one for each texture class. Hebb's assemblies are formed in the subnetworks during the process of training the excitatory connections. Also, a structure of the inhibitory connections is formed in the assembly network during a separate training process. The inhibitory connections result in inhibitory interactions between different subnetworks. Computer simulation of the network has been performed. Experiments show that an adequately trained assembly network with inhibitory connections is more efficient than without them.

Introduction

The task of texture segmentation of natural images is important for solving the problems of object–ground separation, recognition of object shapes, and analysis of natural scenes. The solution to this task is important for robotics and for automation of visual information processing in many different areas, e.g. in medicine, mapping, etc.

Most of the studies of texture recognition and segmentation use heuristic algorithms (Nothdurft, 1985; Voorhees & Poggio, 1988; Bovik et al., 1990; Wu & Chen, 1993; You & Cohen, 1993; Caelli & Reye, 1993; Hild & Shirai, 1993; Shen et al., 1993), but it is the neural network that performs texture recognition and segmentation in the brain of man and animals. Following Hebb's theory, intellectual functions of the mammalian brain are based on the structural and functional organization of neural networks in the form of neural assemblies (Hebb, 1949; Milner, 1957). In neural network-based models neural networks mainly perform classification of textures and nonlinear transforffiaiion of filtered input data (Oje, 1989;Malik & Perona, 1990; Kussul et al., 1991;Van Hulle & Tollenaere, 1993). Inhibitory connections are seldom used in neural models for texture recognition and segmentation. However, it is known that inhibitory connections take part in information processing of the brain (e.g. Eccles, 1969; Hubel, 1988). In (Goltsev, 1996) a neural network with assembly organization is described; the network is intended for texture segmentation of natural images. The present article continues this study. Inhibitory connections are formed in the assembly network, increasing the recognition rate and reliability of the recognition process of the assembly network.

Section snippets

Description of the assembly network

The model is designed to solve the following task. Consider a natural image containing several texture regions, each of which constitutes a texture class. In each region, a comparatively small number of image patches (training samples of this texture class) are chosen by the teacher for the network's learning. Image samples of the training set of each texture class are distributed among all typical positions within the corresponding region. The problem is to identify the membership of test set

Formation of the excitatory connections

The network consists of N neurons, numbered from 1 to N. The neuron numbers are denoted by indices i and j in the formulae presented later. For formal description of modification of the assembly network structure, the input and output binary vectors are introduced to represent network neurons by its one-valued components. At the intersections of corresponding lines and columns of connection matrixes, the weights of connections may change as a result of training.

Let there exist U texture

Description of the network dynamics

The activity of each neuron is calculated at every time step, t, synchronously with all neurons of the network. The output of each neuron has only two values: 0 or 1. A long binary vector of neural activity S(x) is used for description of activity dynamics of the network at the stage of texture recognition of the xth image patch; it represents outputs of all network neurons. Let us denote the output of the ith neuron at the tth time step by Sti.

At the zeroth time step of a recognition process

Formation of negative features

During training the excitatory connections of some texture class, a cumulation of the feature sets extracted from all training samples of this texture class is collected by the model as well. For this purpose, at every training patch xm, of the mth texture class, the model performs a disjunction of the vector G(xm) and a special short binary vector Wm (with all zero-valued components at the beginning of the process). Let the training set of the mth texture class consists of M patches. During

Training the inhibitory connections

In order to train the inhibitory connections directed from the negative features formed in vectors H, the model successively passes all image patches of the training set again. The inhibitory connections, from each training patch's texture class subnetwork, to another subnetwork, are trained by means of a special procedure. This procedure, described later, is performed at each texture patch of the training set. As a result of the inhibitory connections' training process only those neurons which

Restrictions on the negative features

Let us consider the restrictions in using the negative features. The task to the network implies that the training set for each texture class must cover all typical texture samples of this class. The reason for this requirement is evident. The network ought to be trained for all feature values and their combinations which may be met in the texture class. First, this means that the inner structure of assemblies, which consists of the excitatory connections, ought to be formed based on the

Results of the computer simulation

A simulation program for the network described above has been created and used to process photographs digitized to 192×160 pixels with 32 grey levels. Fig. 5(a)Fig. 6(a)Fig. 7(a)Fig. 8(a) show samples of the original images [the same as in (Goltsev, 1996)] used in the experiments. The model was intended to recognize four texture classes, therefore, the network consisted of four subnetworks. There were a total 2048 neurons in the network. One full-connected connection matrix was used, containing

Discussion

There is a necessity to clarify some points relating to the assembly network. The recurrent, dynamic algorithm of the assembly network convergence, which is described above, makes this assembly network a dynamic and nonlinear device. In the present model the network convergence is performed by means of an associative process of neural activity re-distribution between the assemblies. This algorithm demonstrates associative capacities of the assembly network. Associative properties of neural

Conclusion

In conclusion, the following should be remarked. The inhibitory connections are useful in those cases when it is necessary to have high reliability of recognition of a part or all recognized classes, and the fully representative training set is available. The inhibitory connections are a tool, by means of which it is possible to prevent completely the recognition errors in a limited number of classes if, of course, any negative features exist in these classes.

The inhibitory connections do not

References (26)

  • Amosov, N.M. & Kussul, E.M. (1969). The possible circuit of the system of reinforcement-inhibition. The Questions of...
  • Amosov, N.M., Baidyk, T.N., Goltsev, A.D., Kasatkin, A.M., Kasatkina, L.M., Kussul, E.M. & Rachkovskij, D.A. (1991)....
  • A.C. Bovik et al.

    Multichannel texture analysis using localized spatial filters

    IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-

    (1990)
  • Cited by (9)

    • Modular neural networks with Hebbian learning rule

      2009, Neurocomputing
      Citation Excerpt :

      A modular neural network of several modules may have connections not only inside the modules, but between neurons of different modules as well. Such a connection structure is considered for the case of only inhibitory connections in [15]. The subsequent part of the paper describes a modular neural network with modifiable inhibitory as well as excitatory connections between neurons of different modules named as NN2.

    • Secondary learning in the assembly neural network

      2004, Neurocomputing
      Citation Excerpt :

      The simplification of the recognition algorithm makes the present assembly network much faster. The present assembly neural network does not need any regulator of neural activity such as Milner's structure of internal inhibitory connections [33,34], Braitenberg's central extra-network regulator [6], and the system of excitation-inhibition [3,4,26,17,19]. This circumstance is very important.

    • Neural networks and micromechanics

      2010, Neural Networks and Micromechanics
    • Texture recognition with random subspace neural classifier

      2005, WSEAS Transactions on Circuits and Systems
    View all citing articles on Scopus
    View full text