LettersBlind separation of circularly distributed sources by neural extended APEX algorithm☆
Introduction
Blind Source Separation (BSS) by neural networks and adaptive algorithms has raised much interest in the scientific community (see for example [11], [13], [14], [15], [16], [17] and references therein). Among other approaches, it has been proven that adding non-linearity to linear Principal Component Analysis (PCA) neural networks makes them able to improve the independence of their outputs so as to allow blind separation of real-valued independent sources [13], [14], [15], [16], [17]. Moreover, attention has been paid to blind separation of complex-valued signals because of the useful applications to telecommunications and array processing and to blind separation of convolutive mixtures (see [1], [2], [6], [8], [18], [20], [25] and references therein). Consequently, as for instantaneous real-valued separation, the generalization of classical PCA techniques to complex-valued data processing seems interesting. Also, some examples of the usefulness of complex principal component analysis and principal subspace decomposition algorithms may be found for instance in [10], [19].
Recently, some attempts have been made in order to extend the best-known PCA algorithms to the complex case. In [3] Chen and Hou presented an heuristic complex version of the well-known APEX algorithm [9]; in [7] De Castro et al. gave a heuristic complex extension of the GHA algorithm [23]. In this work we formally derive a new learning theory as a generalized complex-valued Hebbian learning for a linear feed-forward network with lateral inhibitory connections [9], [21], [22], which relies on non-linear functions that affect network's signal processing ability. Then we discuss the choice of the set of non-linear functions under the theoretical framework proposed by Sudjianto and Hassoun [24] that we extend to the complex case. A particular non-linearity, having the shape of the Rayleigh probability distribution function, allows the neural network to separate out mixed independent complex-valued source signals, as illustrated by computer simulations. Notation In the paper we use the following notation: denotes mathematical expectation of with respect to the statistics of the multivariate random process ; superscript T denotes ordinary transpose while superscript H denotes Hermitian (conjugate) transpose; symbol denotes complex conjugation.
Section snippets
Extended APEX algorithm and blind separation of complex sources
Consider the complex-weighted neural network with input and output , with m≤p, described by the following relationships:where , vectors represent the network's direct connections, and vectors represent the network's inhibitory lateral connections. A schematic of this topology is reported in Fig. 1. The original APEX learning rule for such a network was derived by Kung and Diamantaras by applying Oja's first principal component rule, which is a
Computer simulations
Suppose input be formed by a linear mixture of four independent signals arranged in a vector . Signal s1 is QAM4, s2 is QAM16, signal s3 is PSK. Signal s4 is a Gaussian noise of variance σ2=0.25. The first row of Fig. 2 depicts the independent signals while second row shows the obtained four mixtures. Also, the bar graphs reported in Fig. 3 represent the histograms of the four sources’ modules squashed by function QR(u). A close examination of these graphs shows that the QAM4 and PSK
Conclusion
In this work a new adapting rule for linear neural networks as generalization of APEX learning has been presented; it provides a generalization in that it applies to complex-weighted neural networks and involves non-linear functions in the classical Hebbian learning. A particular choice of the non-linearity is discussed by recalling the Sudjianto–Hassoun interpretation of non-classical Hebbian learning extended to the complex case.
References (25)
Independent component analysis, a new concept ?
Signal Process.
(1994)- et al.
Independent component analysis by general non-linear Hebbian-like rules
Signal Process.
(1998) - et al.
Representation and separation of signals using nonlinear PCA type learning
Neural Networks
(1994) - et al.
Generalizations of PCA, optimization problems, and neural networks
Neural Networks
(1995) Optimal unsupervised learning in a single-layer linear feedforward neural network
Neural Networks
(1989)- J.-F. Cardoso, P. Comon, Independent component analysis, a survey of some algebraic methods, Proceedings of ISCAS, Vol....
- et al.
Equivariant adaptive source separation
IEEE Trans. Signal Process.
(1996) - Y. Chen, C. Hou, High resolution adaptive bearing estimation using a complex-weighted Neural Network, Proceedings of...
- et al.
Adaptive learning algorithm for principal component analysis with partial data
Proc. Cybernet. Systems
(1996) - et al.
Improved contrast dedicated to blind separation in communications
Proceedings of ICASSP
(1997)
A complex valued Hebbian learning algorithm
Proceedings of IEEE-IJCNN
Cited by (33)
Neural learning by geometric integration of reduced 'rigid-body' equations
2004, Journal of Computational and Applied MathematicsUniqueness of complex and multidimensional independent component analysis
2004, Signal ProcessingCitation Excerpt :Complex S–D theorem Various algorithms for solving the complex BSS problem have been proposed [1,2,7,13,16]. We want to note that many cases where complex BSS is applied can in fact be reduced to using real BSS algorithms.
Neural independent component analysis by 'maximum-mismatch' learning principle
2003, Neural Networks
- ☆
This research was financially supported by the Italian MURST.