Skip to main content
Log in

Order Selection of the Linear Mixing Model for Complex-Valued FMRI Data

  • Published:
Journal of Signal Processing Systems Aims and scope Submit manuscript

Abstract

Functional magnetic resonance imaging (fMRI) data are originally acquired as complex-valued images, which motivates the use of complex-valued data analysis methods. Due to the high dimension and high noise level of fMRI data, order selection and dimension reduction are important procedures for multivariate analysis methods such as independent component analysis (ICA). In this work, we develop a complex-valued order selection method to estimate the dimension of signal subspace using information-theoretic criteria. To correct the effect of sample dependence to information-theoretic criteria, we develop a general entropy rate measure for complex Gaussian random process to calibrate the independent and identically distributed (i.i.d.) sampling scheme in the complex domain. We show the effectiveness of the approach for order selection on both simulated and actual fMRI data. A comparison between the results of order selection and ICA on real-valued and complex-valued fMRI data demonstrates that a fully complex analysis extracts more meaningful components about brain activation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8

Similar content being viewed by others

Notes

  1. http://www.fil.ion.ucl.ac.uk/spm/software/spm5/

References

  1. Adalı, T., & Calhoun, V. D. (2007). Complex ICA of brain imaging data. IEEE Signal Processing Magazine, 24, 136–139.

    Article  Google Scholar 

  2. Adalı, T., Kim, T., & Calhoun, V. (2004). Independent component analysis by complex nonlinearities. In Proc. ICASSP (Vol. 5, pp. 525–528). Montreal, Canada.

  3. Adalı, T., Li, H., Novey, M., & Cardoso, J. F. (2008). Complex ICA using nonlinear functions. IEEE Transactions on Signal Processing, 56, 4536–4544.

    Article  MathSciNet  Google Scholar 

  4. Akaike, H. (1974). A new look at statistical model identification. IEEE Transactions on Automatic Control, 19, 716–723.

    Article  MathSciNet  MATH  Google Scholar 

  5. Bach, F. R., & Jordan, M. I. (2003). Finding clusters in independent component analysis. In Proc. ICA.

  6. Beckmann, C. F., & Smith, S. M. (2004). Probabilistic independent component analysis for functional magnetic resonance imaging. IEEE Transactions on Medical Imaging, 23, 137–152.

    Article  Google Scholar 

  7. Calhoun, V. D., Adalı, T., & Li, Y. O. (2004). Independent component analysis of complex-valued functional MRI data by complex nonlinearities. In Proc. ISBI. Arlington, VA.

  8. Calhoun, V. D., Adalı, T., Pearlson, G. D., & Pekar, J. J. (2001). A method for making group inferences from functional MRI data using independent component analysis. Human Brain Mapping, 14, 140–151.

    Article  Google Scholar 

  9. Calhoun, V. D., Adalı, T., van Zijl, P. C. M., & Pekar, J. J. (2002). Independent component analysis of fMRI data in the complex domain. Magnetic Resonance in Medicine, 48, 180–192.

    Article  Google Scholar 

  10. Cavanaugh, J. E. (1999). A large-sample model selection criterion based on Kullback’s symmetric devergence. Statistics & Probability Letters, 44, 333–344.

    Article  MathSciNet  Google Scholar 

  11. Correa, N., Li, Y. O., Adalı, T., & Calhoun, V. D. (2005). Comparison of blind source separation algorithms for fMRI using a new Matlab toolbox: GIFT. In Proc. ICASSP (Vol. 5, pp. 401–404). Philadelphia, PA.

  12. Draper, D. (1995). Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society, 57, 45–97.

    MathSciNet  MATH  Google Scholar 

  13. Freire, L., & Mangin, J. F. (2001). What is the best similarity measure for motion correction in fMRI time series. IEEE Transactions on Medical Imaging, 21, 470–484.

    Article  Google Scholar 

  14. Hoogenrad, F. G., Reichenbach, J. R., Haacke, E. M., Lai, S., Kuppusamy, K., & Sprenger, M. (1998). In vivo measurement of changes in venous blood-oxygenation with high resolution functional MRI at .95 Tesla by measuring changes in susceptibility and velocity. Magnetic Resonance in Medicine, 39, 97–107.

    Article  Google Scholar 

  15. Hyvarinen, A., Himberg, J., & Esposito, F. (2004). Validating the independent components of neuroimaging time-series via clustering and visualization. Neuroimage, 22, 1214–1222.

    Article  Google Scholar 

  16. Lee, J., Shahram, M., Schwartzman, A., & Pauly, J. M. (2007). Complex data analysis in high-resolution SSFP fMRI. Magnetic Resonance in Medicine, 57, 905–917.

    Article  Google Scholar 

  17. Li, X. L., Adalı, T., & Anderson, M. (2010). Principal component analysis for noncircular signals in the presence of circular white Gaussian noise. In The 44th annual asilomar conference on signals, systems and computers (to appear).

  18. Li, Y. O., Adalı, T., & Calhoun, V. D. (2007). Estimating the number of independent components for fMRI data. Human Brain Mapping, 28, 1251–1266.

    Article  Google Scholar 

  19. McKeown, M. J. (2000). Detection of consistently task-related activations in fMRI data with hybrid independent component analysis. Neuroimage, 11, 24–35.

    Article  Google Scholar 

  20. McKeown, M. J., Hansen, L. K., & Sejnowski, T. J. (2003). Independent component analysis of functional MRI: What is signal and what is noise? Current Opinion in Neurobiology, 13, 620–629.

    Article  Google Scholar 

  21. McKeown, M. J., Makeig, S., Brown, G. G., Jung, T. P., Kindermann, S. S., Bell, A. J., et al. (1998). Analysis of fMRI data by blind separation into independent components. Human Brain Mapping, 6, 160–188.

    Article  Google Scholar 

  22. Menon, R. S. (2002). Postacquisition suppression of large-vessel BOLD signals in high-resolution fMRI. Magnetic Resonance in Medicine, 47, 1–9.

    Article  Google Scholar 

  23. Moritz, C. H., Haughton, V. M., Cordes, D., Quigley, M., & Meyerand, M. E. (2000). Whole-brain functional MR imaging activation from a finger-tapping task examined with independent component analysis. American Journal of Neuroradiology, 21, 1629–1635.

    Google Scholar 

  24. Nan, F. Y., & Nowak, R. D. (1999). Generalized likelihood ratio detection for fMRI using complex data. IEEE Transactions on Medical Imaging, 18, 320–329.

    Article  Google Scholar 

  25. Neeser, F. D., & Massey, J. L. (1993). Proper complex random processes with applications to information theory. IEEE Transactions on Information Theory, 39, 1293–1302.

    Article  MathSciNet  MATH  Google Scholar 

  26. Ogawa, S., Tank, S., Menon, R., Ellermann, J., Kim, S., Merkle, H., et al. (1992). Intrinsic signal changes accompanying sensory stimulation: Functional brain mapping with magnetic resonance imaging. Proceedings of the National Academy of Sciences, 89, 5851–5955.

    Google Scholar 

  27. Papoulis, A. (1982). Maximum entropy and spectral estimation: A review. IEEE Transactions on Acoustics, Speech, and Signal Processing, 29, 1176–1186.

    Article  MathSciNet  Google Scholar 

  28. Pascual-Marqui, R. D., Michel, C. M., & Lehmann, D. (1994). Low resolution electromagnetic tomography: A new method for localizing electrical activity in the brain. International Journal of Psychophysiology, 18, 49–65.

    Article  Google Scholar 

  29. Pham, D. T. (2002). Mutual information approach to blind separation of stationary sources. IEEE Transactions on Information Theory, 48, 1935–1946.

    Article  MathSciNet  MATH  Google Scholar 

  30. Phillips, C. G., Zeki, S., & Barlow, H. B. (1984). Localization of function in the cerebral cortex, past, present and future. Brain, 107, 327–361.

    Article  Google Scholar 

  31. Picinbono, B., & Bondon, P. (1997). Second-order statistics of complex signals. IEEE Transactions on Signal Processing, 45, 411–420.

    Article  Google Scholar 

  32. Picinbono, B., & Chevalier, P. (1995). Widely linear estimation with complex data. IEEE Transactions on Signal Processing, 43, 2030–2033.

    Article  Google Scholar 

  33. Rauscher, A., Sedlacik, J., M. Barth, H. M., & Reichenbach, J. R. (2005). Magnetic susceptibility-weighted MR phase imaging of the human brain. American Journal of Neuroradiology, 26, 736–742.

    Google Scholar 

  34. Rissanen, J. (1978). Modeling by the shortest data description. Automatica, 14, 465–471.

    Article  MATH  Google Scholar 

  35. Rowe, D. B. (2005). Modeling both the magnitude and phase of complex-valued fMRI data. NeuroImage, 25, 1310–1324.

    Article  Google Scholar 

  36. Schreier, P. J. (2008). Bounds on the degree of impropriety of complex random vectors. IEEE Signal Processing Letters, 15, 190–193.

    Article  Google Scholar 

  37. Schwartz, G. (1978). Estimating the dimension of a model. Annals of Statistics, 6, 461–464.

    Article  MathSciNet  Google Scholar 

  38. Wax, M., & Kailath, T. (1985). Detection of signals by information theoretic criteria. IEEE Transactions on Acoustics, Speech, and Signal Processing, 33, 387–392.

    Article  MathSciNet  Google Scholar 

  39. Xiong, W., Li, H., Adalı, T., Li, Y. O., & Calhoun, V. D. (2010). On entropy rate for the complex domain and its application to i.i.d. sampling. IEEE Transactions on Signal Processing, 58, 2409–2414.

    Article  MathSciNet  Google Scholar 

  40. Xiong, W., Li, Y.O., Li, H., Adalı, T., & Calhoun, V. D. (2008). On ICA of complex-valued fMRI: Advantages and order selection. In Proc. ICASSP (pp. 529–532). Las Vegas, NV.

Download references

Acknowledgements

This work is supported by the NSF grants NSF-CCF 0635129 and NSF-IIS 0612076.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tülay Adalı.

Appendix

Appendix

We present the entropy rate of a complex-valued second-order stationary Gaussian random process using a widely linear model following an approach similar to the one given in [27].

Given a second-order stationary and zero mean random process Z k , the covariance function is defined by \(R(m) = E\{Z_{k+m}Z^\ast_k\}\) and the pseudo covariance function [25], also called the relation function [31], as \(\tilde{R}(m) = E\{Z_{k+m}Z_k\}\). Without loss of generality, the random processes and vectors discussed in this paper are assumed to be zero mean. A random process is called second-order stationary if it is wide sense stationary and its pseudo covariance function only depends on the index difference. The Fourier transform of the covariance function yields the power spectrum (or spectral density) function S(ω). Similarly, we define the Fourier transform of the pseudo covariance function as the pseudo power spectrum function \(\tilde{S}(\omega)\).

Entropy rate is a measure of average information in a random sequence, which can be written for a complex random process Z k as

$$ h_c(Z) = \lim\limits_{n \rightarrow \infty} \frac{H(Z_1,Z_2,\ldots,Z_n)}{n} $$
(5)

when the limit exists. As in the real case, \(H(Z_1,Z_2,\ldots,Z_n) \leq \sum_{k=1}^n H(Z_k)\), with equality if and only if the random variables Z k are independent. Therefore the entropy rate can be used to measure the sample dependence and it reaches the upper bound when all samples of the process are independent.

The widely linear filter is introduced in [32], and any second-order stationary complex signal can be modeled as the output of a widely linear system driven by a circular white noise, which cannot be achieved by a strictly linear system [31]. Given the input and output vectors x, y ∈ ℂN, a widely linear system is expressed as

$$ \mathbf{y} = \mathbf{F}\mathbf{x}+\mathbf{G}\mathbf{x}^\ast $$

where F and G are complex-valued impulse responses in matrix form. The system function of a widely linear system is the pair of functions [F(ω), G(ω)]. By using entropy rate analysis on multivariate Guassian random process [5, 39] and [29], the relation of entropy rate for input and output of a widely linear system is given by

$$ \begin{array}{rll} h_c(Y) &=& h_c(X) \\ &&+\,\frac{1}{4\pi} \int_{-\pi}^{\pi} \log \left\{ \left[ |F(\omega)|^2 - |G(\omega)|^2 \right]\right.\\&&\times\,\left. \left[ |F(-\omega)|^2 - |G(-\omega)|^2 \right] \right\} d\omega \end{array} $$

Theorem 1

If z(n) is a complex second-order stationary Gaussian random process with power spectrum function S(ω) and pseudo power spectrum function \(\tilde{S}(\omega)\) , its entropy rate h c is given by

$$ h_c = \log(\pi e) + \frac{1}{4\pi} \int_{-\pi}^{\pi} \log \left[ S(\omega)S(-\omega)-|\tilde{S}(\omega)|^2 \right] d\omega. $$

The proof of Theorem 1 is given in [39].

For a second-order circular process, we have \(\tilde{S}(\omega)=0\), thus yielding the entropy rate of a second-order circular Gaussian random process as

$$ h_{{\rm circ}} = \log(\pi e) + \frac{1}{4\pi} \int_{-\pi}^{\pi} \log \left[ S(\omega)S(-\omega) \right] d\omega. $$

For the general case in Theorem 1, \(|\tilde{S}(\omega)|^2 \geq 0\). Hence, for the second-order circular and noncircular Gaussian random sequences with the same covariance function R(m), we have

$$ h_{{\rm noncirc}} \leq h_{{\rm circ}}, $$

which can be also verified using the result for complex entropy [25, 36] and the definition of entropy rate given in Eq. 5.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Xiong, W., Li, YO., Correa, N. et al. Order Selection of the Linear Mixing Model for Complex-Valued FMRI Data. J Sign Process Syst 67, 117–128 (2012). https://doi.org/10.1007/s11265-010-0509-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11265-010-0509-2

Keywords

Navigation