Independent Component Analysis

From Bangor Imaging Unit
Jump to: navigation, search

Theory[edit]

Independent component analysis (ICA) is a computational method for separating a multivariate signal into additive subcomponents supposing the mutual statistical independence of the non-Gaussian source signals. It is a special case of blind source separation. When the independence assumption is correct, blind ICA separation of a mixed signal gives very good results. It is also used for signals that are not supposed to be generated by a mixing for analysis purposes. A simple application of ICA is the “cocktail party problem”, where the underlying speech signals are separated from a sample data consisting of people talking simultaneously in a room. Usually the problem is simplified by assuming no time delays or echoes. An important note to consider is that if N sources are present, at least N observations (e.g. microphones) are needed to get the original signals. This constitutes the square (J = D, where D is the input dimension of the data and J is the dimension of the model). Other cases of underdetermined (J < D) and overdetermined (J > D) have been investigated. ICA finds the independent components (aka factors, latent variables or sources) by maximizing the statistical independence of the estimated components. We may choose one of many ways to define independence, and this choice governs the form of the ICA algorithms. The two broadest definitions of independence for ICA are

  1. Minimization of Mutual Information
  2. Maximization of non-Gaussianity

The Non-Gaussianity family of ICA algorithms, motivated by the central limit theorem, uses kurtosis and negentropy. The Minimization-of-Mutual information (MMI) family of ICA algorithms uses measures like Kullback-Leibler Divergence and maximum-entropy.
Typical algorithms for ICA use centering, whitening (usually with the eigenvalue decomposition), and dimensionality reduction as preprocessing steps in order to simplify and reduce the complexity of the problem for the actual iterative algorithm. Whitening and dimension reduction can be achieved with principal component analysis or singular value decomposition. Whitening ensures that all dimensions are treated equally a priori before the algorithm is run. Algorithms for ICA include infomax, FastICA, and JADE, but there are many others also.
In general, ICA cannot identify the actual number of source signals, a uniquely correct ordering of the source signals, nor the proper scaling (including sign) of the source signals. ICA is important to blind signal separation and has many practical applications. It is closely related to (or even a special case of) the search for a factorial code of the data, i.e., a new vector-valued representation of each data vector such that it gets uniquely encoded by the resulting code vector (loss-free coding), but the code components are statistically independent.


General Definition[edit]

The data is represented by the random vector and the components as the random vector . The task is to transform the observed data x, using a linear static transformation W as

into maximally independent components s measured by some function of independence.

Further Reading[edit]

An Introduction to ICA[edit]

File:ICA Intro.pdf
File:Independent Component analysis, A new concept?.pdf File:An Information Maximization Approach to Blind Separation and Blind Deconvolution.pdf File:Independent Component Analysis- Algorithms and Applications.pdf

Group ICA[edit]

File:A Method for Making Group Inferences from Functional MRI Data Using Independent Component Analysis.pdf

Constrained ICA[edit]

File:Semiblind Spatial ICA of fMRI Using Spatial Constraints.pdf

Reliability of Default Mode Network Identification[edit]

File:Interater and Intermethod Reliability of Default Mode Network Selection.pdf

Comparison of different methods[edit]

File:Group ICA of resting-state data a comparison.pdf