Book contents
- Frontmatter
- Dedication
- Contents
- List of Algorithms
- Notation
- Preface
- I Classical Methods
- II Factors and Groupings
- III Non-Gaussian Analysis
- 9 Towards Non-Gaussianity
- 10 Independent Component Analysis
- 11 Projection Pursuit
- 12 Kernel and More Independent Component Methods
- 13 Feature Selection and Principal Component Analysis Revisited
- Problems for Part III
- References
- Author Index
- Subject Index
- Data Index
10 - Independent Component Analysis
from III - Non-Gaussian Analysis
Published online by Cambridge University Press: 05 June 2014
- Frontmatter
- Dedication
- Contents
- List of Algorithms
- Notation
- Preface
- I Classical Methods
- II Factors and Groupings
- III Non-Gaussian Analysis
- 9 Towards Non-Gaussianity
- 10 Independent Component Analysis
- 11 Projection Pursuit
- 12 Kernel and More Independent Component Methods
- 13 Feature Selection and Principal Component Analysis Revisited
- Problems for Part III
- References
- Author Index
- Subject Index
- Data Index
Summary
The truth is rarely pure and never simple (Oscar Wilde, The Importance of Being Ernest, 1854–1900).
Introduction
In the Factor Analysis model X = AF + μ + ε;, an essential aim is to find an expression for the unknown d × k matrix of factor loadings A. Of secondary interest is the estimation of F. If X comes from a Gaussian distribution, then the principal component (PC) solution for A and F results in independent scores, but this luxury is lost in the PC solution of non-Gaussian random vectors and data. Surprisingly, it is not the search for a generalisation of Factor Analysis, but the departure from Gaussianity that has paved the way for new developments.
In psychology, for example, scores in mathematics, language and literature or comprehensive tests are used to describe a person's intelligence. A Factor Analysis approach aims to find the underlying or hidden kinds of intelligence from the test scores, typically under the assumption that the data come from the Gaussian distribution. Independent Component Analysis, too, strives to find these hidden quantities, but under the assumption that the data are non-Gaussian. This assumption precludes the use of the Gaussian likelihood, and the independent component (IC) solution will differ from the maximum-likelihood (ML) Factor Analysis solution, which may not be appropriate for non-Gaussian data.
To get some insight into the type of solution one hopes to obtain with Independent Component Analysis, consider, for example, the superposition of sound tracks.
- Type
- Chapter
- Information
- Analysis of Multivariate and High-Dimensional Data , pp. 305 - 348Publisher: Cambridge University PressPrint publication year: 2013