Abstract
Canonical correlation analysis (CCA) is a widely used statistical technique to capture correlations between two sets of multivariate random variables and has found a multitude of applications in computer vision, medical imaging, and machine learning. The classical formulation assumes that the data live in a pair of vector spaces which makes its use in certain important scientific domains problematic. For instance, the set of symmetric positive definite matrices (SPD), rotations, and probability distributions all belong to certain curved Riemannian manifolds where vector-space operations are in general not applicable. Analyzing the space of such data via the classical versions of inference models is suboptimal. Using the space of SPD matrices as a concrete example, we present a principled generalization of the well known CCA to the Riemannian setting. Our CCA algorithm operates on the product Riemannian manifold representing SPD matrix-valued fields to identify meaningful correlations. As a proof of principle, we present experimental results on a neuroimaging data set to show the applicability of these ideas.
Original language | English |
---|---|
Title of host publication | Riemannian Computing in Computer Vision |
Publisher | Springer International Publishing |
Pages | 69-100 |
Number of pages | 32 |
ISBN (Electronic) | 9783319229577 |
ISBN (Print) | 9783319229560 |
DOIs | |
Publication status | Published - 2015 Jan 1 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© Springer International Publishing Switzerland 2016.
ASJC Scopus subject areas
- General Engineering
- General Computer Science
- General Mathematics