Canonical correlation analysis (CCA) is a widely used statistical technique to capture correlations between two sets of multivariate random variables and has found a multitude of applications in computer vision, medical imaging, and machine learning. The classical formulation assumes that the data live in a pair of vector spaces which makes its use in certain important scientific domains problematic. For instance, the set of symmetric positive definite matrices (SPD), rotations, and probability distributions all belong to certain curved Riemannian manifolds where vector-space operations are in general not applicable. Analyzing the space of such data via the classical versions of inference models is suboptimal. Using the space of SPD matrices as a concrete example, we present a principled generalization of the well known CCA to the Riemannian setting. Our CCA algorithm operates on the product Riemannian manifold representing SPD matrix-valued fields to identify meaningful correlations. As a proof of principle, we present experimental results on a neuroimaging data set to show the applicability of these ideas.
|Title of host publication||Riemannian Computing in Computer Vision|
|Publisher||Springer International Publishing|
|Number of pages||32|
|Publication status||Published - 2015 Jan 1|
ASJC Scopus subject areas
- Computer Science(all)