Back to Browse

12.2 Probabilistic PCA - Pattern Recognition and Machine Learning

683 views
Feb 23, 2025
19:31

In this video, we start our discussion of probabilistic PCA, where we view our observations as having been produced by a probabilistic generative process. In particular, we assume that the data were produced by an affine transformation of isotropic Gaussian latent variables, corrupted by isotropic Gaussian noise. Because both the latents and the conditional distribution are Gaussian, the marginal distribution of observations is also Gaussian, with mean and covariance that are simple to compute. We discuss how the latent variable approach allows us to explain correlations in data as transformations of uncorrelated latents. Along the way, we point out rotational redundancy in the latent variables, and discuss how to efficiently perform the covariance matrix inversions required to evaluate the probability of a particular observation.

Download

1 formats

Video Formats

360pmp450.3 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

12.2 Probabilistic PCA - Pattern Recognition and Machine Learning | NatokHD