In this video, we start our discussion of probabilistic PCA, where we view our observations as having been produced by a probabilistic generative process. In particular, we assume that the data were produced by an affine transformation of isotropic Gaussian latent variables, corrupted by isotropic Gaussian noise. Because both the latents and the conditional distribution are Gaussian, the marginal distribution of observations is also Gaussian, with mean and covariance that are simple to compute. We discuss how the latent variable approach allows us to explain correlations in data as transformations of uncorrelated latents. Along the way, we point out rotational redundancy in the latent variables, and discuss how to efficiently perform the covariance matrix inversions required to evaluate the probability of a particular observation.