Scientific talk by Tom Szwagier
Multilevel Machine Learning with Flags
Abstract: In this talk, I will try to democratize a geometrical object that has not been much investigated in Machine Learning yet. A flag is a sequence of nested subspaces of increasing dimension. Flag spaces unify and generalize Grassmannians and Stiefel manifolds which are heavily used in Machine Learning and Computer Vision. Many dimension reduction methods like Principal Component Analysis (PCA) can be rethought as the search for a flag of linear subspaces that best and best approximate the data. This multilevel point of view enriches the previous methods which usually work at a fixed intrinsic dimension that is often unknown or not even well-defined. Alternatively, a flag can be seen as a sequence of mutually orthogonal subspaces and therefore provides a natural parameterization for the eigenspaces of symmetric matrices. In a recent paper, called Stratified PCA, we investigate parsimonious latent variable generative models extending Probabilistic PCA and parameterized with flags. We show that the covariance eigenvectors are not the best features to analyze the data variability when the number of samples is limited and the eigenvalue gaps are small. Instead, we recommend grouping them into flags of eigenspaces.I hope that after this talk you will see flags everywhere in your daily research!
When: Monday, Jan 22 at 2pm
Where: Euler Violet