Abstract
Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models. The existing theory, however, relies on the decoder being a Gaussian distribution as its simple reparametrization allows us to interpret the generating process as a random projection of a deterministic manifold. Consequently, this approach breaks down when applied to decoders that are not as easily reparametrized. We here propose to use the Fisher-Rao metric associated with the space of decoder distributions as a reference metric, which we pull back to the latent space. We show that we can achieve meaningful latent geometries for a wide range of decoder distributions for which the previous theory was not applicable, opening the door to ’black box’ latent geometries.
| Original language | English |
|---|---|
| Conference proceedings | Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS) 2022 |
| Volume | 151 |
| Pages (from-to) | 4872 |
| Number of pages | 4,894 |
| Publication status | Published - 2022 |
Keywords
- Latent space geometry
- Deep generative models
- Fisher-Rao metric
- Decoder distributions
- Latent variables
Fingerprint
Dive into the research topics of 'Pulling back information geometry'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver