Pulling back information geometry

Georgios Arvanitidis, Miguel Gonzalez Duque, Alison Pouplin, Dimitris Kalatzis, Søren Hauberg

Research output: Journal Article or Conference Article in JournalConference articleResearchpeer-review

Abstract

Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models. The existing theory, however, relies on the decoder being a Gaussian distribution as its simple reparametrization allows us to interpret the generating process as a random projection of a deterministic manifold. Consequently, this approach breaks down when applied to decoders that are not as easily reparametrized. We here propose to use the Fisher-Rao metric associated with the space of decoder distributions as a reference metric, which we pull back to the latent space. We show that we can achieve meaningful latent geometries for a wide range of decoder distributions for which the previous theory was not applicable, opening the door to ’black box’ latent geometries.
Original languageEnglish
JournalProceedings of the 25th International Conference on Artificial Intelligence and Statistics (AISTATS) 2022
Volume151
Pages (from-to)4872
Number of pages4,894
Publication statusPublished - 2022

Keywords

  • Latent space geometry
  • Deep generative models
  • Fisher-Rao metric
  • Decoder distributions
  • Latent variables

Fingerprint

Dive into the research topics of 'Pulling back information geometry'. Together they form a unique fingerprint.

Cite this