Deep interactive evolution

Philip Bontrager, Wending Lin, Julian Togelius, Sebastian Risi

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN's generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.
OriginalsprogEngelsk
TitelInternational Conference on Computational Intelligence in Music, Sound, Art and Design : EvoMUSART 2018
Antal sider16
ForlagSpringer
Publikationsdato2018
Sider267-282
ISBN (Trykt)978-3-319-77582-1
ISBN (Elektronisk)978-3-319-77583-8
DOI
StatusUdgivet - 2018
NavnLecture Notes in Computer Science
Vol/bind10783
ISSN0302-9743

Emneord

  • Generative Adversarial Networks (GANs)
  • Interactive Evolutionary Computation (IEC)
  • Image Generation
  • Latent Space Evolution
  • Genotype-Phenotype Mapping

Fingeraftryk

Dyk ned i forskningsemnerne om 'Deep interactive evolution'. Sammen danner de et unikt fingeraftryk.

Citationsformater