Deep interactive evolution

Philip Bontrager, Wending Lin, Julian Togelius, Sebastian Risi

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

This paper describes an approach that combines generative adversarial networks (GANs) with interactive evolutionary computation (IEC). While GANs can be trained to produce lifelike images, they are normally sampled randomly from the learned distribution, providing limited control over the resulting output. On the other hand, interactive evolution has shown promise in creating various artifacts such as images, music and 3D objects, but traditionally relies on a hand-designed evolvable representation of the target domain. The main insight in this paper is that a GAN trained on a specific target domain can act as a compact and robust genotype-to-phenotype mapping (i.e. most produced phenotypes do resemble valid domain artifacts). Once such a GAN is trained, the latent vector given as input to the GAN's generator network can be put under evolutionary control, allowing controllable and high-quality image generation. In this paper, we demonstrate the advantage of this novel approach through a user study in which participants were able to evolve images that strongly resemble specific target images.
Original languageEnglish
Title of host publicationInternational Conference on Computational Intelligence in Music, Sound, Art and Design : EvoMUSART 2018
Number of pages16
PublisherSpringer
Publication date2018
Pages267-282
ISBN (Print)978-3-319-77582-1
ISBN (Electronic)978-3-319-77583-8
DOIs
Publication statusPublished - 2018
SeriesLecture Notes in Computer Science
Volume10783
ISSN0302-9743

Fingerprint

Dive into the research topics of 'Deep interactive evolution'. Together they form a unique fingerprint.

Cite this