The Need for Explainability in AI-Based Creativity Support Tools

Antonios Liapis, Jichen Zhu

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

A long lineage of computer-assisted design tools has established interaction
paradigms that give full control to the designer over the software. Introduction
of Artificial Intelligence (AI) to this creative process leads to a more co-creative
paradigm, with AI taking a more proactive role. Recent generative approaches
based on deep learning have strong potential as an asset creator and co-creator,
however current algorithms are opaque and burden the designer with making sense
of the output. In order for deep learning to become a colleague that designers can
trust and work with, better explainability, controllability, and interactivity is necessary. We highlight current and potential ways in which explainability can inform
human users in creative tasks and call for involving end-users in the development
of both interfaces and underlying algorithms.
Original languageEnglish
Title of host publicationProceedings of the Human Centered AI workshop at NeurIPS 2022
Number of pages3
Publication date2022
Publication statusPublished - 2022

Keywords

  • - Computer-Assisted Design
  • - Co-Creativity
  • - Artificial Intelligence
  • - Deep Learning
  • - Explainability

Fingerprint

Dive into the research topics of 'The Need for Explainability in AI-Based Creativity Support Tools'. Together they form a unique fingerprint.

Cite this