Towards an Empathizing and Adaptive Storyteller System

Byung Chull Bae, Alberto Brunete, Usman Malik, Evanthia Dimara, Jermsak Jermsurawong, Nikolaos Mavridis

    Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

    Abstract

    This paper describes our ongoing effort to build an empathizing and adaptive storyteller system. The system under development aims to utilize emotional expressions generated from an avatar or a humanoid robot in addition to the listener’s responses which are monitored in real time, in order to deliver a story in an effective manner. We conducted a pilot study and the results were analyzed in two ways: first, through a survey questionnaire analysis based on the participant’s subjective ratings; second, through automated video analysis based on the participant’s emotional facial expression and eye blinking. The survey questionnaire results show that male participants have a tendency of more empathizing with a story character when a virtual storyteller is present, as compared to audio-only narration. The video analysis results show that the number of eye blinking of the participants is thought to be reciprocal to their attention.
    Original languageEnglish
    Title of host publicationProceedings of the Eighth AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, AIIDE-12
    Number of pages3
    PublisherAAAI Press
    Publication dateOct 2012
    ISBN (Print)978-1-57735-585-4
    Publication statusPublished - Oct 2012

    Keywords

    • Narrative, Storytelling, Emotional Modeling

    Fingerprint

    Dive into the research topics of 'Towards an Empathizing and Adaptive Storyteller System'. Together they form a unique fingerprint.

    Cite this