Abstract
This paper describes our ongoing effort to build an empathizing and adaptive storyteller system. The system under development aims to utilize emotional expressions generated from an avatar or a humanoid robot in addition to the listener’s responses which are monitored in real time, in order to deliver a story in an effective manner. We conducted a pilot study and the results were analyzed in two ways: first, through a survey questionnaire analysis based on the participant’s subjective ratings; second, through automated video analysis based on the participant’s emotional facial expression and eye blinking. The survey questionnaire results show that male participants have a tendency of more empathizing with a story character when a virtual storyteller is present, as compared to audio-only narration. The video analysis results show that the number of eye blinking of the participants is thought to be reciprocal to their attention.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the Eighth AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, AIIDE-12 |
| Number of pages | 3 |
| Publisher | AAAI Press |
| Publication date | Oct 2012 |
| ISBN (Print) | 978-1-57735-585-4 |
| Publication status | Published - Oct 2012 |
Keywords
- Narrative, Storytelling, Emotional Modeling
Fingerprint
Dive into the research topics of 'Towards an Empathizing and Adaptive Storyteller System'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver