An Affect Detection Technique Using Mobile Commodity Sensors in the Wild

Aske Mottelson, Kasper Hornbæk

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

Current techniques to computationally detect human affect often depend on specialized hardware, work only in laboratory settings, or require substantial individual training. We use sensors in commodity smartphones to estimate affect in the wild with no training time based on a link between affect and movement. The first experiment had 55 participants do touch interactions after exposure to positive or neutral emotion-eliciting films; negative affect resulted in faster but less precise interactions, in addition to differences in rotation and acceleration. Using off-the-shelf machine learning algorithms we report 89.1% accuracy in binary affective classification, grouping participants by their self-assessments. A follow up experiment validated findings from the first experiment; the experiment collected naturally occurring affect of 127 participants, who again did touch interactions. Results demonstrate that affect has direct behavioral effect on mobile interaction and that affect detection using common smartphone sensors is feasible.
Original languageUndefined/Unknown
Title of host publicationProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery
Publication date2016
Pages781–792
ISBN (Print)9781450344616
DOIs
Publication statusPublished - 2016
Externally publishedYes
SeriesUbiComp '16

Keywords

  • affect detection
  • smartphone
  • touch
  • crowdsourcing
  • affective computing

Cite this