XQM: Interactive Learning on Mobile Phones

Alexandra M. Bagi, Kim I. Schild, Omar Shahbaz Khan, Jan Zahálka, Björn Thór Jónsson

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

There is an increasing need for intelligent interaction with media collections, and mobile phones are gaining significant traction as the device of choice for many users. In this paper, we present XQM, a mobile approach for intelligent interaction with the user’s media on the phone, tackling the inherent challenges of the highly dynamic nature of mobile media collections and limited computational resources of the mobile device. We employ interactive learning, a method that conducts interaction rounds with the user, each consisting of the system suggesting relevant images based on its current model, the user providing relevance labels, the system’s model retraining itself based on these labels, and the system obtaining a new set of suggestions for the next round. This method is suitable for the dynamic nature of mobile media collections and the limited computational resources. We show that XQM, a full-fledged app implemented for Android, operates on 10K image collections in interactive time (less than 1.4 s per interaction round), and evaluate user experience in a user study that confirms XQM’s effectiveness.
Original languageEnglish
Title of host publicationMultiMedia Modeling - 27th International Conference, MMM 2021, Prague, Czech Republic, June 22-24, 2021, Proceedings, Part II.
Number of pages13
Place of PublicationPrague, Czech Republic (virtual)
PublisherSpringer
Publication dateJun 2021
Pages281-293
ISBN (Print)978-3-030-67834-0
DOIs
Publication statusPublished - Jun 2021
SeriesLecture Notes in Computer Science
Volume12573
ISSN0302-9743

Keywords

  • Interactive learning
  • Relevance feedback
  • Mobile devices

Fingerprint

Dive into the research topics of 'XQM: Interactive Learning on Mobile Phones'. Together they form a unique fingerprint.

Cite this