XQM: Interactive Learning on Mobile Phones

Alexandra M. Bagi, Kim I. Schild, Omar Shahbaz Khan, Jan Zahálka, Björn Thór Jónsson

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

There is an increasing need for intelligent interaction with media collections, and mobile phones are gaining significant traction as the device of choice for many users. In this paper, we present XQM, a mobile approach for intelligent interaction with the user’s media on the phone, tackling the inherent challenges of the highly dynamic nature of mobile media collections and limited computational resources of the mobile device. We employ interactive learning, a method that conducts interaction rounds with the user, each consisting of the system suggesting relevant images based on its current model, the user providing relevance labels, the system’s model retraining itself based on these labels, and the system obtaining a new set of suggestions for the next round. This method is suitable for the dynamic nature of mobile media collections and the limited computational resources. We show that XQM, a full-fledged app implemented for Android, operates on 10K image collections in interactive time (less than 1.4 s per interaction round), and evaluate user experience in a user study that confirms XQM’s effectiveness.
OriginalsprogEngelsk
TitelMultiMedia Modeling - 27th International Conference, MMM 2021, Prague, Czech Republic, June 22-24, 2021, Proceedings, Part II.
Antal sider13
UdgivelsesstedPrague, Czech Republic (virtual)
ForlagSpringer
Publikationsdatojun. 2021
Sider281-293
ISBN (Trykt)978-3-030-67834-0
DOI
StatusUdgivet - jun. 2021
NavnLecture Notes in Computer Science
Vol/bind12573
ISSN0302-9743

Emneord

  • Interactive learning
  • Relevance feedback
  • Mobile devices

Fingeraftryk

Dyk ned i forskningsemnerne om 'XQM: Interactive Learning on Mobile Phones'. Sammen danner de et unikt fingeraftryk.

Citationsformater