Dissimilarity-based ensembles for multiple instance learning

V. Cheplygina, D.M.J. Tax, M. Loog

Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review

Abstract

In multiple instance learning, objects are sets (bags) of feature vectors (instances) rather than individual feature vectors. In this paper, we address the problem of how these bags can best be represented. Two standard approaches are to use (dis)similarities between bags and prototype bags, or between bags and prototype instances. The first approach results in a relatively low-dimensional representation, determined by the number of training bags, whereas the second approach results in a relatively high-dimensional representation, determined by the total number of instances in the training set. However, an advantage of the latter representation is that the informativeness of the prototype instances can be inferred. In this paper, a third, intermediate approach is proposed, which links the two approaches and combines their strengths. Our classifier is inspired by a random subspace ensemble, and considers subspaces of the dissimilarity space, defined by subsets of instances, as prototypes. We provide insight into the structure of some popular multiple instance problems and show state-of-the-art performances on these data sets.
Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume27
Issue number6
Pages (from-to)1379-1391
Number of pages13
ISSN2162-237X
DOIs
Publication statusPublished - Jun 2016
Externally publishedYes

Keywords

  • Combining classifiers
  • dissimilarity representation
  • multiple instance learning (MIL)
  • random subspace method (RSM)

Fingerprint

Dive into the research topics of 'Dissimilarity-based ensembles for multiple instance learning'. Together they form a unique fingerprint.

Cite this