The goal of the SBS 2015 Suggestion Track is to evaluate approaches for supporting users in searching collections of books who express their information needs both in a query and through example books. The track investigates the complex nature of relevance in book search and the role of traditional and user-generated book metadata in retrieval. We extended last year's investigation into the nature of book suggestions from the LibraryThing forums and how they compare to book relevance judgements. Participants were encouraged to incorporate rich user profiles of both topic creators and other LibraryThing users to explore the relative value of recommendation and retrieval paradigms for book search. We found further support that such suggestions are a valuable alternative to traditional test collections that are based on top-k pooling and editorial relevance judgements. In terms of systems evaluation, the most effective systems include some form of learning-to-rank. It seems that the complex nature of the requests and the book descriptions, with multiple sources of evidence, requires a careful balancing of system parameters.
|Tidsskrift||CEUR Workshop Proceedings|
|Status||Udgivet - 1 jan. 2015|
|Begivenhed||16th Conference and Labs of the Evaluation Forum, CLEF 2015 - Toulouse, Frankrig|
Varighed: 8 sep. 2015 → 11 sep. 2015
|Konference||16th Conference and Labs of the Evaluation Forum, CLEF 2015|
|Periode||08/09/2015 → 11/09/2015|