Repeatability and Workability Evaluation of SIGMOD 2011

Philippe Bonnet, Stefan Manegold, Matias Bjorling, Wei Cao, Javier Gonzales, Joel Granados, Nancy Hall, Stratos Idreos, Milena Ivanova, Ryan Johnson, David Koop, Tim Kraska, René Müller, Dan Olteanu, Paolo Papotti, Christine Reilly, Dimitris Tsirogiannis, Cong Yu, Juliana Freire and Dennis Shasha

    Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review

    Abstract

    SIGMOD has offered, since 2008, to verify the experiments published in the papers accepted at the conference. This year, we have been in charge of reproducing
    the experiments provided by the authors (repeatability),
    and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process
    and results. While the participation is stable in terms of
    number of submissions, we find this year a sharp contrast between the high participation from Asian authors
    and the low participation from American authors. We
    also find that most experiments are distributed as Linux
    packages accompanied by instructions on how to setup
    and run the experiments. We are still far from the vision
    of executable papers
    Original languageEnglish
    JournalS I G M O D Record
    Volume40
    Issue number2
    Pages (from-to)45-48
    Number of pages5
    ISSN0163-5808
    Publication statusPublished - 2011

    Keywords

    • Reproducibility
    • Experimental Verification
    • Repeatability
    • Workability
    • Executable Papers

    Fingerprint

    Dive into the research topics of 'Repeatability and Workability Evaluation of SIGMOD 2011'. Together they form a unique fingerprint.

    Cite this