Abstract
SIGMOD has offered, since 2008, to verify the experiments published in the papers accepted at the conference. This year, we have been in charge of reproducing
the experiments provided by the authors (repeatability),
and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process
and results. While the participation is stable in terms of
number of submissions, we find this year a sharp contrast between the high participation from Asian authors
and the low participation from American authors. We
also find that most experiments are distributed as Linux
packages accompanied by instructions on how to setup
and run the experiments. We are still far from the vision
of executable papers
the experiments provided by the authors (repeatability),
and exploring changes to experiment parameters (workability). In this paper, we assess the SIGMOD repeatability process in terms of participation, review process
and results. While the participation is stable in terms of
number of submissions, we find this year a sharp contrast between the high participation from Asian authors
and the low participation from American authors. We
also find that most experiments are distributed as Linux
packages accompanied by instructions on how to setup
and run the experiments. We are still far from the vision
of executable papers
Originalsprog | Engelsk |
---|---|
Tidsskrift | S I G M O D Record |
Vol/bind | 40 |
Udgave nummer | 2 |
Sider (fra-til) | 45-48 |
Antal sider | 5 |
ISSN | 0163-5808 |
Status | Udgivet - 2011 |