ACM SIGMOD proposed for the first time to test the code of submitted papers in 2008. The repeatability and reproducbility efforts between 2008 and 2012 are summarized in this web page.
The goal of establishing reproducibility is to ensure your SIGMOD 2012 research paper stands as reliable work that can be referenced by future research. The premise is that experimental papers will be most useful when their results have been tested and generalized by objective third parties.
SIGMOD 2011 offers authors an experimental repeatability and workability evaluation of their accepted papers. The repeatability & workability process tests that the experiments published in SIGMOD 2011 can be reproduced (repeatability) and possibly extended by modifying some aspects of the experiment design (workability). Authors participate on a voluntary basis, but authors benefit as well: (i) mention on the repeatability website, (ii) the ability to run their software on other sites, (iii) often, far better documentation for new members of research teams.
The repeatability and workability evaluation in conjunction with SIGMOD 2010 continues along the lines of the 2009 edition, with some improvements related to the procedure. On a voluntary basis, authors of accepted SIGMOD 2010 papers can provide their code/binaries, experimental setups and data to be tested for (i)repeatability of the experiments described in the accepted papers, and (ii) workability in the sense of running different/more experiments with different/more parameters than shown in the respective papers.
Given the quite positive experiences with and feedback about the SIGMOD 2008 repeatability initiative, SIGMOD 2009 continued the repeatability initiative in a slightly modified and extended version. A report on this effort has been published in ACM SIGMOD Record, 38(3):40-43, September 2009.
SIGMOD 2008 was the first database conference that proposed testing the code associated to conference submissions against the data sets used by the authors, to test the repeatability of the experiments presented in the submitted papers. A detailed report on this initiative has been published in ACM SIGMOD Record, 37(1):39-45, March 2008.