Repeatable and reliable search system evaluation using crowdsourcing
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information (SIGIR '11)
By Daniel M. Herzig, Harry Halpin, Henry S. Thompson, Jeffrey Pound, Peter Mika, Roi Blanco, Thanh Tran Duc
Issue Date:July 2011
The primary problem confronting any new kind of search task is how to boot-strap a reliable and repeatable evaluation campaign, and a crowd-sourcing approach provides many advantages. However, can these crowd-sourced evaluations be repeated over long perio...