Affordable Access

CrowdTerrier: automatic crowdsourced relevance assessments with terrier

Authors
Publication Date
Keywords
  • Qa75 Electronic Computers. Computer Science

Abstract

In this demo, we present CrowdTerrier, an infrastructure extension to the open source Terrier IR platform that enables the semi-automatic generation of relevance assessments for a variety of document ranking tasks using crowdsourcing. The aim of CrowdTerrier is to reduce the time and expertise required to effectively Crowdsource relevance assessments by abstracting away from the complexities of the crowdsourcing process. It achieves this by automating the assessment process as much as possible, via a close integration of the IR system that ranks the documents (Terrier) and the crowdsourcing marketplace that is used to assess those documents (Amazon's Mechanical Turk).

There are no comments yet on this publication. Be the first to share your thoughts.