AnnoRank: A Comprehensive Web-Based Framework for Collecting Annotations and Assessing Rankings
Clara Rus, Gabrielle Poerwawinata, Andrew Yates, Maarten de Rijke
Published in CIKM, 2024
We present AnnoRank, a web-based user interface (UI) framework designed to facilitate collecting crowdsource annotations in the context of information retrieval. AnnoRank enables the collection of explicit and implicit annotations for a specified query and a single or multiple documents, allowing for the observation of user-selected items and the assignment of relevance judgments. Furthermore, AnnoRank allows for ranking comparisons, allowing for the visualization and evaluation of a ranked list generated by different fairness interventions, along with its utility and fairness metrics.