Sitemap
A list of all the posts and pages found on the site. For you robots out there, there is an XML version available for digesting as well.
Pages
Archive Layout with Content
Posts by Category
Posts by Collection
CV
CV
Markdown
Page not in menu
Page Archive
Portfolio
Publications
Sitemap
Posts by Tags
Talk map
News
Teaching
Terms and Privacy Policy
Blog posts
Jupyter notebook markdown generator
Posts
Future Blog Post
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml and set future: false.
Blog Post number 4
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 3
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
portfolio
Portfolio item number 1
Short description of portfolio item number 1
Portfolio item number 2
Short description of portfolio item number 2 
publications
Closing the gender wage gap: Adversarial fairness in job recommendation
Clara Rus, Jeffrey Luppes, Harrie Oosterhuis, Gido H Schoenmacker
Published in Recsys@HR, 2022
The goal of this work is to help mitigate the already existing gender wage gap by supplying unbiased job recommendations based on resumes from job seekers. We employ a generative adversarial network to remove gender bias from word2vec representations.
</article> </div>Counterfactual Representations for Intersectional Fair Ranking in Recruitment
Clara Rus, Maarten de Rijke and Andrew Yates
Published in Recsys@HR, 2023
Fairness interventions require access to sensitive attributes of candidates applying for a job, which might not be available due to limitations imposed by data protection laws. In this work we propose using a pre-processing technique to create counterfactual representations of the candidates that lead to a more diverse ranking with respect to intersectional groups.
</article> </div>A Study of Pre-processing Fairness Intervention Methods for Ranking People
Clara Rus, Andrew Yates, Maarten de Rijke
Published in ECIR, 2024
Fairness interventions are hard to use in practice when ranking people due to legal constraints that limit access to sensitive information. Pre-processing fairness interventions, however, can be used in practice to create more fair training data that encourage the model to generate fair predictions without having access to sensitive information during inference. On two real-world datasets, the pre-processing methods are found to improve the diversity of rankings with respect to gender, while individual fairness is not affected. Moreover, we discuss advantages and disadvantages of using pre-processing fairness interventions in practice for ranking people.
</article> </div>AnnoRank: A Comprehensive Web-Based Framework for Collecting Annotations and Assessing Rankings
Clara Rus, Gabrielle Poerwawinata, Andrew Yates, Maarten de Rijke
Published in CIKM, 2024
We present AnnoRank, a web-based user interface (UI) framework designed to facilitate collecting crowdsource annotations in the context of information retrieval. AnnoRank enables the collection of explicit and implicit annotations for a specified query and a single or multiple documents, allowing for the observation of user-selected items and the assignment of relevance judgments. Furthermore, AnnoRank allows for ranking comparisons, allowing for the visualization and evaluation of a ranked list generated by different fairness interventions, along with its utility and fairness metrics.
</article> </div>FairDiverse: A Comprehensive Toolkit for Fairness- and Diversity-aware Information Retrieval
Chen Xu, Zhirui Deng, Clara Rus, Xiaopeng Ye, Yuanna Liu, Jun Xu, Zhicheng Dou, Ji-Rong Wen, Maarten de Rijke
Published in SIGIR, 2025
We introduce FairDiverse, an open-source standardized toolkit to enable comprehensive, reproducible evaluation of fairness- and diversity-aware algorithms across both search and recommendation tasks in IR.
</article> </div>Fairness in information retrieval from an economic perspective
Chen Xu, Clara Rus, Yuanna Liu, Marleen de Jonge, Jun Xu, Maarten de Rijke
Published in SIGIR, 2025
Recently, fairness-aware information retrieval (IR) systems have been receiving much attention. Numerous fairness metrics and algorithms have been proposed. The complexity of fairness and IR systems makes it challenging to provide a systematic summary of the progress that has been made. This complexity calls for a more structured framework to navigate future fairness-aware IR research directions. The field of economics has long explored fairness, offering a strong theoretical and empirical foundation. Its system-oriented perspective enables the integration of IR fairness into a broader framework that considers societal and intertemporal trade-offs. In this tutorial, we first highlight that IR systems can be understood as a specialized economic market. Then, we re-organize fairness algorithms through three key economic dimensions—macro vs. micro, demand vs. supply, and short-term vs. long-term. We effectively view most fairness categories in IR from an economic perspective. Finally, we illustrate how this economic framework can be applied to various real-world IR applications and we demonstrate its benefits in industrial scenarios. Different from other fairness-aware tutorials, our tutorial not only provides a new and clear perspective to re-frame fairness-aware IR but also inspires the use of economic tools to solve fairness problems in IR. We hope this tutorial provides a fresh, broad perspective on fairness in IR, highlighting open problems and future research directions.
</article> </div>Does fair ranking lead to fair recruitment outcomes? A study of interventions, interfaces, and interactions.
Alessandro Fabris, Clara Rus, Jorge Saldivar, Anna Gatzioura, Asia Biega, Carlos Castillo
Published in IPM Journal, 2025
Personnel recruitment is increasingly mediated by Applicant Tracking Systems (ATS), which rank candidates for job positions, making them a central decision-support tool in modern HR processes. Often framed as an information retrieval (IR) problem, the ranking of candidates in ATS is typically driven by relevance to the job position, with algorithms sorting applicants according to a set of predefined criteria. In recent years, fairness-aware ranking methods have emerged to mitigate the risk of indirect discrimination, where the ordering of candidates may inadvertently favor one demographic group over another. These approaches are inspired by browsing models developed for web search and aim to balance candidate exposure based on protected characteristics. However, ATS in recruitment introduce unique challenges due to their high-stakes nature and the decision-making context in which they operate. In this paper, we present a series of user studies that explore the disconnect between fair exposure and fair outcomes in candidate shortlisting. We focus on how factors such as task design (e.g., how recruiters interact with candidate lists), individual representations of candidates (e.g., national origin cues), and ranking order influence both position bias and demographic balance. Our findings show that while demographic balance may be achieved in terms of ranking visibility, this does not necessarily translate to fair outcomes in terms of who gets shortlisted. Through a crowdsourced experiment and in-depth interviews with recruiters, we identify key task-level, individual, and ranking factors that mediate these effects. We conclude that fairness in ATS rankings is contingent not only on algorithmic design but also on the shortlisting tasks they support, as well as the interfaces, strategies, and assumptions that recruiters use when interacting with candidate lists. Based on these insights, we provide implications for the design of algorithms, interfaces, and recruitment processes that support fairer and more equitable recruitment outcomes.
</article> </div>Joint Modeling of Candidate and Recruiter Preferences for Fair Two-Sided Job Matching
Clara Rus, Masoud Mansoury, Andrew Yates, Maarten de Rijke
Published in under review, 2025
</article> </div>Judiciously Reducing Sub-group Comparisons for Learning Intersectional Fair Representations
Clara Rus, Andrew Yates, Maarten de Rijke
Published in under review, 2025
</article> </div>talks
My master thesis research has been featured in a news article!
Published:
Read more about it here.
Participating at the 21st edition of the Dutch-Belgian Information Retrieval Workshop (DIR)
Published:
Organizing and participating at the European Summer School in Information Retrieval (ESSIR)
Published:
Invited Speaker at Annie Romein-Verschoor lecture.
Published:
Read more about it here.
Giving a presentation on software design principles for fair recruitment systems
Published:
Read more about it here.
Giving a talk at the Artificial Intelligence and Labor Market (AI&LM) Workshop on The Impact of Ranking Interventions and Task-Level Factors in Recruitment Interfaces for Shortlisting
Published:
Read more about it here.
Presenting the results of AMS42 team at NTCIR18, winning the best poster presentation.
Published:
Read more about it here.
Giving a tutorial on fairness from an economic perspective at SIGIR’25
Published:
Read more about it here.
Release of FINDHR toolkits for trustworthy design of recruitment systems.
Published:
Read more about it here.
teaching
Teaching experience 1
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Teaching experience 2
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.
