At FLAIRS-38, I presented our paper “Modeling and Mitigating Gender Bias in Matching Problems: A Simulation-Based Approach with Quota Constraints”, which explores how systemic gender bias impacts algorithmic matching processes—especially in high-stakes scenarios like hiring.
The talk introduced a simulation framework that allows us to model individual preferences using gender-specific Dirichlet distributions. We then inject a configurable bias in favor of one gender and examine how this affects group-specific and overall matching efficiency. Our framework uses Total Variation Distance (TVD) to systematically control and measure how much male and female preferences diverge.
To counteract bias, we compare traditional fixed quota mechanisms with a novel preference-based quota that dynamically aligns with group interests through aggregated top-choice voting. Our results show that while fixed quotas can be effective when group preferences are similar, they become inefficient under high divergence. In contrast, our proposed mechanism consistently delivers better fairness-efficiency trade-offs—even in challenging scenarios with strong bias and divergent preferences.
This work provides practitioners with practical insights into the design of fairer algorithmic matching systems and emphasizes the importance of aligning interventions like quotas with actual group preferences.
Check out our paper, the source-code and the slides of my talk.
Comments
comments powered by Disqus