28.11.2025

When fairness feels different: gender, algorithms and the future of work

by Miriam Klöpper

3 min read

Algorithms are beginning to play a central role in managerial decision-making at the workplace. Systems used to collect and analyse employee-data are often grouped under the term »people analytics«. The analyses derived by such systems are presented to managers in the form of actionable insights. They are often discussed as tools that can improve fairness by removing human bias from decision-making. But research has highlighted that reality is more complex, particularly when it comes to gender, and that such systems are prone to contain and reinforce the same stereotypes and biases as humans.

Examining perceptions of fairness

In a recent study written together with Uwe Messer I looked at how employees perceive their managers when the latter use algorithm-based systems to inform their decision-making, or even delegate their decision-making to systems. We created situations in which participants were, objectively and measurably, treated unfairly. We then surveyed employees on their feelings of perceived unfairness, and whether they experienced a sense of betrayal and would like to take action against the manager concerned.

While it was not the focus of the study, we nevertheless noticed that women were consistently less likely than men to feel unfairly treated or betrayed, even though they had every reason to feel that way. The treatment they received was no less severe, yet their emotional responses were noticeably more muted.

This points to a deeper issue. Previous researchers have already highlighted that women are sometimes more comfortable with algorithmic decisions than with decisions made by human managers, particularly male ones. If you already expect to be treated unfairly by a person, a system, however impersonal, might feel like a safer option. 

Gender matters in perceiving and articulating unfairness at work 

However, our findings suggest there could be more going on than just a preference for objectivity. It may also be that many women have become so used to subtle forms of unequal treatment at work that they no longer even see them as unfair. Also, they might feel unable to speak up when something does not feel right. Even in a controlled research setting, in which participants knew that it was safe to share their reactions honestly, such hesitance remained.

In contrast, men in our study were more likely to voice concerns and thus question decisions that seemed unjust. They reported stronger emotional responses and were more comfortable expressing their belief that they had been wronged. 

This difference matters. As algorithmic systems such as people analytics become more common, they have the potential to shape the future of work in lasting ways. If women are less likely to challenge decisions made with the help of such algorithms, unfair outcomes could go unnoticed or unchallenged, simply because fewer women are raising concerns.

It is important to remember that algorithms are not neutral. They rely on data from past decisions, and if that data contains biases, which workplace data often already does, those biases can be built into the system. If the people most affected by those biases are less likely to question them, the system can end up reinforcing inequality rather than reducing it.

What does this mean for fairness and the future of work?

First, organisations need to be aware that gender shapes how people interact with algorithmic systems. This includes not only the outcomes of these systems, but also how comfortable people feel questioning them. Fairness is not just about making the right decision; it is also about ensuring that everyone feels heard when something does not seem right.

Second, companies using people analytics should take steps to ensure that they are not reinforcing bias unintentionally. This means regularly reviewing how decisions are made, involving diverse voices in system design and creating spaces in which employees feel safe to speak up, regardless of their gender.

Finally, we need to keep raising awareness of how subtle inequality can become normalised over time, and indeed has already become so. If women do not recognise unfair treatment because it has become part of their everyday experience at work, then no technology, however advanced, can truly fix the problem.

Algorithms may offer a sense of neutrality, but they are not a solution on their own. Fairness at work depends on the social systems we build: on people being willing and able to hold unfair decisions to account. Ensuring that everyone, regardless of gender, feels empowered to do so is key to making progress.

Miriam Klöpper  is a postdoctoral research fellow at the Norwegian University of Science and Technology (NTNU) in Trondheim, Norway, where she works at the Faculty of Information Systems and Electrical Engineering. Previously, she was a doctoral candidate in the Information Systems Department at the University of Münster, Germany, and she holds a master’s degree in History of War from King’s College London. Her research focuses on the social and ethical implications of using algorithm-driven systems in personnel management, especially on the impact of such systems on social relations, hierarchies, and power structures in traditional organisations. She also focuses on workplace-surveillance, and equality at the workplace. Starting in June 2025, Miriam will serve as the board leader of the ACM Women’s Chapter Trondheim.

Technology, Employment and Wellbeing is an FES blog that offers original insights on the ways new technologies impact the world of work. The blog focuses on bringing different views from tech practitioners, academic researchers, trade union representatives and policy makers.

FES Future of Work

Cours Saint Michel 30e
1040 Brussels
Belgium

+32 2 329 30 32

futureofwork(at)fes.de

Meet the team

Follow us on LinkedIn and  X

Subscribe to our Newsletter

Watch our short videos and recorded events Youtube