Blogpost \ Potentials and Risks of Algorithmic Management: The Gender Pay Gap

The term “Algorithmic management” was first introduced when describing the management practices of Uber and Lyft. [1] Software algorithms are increasingly used to “allocate, optimise, and evaluate work” by platforms to help manage their vast workforces. [1] The use of algorithms and data-driven decision-making processes to manage employees and their work is called algorithmic management. [1] Algorithms are used to assign work to drivers, as mechanisms to optimise pricing for services, and as systems for evaluating driver performance. [1] While this approach has its benefits, such as increased efficiency and objectivity, it can also lead to unintended consequences, such as perpetuating gender pay gap issues. [2] The gender pay gap itself is more than just discrimination in payment, it portrays a large number of inequalities women face in access to work and progression. [2] Algorithms work through data that they are trained on. [3] But what if this data is biassed against women and ethnic minorities in the first place? Does it hold the risk of further exacerbating existing biases in the workplace?

The gender pay gap at work

In a workplace context, when being considered for a male dominated (i.e traditionally-believed-to-be-for-men) jobs, female candidates are evaluated more negatively and less often recommended for employment in comparison with male candidates. [4] For example, in audit studies, female applicants are less likely to be interviewed or called back, in comparison with male applicants. In a recent study, a student was considered for a laboratory manager position. A male applicant was not only rated as significantly more competent and hireable, but also offered a higher starting salary, and offered more career mentoring than a female applicant was. [4]

Amazon holds another example of gender bias in algorithmic management in a workplace context. As 60 percent of the company’s global workforce is male and 74 percent of the company’s managerial positions are held by men. [5] The data used to create the algorithm for applications was built upon resumes of the past 10 years which were mainly submitted by white males. The admission algorithm was taught to recognize word patterns in the resumes, rather than relevant skills to determine whether an applicant would fit. This resulted in the AI software penalising any resume that contained the word “women’s” in the text and downgrading the resumes of women who attended women’s colleges. [5]

Does Artificial Intelligence foster the gender pay gap?

The main advantage of Artificial Intelligence and algorithms is supposed to be their objectivity. [6] However softwares pick up on existing biases shown by our society. [5] If algorithms are trained on historical data that reflects biases against women and ethnic minorities, they may perpetuate these biases in their decision-making processes. This can result in women being passed over for promotions or being paid less than their male counterparts for similar work as shown in the Amazon example. Algorithmic management therefore has the potential to reproduce, if not worsen these developments of the gender pay gap.

A recent study further showed that a recommendation submitted by someone with a female name, like “Mary,” received approximately 25% fewer clicks overall than a recommendation submitted by someone with a very male name, such as “Matthew.” [7] In a workplace context this means that for example, a job-matching algorithm may produce different results for two resumes that only differ in the name “Mary” and “Matthew”. [5]

One of many consequences of this bias, that we can already see, is that women often use nicknames to hide their gender, since females receive on average lower ratings: There are some signs that customer ratings – which often affect pay levels – can discriminate on racial and gender grounds, favouring men over women. The Gender Equality Index reports that workers’ race and gender affect the social feedback that they receive, although the impact is different on each platform. A survey carried out in the United States showed that one third of female platform workers adopted a gender-neutral username in order to maintain anonymity. [8]

The Artificial Intelligence Act: a useful policy option?

In order to address these issues, companies should ensure that their algorithms are designed with fairness and equity in mind, not just for women but also for ethnic minorities. This may involve auditing the algorithms to identify potential biases and making adjustments to ensure that they are fair and unbiased. Additionally, companies should prioritise diversity and inclusion efforts to ensure that women are well-represented at all levels of the organisation and have equal opportunities for career advancement and pay. [5] In this regard, it is to be seen the impact of the proposed regulation on Artificial Intelligence from the European Commission, which addresses the implementation of AI systems in employment contexts as a ‘high risk’ for employees’ fundamental rights. [9] Algorithmic management, therefore, should undergo compliance checks and strict transparency towards employees, and prohibit any implementation that violates labour laws and discriminates against vulnerable groups. A way in which potential biases could be prevented is through comparing outcomes of different groups. [5] For example, if a job-matching algorithm’s average score for male applicants is higher than that for women, there should be a further investigation. Yet, it is still difficult to define and measure fairness. While it will not always be possible to satisfy all notions of fairness at the same time, companies and other operators of algorithms must be aware that there is no simple metric to measure fairness that a software engineer can apply, especially in the design of algorithms and the determination of the appropriate trade-offs between accuracy and fairness. Thus, algorithmic decisions that may have a serious consequence for people will require human involvement. [5]

Conclusions: regulating to fill the gap?

Overall algorithmic management has a big potential to objectively and unbiasedly increase efficiency in the workplace, however it holds risks to widen the gender pay gap even further. Therefore it is important to raise awareness around this topic and to keep equity and fairness in mind. Important legislative developments are foreseen in the European context, we shall see their impact on the gender pay gap issue in the coming years. [9] New technologies have the power and potential to change our lives for the better, their development, however ought to be guided with principles and measures to prevent the exacerbation of discriminatory practices. It is also something to think about as a society as algorithms always reflect a snapshot of societal values. In the decision to create and bring algorithms to the market, the ethics of likely outcomes must be considered, especially in areas where there is potential for harm and where there is a risk of perpetuating existing biases or making certain groups more vulnerable to existing societal inequalities. [5] That is why it is important for everyone to always be asking themselves: Will we leave some groups of people worse off as a result of the algorithm’s design or its unintended consequences?

Bibliography

  • [1] Lee, M.K., Kusbit, D., Metsky, E., & Dabbish, L. Working with Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015: 160-1612. https://doi.org/10.1145/2702123.2702548
  • [2] Gender bias in data: Towards gender equality in digital welfare, Digital Future society, 2022. https://digitalfuturesociety.com/algorithmic-gender-discrimination-where-does-it-come-from-what-is-the-impact-and-how-can-we-tackle-it/
  • [3] How do algorithms work, University of York, 2022.
  • [4] Stamarski, C. S. and Son Hing, L. S., Gender Inequalities in the workplace: the effects of organisational structures, processes, practices, and decision makers’ sexism, Front. Psychol., Volume 6, Department of Psychology, University of Guelph, ON, Canada, 2015.
  • [5] Turner Lee, Nicol and Resnick, Paul and Barton, Genie, Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms, 2019
  • [6] Smith, G. and Rustagi, I, When Good Algorithms Go Sexist: Why and How to Advance AI Gender
    Equity, Stanford Social Innovation Review, 2021.
  • [7] Botelho, T. L. and Abraham, M, Research: Objective Performance Metrics are not enough to overcome gender bias, Harvard Business Review, 2017.
  • [8] Gender Equality Index 2020: Digitalisation and the future of work: https://eige.europa.eu/publications-resources/toolkits-guides/gender-equality-index-2020-report/gender-pay-gap-ict-and-platform-work
  • [9] The Artificial Intelligence Act, https://artificialintelligenceact.eu

Leave a Reply

Your email address will not be published. Required fields are marked *