by Tiago Vieira, Researcher at the European University Institute
5 min read
Over the last few years, the development and deployment of algorithmic management has drawn significant attention from workers and unions, academics and policymakers. The ideal of algorithmic management emerged wrapped around the promise that Big Data and Artificial Intelligence can automate many managerial tasks - if not entirely, at least to a considerable extent.
However, as Hilke Schellmann’s recent book, The Algorithm, shows in detail, the (promised) blessing of a new age of efficient, accurate and neutral managerial decision-making is rapidly showing signs of a curse for workers subject to biased, privacy-wrecking algorithmic tools. One recent, telling example comes from the deployment of so-called AI emotional detection software to monitor and assess workers’ performance at several fast-food chains in the USA. Alongside a vast scholarship from different disciplines, Schellmann’s book eloquently shows how this might be a terrible idea, starting with the fact that this technology is grounded on pseudo-scientific assumptions and has no scientific validity.
Perhaps slower than necessary, Europe Union-level policymakers have been developing regulatory instruments that can offer workers and unions tools to tackle some of these perils (GDPR, AI Act, Platform Work Directive). While these developments may be broadly seen as positive, several voices have highlighted how there is, nonetheless, no room to rest on this front. Drawing from my research, I consider this a call that should be amplified and incorporate not only the flashier algorithmic, quasi-automated decision-making tools but also those dedicated to “merely” tracking workers – even if not necessarily producing more than descriptive statistics for the interpretation of managers.
To better understand this, one must take a step back from the AI hype. As Ulrich Leicht-Deobald and colleagues thoroughly describe, algorithms may be predictive and prescriptive – that is, they may point in some direction or, as we have seen in platform work, directly enforce that direction and, for instance, penalize or even terminate a worker without necessarily running such decision through human supervision. Alarming as this is, algorithms can also be “merely” descriptive – that is, accumulating and sorting data to inform managerial decisions.
Notably, while quantifying workers’ performance has been around for centuries, the novelty of our era is how detailed this quantification can be. More than just rough numbers, like in the recent past, current technologies allow the combination of fine-grained metrics about every step of the execution of a given task with pervasive surveillance techniques, such as geolocation-tracking of workers’ movements. Two aspects of this development deserve special attention.
First, under the still-to-be-proved imperative of increasing companies’ productivity, we observe the gradual demise of workers’ privacy. This is a negative development since privacy is a crucial right of every person. More than that, in the context of work, this offers new opportunities for employers to spy on workers, track associational patterns, and surveil workers in general, as well as union activists and leaders, in particular. In my own research in the food-delivery sector, I found workers feeling the need to ask permission to move away from their motorbikes to go to the toilet, afraid they would appear is the eyes their manager sitting in the office in the other side of town as slacking during work time. Still in this regard, while union busting and discriminating against unionized workers – or any workers engaging with colleagues – is clearly illegal, it is something that still happens and constitutes an obvious source of fear among workers. Therefore, creating more opportunities for it to happen thanks to the refinement of surveilling tools will hardly make life easier for workers and their representatives.
Second, it is well-documented how surveillance curtails workers’ autonomy and how that, in turn, is often associated with occupational health and safety problems due to workers being rushed to fulfil their tasks. What algorithmic quantification and different tracking forms bring new is workers’ internalization of this imperative. Again, in my own research, this time in the grocery delivery sector, I have witnessed how some managers have afforded to take a step back from direct control over workers, only to act as “algorithmic coaches” – that is, mainly focusing on helping workers satisfy the algorithmic metrics’ imperative (for a similar conclusion see, Alex Wood’s report for the Joint Research Committee). What stood out to me was how workers internalized this imperative and, despite the exhaustion and even accidents experienced, would still happily comply with the visibly unrealistic goals imposed by their employer.
Under the current regulations, both these examples can hardly be tackled from a legal perspective – if not for something else, because they take place at a very micro-level, with evidence being challenging to produce. Overall, the two most significant risks are: first, that the increase of workers’ surveillance brought by algorithmic descriptive tools goes under the radar of legislators more focused on predictive and prescriptive applications of algorithms; and second, that these approaches are normalized and seen as inevitable, and therefore remain unchallenged by workers, unions and authorities.
In conclusion, the debate on introducing new technologies in workplaces needs to be open to interrogate the flashiest technologies but also apparently less sophisticated ones. In this vein, the discussion must cover what, beyond maximizing control, is the actual added value of each new technology and, if there is any true added value, the trade-offs that derive from it. At the end of the day, is productivity more important than workers’ well-being and respect for their fundamental rights?
Tiago Vieira is a PhD candidate at the European University Institute where he researches the impact of algorithmic management on workplace power dynamics.
Technology, Employment and Wellbeing is a new FES blog that offers original insights on the ways new technologies impact the world of work. The blog focuses on bringing different views from tech practitioners, academic researchers, trade union representatives and policy makers.
Friedrich-Ebert-Stiftung Future of Work
Cours Saint Michel 30e 1040 Brussels Belgium
+32 2 329 30 32
futureofwork(at)fes.de
Team
This site uses third-party website tracking technologies to provide and continually improve our services, and to display advertisements according to users' interests. I agree and may revoke or change my consent at any time with effect for the future.
These technologies are required to activate the core functionality of the website.
This is an self hosted web analytics platform.
Data Purposes
This list represents the purposes of the data collection and processing.
Technologies Used
Data Collected
This list represents all (personal) data that is collected by or through the use of this service.
Legal Basis
In the following the required legal basis for the processing of data is listed.
Retention Period
The retention period is the time span the collected data is saved for the processing purposes. The data needs to be deleted as soon as it is no longer needed for the stated processing purposes.
The data will be deleted as soon as they are no longer needed for the processing purposes.
These technologies enable us to analyse the use of the website in order to measure and improve performance.
This is a video player service.
Processing Company
Google Ireland Limited
Google Building Gordon House, 4 Barrow St, Dublin, D04 E5W5, Ireland
Location of Processing
European Union
Data Recipients
Data Protection Officer of Processing Company
Below you can find the email address of the data protection officer of the processing company.
https://support.google.com/policies/contact/general_privacy_form
Transfer to Third Countries
This service may forward the collected data to a different country. Please note that this service might transfer the data to a country without the required data protection standards. If the data is transferred to the USA, there is a risk that your data can be processed by US authorities, for control and surveillance measures, possibly without legal remedies. Below you can find a list of countries to which the data is being transferred. For more information regarding safeguards please refer to the website provider’s privacy policy or contact the website provider directly.
Worldwide
Click here to read the privacy policy of the data processor
https://policies.google.com/privacy?hl=en
Click here to opt out from this processor across all domains
https://safety.google/privacy/privacy-controls/
Click here to read the cookie policy of the data processor
https://policies.google.com/technologies/cookies?hl=en
Storage Information
Below you can see the longest potential duration for storage on a device, as set when using the cookie method of storage and if there are any other methods used.
This service uses different means of storing information on a user’s device as listed below.
This cookie stores your preferences and other information, in particular preferred language, how many search results you wish to be shown on your page, and whether or not you wish to have Google’s SafeSearch filter turned on.
This cookie measures your bandwidth to determine whether you get the new player interface or the old.
This cookie increments the views counter on the YouTube video.
This is set on pages with embedded YouTube video.
This is a service for displaying video content.
Vimeo LLC
555 West 18th Street, New York, New York 10011, United States of America
United States of America
Privacy(at)vimeo.com
https://vimeo.com/privacy
https://vimeo.com/cookie_policy
This cookie is used in conjunction with a video player. If the visitor is interrupted while viewing video content, the cookie remembers where to start the video when the visitor reloads the video.
An indicator of if the visitor has ever logged in.
Registers a unique ID that is used by Vimeo.
Saves the user's preferences when playing embedded videos from Vimeo.
Set after a user's first upload.
This is an integrated map service.
Gordon House, 4 Barrow St, Dublin 4, Ireland
https://support.google.com/policies/troubleshooter/7575787?hl=en
United States of America,Singapore,Taiwan,Chile
http://www.google.com/intl/de/policies/privacy/