by Dr Funda Ustek Spilda, Dr Lola Brittain, Dr Callum Cant and Dr Mark Graham, University of Oxford
8 min read
It is impossible to ignore the apparent magic of AI. Centuries-old texts are translated into the language of your choosing, you can read menus in foreign languages just by hoovering your camera on the menu, or you can write a 1000-word essay on obscure plants, even without having any knowledge of the subject. The opportunities are endless. However, as more and more people are thinking, writing and working with AI, they are also questioning how the magic happens. The answer is less to do with sorcery, and more to do with human labour.
There are few idioms that translate across different cultures and to different languages as well as – looking for a needle in a haystack. The idea that some things are impossible to find, and the effort spent in finding them will be in vain, is such an easy sentiment to relate to. For AI to work the way it should, however, there is a vast number of people who need to continuously look for needles in exceedingly messy haystacks. And in the way digital infrastructures are set up, their work ends up being invisible, unvalued and as such, poorly compensated.
In their 2019 book, anthropologist Mary L. Gray and computer scientist Siddharth Suri, coined the term ‘ghost work’ to describe the invisiblised, task-based labour facilitating the development of AI. The people performing such work, also described as ‘data workers’, conduct a myriad of tasks to ensure that data used to train machine learning algorithms for AI is fit for purpose. Such work can entail finding images online, annotating them, labelling them, translating texts or moderating content online so that internet is and remains a safe space – as much as possible.
In recent years, thanks to the work of Gray and Siddarth, as well as other scholars and critical journalists like Phil Jones and Billy Perrigo, the existence of data workers, and the critical role they play within the supply chains of artificial intelligence production, has gained increasing attention. In others, the curtain has been opened and we, as a society, we have been told about the difficult conditions that such workers face as they make these supposedly awe-inspiring AI systems work. Yet much remains unknown about these workers, including the size of this emerging work force, but we know globally recognised standards of work are far from reality for a vast majority. That is why Fairwork, in partnership with the Global Partnership on Artificial Intelligence, has established a set of Fairwork AI principles. Developed via a global multi-stakeholder consultation in 2022 and adapted in response to the on-the-ground concerns of workers observed in fieldwork in 2023, the principles cohere around the core Fairwork structure - of pay, conditions, contracts, management, and representation - to establish basic minimum standards of fairness within the supply chains of AI. It is true, there are already several standards in place for AI, and AI ethics is a topical discussion. However, our intention at Fairwork, has been to establish standards that are operationalisable, and that answer to the needs and wants of the very people who work in the AI supply chain.
What exactly does this mean in practice? The answer is not overtly technical, as despite the undeniable shifts and changes to work brought about by AI, workers still need a sufficient wage, safe conditions, adequate protection, and decent contracts. They still deserve management that is fair and non-discriminatory and an assurance of their freedom to associate and the change to have a say in their conditions.
As alluded to earlier, this is not yet a reality. Indeed, behind the magic of AI are conditions that are from dazzling. In 2023 Fairwork AI project team conducted our first case study exploration into the working conditions in the supply chains of AI. The company in question was Sama, a well-known impact sourcing company that specialises in data annotation for computer vision. Impact sourcing evolved in the late 2000s as corporate social responsibility initiatives began expanding to business process outsourcing; and the practice of hiring marginalised individuals was promoted as a way of helping to lift them out of poverty and boost economic development overall, especially within the data annotation and verification sectors.
Fairwork team was invited by Sama management to conduct a fieldwork study in their offices in Kenya and Uganda. The initial findings were worrying. A company committed to the double bottom line model (combining financial profit strategy with social responsibility), which held BCorp Status, and had a reputation as an ethical operator in impact sourcing initially appeared to meet none of the Fairwork AI principles. Interviews with workers revealed the effect of insecure contracts, poor management practices, gender-based discrimination, unpaid working time and long working hours on the well-being of workers.
When Fairwork shared their findings with Sama management, their response was commendable. Sama’s senior management took the research findings very seriously and committed to respond to the problems identified. In the consultation period between the sharing of findings until the Fairwork report launch, Sama continued to engage with the Fairwork team, sharing evidence, information on the changes being implemented and detailed explanations on how the changes were coming into effect in the company. This has resulted in a significant improvement of fairness at Sama over a very short period, and eventually Sama was awarded 5/10 in the Fairwork AI report published in December 2023.
The company has committed to overall 24 changes to address the issues identified in Fairwork’s research. These ranged from updating the overtime policy and adapting incentive structures to ensure that workers are not rewarded for overworking making a commitment not to pay any employee a base salary below the living wage levels; from updating the employment contract to more explicitly featuring information on working hours, breaks, maximum hours allowed for overtime and benefits, to updating the Workplace Illness and Injury Policy to include specific coverage of work-related psychological disorders; from introducing a new policy to change the default contracts from 1 month to 12 months (or where there are not client terminations of ramp down rights, to the end of the client contract – except where there is good reason to beliece contract will not run its full course) to starting the “Zero Tolerance Culture” campaign to train staff on anti-bullying and harassment, as well as introducing specialised out-of-house training on gender-based violence for key HR, legal and management personnel; as well as creating specific employee rights training to cover freedom of association and investigating representation structures to increase employee involvement in decision-making across the organisation and at all levels of decisions.
With these changes, working conditions for the Sama workers in Kenya and Uganda offices undeniably improved, but there is still a lot to be done. Ultimately, Fairwork’s initial findings gave import important insights into the working conditions of data workers. Indeed, the workers we interviewed through the course of our fieldwork commented that even though conditions at Sama were poor, they were not the poorest in comparison to other data companies. The workers drew a rather unmagical picture of the AI production chains, where financial instability coupled with work strain and poor management practices were all too common.
Magic is often used as a metaphor for complex technological processes and systems, and this is why in the marketing rhetoric of AI systems, magic has been such a powerful metaphor. We are told of its amazing, un-ending capabilities; its power to both save and ruin the world and of God like qualities just round the corner. It is a powerful metaphor, that is easy to get swept up. But a metaphor is all it is. AI is not untethered, immaterial magic. It is structurally reliant on a vast number of people providing a myriad of tasks in not so magical working conditions.
Everyone likes to believe in magic. But where AI is concerned, awe should be reserved for the workers performing the tasks behind the curtain. It is only because of them that the systems can do what they do. The least they deserve is basic minimum standards at work.
As Fairwork, we will be continuing our investigation into AI supply chains in the new year with new studies. We will be shifting our attention to business process outsourcing companies in Latin America with further support from the Global Partnership on AI. There is nothing inevitable about poor working conditions in the digital economy. Despite their claims to the contrary, companies have substantial control over the nature of the jobs that they provide. Fairwork’s aim is to hold them to account.
The Fairwork project is coordinated by the Oxford Internet Institute and the WZB Berlin Social Science Center.
Technology, Employment and Wellbeing is a new FES blog that offers original insights on the ways new technologies impact the world of work. The blog focuses on bringing different views from tech practitioners, academic researchers, trade union representatives and policy makers.
by Dr. Cecilia Rikap, Associate Professor in Economics, University College London.
Friedrich-Ebert-Stiftung Future of Work
Cours Saint Michel 30e 1040 Brussels Belgium
+32 2 329 30 32
futureofwork(at)fes.de
Team
This site uses third-party website tracking technologies to provide and continually improve our services, and to display advertisements according to users' interests. I agree and may revoke or change my consent at any time with effect for the future.
These technologies are required to activate the core functionality of the website.
This is an self hosted web analytics platform.
Data Purposes
This list represents the purposes of the data collection and processing.
Technologies Used
Data Collected
This list represents all (personal) data that is collected by or through the use of this service.
Legal Basis
In the following the required legal basis for the processing of data is listed.
Retention Period
The retention period is the time span the collected data is saved for the processing purposes. The data needs to be deleted as soon as it is no longer needed for the stated processing purposes.
The data will be deleted as soon as they are no longer needed for the processing purposes.
These technologies enable us to analyse the use of the website in order to measure and improve performance.
This is a video player service.
Processing Company
Google Ireland Limited
Google Building Gordon House, 4 Barrow St, Dublin, D04 E5W5, Ireland
Location of Processing
European Union
Data Recipients
Data Protection Officer of Processing Company
Below you can find the email address of the data protection officer of the processing company.
https://support.google.com/policies/contact/general_privacy_form
Transfer to Third Countries
This service may forward the collected data to a different country. Please note that this service might transfer the data to a country without the required data protection standards. If the data is transferred to the USA, there is a risk that your data can be processed by US authorities, for control and surveillance measures, possibly without legal remedies. Below you can find a list of countries to which the data is being transferred. For more information regarding safeguards please refer to the website provider’s privacy policy or contact the website provider directly.
Worldwide
Click here to read the privacy policy of the data processor
https://policies.google.com/privacy?hl=en
Click here to opt out from this processor across all domains
https://safety.google/privacy/privacy-controls/
Click here to read the cookie policy of the data processor
https://policies.google.com/technologies/cookies?hl=en
Storage Information
Below you can see the longest potential duration for storage on a device, as set when using the cookie method of storage and if there are any other methods used.
This service uses different means of storing information on a user’s device as listed below.
This cookie stores your preferences and other information, in particular preferred language, how many search results you wish to be shown on your page, and whether or not you wish to have Google’s SafeSearch filter turned on.
This cookie measures your bandwidth to determine whether you get the new player interface or the old.
This cookie increments the views counter on the YouTube video.
This is set on pages with embedded YouTube video.
This is a service for displaying video content.
Vimeo LLC
555 West 18th Street, New York, New York 10011, United States of America
United States of America
Privacy(at)vimeo.com
https://vimeo.com/privacy
https://vimeo.com/cookie_policy
This cookie is used in conjunction with a video player. If the visitor is interrupted while viewing video content, the cookie remembers where to start the video when the visitor reloads the video.
An indicator of if the visitor has ever logged in.
Registers a unique ID that is used by Vimeo.
Saves the user's preferences when playing embedded videos from Vimeo.
Set after a user's first upload.
This is an integrated map service.
Gordon House, 4 Barrow St, Dublin 4, Ireland
https://support.google.com/policies/troubleshooter/7575787?hl=en
United States of America,Singapore,Taiwan,Chile
http://www.google.com/intl/de/policies/privacy/