Content moderation and working conditions

Commercial content moderators are human-beings responsible for screening user generated content for tech platforms including social media and artificial intelligence platforms.

Industrial composition

edit

The content moderation industry is estimated to be worth US$9 billion. While no official numbers are provided, there are an estimates 10,000 content moderators for TikTok; 15,000 for Facebook and 1,500 for Twitter as of 2022.[1]

The global value chain of content moderation typically includes social media platforms, large MNE firms and the content moderation suppliers. The social media platforms (e.g Facebook, Google) are largely based in the United States, Europe and China. The MNEs (e.g Accenture, Foiwe) are usually headquartered in the global north or India while suppliers of content moderation are largely located in global southern countries like India and the Philippines.[2]: 79–81 

Working conditions

edit

The number of tasks completed, for example labeling content as copyright violation, deleting a post containing hate-speech or reviewing graphic content are quantified for performance and quality assurance.[1]

Non disclosure agreements

edit

Non-disclosure agreements are the norm when content moderators are hired. This makes moderators more hesitant to speak up about working conditions or organize.[1]

Psychological

edit

In February 2019, an investigative report by The Verge described poor working conditions at Cognizant's office in Phoenix, Arizona.[3] Cognizant employees tasked with content moderation for Facebook developed mental health issues, including post-traumatic stress disorder, as a result of exposure to graphic violence, hate speech, and conspiracy theories in the videos they were instructed to evaluate.[3][4] Moderators at the Phoenix office reported drug abuse, alcohol abuse, and sexual intercourse in the workplace, and feared retaliation from terminated workers who threatened to harm them.[3][5] In response, a Cognizant representative stated the company would examine the issues in the report.[3]

The Verge published a follow-up investigation of Cognizant's Tampa, Florida, office in June 2019.[6][7] Employees in the Tampa location described working conditions that were worse than the conditions in the Phoenix office.[6][8] Content moderator Keith Utley suffered a heart attack while working for Cognizant in March 2018 and died in a hospital; the Tampa office lacked an on-site defibrillator.[6][9]

Moderators were required to sign non-disclosure agreements with Cognizant to obtain the job, although three former workers broke the agreements to provide information to The Verge.[6][10] In the Tampa office, workers reported bed bugs, unsanitary work conditions, inadequate mental health resources, sexual harassment, workplace violence, and theft.[6][11] As a result of exposure to videos depicting graphic violence, animal abuse, and child sexual abuse, some employees developed psychological trauma and post-traumatic stress disorder.[6][12] Cognizant sanitized the office before The Verge's visit, a practice the publication described as a "dog-and-pony-show phenomenon".[6] In response to negative coverage related to its content moderation contracts, a Facebook director indicated that Facebook is in the process of developing a "global resiliency team" that would assist its contractors.[6]

Unionization

edit

150 content moderators, who contracted for Meta, ByteDance and OpenAI gathered in Nairobi, Kenya to launch the first African Content Moderators Union on 1 May 2023. This union was launched 4 years after Daniel Motaung was fired and retaliated against for organizing a union at Sama, which contracts for Facebook.[13]

References

edit
  1. ^ a b c Wamai, Jacqueline Wambui; Kalume, Maureen Chadi; Gachuki, Monicah; Mukami, Agnes (2023). "A new social contract for the social media platforms: prioritizing rights and working conditions for content creators and moderators | International Labour Organization". International Journal of Labour Research. 12 (1–2). International Labour Organization. Retrieved 21 July 2024.
  2. ^ Ahmad, Sana; Krzywdzinski, Martin (2022), Graham, Mark; Ferrari, Fabian (eds.), "Moderating in Obscurity: How Indian Content Moderators Work in Global Content Moderation Value Chains", Digital Work in the Planetary Market, MIT Press, pp. 77–95, ISBN 978-0-262-36982-4, retrieved 22 July 2024
  3. ^ a b c d Newton, Casey (25 February 2019). "The secret lives of Facebook moderators in America". The Verge. Archived from the original on 21 February 2021. Retrieved 20 June 2019.
  4. ^ Feiner, Lauren (25 February 2019). "Facebook content reviewers are coping with PTSD symptoms by having sex and doing drugs at work, report says". CNBC. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  5. ^ Silverstein, Jason (25 February 2019). "Facebook vows to improve content reviewing after moderators say they suffered PTSD". CBS News. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  6. ^ a b c d e f g h Newton, Casey (19 June 2019). "Three Facebook moderators break their NDAs to expose a company in crisis". The Verge. Archived from the original on 12 September 2022. Retrieved 20 June 2019.
  7. ^ Technology Correspondent, Mark Bridge (20 June 2019). "Facebook worker who died of heart attack was under 'unworldly' pressure". The Times. ISSN 0140-0460. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  8. ^ Carbone, Christopher (20 June 2019). "Facebook moderator dies after viewing horrific videos, others share disturbing incidents: report". Fox News. Archived from the original on 21 June 2019. Retrieved 21 June 2019.
  9. ^ Eadicicco, Lisa (19 June 2019). "A Facebook content moderator died after suffering heart attack on the job". San Antonio Express-News. Archived from the original on 30 November 2019. Retrieved 20 June 2019.
  10. ^ Feiner, Lauren (19 June 2019). "Facebook content moderators break NDAs to expose shocking working conditions involving gruesome videos and feces smeared on walls". CNBC. Archived from the original on 19 June 2019. Retrieved 20 June 2019.
  11. ^ Johnson, O’Ryan (19 June 2019). "Cognizant Getting $200M From Facebook To Moderate Violent Content Amid Allegations Of 'Filthy' Work Conditions: Report". CRN. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  12. ^ Bufkin, Ellie (19 June 2019). "Report reveals desperate working conditions of Facebook moderators — including death". Washington Examiner. Archived from the original on 20 June 2019. Retrieved 20 June 2019.
  13. ^ Perrigo, Billy (1 May 2023). "150 AI Workers Vote to Unionize at Nairobi Meeting". Time. Retrieved 21 July 2024.
edit