In remote villages across India, a growing workforce of women spends hours each day viewing violent and explicit content. Their task is to label this data to train artificial intelligence systems for global technology companies, but many report severe psychological distress with little to no support.
This digital labor, often marketed as a safe, home-based opportunity, is exposing workers to relentless graphic material, leading to trauma, anxiety, and emotional numbing. It represents a hidden, human-powered assembly line at the core of the AI revolution.
Key Takeaways
- Women in rural India are being hired as content moderators to train AI by classifying violent and pornographic material.
- Workers report significant psychological harm, including traumatic stress, anxiety, and sleep disturbances.
- Companies often use vague job descriptions and provide minimal mental health support for this demanding work.
- This labor is part of a growing data annotation market in India, valued at around $250 million in 2021, with most revenue coming from the U.S.
A Day in the Life of an AI Data Trainer
From her family home in Jharkhand state, Monsumi Murmu, 26, logs onto her laptop. Her daily work involves watching and categorizing hundreds of videos and images that automated systems have flagged as potentially harmful. On an average day, she may review up to 800 pieces of content.
The material ranges from graphic violence to sexual abuse. "The first few months, I couldn’t sleep," she explained. "I would close my eyes and still see the screen loading." The images, she says, followed her into her dreams, depicting fatal accidents and sexual violence.
Over time, the constant exposure has had a numbing effect. "In the end, you don’t feel disturbed – you feel blank," Murmu said. This emotional detachment is a common coping mechanism, but she acknowledges the lasting impact. "That’s when you know the job has done something to you."
The Scale of the Work
An estimated 70,000 people in India were working in data annotation as of 2021. Approximately 80% of these workers are from rural, semi-rural, or marginalized backgrounds, with women making up at least half of the workforce.
The Invisible Workforce Powering Global Tech
This type of work is a critical component of machine learning. AI models are only as effective as the data they are trained on, creating a massive demand for human workers to label and classify content that algorithms cannot yet understand on their own.
Tech companies and third-party contractors are increasingly turning to India's smaller cities and rural areas to find this labor. The strategy allows them to tap into a large pool of first-generation graduates seeking employment while keeping labor and operational costs low. Improved internet connectivity has made it possible to integrate these remote locations directly into global AI supply chains.
Why Rural Women?
Companies often view women as a reliable and detail-oriented workforce. Offering home-based contract work is seen as providing a "safe" and "respectable" income source that does not require migration to large cities, making it an attractive proposition for many families in conservative communities.
However, this arrangement can also reinforce a worker's vulnerable position. Priyam Vadaliya, a researcher focusing on AI and data labor, notes that the perceived respectability of the job can create an "expectation of gratitude," which may discourage women from speaking out about the psychological harm they experience.
The Psychological Price of Moderation
Researchers in the field describe content moderation as inherently dangerous work. Milagros Miceli, a sociologist investigating the roles of data workers, says she has yet to see evidence of moderators who escape psychological harm.
"In terms of risk, content moderation belongs in the category of dangerous work, comparable to any lethal industry."
A 2023 study that included moderators in India identified traumatic stress as the most significant psychological risk. Workers commonly report intrusive thoughts, anxiety, sleep disturbances, and behavioral changes like heightened vigilance, even when some workplace support is available.
From Vague Promises to Graphic Realities
Raina Singh, 24, took a data annotation job in her hometown of Bareilly, Uttar Pradesh, for a monthly income of about £330. Initially, her tasks were benign, involving screening text messages for spam. "I felt like I was working behind the AI," she said. "I was seeing what makes it work."
Six months later, her assignment changed abruptly. She was moved to a project for an adult entertainment platform and tasked with identifying and flagging content involving child sexual abuse. When she expressed her shock, her manager's response was simple: "This is God’s work – you’re keeping children safe."
Soon after, her role shifted again to categorizing pornography. "I can’t even count how much porn I was exposed to," she recalled. "It was constant, hour after hour." The experience deeply affected her personal life, causing her to withdraw from intimacy and feel disconnected from her partner. When she complained, she was told her contract for "data annotation" covered this work. She eventually quit the job.
A System Without a Safety Net
Many workers enter these roles with little understanding of what they will entail. Job listings are often ambiguous, promoted online as "easy money" or "zero-investment" opportunities. According to researcher Priyam Vadaliya, workers often only realize the true nature of the job after signing contracts.
Furthermore, strong non-disclosure agreements (NDAs) prevent workers from discussing their jobs with family or friends, deepening their isolation. Violating an NDA can result in termination or legal action.
Of eight data annotation and moderation companies in India contacted for comment, only two stated they provided psychological support. Most argued the work was not demanding enough to warrant mental healthcare. Even when support is offered, the burden is typically on the individual to seek it out, a significant barrier for workers from marginalized backgrounds who may lack the language to articulate their trauma.
India's labor laws currently do not offer specific legal recognition for psychological harm from this type of work, leaving employees with few protections.
For Monsumi Murmu, the fear of unemployment outweighs the daily distress. With her contract ending soon, she avoids raising concerns about her mental health. To cope, she finds solace in nature. "I go for long walks into the forest," she said. "I sit under the open sky and try to notice the quiet around me."





