Pages

Tuesday, May 30, 2017

Internet's hidden warriors: Thousands of young Indians are working to remove sexually and racially abusive content

BY SANGHAMITRA KAR & ARITRA SARKHEL, ETTECH | MAY 30, 2017, 06.19 AM IST

BENGALURU: Here's India's hidden IT army — they clean up the internet, and their jobs sometimes take heavy personal tolls. 
A challenge that content moderators grapple with on a regular basis is the ambiguous standards that vary from country to country.
Take Sharanya. Her new job went from dream to nightmare in just 90 days. When she joined as a content moderator for a Hyderabad-based firm in 2012, her job description was simple: sift through the content posted by users on clients' portals and social media channels, and weed out the trash. What she was not prepared for was an avalanche of disturbing visuals many featuring child abuse. 

Soon, Sharanya, who requested that her second name not be disclosed to protect her identity, lost sleep and appetite. She was haunted by those images and quit her job. 

But there are many satisfied soldiers of the army that cleans up online filth. 

Chandan Kumar Nayak, content moderator for five years, has been working as a team leader at Bengaluru-based Foiwe Info Global Solutions, which offers content moderation services for a  variety of clients. 

"I thoroughly enjoy my job," says Nayak. "I never get overwhelmed by the sensitive content. I treat it as an opportunity to help clean the internet and make it a better place for others." 

Sharanya and Nayak are a part of the small but booming content moderation business in India. From metropolises such as Delhi, Bengaluru, Hyderabad and Chennai to smaller cities such as Jaipur, thousands of men and women, aged between 18 and 28 years, sit in front of computers looking at graphic content ranging from brutal murders, rapes, beheadings, nude pictures, abusive posts, racist videos, etc posted on portals of websites and social media channels. 

They are paid between Rs 1.5 lakh and Rs 5.6 lakh per annum. 

"We are here to clean the dirt from the Internet. We help build better brands for our clients," says Aravind Rao, Co-Founder at Hyderabad-based Infoesearch, who has been in the content moderation business for the last five years. 

Some moderators, like Sharanya, find it difficult to cope. Debarati Halder, honorary MD of Tirunelveli-based Centre for Cyber Victim Counselling, cites the example of two college students, both in their late teens, who had taken up content moderation jobs. "In the first few months, they saw more than 1 lakh videos of graphic content including rape and molestation," says Halder. "The boys became averse to sexual relationships," she adds. 

"Even though many moderators might enjoy their work, it's bound to take a toll on their mental health," says Dr Shyam Bhat, psychiatrist and founder of Seraniti, a Bengaluru-based mental health service provider. 

Rao says his firm Infoesearch regularly conducts sessions about the nature of the work. "We do not allow women who recently joined to moderate extreme violent content. We sensitise our employees slowly," says Rao. 

Similarly, Suman Howladar, Founder, Foiwe Info Global Solutions, says that employees are not pushed to screen violent content from the word go but are put through a two-week training process, sensitising them about the scope of work. "We give gruesome content to only certain people in the organisation and in case they have issues, our HR helps puts them in less sensitive projects," says Howladar. 

AMBIGUOUS STANDARDS 
Another challenge that content moderators grapple with on a regular basis is the ambiguous standards that vary from country to country. "The concept of racism and nudity varies from client to client and even countries. What may sound abusive to Indians may not be true to Europeans," says Rao. For instance, clients from India are skittish about any form of nudity while a dating platform in the US is okay with partial nudity, where a user covers his or her private parts with hands. "Clients from Middle East and India would not accept a lady with hot pants whereas in US and Europe, many clients are okay with certain form of skin show," says Howladar. 


Graphic content moderation is just one part of the job. Ecommerce platforms employ content moderation companies to flag and remove "abusive or racist products" posted for sale on their websites. 

"We do not allow campaigns like tshirt quotes or pictures which promote hatred against sexual orientation, joining hate groups, frontal nude pictures, flags of terrorist organisations, false claims," says Apurv Agrawal, founder, SquadRun, which has operations in San Francisco and Delhi. " "At the same time, a picture promoting legalisation of marijuana is okay as it promotes a point of view." 

But even product moderation has its sensitivities. "National flags imprinted on products and apparels are accepted outside and part of the dayto-day lifestyle… doormats of US Flag, or shirts of United Kingdom flags. However, it's strictly prohibited on Indian marketplaces." 


Interestingly, Agrawal and his team combine artificial intelligence and a workforce of around 75,000 on a single enterprise SaaS platform to drive moderation for leading commerce companies like Sephora, Flipkart and Offerup. 

"We even have an app wherein users from across the world can log in and do content moderation. It's like the Uber model where we don't hire but rent out work," he adds. 

But maybe the clean-up army needs to get bigger. Mishi Choudhary, legal director, New York-based Software Freedom Law Center, believes the sheer numbers of photographs and videos are overwhelming and companies need to allocate more resources for content moderation and build AI to understand their community's issues better. 





No comments:

Post a Comment