Sunday, November 17, 2024

Trending Topics

HomeTechnologyTikTok Moderators Are Being Trained Using Graphic Images Of Child Sexual Abuse

TikTok Moderators Are Being Trained Using Graphic Images Of Child Sexual Abuse

spot_img

N asser expected to be confronted with some disturbing material during his training to become a content moderator for TikTok. But he was shocked when he and others in his class were shown uncensored, sexually explicit images of children. Nasser, who was working for a third-party company, Teleperformance, that moderates content for the social media giant, had been assigned to a special project: teaching TikTok’s AI to spot the worst of the worst posted in the app.

Just a few days into onboarding, he and his colleagues were shown graphic images and videos of children involved in sexual acts—all material that had been removed from TikTok. “I have a daughter, and I don’t think it’s right—just a bunch of strangers watching this,” Nasser, who left Teleperformance in 2020, told Forbes. “I don’t think they should use something like that for training.

” (His last name, and some others in this story, have been omitted for privacy reasons. ) Whitney Turner, who worked for Teleperformance’s TikTok program in El Paso for over a year and departed in 2021, also recalled being shown sexually exploitative imagery of kids as part of her training. Whitney was given access to a shared spreadsheet that she and other former employees told Forbes is filled with material determined to be violative of TikTok’s community guidelines, including hundreds of images of children who were naked or being abused.

Former moderators said the document, called the “DRR,” short for Daily Required Reading, was widely accessible to employees at Teleperformance and TikTok as recently as this summer. While some moderators working in unrelated functions were restricted from viewing this material, sources told Forbes that hundreds of people across both companies had free access to the document. The DRR and other training materials were stored in Lark, internal workplace software developed by TikTok’s China-based parent company, ByteDance.

Whitney was so aghast at the casual handling of the material that she reported it to the FBI, and subsequently met with an agent in June. The FBI did not respond to multiple requests for comment on whether it will investigate the matter. “I was moderating and thinking: This is someone’s son.

This is someone’s daughter. And these parents don’t know that we have this picture, this video, this trauma, this crime saved,” Whitney told Forbes . “If parents knew that, I’m pretty sure they would burn TikTok down.

” Teleperformance’s Global President of Trust & Safety Akash Pugalia told Forbes the company does not use videos featuring explicit content of child abuse in training, and said it does not store such material in its “calibration tools,” but would not clarify what those tools are or what they do. He declined to answer a detailed list of other questions regarding how many people have access to child sexual abuse material through the DRR and how Teleperformance safeguards this imagery. TikTok is hardly alone in its struggle to purge child sexual abuse material.

The most powerful social media platforms on the planet have long used machine learning and third-party human reviewers to catch and remove such content before it’s widely shared, and many companies work with the National Center for Missing & Exploited Children, or NCMEC, to alert law enforcement of such problematic imagery in their apps. What is unique, however, is the way TikTok and its outside consultants are handling this material—an approach experts say is ham-handed and cavalier at best, and harmful and re-traumatizing at worst. They say showing sexual images of kids in content moderation training, censored or not, only re-victimizes them.

And storing the images in a widely accessible document is reckless and unnecessary. TikTok spokesperson Jamie Favazza said that the company’s “training materials have strict access controls and do not include visual examples of CSAM,” but conceded that it works with third party firms who may have their own processes. TikTok declined to answer questions about how many employees have access to the DRR and where the document is hosted.

TikTok’s parent company ByteDance, purveyor of the Lark platform, did not respond to repeated requests for comment. I mages of child abuse and exploitation are illegal, and there are strict rules for handling them when discovered. The material must immediately be taken down and reported to NCMEC’s CyberTipline , where staff then analyze the files, and work to track down where they came from, so they can alert the appropriate law enforcement agency.

Once a company has reported this material to NCMEC, it’s statutorily granted immunity to retain it for 90 days to aid authorities. But federal law explicitly requires companies to “minimize the number of employees that are provided access” to that content, “maintain the materials in a secure location,” and “ensure that any such visual depiction is permanently destroyed, upon a request from a law enforcement agency. ” Top legal and online safety experts said making abusive and exploitative content featuring minors widely available to Teleperformance and TikTok workers with lax safeguards runs counter to that mandate and could cross the line from a safety or privacy problem to a crime.

“You have to have incredibly strict parameters for how it’s being kept, how it’s being viewed, how it’s being shared—it is simply not something that you can just save on a computer or in a training file,” said NCMEC’s general counsel, Yiota Souras. Asked what could make such handling of child sexual abuse material a crime, Souras said it would depend on how the company is preserving and protecting it; who has access to it; and how it’s being distributed, downloaded or otherwise replicated (in a document, for instance). “How locked down is it?” Souras said.

“Free and loose is not going to work here. ” TikTok’s Favazza said, “We aim to minimize moderators’ exposure in line with industry best practices. ” Asked whether TikTok had provided Teleperformance with guidelines for handling child abuse material, Teleperformance did not respond.

Pugalia, the company’s Trust & Safety leader, said only that Teleperformance follows applicable laws wherever it operates. David Vladek, faculty director of Georgetown Law’s Center on Privacy and Technology, added that while it’s important for companies to train their employees and contractors to recognize posts that are out-of-bounds, they should not be using content scraped from the platform to do it. “It’s not hard to just come up with teaching tools that help staff delineate between what’s okay and what isn’t,” said Vladek, a former director of the Federal Trade Commission’s Bureau of Consumer Protection.

“But to use real-life examples and expose those examples to generations of new employees—that just strikes me as being irresponsible. ” T eleperformance is a global customer service titan that has been around since the 1970s. The company saw record growth last year, raking in north of $8 billion in revenue.

It boasts some 1,000 clients spanning nearly every sector, from healthcare to hospitality to retail to telecom—only recently adding social media to its portfolio. Teleperformance launched its “Trust & Safety” arm in 2019, and its El Paso, Texas-based moderation program with TikTok was live by the end of that year, according to former moderators who were brought on at the time. (Neither company would say when the contract began.

) The partnership is a feather in the cap of a decades-old business eager to show it’s keeping up with the times. The company’s most recent annual report touted its partnership with “the fastest growing video platform in the world,” and said many of its employees now fashion themselves “ Guardians of the Internet . ” Teleperformance told Forbes it has a comprehensive recruitment process for content moderators and that it’s committed to their wellbeing.

But conversations with former employees reveal the extent to which moderators were both under-trained and overworked. As TikTok grew rapidly and the volume of content on the platform surged, the company began punting more and more moderation tasks to Teleperformance, according to Richard, a former moderator and supervisor in El Paso who left the company in July after nearly three years there. He described a chaotic training department rife with communication problems and high turnover that often left young or inexperienced employees leading the work.

TikTok “has hardly any control of the training taking place now,” Richard said. And Teleperformance is “at this point where they can’t manage it accordingly or correctly … They’re just overwhelmed. ” Angel, who worked at Teleperformance in El Paso from February 2020 to December 2021, first as a content moderator and then as a manager, echoed those sentiments, noting Teleperformance’s moderation arm simply couldn’t handle TikTok’s explosive growth.

“They were pretty much spread thin, that’s for sure, at least in the training department,” he said. As Teleperformance ramped up its work with TikTok, moderators were often trained and led by supervisors and other higher-ups who themselves did not have the proper clearances, according to Angel. As trainers scrambled to bring on more moderators, “they were pressured by management to go ahead and get the people through” even with inadequate preparation, he explained.

Whitney added that when she asked her bosses “why we couldn’t just be transparent with TikTok to let them know that we needed more time and training, I was repeatedly told it was because the contract was signed [and] we were expected to live up to the contract—or they would simply replace us with people that could. ” T he so-called “DRR” is one of several files that TikTok moderators have used as a resource on the job. (Neither TikTok nor Teleperformance would confirm whether it’s still live today, but three former moderators who left this past May, June and July said the DRR was in use when they departed.

) Angel and other former staffers described the DRR as a spreadsheet filled with examples of violative content that had been removed from TikTok. Moderators said they were instructed to refer back to it for guidance on how to “tag” which policies that material violated. The spreadsheet had various tabs for different kinds of content—one related to terrorist organizations, another on nudity, for example.

Child sexual abuse material was also given its own, designated tab, Angel said. Whitney estimated that pictures, videos and descriptions of child sexual abuse made up about a quarter of the spreadsheet, which contained thousands of examples of violative material. She believed that the DRR dated back to when TikTok first brought on Teleperformance.

Marjorie Dillow, who until May worked as a moderator for another Teleperformance hub in Boise, Idaho, left with a similar impression. “They don’t really delete anything off of that DRR,” she said. Provided with all of this information, TikTok told Forbes it did not have enough details to confirm the existence of the DRR or answer a list of questions about it.

Teleperformance also declined to answer a list of queries about the document. Neither would say how many people and companies have access to it—given it lives on ByteDance’s Lark software, and ByteDance has had access to U. S.

TikTok data in the past —or what company controls access to the material. They also would not say how far back the examples of child sexual abuse material on the DRR go, and what if any process there might be for purging it 90 days after reporting it, as law enforcement requires. While there is not yet a common set of industry best practices for handling this material, the leader of the Tech Coalition, a group fighting child sexual abuse online that counts TikTok as a member, said there is a framework.

“There are certain laws that are sort of bright lines—like you have to maintain it for 90 days, it has to be embargoed, once you become aware of it, you must report it to NCMEC, it’s generally a felony to possess this stuff,” said Sean Litton, the coalition’s executive director. But “each company has set up their own way to process it. ” Former federal prosecutor Mary Graw Leary, an expert on child pornography and exploitation who previously helped lead NCMEC’s legal work, said child sexual abuse material should be treated as “contraband,” like guns or drugs.

“If you came across drugs in the workplace, or drugs at school, and you’re a vice principal, you wouldn’t put it in your desk and whip it out at the next faculty meeting to say, ‘This is what cocaine looks like,’” she said. “Similarly, because the Supreme Court and everyone else has recognized the subjects of this content are re-victimized every time someone looks at it, it shouldn’t be bandied about in a light manner. ” Asked how many people have access to the DRR spreadsheet, Angel estimated hundreds of people at the El Paso office alone and potentially anyone working with TikTok at Teleperformance’s other U.

S. locations, like Boise; as well as TikTok employees themselves. Angel also noted that despite the El Paso site being “a no paper kind of office” where moderators caught with a phone “would automatically be fired,” remote work during the pandemic made those privacy and security measures harder to enforce.

Seventy percent of Teleperformance’s 420,000 employees worked from home in 2021, according to the company’s most recent annual report. Nasser said remote employees could access this sensitive spreadsheet from home. “If somebody wanted to, they could pull up their phone and just start recording whatever they’re seeing,” he said.

“[They could] send it to their friends or their buddies or whoever they wanted to. I don’t think they would have any way of tracking any of this down in Teleperformance. ” Former moderator AnaSofia Lerma told Forbes that despite being shown a sexual video of a child during training, she never encountered such material while on the job.

She wondered why she’d been required to view it in the first place. “I don’t think it was necessary at all,” said Lerma, who worked at Teleperformance in 2020. “If someone else uploads it, they should just take it down and just erase it from existence.

… I did wonder, ‘Why are they saving these videos?’” “Honestly,” she added, “I had to take a few minutes off. ” Whitney from the El Paso office said she jumped at the opportunity to earn $18. 50 an hour at Teleperformance—an upgrade from a past hourly wage of $16 working for a company that renovates Walmarts.

A recruiter told Whitney she’d be like “a police officer for TikTok,” and the prospect of protecting people excited her. But after spending just over a year at Teleperformance, “I have been struggling to just function properly,” Whitney said. She frequently loses her train of thought, becomes filled with rage at random and has feelings of suicide.

Nasser, a veteran with combat-related PTSD, has struggled, too, having found some of his experiences at Teleperformance to be more challenging than his time with the U. S. Army.

As for the graphic videos of children he was forced to watch to land the job, he said, ”Just talking about it, mentioning it alone, would be sufficient. ” “I know what we saw, and it was pretty fucked up,” he told Forbes. “We shouldn’t have had to see any of that stuff.

“.


From: forbes
URL: https://www.forbes.com/sites/alexandralevine/2022/08/04/tiktok-is-storing-uncensored-images-of-child-sexual-abuse-and-using-them-to-train-moderators/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News