Dubai Tech News

How TikTok Has Bent Its Rules For Its Top Creators

TikTok has employed a two-tiered moderation system that gives preferential treatment to influencers, celebrities and other VIPs, according to leaked audio recordings of internal TikTok meetings from fall 2021. This more lenient policy enforcement system has used dedicated queues to prioritize and protect the posts of people with more than 5 million followers when they break TikTok’s content rules. Within TikTok’s internal systems, these high-profile accounts have been designated with what are called “creator labels.

” In a meeting from late September 2021, a member of TikTok’s Trust & Safety team explained that the labels were reserved for “special users” whose content was to be moderated separately, and differently, than content from second-class users, according to a recording of the meeting. “We don’t want to treat these users as, um, like any other accounts. There’s a bit more leniency, I’d say,” said one employee on the Trust & Safety team.

Another attendee of that meeting, a consultant from Booz Allen Hamilton, said this corroborated what he had heard earlier from another TikTok employee. That employee, he said, “was pretty blunt that, like, a famous person could post content and I could post content, and if they were both inappropriate, the famous person’s would be able to stay up. ” TikTok, like most other social platforms, is governed by a set of community guidelines , which police everything from child sexual abuse material to election misinformation to dangerous viral challenges.

The company’s website says these rules “apply to everyone and everything on TikTok. ” When asked about the existence of a separate system for “creator” content moderation at TikTok, company spokesperson Jamie Favazza said, “TikTok is not more lenient in moderating accounts with more than 5 million followers. ” She also said “we do not have moderation queues based on following size.

Favazza did not immediately answer a follow-up question about whether the company had previously employed such a system. Social media companies bending their rules for the rich and powerful is not new. Companies like Twitter, YouTube and even Spotify have struggled with rule breaking by A-list celebrities, media companies and world leaders.

But the company best known for favoring powerful posters is TikTok’s primary rival, Meta. The parent to Facebook and Instagram developed an elaborate internal system to shield certain accounts from having their content moderated by the company’s usual systems. Called XCheck, the system was used to protect celebrities, journalists and politicians.

An internal memo first reported by the Wall Street Journal criticized the practice, saying “[i]t creates distrust, fuels the narrative that Facebook is ‘biased’ and undermines our goal of building legitimacy with stakeholders and community. ” At the time, Facebook confirmed the existence of the system, describing it as a quality assurance system for “content that could require more understanding. ” TikTok’s system of favoritism appears to be considerably less elaborate than Facebook’s XCheck system.

It also appears blunter: While Facebook XCheck eligibility is determined in part by whether an account was “newsworthy” or “PR risky,” at TikTok, the cutoff appears to have been purely numerical. In a recording of an October 2021 meeting, a crisis management operations manager on the TikTok Trust & Safety team explained that creator labels are reserved for accounts that have more than 5 million followers. “You can totally understand why a social media company would want to have special review of really high-profile accounts,” said Evelyn Douek, a professor at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia.

“There’s nothing inherently malicious or wrong with doing that, it’s kind of basically just a due diligence or quality assurance method. ” But, Douek cautioned, such a system can be easily abused, especially when it is run by people with a financial incentive to leave content up, or take it down. Although they might look similar, “a system meant to ensure that rules are applied consistently is very different from a system meant to ensure that rules are applied inconsistently, say, when the bottom line is at stake,” she said.

Accounts that meet the 5 million-follower threshold have been the subject of content policy issues in the past. Daniel Aronov, an Australian doctor who amassed 13 million followers on the platform, came under fire for posting videos during his surgeries and was eventually ordered by an Australian regulatory board to delete his TikTok account. Issey Moloney, a 17-year-old who makes posts to her 5.

9 million followers about mental health conditions, has faced criticism for encouraging self-diagnosis. In response to questions about these accounts, spokesperson Favazza said, “While we can confirm violative content has been removed from accounts you’ve flagged, we prioritize people’s privacy and therefore do not disclose private information about accounts. ” According to the October 2021 recording, employees have also been sensitive to which third-party contractors oversee “creator” accounts that receive special treatment.

The crisis management operations manager asked that content from “creator” accounts not be moderated by a team of contractors in Kuala Lumpur, because those moderators were “very by-the-book,” and “if it needs to go outside of that, they won’t. ” “If Addison Rae shows a little bit of a thong, they might mark that, whereas we might be a little more lenient,” he said, referencing one of TikTok’s biggest stars. Addison Rae Easterling, 21, has more than 88 million followers on the platform and was the third-highest paid TikTok star in 2022, according to a Forbes estimate .

(Easterling’s team did not respond to a comment request. ) The crisis management operations manager recommended that when moderation issues arise, the third-party moderators should notify the accounts’ internal partnership managers — the TikTok employees tasked with helping popular creators navigate the platform — but punt final decisions back to him. “I could definitely go in there and just ban or unban — whatever their request is,” he said.

.


From: forbes
URL: https://www.forbes.com/sites/emilybaker-white/2022/09/20/tiktok-special-treatment-top-creators-bent-rules/

Exit mobile version