Friday, November 22, 2024

Trending Topics

HomeBusinessHow A British Teen’s Death Changed Social Media

How A British Teen’s Death Changed Social Media

spot_img

Ian Russell’s mind was still reeling when he sat at the family computer back in 2017. His 14-year-old daughter, Molly, had just died from an act of self-harm and Russell was looking for answers about how this could have happened. Scrolling through Molly’s email inbox, he believed he had found them.

Two weeks before her death, the British teenager received an email from Pinterest. “Depression Pins you may like,” it read. The email included an image of a bloody razor.

Instagram was also helping Molly discover new depression content: In the six months before her death, she shared, liked, or saved more than 2,000 posts related to suicide, self-harm, and depression on the site. Last week, the senior coroner for north London, Andrew Walker, concluded that it was not right to say Molly died by suicide and said that posts on Instagram and Pinterest contributed to her death. “She died from an act of self-harm while suffering from depression and the negative effects of online content,” Walker said.

More children than Molly are exposed to disturbing content online. Almost two-thirds of British children aged 3-15 use social media, and one-third of online children aged 8-15 have seen worrying or upsetting content online in the past 12 months, according to a 2022 report by British media regulator Ofcom. Child protection campaigners say posts showing self-harm are still available, even if they are now harder to find than in 2017.

But Molly’s case is believed to be the first time social media companies have been required to take part in legal proceedings that linked their services to the death of a child. The platforms were found to have hosted content that glamorized self-harm and promoted keeping feelings about depression secret, says Merry Varney, solicitor at Leigh Day, the law firm representing the Russell family. Those findings “captured all the elements of why this material is so harmful,” she adds.

The inquest sought only to establish the official reason Molly died. But unofficially, the two-week hearing put Instagram and Pinterest on trial. Both companies say they’ve changed in the five years since Molly’s death.

But those changes have made them veer in different directions; demonstrating two distinct models for how to run a social media platform. Meta, Instagram’s parent company, says it wants to be a place where young people struggling with depression can seek support or cry out for help. Pinterest has started to say some subjects simply don’t belong on its platform.

According to Pinterest, self-harm is one of those subjects. “If a user searches for content related to suicide or self-harm, no results are served, and instead they are shown an advisory that directs them to experts who can help if they are struggling,” says Jud Hoffman, global head of community operations at Pinterest. “There are currently more than 25,000 self-harm-related search terms on the blocked list.

” Varney agrees the platform has improved but says it’s not perfect. “Research that we did with Molly’s family suggested that there is much less of this content on Pinterest [now],” she says. Instagram also hides search terms—but only if the term or phrase itself is promoting or encouraging self-harm, says Tara Hopkins, head of EMEA public policy at Instagram.

“For other search terms related to suicide/self-harm that aren’t inherently violating, we show a message of support before showing any results. ” The company declined to share how many search terms were blocked. Instagram-owned Meta says it is juggling concerns about child safety with young people’s free expression.

The company admitted that two posts seen by Molly and shown to the court would have violated Instagram’s policies at the time. But Elizabeth Lagone, head of health and well-being policy at Meta, told last week’s inquest that it is “important to give people that voice” if they are struggling with suicidal thoughts. When the Russell family lawyer, Oliver Sanders, asked Lagone if she agreed that the content viewed by Molly and seen by the court was “not safe,” Lagone responded: “I think it is safe for people to be able to express themselves.

” These comments embody what researchers say are major differences between the two platforms. “Pinterest is much more concerned with being decisive, being clear, and de-platforming content that does not meet their standards,” says Samuel Woolley, program director for the propaganda research lab at the University of Texas, Austin. “Instagram and Facebook … tend to be much more concerned with running up against free speech.

” Pinterest has not always operated like this. Hoffman told the inquest that Pinterest’s guidance used to be “when in doubt, lean toward … lighter content moderation. ” But Molly’s death in 2017 coincided with the fallout from the 2016 US presidential elections, when Pinterest was implicated in spreading Russian propaganda.

Around that time, Pinterest started to ban entire topics that didn’t fit with the platform’s mission, such as vaccines or conspiracy theories. That stands in sharp contrast to Instagram. “Meta platforms, including Instagram, are guided by the dictum of wanting to exist as infrastructural information tools [like] the telephone or [telecoms company] AT&T, rather than as social media companies,” says Woolley.

Facebook should not be “the arbiter of truth,” Meta founder and CEO Mark Zuckerberg argued in 2020. The inquest also highlighted differences between how transparent the two platforms were willing to be. “Pinterest helpfully provided material about Molly’s activities on Pinterest in one go, including not just pins that Molly had saved but also pins that she [clicked] on and scrolled over,” says Varney.

Meta never gave the court that level of detail, and much of the information the company did share was redacted, she adds. For example, the company disclosed that in the six months before her death, Molly was recommended 30 accounts with names that referred to sad or depressing themes. Yet the actual names of those accounts were redacted, with the platform citing the privacy of its users.

Varney agrees both platforms have made improvements since 2017. Pinterest results for self-harm search terms do not contain the same level of graphic material as did they five years ago, she says. But Instagram’s changes have been too little, too late, she claims, adding that Meta did not prohibit graphic images of self-harm and suicide until 2019.

Other organizations have been tracking those changes too. Since 2017, Pinterest has turned off personalized recommendation emails and push notifications for underage users, while Instagram has turned off direct messaging for adults to children that don’t follow them, says Izzy Wick, director of UK Policy at 5rights, a British group campaigning for a digital environment that protects young people. Children are also no longer recommended as friends to adult strangers, she adds.

“A lot of these changes are not announced with much fanfare because people often think, well, why weren’t you doing this before?” Instagram and Pinterest face different pressures. As a bigger platform, Instagram comes under more political pressure in the US to preserve free speech. Pinterest’s 433 million users are easier to moderate than Instagram’s 2 billion plus users, who are more likely to post about current events.

But the Molly Russell case was a rare example that put two platforms in stark contrast, revealing their different models of content moderation. Yet the inquest also shows how difficult it is to compare the two, when transparency is voluntary. “We don’t have the data, necessarily, to be able to compare platforms and the safety measures that they have in place,” says Watson.

Pinterest and Instagram have tweaked the way their platforms work since Molly’s death, but child safety campaigners hope that looming UK regulation will bring more radical change. The inquest has renewed pressure on the new British government to introduce the long-awaited Online Safety Bill, which culture minister Michelle Donelan promised to bring back to Parliament before Christmas this year. “We need the bill to be brought forward as quickly as possible now,” says Hannah Rüschen, senior policy and public affairs officer at British child protection charity, NSPCC.

Young people say that self-harm content is harder to find now than in 2017 but that this content is still available, says Rüschen, adding that people can make minute changes in their search terms to reveal very different results. She says mental health is consistently the main reason young people contact NSPCC’s counseling service Childline. Last year, the service received 24,200 calls about suicidal thoughts or feelings.

“Given the sheer scale of the impact of the online world on children’s lives, we really need to make sure that the online safety bill brings in tight controls and understandings on what is legal but harmful when it comes to this kind of content,” Rüschen says. Other supporters of the bill believe the new rules will compel companies to be more proactive in cases like Molly’s, forcing them to carry out risk assessments. “A risk assessment is helpful because it means that companies have to look at what’s happening on their platforms, what might happen on their platforms, and have a plan for what to do about it,” says Hilary Watson, policy and campaign manager at Glitch, a UK charity campaigning against online abuse.

“They will [have to] draw a line in the sand about what they will and won’t tolerate. ”.


From: wired
URL: https://www.wired.com/story/how-a-british-teens-death-changed-social-media/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News