Sunday, May 19, 2024

Trending Topics

HomeTop NewsTech Companies Are Reconsidering an Old Enemy

Tech Companies Are Reconsidering an Old Enemy

spot_img

As the midterm election season kicks into high gear, platforms across the web will begin rolling out enhanced protections to guard against digital threats to the democratic process. While every platform has different policies and approaches—from warnings and educational reminders at the top of news feeds to limitations on replies and reposts—a common strategy lies at the heart of many of the features being rolled out across the web: they’re all prompting users to slow down a bit. These efforts are reversing a long-held course, and they reflect a wider reconsideration of what was once the industry’s enemy number one: friction.

Yasmin Green is the CEO of Jigsaw , a unit within Google that addresses threats to open societies. She leads an interdisciplinary team that researches and develops technical solutions to a range of global security challenges, including violent extremism, repressive censorship, hate and harassment, and harmful misinformation. In the technology industry, we consider “friction” to be anything that stands between an individual and their goals.

And completely eliminating it was once a common goal. Teams worked for years to shave milliseconds off page load times and system responses, and companies invested millions in developing and testing designs and user flows, all to ensure that every interaction would be as fast and effortless as possible. The emphasis on speed and ease of use makes sense—technology has always served to complete complex tasks faster and more easily.

But as our tools have become more refined, and the information environment more complex, the speed at which information can reach us at times outpaces the rate at which we can fully process it. This point was driven home for me by the results of a study conducted by scholars from MIT several years ago, published in Nature last year . In a survey of American adults, individuals claimed that it was far more important to them that what they shared online was accurate than that it was surprising, funny, aligned with their political views, or even just interesting.

What’s more, respondents were extremely good at identifying accurate and inaccurate headlines, even when those headlines ran counter to their political beliefs. Despite this, when presented with a set of both truthful and misleading headlines and asked which they’d consider sharing online, the accuracy of the headline had virtually no impact on what participants said they’d consider sharing. A simple design change, however, can substantially alter people’s likelihood of sharing information they believe to be false.

Serving individuals “accuracy prompts,” which ask them to evaluate the accuracy of an unrelated headline before they share, can shift their attention from a knee-jerk reaction to their underlying values, including their own commitments to accuracy. A meta-analysis of 20 experiments that primed individuals to think of accuracy found that these types of interventions can reduce sharing of misleading information by 10 percent. Subsequent research produced by our team at Jigsaw, in partnership with academics from MIT, Macquarie University, and the Universities of Regina and Nottingham, further found that these prompts are effective across 16 countries and all 6 inhabited continents .

Prompts can also encourage individuals to engage more deeply with information in other ways. A feature rolled out by Twitter prompting users to read an article before retweeting if they hadn’t previously visited the site led to a 40 percent increase in individuals clicking through to the piece before sharing it with their networks . Once you start looking, you’ll notice these small instances of friction everywhere, and there’s strong evidence they work.

In 2020, Twitter began experimenting with a feature that prompted individuals replying to others with rude or abusive language to reconsider their tweet before posting it. According to Twitter, 34 percent of those who received these prompts either edited their original reply or decided not to reply at all . What’s more, users who received the prompt were 11 percent less likely to post harsh replies again in the future.

While these numbers may not seem earth-shattering, with over 500 million tweets sent every day, they add up to a substantially healthier online environment. A similar experiment run by the social engagement platform OpenWeb, leveraging Jigsaw’s Perspective API, found that 34 percent of commenters across a range of sites including AOL, Salon, and NewsWeek revised their posts when prompted about their harsh language. Of those who amended their comments, 54 percent changed them to be immediately permissible.

Users in the experimental group who received these prompts were also more likely to return to the site and engage in the future, and the overall community experience received higher marks. When thoughtfully designed and deployed, these types of digital speed bumps do not attempt to tell individuals what to think or restrict their actions. Instead, they slow the internet down to a more human pace, allowing individuals to take a moment to consider the information they encounter before taking action.

And while some instances of friction can be bothersome, the benefits are easy to grasp. One need only think of some of the more banal instances of friction that litter our daily digital interactions, from browser warnings on malicious sites that you must click through to the “undo send” feature now common across email clients, which hold emails for just a few seconds before sending in case you need to fix a typo or forgot to include the attachment—again. Online harms have diverse causes, and comprehensive approaches are needed to address them all, but the promising, real-world results of adding friction should encourage us to look for opportunities to apply it in a wider range of scenarios.

Platforms in the future might look to warn individuals before sharing sensitive or private information on public channels, or serve as a circuit breaker to halt online dogpiles, prompting users to consider whether they want to contribute to mass harassment. Digital technologies have allowed us to do more, and to do it more quickly, than at any previous period. And therein lies their power.

But every now and again—particularly at our most critical junctures, like the weeks ahead—we all need to slow down. WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here , and see our submission guidelines here .

Submit an op-ed at opinion@wired. com . .


From: wired
URL: https://www.wired.com/story/friction-social-media-moderation/

DTN
DTN
Dubai Tech News is the leading source of information for people working in the technology industry. We provide daily news coverage, keeping you abreast of the latest trends and developments in this exciting and rapidly growing sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Must Read

Related News