A complex area of ​​disinformation intervention

If you’re someone who’s “chronically online,” chances are you’ve come across content that is either fake or deceptive. This is officially known as “misinformation,” and according to the nonprofit KFF, during the COVID-19 pandemic, 78 percent of U.S. adults were deceived or stumbled upon by at least one false statement about the COVID pandemic or vaccines.

The online disinformation ecosystem is complicated, which actually makes improving the situation challenging. Social, psychological and technological influences drive how fake news and other misleading facts spread through the internet.

Here’s one approach: Adding source credibility tags is a method used by social media companies and third-party organizations to combat misinformation, although this has been controversial. For example, an MIT study found that by placing false tags on some articles, users give more credibility to unmarked articles (some of which may not have been verified or confirmed). Scientists who conducted the study called it the “implied effect of truth.”

But new research in Advances in science they found that on average these labels have limited effectiveness in terms of changing where people get information and reducing their misperceptions. However, in a small subgroup of individuals who consume a lot of low quality news, these labels seem to direct them towards adding more quality news to the mix.

Researchers from NYU and Princeton University have tried to study whether using a browser extension that indicates the reliability of news sources will affect viewing patterns and deter people from going to low-quality sites, as well as change their views on issues such as media trust, political cynicism. and common misconceptions.

[Related: How older adults can learn to effectively spot fake news]

To conduct the survey, they gathered about 3,000 YouGov participants representing the U.S. population by demographic categories such as age, gender, race, and a randomly assigned half of the group that will receive the NewsGuard extension. This extension showed a green, yellow, or red shield icon in front of the URLs of the websites they view in their browsers, as well as in Google search results and their Facebook and Twitter news sources. (Chrome extension NewsGuard is available to the general public; install it here).

Websites get a green shield when they generally maintain “basic standards of accuracy”. Green sites include Reuters, AP, and even Fox News. Websites that users should “read with caution” get a red shield because they tend to lag behind in areas such as accuracy and fact-labeling versus opinion. These include Epoch News and Daily Kos. Satirical sites, like The Onion, get a golden shield. Websites that contain a lot of unverified or user posts, such as YouTube, Reddit, and Wikipedia, are getting a gray shield.

The other half of the participants did not receive any guidance or information on news sources they visited online.

This survey was conducted as early as 2020, and the misinformation seen by participants was mainly related to COVID-19 and the Black Lives Matter movement. The researchers gave all participants a survey two weeks before asking the treatment group to install NewsGuard for three to four weeks, and a second survey two weeks after that period.

“As you can see with Elon Musk buying Twitter and this whole debate on online freedom of speech, there is definitely this fine line between autonomy and giving people the right information,” said Kevin Aslett, a postdoctoral fellow at the NYU Center for Social Media and Policy. and the first author of Advances in science paper. “NewsGuard walks the ropes without telling people it’s a lie. It kind of gives them subtle information by saying, ‘Hey, this is a source of low quality news that doesn’t seem reliable for these reasons.’ Previous work has found that if we give people this original information, they are less likely to believe misinformation when they see it. ”

[Related: Pending any plot twists, Elon Musk will soon own Twitter]

But for most users, these labels have not changed their online behavior in a measurable way. In general, the team did not notice a significant effect on the average Internet user who uses NewsGuard in terms of their news diet or on any of the “indicators” that disinformation affects, such as polarization, political cynicism, media trust, and common misperceptions. says Aslett. “A possible reason for this is that people just don’t watch a lot of low quality news,” he says. “So about 65 percent of our sample didn’t see anything unreliable during the treatment period.”

However, what they found was that it had a significant effect on individuals who consumed the largest amount of low-quality news. “When you look at those 35 percent who watch unreliable news, I’d say a small percentage of that relies mostly on low-quality news, and I think that group of individuals is where we’ve seen the main effects,” Aslett says. “It wasn’t drastic, like they stopped watching unreliable news altogether, but there was definitely a drop in the amount of unreliable news they watched.”

[Related: Twitter’s efforts to tackle misleading tweets just made them thrive elsewhere]

In addition, the decline was more significant in the high-disinformation group compared to individuals who reviewed several unreliable news items each week. This finding is consistent with previous studies that found that sharing fake news actually happens much less than expected and that more often than not only a few people are responsible for spreading most misinformation online.

“One conclusion from this study is that these interventions are advertised to the average Internet user, and probably, [to] Internet users who watch the most reliable news, ”notes Aslett. “But it seems to have a positive effect only on those who consume the most misinformation who are unlikely to download these web extensions or do not include these web extensions.”

“Maybe we should change who we target these interventions and try to find a way to advertise it to those groups of individuals who watch a lot of misinformation,” he adds.

Leave a Comment