Facebook will start nudging users who have liked coronavirus hoaxes – Vox.com

Facebook just announced another step in trying to set the record straight about misleading information about the novel coronavirus on its platform. In the coming weeks, the company will start directing people who have previously liked, reacted, or commented on harmful misinformation about Covid-19 to information from more authoritative sources, such as a myth-busters website by the World Health Organization (WHO).

This represents one of the first times Facebook will warn a specific set of users who have interacted in the past with false information. Experts have argued for years that the social network is rife with misleading information that taints public discourse and asked Facebook to take the retroactive approach.

We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook, wrote Guy Rosen, Facebooks vice president of integrity, in a company blog post released on Thursday. The notifications will apply only to Facebook and not its other platforms like Instagram and WhatsApp.

Since the beginning of the Covid-19 crisis, social media users have been posting popular and dangerous hoaxes about the virus, including false cures and myths about the origins of the outbreak. In response, companies like Facebook, YouTube, and Twitter have been stepping up their measures to flag and sometimes delete this type of content. In March alone, Facebook says it labeled 40 million posts as false on its network, relying on its team of independent third-party fact-checkers. Still, plenty of bad information that fact-checkers dont catch regularly slips through the cracks or is caught only after tens of thousands of users have already seen the posts in question.

Facebook included a screenshot of a new News Feed message that will show users who have interacted false coronavirus information, pointing them to WHO resources.

The design Facebook provided looks more like a gentle nudge than a specific warning, and some have called for stronger notifications that specifically correct the record on individual false claims. When Recode asked about what the notifications will look like, a spokesperson for Facebook said the design in the blog post is an early version and that its also testing more explicit variations.

Well continue to iterate on these designs with a goal of ensuring people whove been exposed to harmful misinformation about Covid-19 are connected with the facts, said the spokesperson.

For Facebooks critics, the move toward retroactively notifying users about misleading Covid-19 content, even if incremental, is a welcome development. Many academics and policy experts have scolded Facebook for lax moderation not only on Covid-related topics but also on misinformation around issues like immigration and politics. Historically, Facebook and other social media companies have been reluctant to flag politically contentious information as false, arguing that over-policing content on their platforms could limit free speech. But with the coronavirus, the company has adopted a more aggressive approach.

[T]he company has taken a key first step in cleaning up the dangerous infodemic surrounding the coronavirus, but it has the power to do so much more to fully protect people from misinformation, wrote Fadi Quran, campaign director at nonprofit activist group Avaaz, in a statement to Recode. Avaaz is one of several groups that has been pushing for stronger fact-checking and for corrections to be issued more broadly on the platform, not just on content about Covid-19. New research commissioned by the organization shows that Facebook corrections have a major impact in shaping users views and can effectively reduce peoples belief in misinformation by 50 percent.

Facebook declined to answer a question from Recode about whether it will apply its warnings to other types of misinformation in the future.

For companies like Facebook, its a lot easier to draw a line in the sand on misinformation about coronavirus topics than around more politically contentious ones, like gun rights, abortion, immigration, or even the 2020 US elections. While theres still plenty of uncertainty about Covid-19 even alleged biases of authoritative sources like the WHO have come under question its still much easier to prove why a hoax about Covid-19 is wrong than it is to confirm the veracity of a personal attack on a politician.

In the next few weeks, well learn more about what these new retroactive notifications to Facebook users that have been exposed to coronavirus misinformation will look like, not to mention how widely theyre being sent out. But it may take much longer before we see if Facebooks new moderation strategy will be a lasting mechanism to fight misinformation online more broadly.

Support Voxs explanatory journalism

Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that has the power to save lives. Our mission has never been more vital than it is in this moment: to empower you through understanding. Voxs work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources particularly during a pandemic and an economic downturn. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts at the quality and volume that this moment requires. Please consider making a contribution to Vox today.

Originally posted here:

Facebook will start nudging users who have liked coronavirus hoaxes - Vox.com

Related Posts
Tags: