In an attempt to curb misinformation on its platform, Facebook is launching new ways to inform people if they’re interacting with content that’s been rated as misinformation by a fact-checker as well as taking stronger action against people who repeatedly share misinformation.
From false or misleading content about COVID-19 and vaccines, to climate change, elections, or other topics, the tech giant is taking steps to combat misleading information and making sure that fewer people see misinformation on its platform.
What Facebook is saying
Facebook has also redesigned the notifications that pop up when users share content that fact-checkers have labelled as false.
An excerpt from the blog post by Facebook, stated, “We currently notify people when they share content that a fact-checker later rates, and now we’ve redesigned these notifications to make it easier to understand when this happens. The notification includes the fact-checkers article debunking the claim as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them.”
How Facebook will notify users of fake news pages
If you try to like such a page, you will see a pop-up saying that the page has “repeatedly shared false information,” and that “independent fact-checkers said the information is false.”
You will then be presented with a choice of going back to the previous page or following the page anyway. This will help people make an informed decision about whether they want to follow the page.
Facebook to penalise faulting accounts
Facebook also said it would expand penalties for individual Facebook accounts that repeatedly share misinformation by reducing the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by its fact-checking partners.
In case you missed it
- Facebook has often reiterated its commitment to join in the fight against spreading misinformation about Covid-19 and vaccines amidst spikes of infection by its subscribers on all its social networks including removing such major groups, accounts and Instagram pages for repeatedly spreading misinformation.
- Facebook-owned Instagram will also make it harder to find accounts that discourage vaccination, and remove them if they continuously violate the rules.