Facebook announced it’s launching new ways to inform people if they’re interacting with content that’s been rated by a fact-checker, as well as taking stronger action against people who repeatedly share misinformation on the platform.
Before a user likes a page that repeatedly shared misinformation, Facebook will give a pop-up to warn the user and include what fact-checkers said about some posts shared by the page that includes false information.
Facebook says it will also expanding penalties for individual accounts as well. Starting today, the platform will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of the fact-checking partners.
Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps.
The company also announced redesign notifications when people share fact-checked content. The notification includes the fact-checked article debunking the claim, as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so that other people are less likely to see them.
Recently, the platform started to test a “read before your share” pop-up. With a similar approach to what Twitter does, Facebook started suggesting the user read an article before sharing it.