(ABC4) – Facebook is taking new action against those who repeatedly share misinformation.
The social media giant already uses notifications to post that may contain inaccurate information but is launching new ways to inform you if you are interacting with content rated by a fact-checker. Stronger action will also be taken against users who repeatedly share misinformation.
“Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps,” Facebook says.
Before you like a Facebook Page that has repeatedly shared content fact-checkers have rated, you will see a pop up, like the one below, when you visit the Page.
You can click the pop up to learn more, including that fact-checkers found some posts shared by the Page include false information and a link to more information about Facebook’s fact-checking program.
While Facebook has taken action against Pages, Groups, Instagram accounts, and domains sharing misinformation, the company says it will now expand penalties to individual Facebook accounts. If an individual repeatedly shares content rated by a fact-checker, Facebook will reduce the distribution of all posts in News Feed from that individual’s account.
Facebook says it has redesigned the notifications sent to individuals when they share content a fact-checker later rates. New notifications include the fact-checker’s article debunking the claim and a prompt to share that article. Those who repeatedly share information labeled as false will also receive a notice that their posts may be moved lower in News Feed.
Facebook has recently been in the spotlight for its handling of misinformation and flagging accounts, especially after its Oversight Board decided to uphold the platform’s decision to restrict former President Donald Trump’s access to his account. The Board did call on Facebook to review its current rules.
Facebook originally created the oversight panel to rule on content on its platforms following widespread criticism of its difficulty responding swiftly and effectively to misinformation, hate speech, and nefarious influence campaigns. Its decisions so far have tended to favor free expression over the restriction of content, according to the Associated Press.