As the coronavirus continues to spread across the United States, Facebook is working to stop the spread of misinformation online.
Guy Rosen, Vice President of Integrity at Facebook, penned a blog post explaining how the social media giant will be informing users about misinformation on the coronavirus and how it will be stopping the spread of misinformation on its platform.
“Ever since COVID-19 was declared a global public health emergency in January, we’ve been working to connect people to accurate information from health experts and keep harmful misinformation about COVID-19 from spreading on our apps,” Rosen writes.
Facebook will be directing users to different sources such as the World Health Organization (WHO) and other health experts through its coronavirus information center.
Rosen reports Facebook works with fact-checking agencies to identify misinformation.
After the fact-checkers rate the misinformation, Facebook will “reduce its distribution and show warning labels with more context.”
Rosen gives examples of the misinformation they have flagged.
“Examples of misinformation we’ve removed include harmful claims like drinking bleach cures the virus and theories like physical distancing is ineffective in preventing the disease from spreading,” Rosen writes.
He outlines the additional steps Facebook is taking to notify its users.
Users will be notified in the News Feed if they have reacted to coronavirus misinformation through likes or comments. Users will then be directed to “myths debunked by the WHO.”
“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” the blog post reads.
Users will begin to receive these notifications within the next few weeks.
To provide more accurate information, Facebook added a new feature to its information center where users can get the facts on the coronavirus. The feature provides articles that have been fact-checked.