Meta Platforms said on Friday a policy that was put in place to curb the spread of misinformation related to COVID-19 on Facebook and Instagram would no longer be in effect globally.
Social media platforms such as Facebook and Twitter came under immense pressure to tackle misinformation related to the pandemic, including false claims about vaccines, prompting them to take stringent measures.
Earlier in 2021, Facebook said it took down 1.3 billion fake accounts between October and December and removed more than 12 million pieces of content on COVID-19 and vaccines that global health experts flagged as misinformation.
The Facebook parent in July last year sought the opinion of its independent oversight board on changes to its current approach, given the improvement in authentic sources of information and general awareness around COVID.
However, Meta said on Friday that the rules would still stand in countries, which still have a COVID-19 public health emergency declaration, and the company would continue to remove content that violates its coronavirus misinformation policies.
"We are consulting with health experts to understand which claims and categories of misinformation could continue to pose this risk," Meta said in a blog post.
Earlier in November, Twitter also rolled back its COVID-19 misinformation policy.
(Reporting by Tiyashi Datta in Bengaluru; Editing by Anil D'Silva)