Facebook announced today more changes in its policy against fake news on its social networks. The company will now be more aggressive in punishing users who share misleading information on Facebook and also Instagram.

As reported by The Guardian, this is a strong move by Facebook to avoid anti-vaccination posts, which have become more frequent as COVID-19 vaccines arrive. The social network has confirmed that it will remove any content that has been disproved by public health experts.

A Facebook spokesperson said these changes will soon be applied to all Facebook and Instagram users around the world.

Given the recent news that Covid-19 vaccines will soon be rolling out around the world, over the coming weeks we will also start removing false claims about these vaccines that have been debunked by public health experts on Facebook and Instagram.

Last year, Facebook banned advertisements with misinformation about vaccines in an attempt to stop the spread of vaccine hoaxes.

It’s not only Facebook that has been trying to prevent the dissemination of fake news. Twitter is also becoming more aggressive about informing users when content has misleading information as the social network recently included alerts for disputed tweets.

In the face of a complicated situation such as the COVID-19 pandemic, big tech companies have been investing more to deliver trustworthy news and avoid misleading content.

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author