bbc news

Covid-19: Facebook to take down false vaccine claims

By Alistair Coleman
BBC Monitoring.

Facebook logo and an illustration of the Covid-19 virus
IMAGE COPYRIGHTGETTY IMAGES image captionFacebook wants to stamp out content that discourages vaccinations

Facebook says it will start removing false claims about Covid-19 vaccines to prevent “imminent physical harm”.

The company says it is accelerating its plans to ban misleading and false information on its Facebook and Instagram platforms following the announcement of the first vaccine being approved for use in the United Kingdom.

ORIGINAL NOTE: https://www.bbc.com/news/technology-55175222

Among already-debunked claims that won’t be allowed are falsehoods about vaccine ingredients, safety, effectiveness and side-effects.

Also banned will be the long-running false conspiracy theory that coronavirus vaccines will contain a microchip to control or monitor patients.

Facebook has come under fire for what’s been seen as a patchy approach to fake news and false claims, and misleading content about the pandemic is still widely available on its platforms.

What did Facebook say?

It says it will remove false claims about Covid-19 vaccines “that have already been debunked by public health experts”.

Facebook says that since January it has been removing content about the pandemic, such as false cures and treatments or claims that the disease doesn’t exist at all.

In October, it banned advertisements that discouraged people from taking vaccines.

This is a continuation of the policy “to remove misinformation about the virus that could lead to imminent physical harm”, the company said.

“This could include false claims about the safety, efficacy, ingredients or side effects of the vaccines [and] false claims that Covid-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list.

“We will also remove conspiracy theories about Covid-19 vaccines that we know today are false.”

However, Facebook warned that these policies, which the BBC understands have been brought forward following the approval of the Pfizer/BioNTech vaccine by the British medicines regulator, will take some time to come into effect.

“We will not be able to start enforcing these policies overnight,” a Facebook statement said.

A protester holds up an anti-vaccine placard
IMAGE COPYRIGHTGETTY IMAGES image captionAnti-vaccine sentiment has been picked up by believers of other conspiracy theories
Presentational grey line
Analysis box by Marianna Spring, Disinformation and social media reporter

Social media is awash with conspiracy theories about coronavirus vaccines, which resurface whenever news breaks – like the approval of the Pfizer/BioNTech vaccine for use in the UK.

These conspiracies are worlds away from legitimate concerns, making false allegations of microchipping, genocide and DNA alternation.

For that reason, many will welcome the announcement of this crackdown by Facebook – including politicians who have been calling for action from the social media giants.

It’s tougher than previous announcements – this time a commitment to removing conspiracies rather than just fact-checking or labelling these posts as misleading.

But fears remain that yet another commitment to tackle misinformation by the social media site will not translate into effective action – or be the right approach.

Conspiracy narratives about the coronavirus vaccine have become so prevalent, often originating in anti-vaccination and pseudoscience circles before spilling into parent chats, community forums and Instagram feeds.

Undoing those narratives months after they first began to spread may not be enough to reverse the seeds of doubt they have sown in communities around the world.

Presentational grey line

What response has there been?

While Facebook’s announcement has been broadly welcomed, there are concerns that the company might not follow through on its promises.

Imran Ahmed, CEO of the Center for Countering Digital Hate, said in a statement: “Today’s policy change is long overdue, but there is no guarantee that it will be properly enforced.”

“As we saw in the months following Facebook’s promise to remove misinformation about coronavirus earlier this year, they rarely enforce their own policies”.

Mr Ahmed went on to urge governments to accelerate plans to regulate harmful online content. “They must introduce tough regulations as soon as possible to ensure these policies are enforced, including criminal sanctions for breaches of their duty to remove harmful material that puts lives at risk.”

The UK government has been planning an Online Harms Bill for this very purpose, but it has been criticised over “unacceptable” delays in its publication, while Jo Stevens, Labour’s Shadow Secretary of State for Digital, called for emergency legislation “to protect people from this dangerous disinformation”.