Anti-vax groups use carrot emoji to evade Facebook moderation

Anti-vaxxers protesters hold placard reading
Anti-vaxxers protesters hold placard reading "Vaccination kills" during their mass protest against the Covid quarantine restrictions in Kiev. (AFP file photo)
Short Url
Updated 16 September 2022

Anti-vax groups use carrot emoji to evade Facebook moderation

Anti-vax groups use carrot emoji to evade Facebook moderation
  • Vegetable emoji was used as substitute for terms such as vaccine, booster or COVID-19, making it almost impossible for content moderation algorithms to detect
  • Disinformation researcher at Qatar university noticed the trend after being invited to join one group

LONDON: Anti-vaxxer groups are using carrot emojis to evade automated moderation tools used by social media networks to detect news that does not comply with the platform policies, the BBC reported on Friday.

An investigation revealed multiple Facebook groups in which the carrot emoji was substituted for the word “vaccine.” Because Facebook’s algorithm normally concentrates on words rather than emojis, members were able to sidestep the platform’s automatic content moderation mechanisms.

According to the report, one Facebook group using this tactic had over 250,000 members.

The groups, which could only be joined on invitation, had clear guidelines and urged members to “use code words for everything” and “Do not use the c word, v word or b word ever,” referring to “COVID,” “vaccine” and “booster.”

The investigation also said that groups using the carrot emoji were promoting unverified claims that people are being hurt or killed by vaccines.

Marc Owen Jones, a disinformation researcher at Hamad bin Khalifa University in Qatar, noticed the trend after he was invited to join one of the groups and took to Twitter to share his findings.

“It was people giving accounts of relatives who had died shortly after having the COVID-19 vaccine”, he said. “But instead of using the words ‘COVID-19’ or ‘vaccine,’ they were using emojis of carrots.

“Initially I was a little confused. And then it clicked — that it was being used as a way of evading, or apparently evading, Facebook’s fake news detection algorithms.”

After the BBC reported the findings to Meta, the groups were taken down, though some reappeared shortly afterwards.

“We have removed this group for violating our harmful misinformation policies and will review any other similar content in line with this policy. We continue to work closely with public health experts and the UK government to further tackle COVID vaccine misinformation,” Meta said in a statement.

Meta, along with other social media platforms, has been under intense scrutiny in the past two years for failing to remove fake news about COVID-19 and vaccines.

Facebook said last year that it had removed more than 20 million pieces of content containing misinformation about COVID-19 or vaccines since the start of the pandemic.

Emojis are more difficult for algorithms to detect since the AI is trained on text and words, which may explain how these groups managed to go unnoticed for so long.

With emoji-based hate posing a growing challenge for automated detection, a team of researchers at Seattle University created a tool called HatemojiCheck, a test suite that exposes weaknesses in existing hate detection models and identifies hateful language expressed via emojis.