Facebook takes down China-based network spreading false claims about COVID-19: NPR
Kirill Kudryavtsev / AFP via Getty Images
The parent company of Facebook and Instagram said on Wednesday it had taken down more than 600 accounts, pages and groups linked to Chinese influence operations that spread COVID-19 misinformation, including one account of purporting to be a fictional Swiss biologist.
The China-based network is one of six Meta, formerly known as Facebook, was removed in November for abusing its platform, a reminder that bad guys around the world are using the social network to promote misinformation and harassment against player.
Other operations include one in support of Hamas and two others, based in Poland and Belarus, focusing on migration crisis on the common borders of the countries.
Meta also removed a network linked to the anti-vaccination conspiracy movement in Europe that harassed doctors, elected officials and journalists on Facebook and other internet platforms, as well as a group of accounts in Vietnam that had brought activists and government critics to Facebook in an attempt to get them banned from the social network.
The China-based operation came to light after the company was alerted to an account purporting to be a Swiss biologist named Wilson Edwards (no such person exists). . The account posted statements on Facebook and Twitter in July that the US was pressuring World Health Organization scientists to blame China for the COVID-19 virus. Posts alleging US intimidation soon appeared in stories in Chinese state media.
Ben Nimmo, who investigates influence practices at Meta, wrote in the company. The report. Meta has been working with individuals in China, he said, and who are “associated with Chinese state-owned infrastructure companies around the world”.
China’s activity is an example of what Meta calls “coordinated disinformation,” in which adversaries use fake account for influence activities, as Russian agents did by impersonating Americans on Facebook in the run-up to the 2016 US presidential election.
But recently, Meta’s security team has broadened its focus to root out accounts real people who are collectively causing harm both on Facebook and offline.
It was the rationale used to delete a network of accounts in Italy and France associated with the anti-vaccination movement known as V_V. According to a The report from research firm Graphika, the group primarily coordinates on the messaging app Telegram, but “apparently mainly targets Facebook, where members display the group’s double V in their profile pictures.” and assembled the comment sections of pro-COVID-19 vaccine posts with hundreds of abusive messages.” Graphika said the group also destroyed medical facilities and attempted to disrupt vaccination programs. public.
Meta said the people behind the network used real, duplicate and fake accounts to comment on Facebook posts and threaten people. That breaks the company’s rules against “brigading”. Meta says it doesn’t ban all V_V content but will take further action if more rule violations are found. It did not say how many accounts in the network it deleted.
The company admits that even as it becomes faster to detect and delete accounts that violate the rules, it is still playing a cat-and-mouse game.
Nathaniel Gleicher, Meta’s head of privacy policy, wrote in a blog post on Wednesday. “We built the defense with the expectation that they would not stop, but adapt and try new tactics.”
Editor’s Note: Meta pays NPR to license NPR content.