Bots were the primary driver behind the spread of COVID-19 misinformation online, a new study suggests.
Researchers looked at the sharing of different links related to the pandemic in more than 300,000 posts, mainly regarding the use of masks, made to Facebook groups.
They used the timing certain links to misinformation were shared into these groups to measure the activity of bots, with frequent sharing – multiple posts of the same link into groups only seconds apart – being a sign of bot activity.
The team – led by the University of California San Diego in collaboration with researchers at George Washington University and Johns Hopkins University – are asking for Facebook and other social media giants to tighten restrictions and limit the spread of misinformation.
It’s worth noting that many assumptions about COVID-19 early on in the pandemic that were classified as ‘misinformation’ are now being reexamined such as the possibility of the origins of the virus, including that it may have come from a lab in Wuhan.
Researchers found that a large portion of misinformation about Covid-19, masks and the vaccines were spread by bot accounts on social media.
‘The coronavirus pandemic has sparked what the World Health Organization has called an “infodemic” of misinformation,’ lead author Dr John Ayers, a scientist who specializes in public health surveillance at the University of California, San Diego,
‘But, bots…have been overlooked as a source of COVID-19 misinformation.’
One of the links the researchers used for the study was a study performed in Denmark that found inconclusive data as to whether or not wearing a mask reduced transmission of COVID-19.
The study was misinterpreted, and used as a source of misinformation by many on social media, especially Facebook.
Researchers found that the post was often being shared by multiple accounts to multiple groups many times in the span of seconds, a sign that the accounts sharing the post were bots operating in the same network.
Almost 40 percent of times the post was shared on Facebook, it was done so in groups that the researchers flagged as having heavy bot activity.
One in five of those posts lied about the results of the study, saying that the researchers determined masks were harmful to their wearer – a conclusion that never appears in the research.
Posts of the study is Facebook groups with detected bot activity were 2.3 times more likely to share the false claim that masks hurt their wearer.
‘Bots also appear to be undermining critical public health institutions,’ said Brian Chu, study coauthor and medical student.
‘In our case study, bots mischaracterized a prominent publication from a prestigious medical journal to spread misinformation.’
‘This suggests that no content is safe from the dangers of weaponized misinformation.’
Researchers are calling for Mark Zuckerberg (pictured), CEO of Facebook, and other leading figures in the tech industry to take a stronger stance against public health misinformation. Not all members of the research field agree, though
The researchers are asking for Facebook and other social media giants to tighten restrictions on the spread of disinformation.
They believe that companies like Facebook can easily detect and censor false information produced by bots, as they themselves were able to detect much of the disinformation and bot-active groups themselves.
Researchers also fear that bots may be manipulating the algorithms used by these companies, as the massive sharing of these stories by bots may cause the algorithm to think they are more popular than they are, and boost them on users feeds.
‘Our work shows that social media platforms have the ability to detect, and therefore remove, these coordinated bot campaigns,’ said Dr David Broniatowski, associate director of the GW Institute for Data, Democracy, and Politics, and study coauthor.
‘Efforts to purge deceptive bots from social media platforms must become a priority among legislators, regulators, and social media companies who have instead been focused on targeting individual pieces of misinformation from ordinary users.’
Not all researchers agree, however.
Kamran Abbasi, executive editor of The BMJ, one of the oldest medical journals in England, wrote in an op-ed that social media platforms censoring these stories could be dangerous.
‘It seems 2020 is Orwell’s 1984, where the boundaries of public discourse are governed by multibillion dollar corporations (in place of a totalitarian regime) and secret algorithms coded by unidentified employees,’ Abbas wrote in reference to Facebook potentially censoring or labeling stories about the Danish study as ‘misinformation’.
‘Where is Facebook’s accountability for the lies and damaging misinformation that it has peddled on controversial topics such as mental health and suicides, minorities, and vaccines?
‘Facebook in particular purports to allow freedom of speech on its platform but acts selectively, seemingly without logic, consistency, or transparency.
‘That is how control of facts and opinions furthers hidden agendas and manipulates the public. ‘
Misinformation regarding Covid-19 has spread across the world as fast as the virus.
Last week, prominent feminist writer Naomi Wolf was suspended by Twitter after a series of posts spreading misinformation about the Covid-19 vaccines.
Naomi Wolf had made controversy in recent months due to a series of posts spreading misinformation about Covid-19 vaccines. The feminist writer was suspended on Twitter last week
Recent claims she had made on Twitter included saying that vaccines were software platforms that can receive uploads, and that sewage from vaccinated individuals could be dangerous to the drinking water supply, both claims with no scientific backing.
Facebook, Instagram and other platforms have also added features to combat vaccine misinformation, linking information about the vaccines automatically to any post made about the shots.
Facebook has even said they will outright remove certain posts making baseless claims about the vaccines.
The study will be available in the Journal of American Medicine on Monday.