Facebook veteran and Meta’s head of virtual reality Andrew Bozworth says that “individual humans” are to blame for the spread of misinformation.
“If we took every single dollar and human that we had, it wouldn’t eliminate people seeing speech that they didn’t like on the platform. It wouldn’t eliminate every opportunity that somebody had to use the platform maliciously,” he said in an interview with Axios.
“Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,” Mr Bosworth continued.
“I don’t feel comfortable at all saying they don’t have a voice because I don’t like what they said.”
Meta’s platforms – Facebook, Instagram, and WhatsApp – have all been used to spread misinformation about the coronavirus pandemic.
Researchers running experiments on the platform found that two brand-new accounts they had set up were recommended 109 pages containing anti-vaccine information in just two days.
A study conducted by the non-profit Centre for Countering Digital Hate and Anti-Vax Watch suggested that close to 65 per cent of the vaccine-related misinformation on Facebook was coming from 12 people.
Facebook, however, said those people were only responsible for 0.05 per cent of all views of vaccine-related content on the platform.
“If your democracy can’t tolerate the speech of people, I’m not sure what kind of democracy it is. [Facebook is] a fundamentally democratic technology”, Mr Bozworth said in the interview.
Recently, it was exposed that Facebook had a secret VIP list that allowed high-profile users to break its policies. Approximately 5.8 million celebrities, politicians, and journalists to be “whitelisted” from violating Facebook’s rules under the “cross check” or “XCheck” system.
“We are not actually doing what we say we do publicly,” said the review from Facebook into XCheck, calling the actions “a breach of trust.”
It continued: “Unlike the rest of our community, these people can violate our standards without any consequences.”
Facebook’s algorithm has also been criticized for inherently promoting inflammatory views. A 2018 presentation within the company, leaked last year, showed that it knew its algorithm encouraged divisiveness but moves to stop it would be “antigrowth” and require “a moral stance”.
Facebook did not respond to a request for comment from The Independent before time of publication.