12 state AGs push Facebook, Twitter to crack down on anti-vaxxers

▶ Watch Video: Facebook plans to help 50 million users get vaccinated

Washington — A group of 12 state attorneys general on Wednesday urged Facebook and Twitter to “take immediate steps” to crack down on online “anti-vaxxer” falsehoods amid the ongoing effort to vaccinate the public against COVID-19.

In a letter to Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey, the attorneys general pressed the social media giants to fully “identify and enforce” the companies’ terms of service to combat against vaccine disinformation and misinformation.

“A small group of individuals use your platforms to downplay the dangers of COVID-19 and spread misinformation about the safety of vaccines,” the group said. “These individuals lack medical expertise and are often motivated by financial interests.”

The letter was signed by the attorneys general of Connecticut, Delaware, Iowa, Massachusetts, Michigan, Minnesota, New York, North Carolina, Oregon, Rhode Island, Pennsylvania and Virginia.

“Anti-vaxxer” accounts on Facebook, YouTube, Instagram and Twitter reach more than 59 million followers, the group said, citing a recent report by the Center for Countering Digital Hate. The attorneys general also noted that anti-vaccine social media accounts target individuals of color and Black communities in particular, who have been disproportionately impacted by the virus and whose vaccination rates lag behind other demographic groups.

And while willingness to get the vaccines has increased, according to an Edelman Trust Barometer poll released earlier this month, more than 40% of respondents who identified as Black and Latinx said they could not get an appointment or get to a vaccination site, compared to 34% of Asian respondents and 24% of White respondents. People of color are more likely to contract COVID-19 than White people, and Black, Hispanic and Native Americans are two to three times more likely to require hospitalization, according to Centers for Disease Control and Prevention data. Among minority populations, vaccine hesitancy may be  compounded by a history of medical abuse of racial and ethnic minorities.

“‘Anti-vaxxers’ are using social media platforms to target people of color and Black Americans specifically, members of communities who have suffered the worst health impacts of the virus and whose vaccination rates are lagging,” the letter to the tech giants read.

The attorneys general also said that Twitter and Facebook have not taken action against some prominent anti-vaccine accounts that repeatedly violate the companies’ guidelines.

“The only way that we will be able to get back to normal as a country is through widespread use of safe, effective vaccinations, which is why it’s so important that everyone who can be vaccinated is,” Virginia Attorney General Herring said in a statement. “The spread of misinformation about COVID vaccines over social media sites like Facebook and Twitter could be detrimental to the national effort to end the coronavirus pandemic.”

A spokesman for Twitter told CBS News that “making certain that reliable, authoritative health information is easily accessible on Twitter has been a priority long before we were in the midst of a global pandemic.” The spokesman said the company had removed 22,400 tweets under its COVID-19 “misleading information policy” and “challenged” nearly 12 million accounts.

Dani Lever, a spokesperson for Facebook, said the company is “continuing to work with health experts to make sure that our approach and our policies are in the right place.” Lever said Facebook has removed more than 2 million pieces of content since February.

Social media companies have grappled with misinformation about vaccines for years, well before the coronavirus pandemic. In 2019, Facebook began de-prioritizing medical myths across the platform, removing “verifiable vaccine hoaxes” from News Feeds, public and private pages and groups, search predictions and recommendations.

Last month, Facebook widened its ban on vaccine misinformation and pledged to remove false information claiming that COVID-19 is man-made, that vaccines are not effective against the diseases they are meant to prevent, that vaccines cause autism and that it is safer to get COVID-19 than receive the vaccine.

But policing false information around vaccines poses challenges to social media companies sorting through both misinformation and disinformation, as online content runs the gamut from users expressing genuine concerns to others deliberately amplifying false information.

“Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful,” Kang-Xing Jin, Facebook’s head of health, wrote in an op-ed in the San Francisco Chronicle this month. “It’s hard to draw the line on posts that contain people’s personal experiences with vaccines.”

Facebook has conducted its own study of American users’ vaccine doubts, according to The Washington Post, which found that even content that does not violate the company’s terms and conditions may cause harm in certain communities.

The company launched an effort earlier this month to help users get reliable information about vaccines and schedule appointments to get their shots. Chris Cox, Facebook’s chief product officer, told CBS News that the new tools are meant to help curb the spread of harmful content.

“[Misinformation] is certainly something we’re taking very seriously,” said Cox. “There’s a lot of information out there about COVID, and there’s a lot of questions about COVID. We took the step, as you know, of taking a stand that for anything that was a viral piece of misinformation or a hoax spreading misinformation about COVID, we were going to work with fact-checkers to identify those and that we were going to take it down to remove it from our platforms.”

Earlier this month, Twitter announced it will begin applying labels to tweets that include misleading information about COVID-19 vaccines, introducing a “strike” policy to curb users from repeatedly violating those guidelines.

The company announced in December that it may require users to remove tweets that advance harmful, false or misleading narratives about vaccines, including suggestions that immunizations and vaccines are used to intentionally control citizens.

Both Zuckerberg and Dorsey, as well as Alphabet and Google CEO Sundar Pichai, plan to testify before the House Energy and Commerce Committee on Thursday in a hearing about social media’s role in promoting extremism and misinformation.

More from WTOP

Log in to your WTOP account for notifications and alerts customized for you.

Sign up