Social media was initially created for people to positively interact with one another, but the dark side has always existed — from cyberbullying to false information that misled or intimidated voters during the 2020 presidential election. This year, misinformation on social networks is among the greatest threats to mass vaccination in BIPOC communities.
According to a new study by Brilliant Corners and National Urban League (NUL), 40 percent heard or read something that made them less likely to take the vaccine when given the chance, is a result of misinformation. That number is even higher for Black Americans, as theGrio reports, at 48 percent.
It is no secret that fewer people of color have been vaccinated compared to white Americans. As of March 15, the Centers for Disease Control and Prevention (CDC) reported national COVID vaccinations among race and ethnicity as 66 percent white, nine percent Hispanic, eight percent black, two percent Asian and less than one percent of vaccinators as native Hawaiian or Pacific Islander.
Historically, vaccination hesitancy can be traced back to racial bias in medicine that sowed distrust amongst people of color and their physicians. The study by Brilliant Corners for the National Urban League (NUL) looks at understanding the racial disparities during COVID and how to drive positive attitudes towards vaccination.
Two-thirds of all U.S. adults use social networks, so the majority of Americans receive news through posts made on social media. This raises the question of social networks like Facebook, Instagram, and Twitter should be held responsible for misinformation promoting vaccine hesitancy.
While Brilliant Corner’s study does not specifically hold social networks responsible for misinformation, a few social platforms like Facebook and Instagram are working against misinformation on their sites.
In February, Facebook announced a list of content that will be removed from their platform during the pandemic due to a violation of their Community Standards. This new policy also enables the platform to remove false information regarding COVID-19, COVID vaccines, and claims about vaccines in general. These measures also apply to Instagram, which is owned by Facebook.
Since launching this policy, Facebook has connected two billion people to credible information through the largest online vaccine information campaign in history.
“We’ve removed two million pieces of content on Facebook and Instagram that violate our COVID-19 and vaccine misinformation policies,” said Facebook spokeswoman Dani Lever, who also explained that Facebook is amplifying content directly serving communities where vaccine intent and access may be lower.
“We are providing free ads to health organizations to target and promote reliable information about COVID-19 vaccines,” Lever said. “In the U.S., we’re partnering with Johns Hopkins University to reach BIPOC communities, among others, with reliable information that addresses these communities’ questions and concerns.”
Although Spotify is not a social network, the social platform understands the responsibility they have in driving change and raising awareness about the critical issues society is facing.
A Spotify spokesperson told Marcom Weekly: “Spotify prohibits content on the platform which promotes dangerous, false, deceptive, or misleading content that may cause offline harm or pose a direct threat to public health.”
The audio streaming service, which had 144 million premium subscribers worldwide in 2020, uses various algorithms and human detection measures to ensure content on Spotify is in line with the policy stated above. “When content is found to be in violation, it is removed from our platform,” said the spokesperson.
Recently, Spotify partnered with the Ad Council to launch “It’s Up To You,” a COVID-19 vaccine education initiative where they will produce custom audio PSAs and messaging for podcast hosts to promote COVID-19 vaccine awareness and education.