When a gunman mounted a camera to his helmet and livestreamed on Twitch as he killed 10 Black people at a grocery store in Buffalo, N.Y. on May 14, it was the latest racist attack amplified through social media platforms.
In fact, the cold-blooded murders were just another “game” shared on Twitch, the preferred game-sharing platform.
The streaming and broadcasting of these horrific acts are part of the “gamification” of these attacks, according to the Royal United Services Institute, a U.K. thinktank that studies defense, security and international affairs. It defines “gamification” as a term where fatality counts in attacks are referred to as “scores” to note “achievements” or “high kill counts.” To make matters worse, Homeland Security Today, a news and analysis publication focused on homeland security issues, contends that gamification, will likely “continue to inspire future plots.”
Here’s the unfortunate truth:
A growing number of white men increasingly feel lost in this world and resort to social media, said Dr. Yong Jin Park, whose research focuses on social media and misinformation. Many sites give them the information they seek, he added. But Park, a professor in the Communication, Culture and Media Studies Department at the Cathy Hughes School of Communication, Howard University, told Marcom Weekly that for all the discussion of social media’s impact on mass violent events, the issue of gun control is raised but nothing much happens.
According to police, Payton S. Gendron, 18, who was arrested and charged in the incident this past weekend, after driving 200 miles from his home in Conklin, N.Y., said he was inspired by the 2019 Christchurch massacre in New Zealand. There, a gunman killed 51, mostly Muslims, during prayer at two mosques.
Although Twitch.tv, which is owned by Amazon, immediately took down Gendron’s video, according to the New York Times, a clip from the original video was shared across Meta’s Facebook and Twitter platforms after the shooting. Twitch, which was founded in 2011, is a platform that allows gamers to livestream and watch their favorite games in real time. In a statement reported by CNN, Twitch said the user “has been indefinitely suspended from our service, and we are taking all appropriate action, including monitoring for any accounts rebroadcasting this content.”
Gendron is also reported to have posted a 180-page manifesto on Google Docs, though it is not known how widely he might have shared it.
Experts say live broadcasts and the spreading of hate speech on social media platforms have a significant role in the rise of racial hate crime. The FBI said these incidents have risen to the highest level in 12 years, with an increase in assaults on Black and Asian Americans. The FBI said these domestic terrorists have killed more people in the U.S. than any group, including the 9/11 terrorists. This year, 198 mass shootings have been carried out in the U.S., so far. The FBI is worried that these incidents encourage more copycat mass killings.
Mutale Nkonde, an artificial intelligence policy analyst and founding CEO of AI for the People, told Marcom Weekly, “The internet provides a way for white supremacists to share ideas. They enthusiastically took this up during the early days of internet as noted by Jessie Daniels in her book Cyber Racism [ “Cyber Racism: White Supremacy Online and the New Attack on Civil Rights.”] She found white supremacists communities bought the URL www.MartinLutherKing.org in the 1990s and used it to defame the civil rights leader.
“This shooter was influenced by Dylan Roof who spent hours watching YouTube videos promoting disinformation about white on Black crime. This does not exist but incited enough anger for him to execute a group of African American people during Bible study at Mother Emmanuel Church.
“The video of the Buffalo shooting had been shared 1,000 times on Twitter before being banned, and so social media companies do not know how to respond. We cannot allow for less regulation on online speech as (Elon) Musk is promising because Black lives are at risk.” Musk has been in discussions about a deal to buy Twitter.
It’s not the first time in recent years that a “lone wolf” domestic terrorist with a social media account has threatened minoritized groups. In 2015 in Charleston, S.C., nine worshipers were murdered during Bible study at Mother Emanuel Church, after Dylan Roof, the convicted killer, promoted his racist ideas on his website. At an El Paso, Tex., Walmart in 2019, a man killed 23 Latino-Americans, after posting an anti-immigrant manifesto on 8chan, an online messaging site that promotes racist and anti-Semitic conspiracy theories. In 2018, 11 congregants killed in the Tree of Life Congregation synagogue massacre in Pittsburgh where the killer used the extremist social media website Gab to post messages just before he acted.
As in Buffalo, each of these mass shootings have three things in common: the influence of social media in spreading disinformation and conspiracy theories, as well as the horrific acts; a rise in white supremacist extremism; and access to weapons. The spread of misinformation on social media also has been implicated in the violent Jan. 6 insurrection attempt on the U.S. Capitol.
The impact of social media in mass shootings and other deadly events has raised questions about the responsibility those sites have in permitting violent and hateful content to proliferate. Researchers and experts all cite the impact of negative content on increased online engagement. According to Scientific American, algorithm manipulators and inherent risks in the use of artificial intelligence on privacy, bias, inequality, safety and security allow bad actors to exploit cognitive vulnerabilities in some users. Likewise, search engines can direct users to sites that provoke their suspicions, and social media makes it easy to connect like-minded people.
Automated media accounts, aka bots, and trolls that mimic real people can easily target the vulnerable by overloading them with coercive and exploitive messages. But some researchers said this problem is exacerbated by the deplatforming of far-right users from mainstream social media sites like Twitter, Instagram and Facebook, sending them to alterative platforms like Gab, Parler and Telegram, an app that allows end-to-end encrypted chats, and where white supremacists and neo-Nazis use toned-down, coded language to attract a wider audience.
For example, Gab, is an alt-right social media site known for conspiracy theories that blend features of Twitter, Reddit and Facebook with a dash of racism, anti-Semitism and misogyny. Founded in 2018, Parler is an alt-right site associated with Donald Trump supporters and implicated in the Jan. 6 insurrection attempt. It also dabbles in conspiracy theories pushed by QAnon, a political conspiracy movement.
What these sites have in common is a content moderation with a light touch and content with a heavy dose of racism, anti-Semitism and misogyny and aggrieved white men looking for action. In these places, users are free to discuss racist tropes like “Great Replacement” theory, a manifesto being used by white supremacists that is being more and more amplified in media organizations such as Fox and among conservative lawmakers, such as Steven King and Matt Gaetz. The theory rails against a “Hispanic invasion” of the United States and was implicated in August 2017 the white supremacist Unite the Right rally on the campus of the University of Virginia that targeted Jews. It was also cited in the Buffalo shooter’s manifesto. It builds on supremacist conspiracy theories that white people are being demographically and culturally “replaced” by a more diverse population, according to the ADL.
As for the alleged gunman in the Buffalo assault, Payton S. Gendron also used the Discord channel, which is dedicated to gun culture, to post his plans for an attack and found tips for how to protect himself with armored gear —just another chilling reminder of the threats of social media.