Opinion: It’s Time to Hold Social Media Platforms Accountable
We do not need another TikTok video inviting or suggesting school violence
TikTok: a place where we can watch cooking videos, silly dance moves, and, apparently, where people can post non-credible threats of school violence. Last month, dozens of school districts around the country announced they were closing because of a wave of anonymous TikTok videos citing shooting and bomb threats.
This is incredibly dangerous for our kids, schools, parents, teachers, and the community. Now more than ever it is time that social media corporations like TikTok are held accountable for the content that is posted to their site.
TikTok has become one of the most popular social media platforms of this generation. It was launched in 2016 by the Chinese technology company ByteDance and allows users to create, watch, and share short videos. As of late 2021, the application had roughly 1 billion active users. Yes, as in 1 out of 7 people in the entire world use TikTok. That is quite a lot of people.
Since this platform is so enormous, it can be tough to regulate what gets posted. Currently, TikTok has a safety team based in the U.S. and any content that is uploaded passes through a machine that inspects for policy violations. This is then reviewed by a human, part of the app's safety team, before it is posted. More recently, TikTok has utilized software that can automatically remove any videos that might violate its guidelines.
Clearly that software is not sufficient.
In mid-December, school districts from Texas to Michigan issued warnings, canceled classes, and increased security presence because of a viral TikTok video warning of an impending bombing or shooting. Even the Austin ISD Police Department beefed up security and monitored the national social media trend in late December, in an effort to prevent potential harm. Clearly, threats of school violence passed through the safety software, and this is not the first time.
TikTok, and other social media giants like Facebook and Twitter, have been criticized for spreading harmful content among children and young adults. In 2021, teachers had to ask TikTok to intervene when a challenge to "slap the teacher" went viral. Two years ago, 4,000 people viewed a livestream of mass killings that was posted on Facebook, which spread rapidly across the internet and was reposted countless times.
In response to these dangerous and jarring pieces of content, social media giants claim to continue to beef up their security measures. I'm not so sure.
A TikTok spokesperson responded to the alleged school violence threats, tweeting that "... we're working with law enforcement to look into warnings about potential violence at schools even though we have not found evidence of such threats originating or spreading via TikTok."
These are empty and broken promises. As dangerous content continues to permeate its way through the so-called safety measures that are in place at social media behemoths, we will continue to see issues like the school threats arise. We will continue to see hate speech, calls to incite violence, and other toxic pieces of content affect our children and endanger our communities.
As such, it is time that massive social media corporations are held accountable.
Currently, under Section 230 of the Communications Decency Act, platforms like TikTok, Twitter, and Facebook aren't treated as publishers and are technically not responsible for the content that users post. The law was created in 1996 and was designed to protect websites from a lawsuit if a user posts something illegal. Joe Biden has suggested revoking Section 230 entirely, which would be a good start. The administration could act by removing legal immunity from lawsuits for social media giants, especially those that refuse to be proactive in removing dangerous content.
Enough is enough. It is time for legislation that says social media networks can be held liable for damage caused by false information, harmful content, and incitements to violence that are shared on their platforms.
We do not need another video inviting or suggesting school violence. The clock is ticking, TikTok.
Annika Olson is the assistant director of policy research for the Institute for Urban Policy Research & Analysis. Annika is passionate about using research and legislative analysis to inform policies that impact the lives of vulnerable members of our community. She received a dual master’s degree in psychology and public policy at Georgetown University and her bachelor’s in psychology from the Commonwealth Honors College at UMass Amherst. Annika previously served as an AmeriCorps member with at-risk youth in rural New Mexico and Austin.