Opinion: It’s time to empower social media platforms: we don’t need another TikTok video inviting or suggesting violence in schools – Columns
TikTok: A place where we can watch cooking videos, silly dance moves, and, apparently, where people can post unreliable threats of school violence. Last month, dozens of school districts across the country announced their closures amid a wave of anonymous TikTok videos citing gunfire and bomb threats.
It is incredibly dangerous for our children, our schools, our parents, our teachers and the community. Now more than ever, it’s time for social media companies like TikTok to be held accountable for the content posted on their site.
TikTok has become one of the most popular social media platforms of this generation. It was launched in 2016 by Chinese tech company ByteDance and allows users to create, watch, and share short videos. At the end of 2021, the app had around 1 billion active users. Yes, because 1 in 7 people worldwide use TikTok. That’s a lot of people.
Since this platform is so huge, it can be difficult to regulate what gets released. Currently, TikTok has a US-based security team, and all uploaded content goes through a machine that inspects for policy violations. This is then reviewed by a human, a member of the app’s security team, before being posted. More recently, TikTok has used software that can automatically remove any videos that may violate its guidelines.
It is clear that this software is not enough.
School districts from Texas to Michigan issued warnings, canceled classes and increased security presence in mid-December due to viral TikTok video warning of impending bombing or shooting . Even the Austin ISD Police Department stepped up security and monitored the national social media trend in late December, in a bid to prevent potential harm. Clearly, threats of school violence have passed through the security software, and this is not the first time.
TikTok and other social media giants like Facebook and Twitter have come under fire for spreading harmful content among children and young adults. In 2021, teachers had to ask TikTok to step in when a challenge to “slap the teacher” went viral. Two years ago, 4,000 people watched a live broadcast of the mass killings posted on Facebook, which quickly spread across the Internet and has been reposted countless times.
In response to this dangerous and jarring content, the social media giants claim to continue to tighten their security measures. I am not so sure.
A spokesperson for TikTok responded to alleged threats of school violence, tweeting that “… we are working with law enforcement to review warnings about potential violence in schools, even though we don’t. found no evidence of such threats originating or spreading through TikTok. “
These are empty and broken promises. As dangerous content continues to find its way through the so-called security measures that are in place in social media behemoths, we will continue to see issues like threats to school arise. We will continue to see hate speech, incitement to violence and other toxic content affecting our children and endangering our communities.
As such, it’s time for the big social media companies to be held accountable.
Currently, under section 230 of the Communications Decency Act, platforms like TikTok, Twitter, and Facebook are not treated as publishers and are not technically responsible for the content that users post. The law was created in 1996 and was designed to protect websites from a lawsuit if a user posts something illegal. Joe Biden suggested revoking Section 230 entirely, which would be a good start. The administration could act by removing legal immunity from prosecution for social media giants, especially those who refuse to be proactive in removing dangerous content.
Enough is enough. It’s time to pass legislation that states that social media networks can be held liable for damage caused by false information, harmful content and incitement to violence that is shared on their platforms.
We don’t need another video inviting or suggesting school violence. Time is running out, TikTok.
Annika Olson is Deputy Director of Policy Research at the Institute for Urban Policy Research & Analysis. Annika is passionate about using research and legislative analysis to inform policies that impact the lives of vulnerable members of our community. She received a double MA in Psychology and Public Policy from Georgetown University and her BA in Psychology from Commonwealth Honors College at UMass Amherst. Annika was previously an AmeriCorps member working with at-risk youth in rural New Mexico and Austin.