Opinions

How Free is too Free?

Increasing violence today is beginning to highlight a foundational issue with digital media.

Reading Time: 4 minutes

In light of recent events, a couple weeks ago I deemed it fit to download X for the first time since getting a phone. I did so under the presumption that it was simply an online media platform meant for sharing ideas and news, of course bearing in mind the negative things I had heard since Elon Musk purchased the platform.


That being said, it’s hard to describe how dismayed I was when I learned the actual nature of the outlet. In late August, Iryna Zarutska, a white Ukrainian refugee, was fatally stabbed on a light rail train by Decarlos Brown, a schizophrenic black man. Immediately after creating an account, my For-You-Page was flooded with pictures of Zarutska and comments about what happened. I was appalled by the blatant racism present in these posts, with people saying things ranging from “he did it because he was black” to even suggesting that the judge who allowed him to walk free from his last felony charge did it because she herself was black.

All of this is an unfortunate reflection of an utter lack of content moderation. When Musk initially purchased X, it was a social media platform held by public shareholders, all of whom had significant power in deciding the logistics of how it was operated. They voted on members of a Board of Directors, which was responsible for overseeing the company and making corporate decisions, and they got to play a role in other more specific issues, such as major sale and acquisition decisions or challenging management to make other changes. All of these checks and balances factored into a nuanced outlook on how the company should be run, decreasing the amount of extremist racism, homophobia, sexism, and hate speech present on the forum. In the months following Musk’s purchase of the corporation, however, he fired over 3,000 moderators, leading UC Berkeley News to find that hate speech as a whole rose by about 50 percent. Musk has done nothing whatsoever to counter this in the three years since his purchase.


One of the worst parts of this issue is the amount of accessible graphic imagery. It was scarily easy for me to search up a zoomed-in video of the Charlie Kirk assassination or the full video of the stabbing and aftermath. Unavoidably, easy access to these violent videos comes with a torrent of people spreading conspiracies about what actually happened. I found some comments saying that Kirk wasn’t actually shot and that you could tell the blood was fake if you zoomed in 1000px. Other comments said that the man in the stabbing was a hallucination and none of what was captured on the security cameras was real. More devastatingly, though, repeated exposure to this kind of extreme graphic imagery desensitizes the public to physical and political brutality. Not only does this diminish the impact of violent incidents, but it sets a dangerous precedent for the future of the digital age: one where privacy is pushed to the back burner and media outlets become gruesome meat grinders of inappropriately violent content to satisfy an American audience with little compassion for suffering and human trauma.


The easy access to violence online has created a general attitude of desensitization and disconnection from human pain, an issue that is becoming more and more prevalent with each cruel and violent post. This leads to the conclusion that media platforms and news sources shouldn’t be individually owned or purchasable by a small group of like-minded people. When a forum that’s meant to be a place where people can freely express their opinions is run by one person (who presumably has their own opinions) or a group of biased people, it creates a breeding ground for hatred and prejudice, much as we’ve seen in action with X. Having corporate leadership that values diversity of opinion and nuance works to counteract the potential issue, providing a more diverse take on how to run the logistics of the platform. 


All this being said, the only thing more dangerous than a complete lack of moderation is government-mandated moderation. As bad as private ownership can be, social media platforms need to remain independent and uncontrolled—purely on the basis of the principles of freedom of speech. It’s a constitutional right to have an, at least in part, unregulated marketplace of ideas, and government control eradicates this freedom. It’s easy to argue that by the logic of this point, media censorship is also unconstitutional, but the issue with that is that the First Amendment isn’t objective; it doesn’t protect every word. “Fighting words,” or words that incite violence, “true threats,” and “offensive harassment,” many of which can be found on day-to-day media posts, are illegal and are not defensible with the First Amendment. This is the content social media moderators are meant to remove while still allowing for everyone to express their opinions in a compassionate and kind manner.


While it’s certainly true that other media organizations have similar issues, primarily forums like Reddit, Quora, and other more informal ones, it’s important to note that even they have some moderation. The video of Kirk’s shooting was accessible on Reddit for only about an hour before being taken down by content moderators, being labeled as unnecessarily graphic. On the other hand, X has left the video up under the premise of being “graphic but newsworthy.”


As part of the generation that will inherit this media and social climate, I will acknowledge that, of course, perfection is impossible. That said, it’s imperative that the world works towards creating decentralized media outlets that actually facilitate freedom of speech and expression in an appropriate manner. With the further development of the digital age, these kinds of incidents will only become more common, which is why making sure our digital platforms are as safe and welcoming as possible for everyone is even more important now than ever before.