Opinions

A Job Looking at Abuse, Suicide, and Murder

The career of a social media moderator poses a mental health threat that makes it unsustainable for the future.

Reading Time: 2 minutes

Walking down the street, we don’t normally expect to see people eating Tide PODS or bleached strawberries. In newspapers, we don’t expect to see “scientists” telling us that the Earth is flat. However, on social media, these situations seem to be the norm. Radical, extreme content spreads like wildfire, and real people get hurt in the process. Fortunately, as people recognize the problem, they have demanded safe content—content that, at the very least, does not harm people. Unfortunately, the solution that many social media platforms have introduced is riddled with its own problems that make it infeasible as a permanent implementation.

Social media moderators are paid to review content on a platform and remove it if they deem such an action appropriate. Even though we are familiar with how graphic online content can be, the content that these moderators view for eight hours every day is extreme. There is animal abuse, cannibalism, suicide attempts, and child exploitation on these platforms that the general public is unaware of because moderators remove these types of disturbing content before they reach a larger audience.

Unsurprisingly, these moderators face a great deal of mental health problems. Former Facebook moderator Isabella Plunkett said that after two years of watching horrifying content, she developed anxiety and required medication. Out of 14 current and former moderators in Manila, one stated that he attempted suicide due to his trauma.

Though social media companies boast of wellness teams being available to moderators, the reality is not often as ideal. Plunkett stated that for Facebook, the wellness coaches are not trained psychiatrists, and she did not feel properly supported by them. The subcontracted staff was also limited to 1.5 hours of “wellness” a week. Ashley Velez, a contractor at TikTok, stated that a 30-minute meeting with a counselor felt rushed and superficial, because counselors are responsible for so many people that they cannot be attentive to each individual. In the Philippines, where many companies outsource their workers, moderators were limited to one session with their counselor each month, every six months, or sometimes not at all.

It is not easy for moderators to seek outside help from their loved ones either, as they are required to sign a non-disclosure agreement that prohibits them from sharing information about the content they view to anyone outside their workspace.

Moderators are further encouraged to neglect their wellbeing as they are desensitized from working long hours without breaks in order to meet certain targets. Velez and another former TikTok moderator Reece Young described that they were expected to review videos for following guidelines in under 25 seconds with 80 percent accuracy. They worked for 12 hours each day, with two 15 minute breaks and an hour for lunch.

Social media moderators should not be expected to work under such harsh circumstances. Companies need to realize that exposure to graphic content can have a severely damaging effect and that mental health risks are just as real as physical ones.

When it is clear that the mental health support currently offered isn’t enough, companies should start restricting working hours and offering more breaks. Their wellness teams should be large enough in proportion to the number of moderators and should be composed of licensed psychiatrists and therapists.

Ultimately, companies should focus on eliminating the risk for human moderators

entirely. As important as maintaining proper content guidelines are, it should not be done at the expense of putting people in active danger of mental disorders, especially if an alternative is available. AI technology should be replacing the function of moderators in the future, since a career as a social media moderator poses a mental health threat that is not sustainable.