Combating the Dangers of PTSD in Content Moderation

Combating the Dangers of PTSD in Content Moderation

If you are involved with jobs that require the consumption of horrific and disturbing media–journalism, human rights activism, or content moderation—then you need to be aware of the very real dangers of secondary trauma. Workers who have to view an unending stream of violent and traumatic imagery risk their mental health every day. Fortunately, there are several options available to mitigate these dangers.

Post-traumatic Stress Disorder (PTSD) is a persistent condition of emotional and mental distress, generally caused by experiencing a terrifying event and resulting in sleep disturbance, flashbacks, and intrusive thoughts. While it was once thought to be a danger to those with firsthand experiences, newer research suggests that even viewing recorded images of these events can cause the same reaction.

Workers can perform risk management themselves. Examples include: using a smaller image window for viewing disturbing content; taking comprehensive notes on the material, so that it does not need to be viewed multiple times; and keeping “distraction files” of beautiful or appealing images (think pictures of cute animal or vacation destinations) to view during breaks. All of these viewer-based methods can work, but experts say that the best aid comes from the top.

Managers of these workers must understand the nature and dangers of this kind of content consumption and prepare an overall workflow that aids mental health. First and foremost is an understanding and acceptance of the importance of mental health as a concept. Many employers are quick to accommodate physical health concerns but fall short when it comes to mental and emotional health. Second, it is imperative for management to create a work environment tailored for this type of work; this means scheduling breaks and preparing a comfortable and aesthetically pleasing workspace to combat these stresses. And finally, it is vital to screen employees for mental health regularly. Don’t simply wait for someone on your staff to ask for counseling; taking the lead on mental health sets the expectation that the employees are allowed and even encouraged to seek out these services.

Organizations for journalism and human rights activism have been working to implement these kinds of steps. Content moderation, however, faces more complicated hurdles. Social media platforms often receive much greater numbers of potentially disturbing images, which requires many more moderators. And while many of these tech companies acknowledge the risks associated with content moderation, they are also quick to promise to hire more moderators whenever they face any content-related controversy.

Additionally, because most of these companies use contracted moderators, there is always the push to reduce overhead to remain competitive. Cutting overhead often cuts the safeguards for mental health. Many corporations have begun integrating image moderation API into their digital platform to reduce the overall number of images that need to be moderated, but this can’t solve everything. To truly make these necessary changes, the entire industry needs to acknowledge the danger and work together to make a difference.

Digital Marketing