LowEndBox - Cheap VPS, Hosting and Dedicated Server Deals

The Invisible Cost of a Clean Internet: Burning Out Poor People's Souls to Moderate Online Content

Content ModerationIf you’ve ever reported something on a major social media platform, you may wonder what happens to that report.  Some of it is handled by AI, but quite a lot is handled by humans, who also review things that automated scans of Google Drive, iCloud, and other platforms reveal.

I’ve seen pictures of atrocities, photos from the Holocaust, gruesome car wreck pictures, animals that were abused, etc. and these appalling visuals tend to stick with you.  I mean, I still remember what goatse looks like, and that was not even abuse.

Now imagine it was your job to look at images of violent, disturbing, and CSAM content all day long.  You’re watching beheadings, animal abuse, torture videos, and every other depravity…endlessly, for up to 12 hours.

In theory, AI will do this work some day but today there are over 100,000 workers who do this work, most of whom are contract workers in poor countries.  Lured with promises of “tech jobs” and placed in relatively plush corporate campuses, they sit in front of screens for hours and hours all day long as reported images appear, and they’re asked to confirm that they are, in fact, vile.

I heard a reference in an interview to the Data Workers Inquiry, an advocacy group that seeks to bring the stories of typical content moderators.  They detail what I think are two main areas of concern.

First, there is the typical contract-labor experience in poor countries.  Because the employer is offering steady wages in a depressed economy, they have enormous leverage over employees.  They can work them hard, refuse any accommodations, and dispose of anyone who pushes back because there’s a long queue of impoverished people competing for these jobs.  There are few social protections, abuse of employees is common, and these contracts can be terminated on short notice.

And second, the work itself is emotionally horrifying.  The people who take these jobs are viewing the absolute worst content and filling their minds – for 8 to 12 hours a day – with a constant stream of repugnant images.  There has never been anything like this in human history and it’s not something human psychology can handle

Police, EMTs, firefighters, crime scene photographers, etc. often see horrifying things.  But they experience balancing events these moderators don’t.  They catch criminals, save lives, and often get closure.  They see positive outcomes as well as bad ones.  They have tight-knit communities and support around them.  They get to see the entire picture of what’s going on, in full context.  Moderators just see snapshots of peak horror.  Your typical police officer is not seeing tens of thousands of traumatizing images every single work day.

Studies show moderators can develop post-traumatic stress disorder from repeated exposure, and symptoms can include intrusive thoughts, hyper-vigilance, nightmares, and emotional numbing.  That’s quite a price to pay for your job.  According to a TIME magazine article published last week, more than 50% of moderators met criteria for clinical depression.  The same article mentions that 28% turn to substance abuse as a coping mechanism.

Clearly, content moderation is needed, but there needs to be a better way to handle this.  AI is the long-term solution (in theory)…and of course, the meta-solution is to have less of this activity in the world, but the human heart has resisted improvement for at least a couple million years.  The people who do this work should be treated better, with proper psychological support and limited exposure, instead of working in cubicle sweatshops.

The next time you login to Instagram or Reddit, remember that there’s a human cost for your clean feed.

 

No Comments

    Leave a Reply

    Some notes on commenting on LowEndBox:

    • Do not use LowEndBox for support issues. Go to your hosting provider and issue a ticket there. Coming here saying "my VPS is down, what do I do?!" will only have your comments removed.
    • Akismet is used for spam detection. Some comments may be held temporarily for manual approval.
    • Use <pre>...</pre> to quote the output from your terminal/console, or consider using a pastebin service.

    Your email address will not be published. Required fields are marked *