All You Need
In One Single
Lorem ipsum dolor sit amet, consectetuer adipiscing elit, sed diam nonummy nibh euismod tincidunt ut laoreet dolore magna aliquam erat
Search here:


socially - content moderation

At some stage of our lives online, we have all called for more moderation on social media, and for more to be done about hate speech and racism.  But have we thought about how exactly the existing moderation happens?

When you see an image or video on Facebook that you deem to be inappropriate, you click the report button and press submit.  Job done, “the internet takes care of the rest”.  What really happens when you submit a report is a person (content moderator) gets a notification while sitting in his/her office and has to view the reported content to see whether it needs to be removed or not.

Sounds like a great job,  scrolling through Facebook videos all day?  Well, you wouldn’t be saying that when you sit and think of what the job actually entails.  These content moderators HAVE to view each video that gets sent their way, and on the video could be anything, literally anything.

Speaking to a Dáil committee yesterday, Isabella Plunkett, a Facebook content moderator has spoken out to address what it is exactly she and her colleagues face on a day to day basis.

The graphic violence, the child stuff, the exploitation and the suicides, people working from home don’t get that - the burden is put on us.

Isabella goes on to say that she could face up to 100 of these reported videos each and every day.  On gaining the job, Content moderators are told to sign an NDA (Non Disclosure Agreement) preventing them from talking about specific content they have seen during their time working, and expectedly it takes it’s toll on the mental health of all of those who has to view.  The idea is to eventually have the content moderation all automated, but between now and then there needs to be human interaction to “train” the system.

This training does come at a personal cost of those who view it each and every day.  “I’m now seeing the content I view in work in my dreams. I remember it, I experience it again and it is horrible.

We owe a lot of thanks to people like Isabella and everyone in content moderation, without them Facebook would be a much scarier place to be for us and our children.