Facebook and Its Reporting System

Posted by Arnav Gade.

If you have ever used social media like Instagram or Facebook, you may be familiar with the reporting system. Many pictures or videos are inexcusably too horrible for people to view, so they have moderators who check the reports and make sure the content is actually that bad. Sometimes if the content is a little sensitive or not breaking any guidelines, they can blur the image and prompt the user if they want to continue to view more. Although this system does seem to be a pretty good idea there are multiple flaws present. The current social media dynamic is the idea of personalized content, there are for your pages, timelines that use algorithms to put content you want to see, and they suggest people you want to follow. This idea of personalized content means that people often view these apps as personal only to them, and they almost get sucked into their own world. So, when they report an image or video, they feel like the process is automized and they don’t think about the human aspect, the corporations feel the same way, they view these moderators as a necessary evil. Someone who is like a janitor who has to sift through all the trash and keep the app clean.

The people in the article sued Facebook for their working conditions, and shows the first lawsuit against Facebook that comes from outside the U.S. They are seeking a 1.6-billion-dollar fund to compensate for poor working conditions, insufficient mental health benefits, and low pay. This case could shake up the entire company and the way we view content. The workers in Kenya say that they took the job to escape the war there, yet they have to spend all day viewing war footage and clips which keeps them mentally there. Facebook has also taken away their contracts while the court order pends, and the workers are alleging that Facebook is ignoring the orders to reinstate it.

The workers understand that it is very common for people in the U.S. to settle, but they want the people across the world to see what they are going through. They had originally signed up for consulting work, but the boss there exploited them in a low-income area and dumped them after they started to complain. The long-lasting psychological toll that these workers are facing is being ignored not only in Kenya but across the globe. Facebook had originally created these moderation hubs because of the hate speech circulating in Myanmar. The system in place is faulty, and they skirt around this by employing cheap labor overseas and people they can view as expendable, Facebook does not really care what happens to the workers who are mentally scarred, they can just find more desperate people and use them or throw money in their faces to get them to stop talking.

Arnav is a FinTech major at the Stillman School of Business, Seton Hall University, Class of 2025.

https://www.newser.com/story/337097/kenyan-facebook-content-moderators-job-is-torture.html