bannerebay

Moderation Of Content: A Factory Of Mental Disorders

by ace

One of the most unpleasant tasks of the technology industry is working as a content moderator. Being bombarded daily by conspiracy theories, videos of bizarre sexual fetishes and even actual scenes of murder and mass extermination of people can have devastating effects on the mental health of these moderators.

And while companies like Facebook, Google, and Twitter ensure that they provide the necessary protections for these people to have a healthy working life, the reality may not be as secure as these companies try to sell.

A report published in February by The Verge shows the lives of some of those moderators who worked for Facebook, who reveal an environment whose pressure is not only the traumatic images and videos they are exposed to, but also created by the way of managing employees used by the company to create paranoid people with symptoms of PTSD (Post Traumatic Stress Disorder, a condition most commonly associated with war veterans and survivors of torture and genocide), and as soon as they are removed from the company, professional psychological assistance.

It is already possible to identify how these moderators are seen as “second-tier” employees by the hiring method itself.


Contrary to what happens in all other sectors of Facebook, the vast majority of the 15,000 moderators that the company employs are not directly linked to it but are outsourced, that is, Facebook hires a company to do the service for them, and this company that leases the employees who will perform the service.

This is a model widely used by technology companies in their appliance assembling industries (such as Apple, which hires companies with Foxconn to do the entire assembly process of their iPhones) and call center, but has also become very common for the content moderation sector of social networks.

And there is a straightforward reason to use the services of a third party: the costs of operation.

While a Facebook employee’s average salary is $20,000 per month in addition to salary and benefits, a content moderator hired by Cognizant (an Arizona-based company that has a contract with Facebook to moderate content) receives, in addition to all the benefits, about $2,400 per month.

This kind of contrast on the payroll is what allows Facebook to invest more and more in hiring moderators, and yet keep its profit margins high – last year the company closed with a profit of $ 6,9 billion, which meant an increase of 61% in relation to the benefit obtained in the previous year.

Even though these employees do not exactly work for Facebook, companies are struggling to leave the niche-friendly social networking environment in Silicon Valley with colorful buildings, corridors and bright, graffiti-filled rooms with positive images, with a view of the outdoor environment, and various relaxation activities for employees such as yoga sessions and an event inspired by the movie Bad Girls, where all the staff wear pink on Wednesday.

In addition, the company also provides a special room where nursing staff can breastfeed their children or pump milk for storage.

But even though there is an effort to get closer to the work environment of Facebook, this effort is only external, since the model of work organization in these places is still much closer to that used in call-center companies, where employees are crammed into rooms occupied by as many people as possible, with daily, weekly, and monthly goals to be met, and the time of rest and bathroom visits timed by supervisors.

According to former Cognizant employees, the company gave each of them a 30-minute lunch break and another two 15-minute breaks during office hours.

These breaks should be used by the staff to go to the restroom (no toilet breaks outside the breaks), use the telephone to communicate with friends and family or eat something.

And even when these workers found a solution, the company helped. In addition to the three rest intervals, the company provided a further 9 minutes, which could be taken at any time of day for those employees who were already experiencing symptoms of severe trauma such as PTSD.

These people then called for this “bonus” interval at a time when the bathroom queue was shorter, making it easier not only for them but also to reduce the number of people in the lines during scheduled intervals.

It is already possible to see that the work of these people is different from that of many others through the very declaration for it.

In addition to preventing the entry of cell phones and other electronic devices, which should be stored in cabinets outside the room where they work, moderators are also prevented from entering with any paper, pencil, pen or object that can be used for make notes – including candy and gum.

The ban is a measure of data protection for Facebook users, to prevent some of the moderators from recording personal data of anyone who is moderating.

It is also allowed to enter any packaging where it is not possible to see the internal contents, and water bottles or pots of cream for hands are only allowed if the packaging is transparent.

Inside the room, people do not have their own tables; as the work system is all based on a cloud environment, employees should sit wherever they are free, log into their accounts and begin the work process.

The system is very simple: a post that has been marked problematic (either by a user’s denunciation or by the automatic identification of Facebook) is displayed on the moderator’s screen, and the moderator should, in about 30 seconds, identify if that post is even inappropriate for the social network and should be removed or, although controversial, may fit the rules of freedom of expression adopted by the platform.

These rules of what should be considered inappropriate or do not come mostly from two documents: Facebook’s Rules of Conduct (which, in addition to the public version that can be accessed by any user, moderators have access to a more which goes into more detail on some points that in the public document may have dubious meaning) and a “frequently asked questions” document, where there are examples of how to correctly classify some problem cases.

These manuals usually create rules based on very objective things (like, for example, no video where a person is stabbed can be posted on the platform) and others where interpretation is required or “protected groups” need to be used.

Thus, in addition to the rules, moderators also follow recommendations from two other channels: discussions between themselves, where moderators come to a consensus as to whether such content should be allowed or removed, and posts made in Workspace (the “social network corporate “on Facebook, where employees can communicate with each other) issued by management and QA managers.

The problem is that Facebook’s insistence on the use of algorithms ends up disrupting their work. This is because, just like in the leading social network, Workspace also has a timeline that does not necessarily follow the chronological order, but uses an algorithm to decide which are the most exciting content that should appear first.

This creates confusion during catastrophic events, as in the case of school shootings. Because things are happening in real time, there is no set rule for these events, and then moderators are updating the rules that should be used with posts in Workspace.

Only, because of the algorithm that regulates the timeline, not always the post that appears at the top is exactly the most up-to-date position on the case.


And “accuracy” is the key word of the entire business model of these moderation companies.

According to the rules stipulated by Facebook, companies must have a level of accuracy in moderation of at least 95% – that is, for every 20 posts evaluated, moderators can only err in the evaluation of the content only once.

This number is virtually impossible to achieve, and Cognizant usually has its performance fluctuating at around 90% – sometimes a little more, sometimes a little less.

Whereas you expect moderators to review about 400 posts per day and 1,500 per week, and in 30 seconds try to remember the rules of the Facebook document if your superiors have posted an update to any of these rules in Workspace.

This accuracy is calculated by another Cognizant employee who works on quality control. This employee receives about 50 or 60 already reviewed posts from each moderator that week, and needs to re-review, deciding whether or not they should be removed.

If the decision taken by the quality is the same as the moderator, the review of that content is counted as success; if different is counted as an error.

This creates a system where moderators should not only bother to remember thousands of rules and revise as much content as possible in less than a minute but still see the possibility of their work is at the mercy of someone who, pondering the posting, may have a different interpretation of his, penalizing his work.

All of these problems may already be enough to get employees to create certain psychological traumas caused by the workplace – as you can see in some former call center employees and collection companies – but when you add that to a job which consists of spending all day seeing all that is worse than humanity has to offer, this danger is raised to hitherto unknown powers.

The constant bombardment of this type of content causes many employees to contract “Compassionate Fatigue” (also known as “Secondary Traumatic Stress”), a disease with symptoms similar to PTSD that can be contracted by observing trauma images suffered by others, and which is very common in psychologists, psychiatrists, and social agents.

People suffering from this disease may have symptoms such as increased anxiety, loss of sleep, a feeling of loneliness and dissociation (a state where the body hides specific thoughts and emotions of the conscious mind because they are very shocking).


This causes these people, in the pursuit of a brief moment of pleasure and happiness, to commit excesses that are not seen in any other work environment.

According to several former officials, it is prevalent for people to consume alcohol during breaks, get drugs with marijuana and cocaine, or even sex.

Among the places used by staff to perform sexual acts are the bathroom booths, the emergency stairs of the building and also the nursing room for nursing mothers.

Another intriguing and pervasive point among moderators is that they have so much contact with conspiracy theories that they come to believe in them.

In a way, Facebook knows how traumatizing this work can be, and within each of the company’s restraints, there is a psychologist who helps people cope with emotions when they become unbearable – according to various reports, attacks during office hours are common.

But while such officials do exist, the stories they tell show worrying failures in the work of these professionals who are not proactive in identifying people who are “getting lost” at work and refer them to psychological treatments – which, if necessary, are also paid by Cognizant while the person still works for the company – and they do this only when the employee himself realizes there is something wrong with him and says he needs help.

banneraliexp

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More