‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
Fuck that job.
I’ve heard this is also the case for civilians and officers working for police departments who are responsible for handling evidence related to child abuse. Takes a lot of psychological support, which I’m not sure can ever be enough.
Many places like that have protocols for how long your shifts can be and how long you can do it, with constant psych support while you do it in order to reduce and mitigate the impact of the material. Meta may have been pushing their workers too hard and cutting corners.
How much violent content is even out there that you couldn’t just trivially block by just collecting a list of content-id? How much of it would you need to watch in full to pass judgement?
I seriously don’t get why this is a problem in the first place. Every tiny nip-slip gets you instantly blocked on Facebook and Instagram. They always default to “block” without any closer inspection. They are content moderators after all, not criminal investigators, there shouldn’t be a need to watch it in every detail. So why are they watching enough violent videos to cause trauma and not just hitting the block button or let the computer do the work?
Both lawyers agree that Meta’s policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.
It’s in the article.
It’s trivial to circumvent automatic detection
The EU now has a rule that all reports of content must be checked and verified for illegal content like misinformation. They can’t automatically block that content because then people would weaponise reports. At best they can automatically block video and image hashes which have been previously verified as illegal, but these are trivial to circumvent. I think they’ve started using perceptual hashes but these are far from perfect.
I believe they use similar moderation for the US to proactively head off potentially similar legislation to the EU.
Something like 3 billion people actively use Facebook each month. There must be tens of millions of daily reports. I can only imagine the level of planning, staffing, and tools which are required to facilitate that.
They need an AI to curate that kind of content then.
AI is far from perfect, and is unlikely to satisfy the DSA requirements.
Im pretty sure I have some trauma from watching all the Liveleak executions as a teenager back in the day.
deleted by creator
Why is “croatia”, a whole country on your list? Is it that traumatic?
deleted by creator
This is the best summary I could come up with:
More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.
More than 20% of the staff of CCC Barcelona Digital Services - owned by Telsus, the company that Meta hired to check the content of Facebook and Instagram, are on sick leave due to psychological trauma.
The images posted on the social networks they were supposed to check showed the worst of humanity: videos of murders, dismemberments, rapes and live suicides.
He sticks a knife in its chest, rips out its heart and eats it," Francesc Feliu, lawyer for more than a dozen workers who decided to sue the company, told Euronews.
The psychologist would listen to them and then tell them that what they were doing was extremely important for society, that they had to imagine that what they were seeing was not real but a film, and that they should go back to work," says the Spanish lawyer.
Both lawyers agree that Meta’s policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.
The original article contains 986 words, the summary contains 191 words. Saved 81%. I’m a bot and I’m open source!
I was over at 4chan looking at some really fucked up videos the other day. Yes, it left a mark in my head. Yes, I’m thinking of not looking at that stuff anymore.
You have to be 18 to check that site.
Over 18 months :)
Real-world brown notes and BLITs
Couldn’t they hire from watchpeopledie or nothingtoxic or ebaum. Those users probably would do overtime for free.
People that are completely desensitized to that kind of stuff would probably not be very good at moderating it really.
Also this is a terrible job and I’d be very worried if a company was paying and enabling people who find that fun. It’s horrible, but trauma is the normal outcome.
Sounds like the perfect job for AI
I feel sorry for whichever researchers are in charge of training and fine tuning those models… ouch
Should there be a disclaimer before hiring? I mean don’t apply for the job if it’s going to bother you. I sure hope they were not forced by the boss to suck it up.
I admit I’m a bit desensitized to gore but I don’t know if I can handle truly brutal videos.
probably there is a disclaimer of some sort, and people think they can handle it. but also, even people who worry they can’t may be desperate for a job.
i work with people who have experienced terrible things, but it’s a little easier to manage because i am hearing about it (only to the extent that people want to talk) and not seeing it, so i get a little psychological distance. my environment is very supportive and i’m hearing things sometimes, not watching it hundreds of times in a day. even then, people in my role are at decent risk of burnout.
like, you never know when you’re going to hear a thing that just gets to you.
doing what these mods do for hours and hours a week without support seems like a recipe for secondary trauma and burnout. edit - and yes, i’m pretty sure their environment amounts to “suck it up and keep going.”
I guess it’s the same when joining the military, no amount of video games or war movies will prepare you for the horrors of war.
Pretty sure that if you get to see a lot of “Violent Content” on your Facebook, you’re either following the wrong groups or have wrong friends who you chose to allow to (or you’re specifically looking it up 😅). If there’s anything Facebook is good at, it’s at keeping people cosy in their selective information bubble. I myself have maybe had 5 posts that actually had shocking content I rather not had seen in the 14 years I was on Facebook, and I think most of them by the same 1 person that I eventually asked to exclude me if they post stuff alike… 😅
Not gonna argue that visual shock trauma isn’t a thing, it surely is! (I have sadly collected a small (mind-)museum of those in my lifetime myself.) Nor am I gonna claim Facebook “isn’t that bad”, it definitely is too! But this just has moneyfishing written all over it.
👴🏼 Oh, well,… Back in my day, when we saw something that would scar us for life, we’d immediately call as many friends possible to come look at it too… 👴🏼 😅
This is about the people who have to check if this stuff is violating the rules, not users who happened to see it
In my defense, I was pretty drunk when I commented this…
Don’t drink and Lemmy… 😅
Did you read the article or just skim the headline?
No, I read the headline and skimmed the article. 😅
are you lost ?