ghoul2
18 hours ago
I long time ago, I was operating a "social network" which allowed image uploads. (India local, didn't amount to much as was shutdown.).
Immediately at launch, we started having a huge amount of (image) pron being uploaded into the pages. We put in some rate limits etc, but did not want to put any major restrictions of user signups etc as that would hurt signup figures (important to the ceo!).
We already had some content review people thru a temp agency on site, so we checked with them and they were fine doing this manual filtering of these images for us. All young (early 20s) women. While my team built a quick "dashboard" for them to be able to do this image filtering quickly and conveniently, I had a detailed conversation with them as I was very concerned about having them review this kinda stuff for 8 hours a day, 5 days a week. Truly nasty stuff.
They were _perfectly_ clear that they had no issues with it, and they told me in so many words to not give it a second thought and to let them get back to work.
It was a surprise, but it was a point of realization: there are much worse things they could be doing. And looking a porn has shock value only the first time. I was under-estimating these women and assuming they were some "snowflakes" who could not deal with something this silly and non-threatening.
Just my own person anec-data.
ura_yukimitsu
16 hours ago
It's good that you warned them in advance, but I don't think "there are much worse things they could be doing" really is a reason to discard those who are hurt by this as "snowflakes" and the content they're exposed to as "silly and non-threatening." From what you're saying, there's also a big difference between what you were dealing with and what this article is talking about: in your case it was only still images, and only "regular" porn (and I assume they probably had to monitor every image posted if it was "a long time ago" so overall they were not exposed only to that content); in this case they're reviewing videos content including not only porn but also violence and sexual abuse, that was pre-flagged by an AI so with way less rule-abiding content mixed in.
ThrowawayR2
12 hours ago
People butcher livestock for pay on a daily basis, rendering a live animal dead and cutting it to bits, on a massive scale to provide the food that we eat. Rescue and medical personnel deal with the injured, ill, and dying on a daily basis on a massive scale. Merely watching videos, even of disturbing material, doesn't even come close to being as bad as those and some other professions.
tomcam
7 hours ago
I think watching children being butchered or prisoners being tortured to death could be equally or more traumatic. Killing in processing animals is at least socially acceptable and (arguably) necessary for survival.
D-Machine
10 hours ago
The massive over-reaction and pearl-clutching surrounding "seeing stuff on a screen" today is really bizarre. I think Meehl called this kind of thinking in therapy the "spun-glass theory of mind", i.e. that people are so fragile and easy to damage that you have to shelter them from even the mildest of harms. Yes, images and videos can be horrific, but that horror absolutely pales in comparison to what the world has to offer. Plus, plenty who grew up in the early era of the internet and who saw shit on 4chan know that seeing horrifying things on a screen does not have to be some kind of trauma. You just reported the post and hid it, and moved on.
And the horror of a thing is not actually solely a property of the thing. Different people are affected differently by different things. People the most upset and making the most grandiose claims about how damaging these things are are often projecting their own weakness / engaging in typical mind fallacy. They would do well to provide data to quantify and/or support their claims of the degree of harm here.
rramadass
an hour ago
D-Machine
an hour ago
As I've said, images can be horrific, and some people can be traumatized by them. That must not be dismissed.
However, It is also important to carefully and properly quantify these things and not sensationalize. You've linked a 50+ minute documentary, without comment, that seems to prove that one person hired to curate content can become traumatized by that process. I can't be certain that is what it is about, because I will not waste time watching documentaries (the vast majority of which are outright propaganda or incredibly biased, while pretending to be objective), but still, I've no doubt the general claim is true, since I never claimed or believed otherwise.
But you've not provided meaningful statistical or scientific evidence properly quantifying such harms in general.
pavel_lishin
8 hours ago
> Yes, images and videos can be horrific, but that horror absolutely pales in comparison to what the world has to offer.
"Eat your dinner, there's kids starving in Africa."
rramadass
an hour ago
lazide
16 hours ago
The alternative is…. They don’t have a job?
What else is going to happen?
For many people, the alternatives can literally be death, wading unprotected through human excrement, sold into defacto slavery, etc. etc.
AI can do some of it (now), but largely isn’t going to be more accurate and still needs humans to double check its work.
ura_yukimitsu
15 hours ago
I really don't think it's fair to minimize someone's struggles just because their situation could be worse. Is only the most miserable person on the planet (by what metric anyway?) allowed to complain about their condition?
I also don't think it's fair to exploit people who are in terrible situations by pushing jobs we don't want onto them, pay them a handful of crumbs, and then say they should be happy with what they get because their neighbor who does another job gets half a handful of crumbs.
lazide
15 hours ago
The women in this situation aren’t complaining. Very much the opposite.
Why the compulsion to paternalistically ‘protect’ everyone even to the point of making them unemployed? I assume they weighed their options and decided this was the best one. It sounds like you want to stop them from doing that?
Isn’t that the real minimization?
ura_yukimitsu
14 hours ago
You're misrepresenting both what the article says and what I wrote.
The article explicitly mentions that the jobs aren't clearly labeled so they couldn't weigh their options beforehand, that concerns raised by the workers are being dismissed by management, and that several workers have developed mental health problems.
I'm not arguing that these women should be "protected" by taking their jobs away or that they can't make their own choices; of course they're weighing their options and deciding accordingly (the article even mentions that some of them decided to leave). But it's not unreasonable to critique a system where the only choices they have are all horrible in (often more than) one way or another.
lazide
14 hours ago
I’m responding to the gp comment, which I thought was obvious
TitaRusell
12 hours ago
I actually had a conversation about this with my mom. We were talking about the hotel cleaners in Dubai walking around with toothbrushes to clean the shower which seemed mildly ridiculous to our European eyes.
But we came to the realisation that these folks were probably happy that they could send money back to their villages. And we left a nice tip.
paulryanrogers
5 hours ago
Risk of developing trauma or mental illness is probably much lower for shower cleaners than CSAM / gore filters.
4MOAisgoodenuf
15 hours ago
There is plenty of productive work to do in the world that isn’t streaming through the dregs of the internet for hours on end.
This argument is a false dichotomy.
lazide
15 hours ago
I take it you’ve never lived in India? Visited Pakistan or Bangladesh?
Plenty of people would (literally) kill for those jobs, and have no other useful employment.
In those countries, Women are often widely forbidden from doing most jobs, which makes them particularly vulnerable.
paulryanrogers
5 hours ago
Wouldn't it be better to give women more choice in general and the option of therapy if they take on a job with a risk of PTSD? Dismissing the concerns of the women actually doing the work isn't making things any better.
calendar938
15 hours ago
wow, this is absolute peak privilege of someone sheltered.
aaaalone
10 hours ago
It's modern slavery.
Just because it's beneficial for them doesn't mean they are getting exploited.
Looking at porn or something similar or actually really bad , changes people.
Even if it means they get a lot more insensitive
paulryanrogers
5 hours ago
IMO it's only slavery if they truly have no other options. Which, if true, is also worth addressing.
rramadass
an hour ago
yunohn
8 hours ago
While one can blame corporations, the most blame lands on the Indian government(s). Decades and decades of corrupt local, state, and central governance has led to dire poverty and high levels of unemployment. The current and past leaders have had no care to fix it, and it’s only getting worse. Their incompetence is what creates these kinds of jobs as alternatives to abject poverty and death.
rramadass
an hour ago
As an Indian living in India's Silicon Valley, I am calling BS on this.
"All young (early 20s) women" doing explicit content manual filtering is not normal. No Indian Family would allow their daughters (and sons) to take up such a job if they knew about it.
As the article itself points out, the workers are drawn in by non-explicit/non-pron image annotation and then switched to psychologically harmful work when they cannot leave due to job-needs/contracts-enforcement.
This is pure bait-and-switch scam which has severe psychological effects on the women (and men) involved and should not be blithely hand-waved away.
For a detailed understanding of the psychological harm of explicit image annotation see this documentary (not specific to India) The "Modern Day Slaves" Of The AI Tech World. Undercover reporter does annotation of explicit images for AI but is so traumatized by what he sees that he quits in two weeks. - https://www.youtube.com/watch?v=VPSZFUiElls