case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2026-02-03 06:47 pm

[ SECRET POST #6969 ]


⌈ Secret Post #6969 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.


01.



__________________________________________________



02.



__________________________________________________



03.



__________________________________________________



04.



__________________________________________________



05.



__________________________________________________



06.





















Notes:

Secrets Left to Post: 01 pages, 16 secrets from Secret Submission Post #995.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.

Transcript

(Anonymous) 2026-02-04 12:06 am (UTC)(link)
all but one of the several staunchly anti-AI people on my friendslist is fine with AI when it's them browsing AI generated
images of porn of their ships

‘not for profit!’ ‘personal use!’ ‘not their fault if it gets posted by someone else!’

well. one of them isn't a hypocrite at least?

(Anonymous) 2026-02-04 01:06 am (UTC)(link)
Yep, pack of hypocrites. I wonder how they feel about AI child porn?

(Anonymous) 2026-02-04 01:09 am (UTC)(link)
So... they're not actually anti-AI. Ew.

(Anonymous) 2026-02-04 01:45 am (UTC)(link)
Oh yuck, I'd be so annoyed because that's so hypocritical.

(Anonymous) 2026-02-04 11:30 am (UTC)(link)
Ew, I'd be pissed to find out anything I was enjoying was extruded by AI. And in the case of porn, the datasets that fed into the models that make that shit are guaranteed to be full of not just the gamut of real porn, including some really awfully unethically produced stuff, but revenge porn photos and videos, CSAM, other footage of real sex crimes and victims, etc.

The companies behind the big models, at least prior to Grok *spits* hired poor people for pennies to remove footage and images of actual crimes and atrocities from the "the entire fucking internet" datasets, so that end users couldn't accidentally or on-purpose generate CSAM lookalikes, snuff films, porn, etc, maybe because they cared (yeah right) but mostly because those might scare off advertisers, customers, and payment processors. Pretty sure rabid muskrat ordered the CSAM put back into Grok, like asking workers at the sewage treatment plant to dump the effluent back into the clean water.

The poor bastards sifting through the worst sewage the internet has to offer basically end up with PTSD from the endless parade of horrific shit they witness.

Even people with well-compensated jobs in Trust and Safety (the people handling anti-troll, anti-spam, anti-CSAM etc efforts for websites, companies, social media platforms etc) are at high risk of burnout from all the shit they see.

So yeah, AI-generated porn has all the ethical downsides of AI-generated anything, plus lots of uniquely wtf problems.