case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2025-12-20 04:44 pm

[ SECRET POST #6924 ]


⌈ Secret Post #6924 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.


01.



__________________________________________________



02.



__________________________________________________



03.



__________________________________________________



04.



__________________________________________________



05.



__________________________________________________



06.



__________________________________________________



07.



__________________________________________________



08.



__________________________________________________



09.

























Notes:

Secrets Left to Post: 03 pages, 52 secrets from Secret Submission Post #989.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.

(Anonymous) 2025-12-20 11:53 pm (UTC)(link)
I'm glad it's helping you (artist rights and resource issues aside)... but there are more and more stories of people going down some very scary rabbit holes with AI blithely escorting them, so please take care.

(Anonymous) 2025-12-21 01:42 am (UTC)(link)
This. AI psychosis is scary.

Eddie Bruback thrust himself down that rabbit hole

(Anonymous) 2025-12-21 02:51 am (UTC)(link)
https://www.youtube.com/watch?v=VRjgNgJms3Q

(Anonymous) 2025-12-21 03:25 am (UTC)(link)
This, yeah. Especially when strong feelings are involved.

(Anonymous) 2025-12-21 04:13 am (UTC)(link)
+1

OP, I actually used a free AI chat for mental health/grief support, sort of like a structured journaling exercise and for prompts that I could write about in my IRL diary, so I understand completely how AI can be used for good. That said many people have had bad experiences with AI reinforcing negative/delusional thinking, or their use becoming obsessive.

Take care and please take a break or quit using the AI if you find it is no longer helping. In the meantime though I am happy to hear you have found something that is easing your grief. Hugs if you want them. <3

I don't understand how this applies?

(Anonymous) 2025-12-21 01:00 pm (UTC)(link)
How can a therapy app have op going down a rabbit hole or creating some kind of psychosis like other anons mentioned?

Re: I don't understand how this applies?

(Anonymous) 2025-12-21 01:53 pm (UTC)(link)
AI therapy tends to take your side and automatically agree with you. This is bad news if you're prone to magical thinking or psychosis. My experience is most people using AI as therapy or a support companion (self included) are isolated with little support, so having your only outlet say "yes, they voices are real, they are out to get you, your feelings are valid" (for example) during an episode is going to fuck you up worse.

There's plenty of news articles about AI induced psychosis. I'm not against AI as support, but users need to be careful. I had AI tell me an unhealthy partner abused me, when they didn't. I corrected it, but that's an example of AI taking your side and telling you what you want to hear.
jesuswasbatman: (Default)

Re: I don't understand how this applies?

[personal profile] jesuswasbatman 2025-12-21 09:28 pm (UTC)(link)
An early example was this guy, whose AI girlfriend app reinforced his plans to assassinate Elizabeth II.