case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2025-06-29 02:26 pm

[ SECRET POST #6750 ]


⌈ Secret Post #6750 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.


01.



__________________________________________________



02.



__________________________________________________



03.



__________________________________________________



04.



__________________________________________________



05.



__________________________________________________



06.



__________________________________________________



07.




















Notes:

Secrets Left to Post: 02 pages, 33 secrets from Secret Submission Post #965.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.

Re: AYRT

(Anonymous) 2025-06-30 12:35 am (UTC)(link)
DA

Honestly just use Chat GPT as a therapist. It's not half bad and it's available 24/7. I've played with the free and paid versions courtesy of a tech bro family member and the paid version isn't that different especially if you don't care about its "art generation" feature. If you tell it to do searches for resources then it can find the surface level stuff. It's helped me with my medication and it explains how shit works. It's been helpful for bouncing ideas off of and getting experiences out of my head.

Re: AYRT

(Anonymous) 2025-06-30 12:41 am (UTC)(link)
ChatGPT hallucinates regularly and tells you what you want to hear. This is horrible advice.

Re: AYRT

(Anonymous) 2025-06-30 12:43 am (UTC)(link)
DA

Oh my GOD you did not just suggest a predictive algorithm that suggests people eat rocks every day as a substitute for real therapy. Jesus fucking christ. We're doomed.

Re: AYRT

(Anonymous) 2025-06-30 12:45 am (UTC)(link)
https://ask.library.arizona.edu/faq/407990

https://www.pcgamer.com/software/ai/chatgpts-hallucination-problem-is-getting-worse-according-to-openais-own-tests-and-nobody-understands-why/

Holy hell, you sure are suggesting the Hallucination Machine for therapy.

Re: AYRT

(Anonymous) 2025-06-30 12:48 am (UTC)(link)
That thing is going to give you the wrong medication advice and it will kill you. Sincerely: a fucking trained pharmacist.
nanslice: (Default)

Re: AYRT

[personal profile] nanslice 2025-06-30 12:51 am (UTC)(link)
lmao A+ bait

Re: AYRT

(Anonymous) 2025-06-30 01:29 am (UTC)(link)
DA
I'm agnostic on certain aspects of AI, but ChatGPT for therapy sounds like a nightmare when it lacks less confidentiality than a real one. Not to mention even using it for legal, historical, and news related questions brings up complete misinformation. I would not trust that with information about my disabilities, medication dosages, and trauma history when its responses are predictive text.

If it helps the venting anon, I got free therapy via a crisis center in my state for assault victims. It was several hours away, but after the pandemic I exclusively saw them via teletherapy at no cost. I'd look into programs like that if possible.

Re: AYRT

(Anonymous) 2025-06-30 01:37 am (UTC)(link)
This is the worst advice ever