case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2022-09-30 06:09 pm

[ SECRET POST #5747 ]


⌈ Secret Post #5747 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.


01.
[Halt and Catch Fire]



__________________________________________________



02.
[Buffy the Vampire Slayer]


__________________________________________________



03.



__________________________________________________



04.



__________________________________________________



05.



__________________________________________________



06.



__________________________________________________



07.



__________________________________________________



08.



__________________________________________________



09.



__________________________________________________



10.
[According to Jim]


__________________________________________________



















11. [SPOILERS for Stranger Things Season 4]




__________________________________________________



12. [SPOILERS for The Wishing Stone]




__________________________________________________



13. [WARNING for discussion of transphobia]








































Notes:

Secrets Left to Post: 00 pages, 00 secrets from Secret Submission Post #822.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.
mishey22: (Default)

[personal profile] mishey22 2022-09-30 10:41 pm (UTC)(link)
I thought that the issue with AI art is that it takes existing art and mashes it all together? Because that is stealing and I'd get why they're mad.

If that isn't it, then...I dunno. Carry on, I guess

(Anonymous) 2022-09-30 11:46 pm (UTC)(link)
That's not how AI art works, no - it's not a mashup generator, it's a pattern-generation machine that's trained on millions of patterns, which in this case is images. How ethical the gathering of training data has been is another question (and there are certainly problems there even if you think it's fine to train on whatever stock photos and art they scraped off the net - for instance, private medical photos have ended up in some of these datasets), although I do find it a bit strange that I have never seen anyone claiming that text-generating neural networks were stealing human labor, even though they're also trained on scraped human-written data. Maybe that's only because they're still further behind humans.