case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2017-04-28 07:06 pm

[ SECRET POST #3768 ]


⌈ Secret Post #3768 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.

01.



__________________________________________________



02.
[Goodbye to Halos]


__________________________________________________



03.


__________________________________________________



04.
[Great British Bake Off]


__________________________________________________



05.


__________________________________________________



06.


__________________________________________________



07.


__________________________________________________



08.


__________________________________________________



09.


__________________________________________________



10.


__________________________________________________



11. [SPOILERS for Yuri on Ice]



















Notes:

Secrets Left to Post: 00 pages, 00 secrets from Secret Submission Post #538.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ], [ 1 - I am not sure if this is a troll or not ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.

(Anonymous) 2017-04-28 11:51 pm (UTC)(link)
Well in the sense that actions and thoughts come from a series of complex systems, sure. But whether or not you call it consciousness seems very contentious in the case of robots, because you have to define consciousness in a way that doesn't currently apply to robots - I think it absolutely makes a difference if you're talking about animals with brains vs. robots. My computer is not "conscious", after all, it's just a series of programs. It doesn't seem particularly weird to me to separate life we know is "real" - that is, life like us, because we know we're "real" - and life we don't have a way to qualify. I don't think my childhood pet Furby had valuable life or true feelings even it it emulated them, and I'm not sure why that changes the more complex the Furby is produced.

(Anonymous) 2017-04-28 11:59 pm (UTC)(link)
Certainly, it's hard to judge whether a particular given entity is conscious or not from outside (it is more or less just the other minds problem, I think, but that's obviously not an easy problem). But the two points I would make are:

1) It seems to me that, if there were an artificial entity that displayed the same kind of behavior and outputs as human beings, it would at least be reasonable to give it the same presumption of consciousness that we give to human beings presently

2) In principle, from a physical-material standpoint, it's hard to see some philosophical reason why consciousness - or any other quality of mind - should be the exclusive property of humans, or why it should not be replicable by artificial systems. In principle, we should expect there to be some theoretical series of programs running on artificial hardware which was "conscious", because from an abstract point of view, consciousness in the human cases has to be rooted in physical hardware anyway.

(Anonymous) 2017-04-29 12:02 am (UTC)(link)
DA

>I don't think my childhood pet Furby had valuable life or true feelings even it it emulated them, and I'm not sure why that changes the more complex the Furby is produced.

I don't think insects have emotions, but more advanced biological organisms do. We're all biological, but complexity matters. Your Furby might be a mayfly on the scale of simple to advanced robotics.

(Anonymous) 2017-04-29 12:12 am (UTC)(link)
Okay but you typical anime robot with a "soul" is going to be far more complex than a goddamn Furby is the point.

As far as can be proven, humans (and also robots) are just basically electric impulses directed certain ways. Who's to say a robot couldn't eventually become that goddamn advanced? So OP is limiting their thought because It's Not Human.

Humans are just, in the end, organic computers. Just highly advanced ones, the likes of which we haven't really reached yet with metal.

(Anonymous) 2017-04-29 12:23 am (UTC)(link)
Advanced really isn't equivalent to "conscious". A computer can beat a human at chess, that really doesn't mean anything.

And if the division is organic or not, you really don't have a point. Organic life is life as we know it. Simulating feelings and consciousness doesn't have anything to do with authentically experiencing feels and consciousness.

(Anonymous) 2017-04-29 12:34 am (UTC)(link)
That you know of. Again, despite being organic, we are still just electric impulses coursing down wires and sent from a computer. Whether it's metal are not is arbitrary and I feel something small-minded people use to feel comfortable because the idea we could eventually create a computer that legitimately functions the same exact way we do, just with metal and silicon and stuff, terrifies them.

There's a reason we're able to make prosthetics now that actually are beginning to function similar to actual body parts. We're not that different from computers, just... Again, more advanced and self-aware.

(Anonymous) 2017-04-29 01:07 am (UTC)(link)
It literally wouldn't function the same way as organic life. We are born, age, live, die. A robot can't experience life in that way because it is never going to age or die in the same way a human does. To me, that's a pretty huge distinction. It's not about being advanced or self-aware, it's about being an actual organism vs. inanimate matter.

(Anonymous) 2017-04-29 01:58 am (UTC)(link)
Computers die all the time. Just not in the same way. They become outdated (age). They stop functioning.

Your point is moot because you want to be a special goddamn snowflake in the universe.

(Anonymous) 2017-04-29 02:52 am (UTC)(link)
calling organisms "goddamn special snowflakes" lol

you're the one trying to push specialness onto something inanimate.

(Anonymous) 2017-04-29 02:56 am (UTC)(link)
NAYRT but I think part of the point - to me - is that it's a mistake to equate consciousness or emotion with "specialness"

(Anonymous) 2017-04-29 03:05 am (UTC)(link)
But why are we even talking about specialness at all? It seems the opposition to OP's secret here is that it's somehow "human-centric" to not ascribe some kind of spiritual value to robots/computers.

(Anonymous) 2017-04-29 03:52 am (UTC)(link)
I don't know why we're talking about specialness (I wasn't the one who initially used the line about special snowflakes).

The point that I'm making, myself, is that as far as we know, human consciousness and emotionality and being - the things that OP was talking about - are the result of physical processes, which means there's no particular reason that robots or other artificial entities could not have those qualities. Talking about "the artificial simulation of emotion" doesn't really make sense, because in either case, emotion is the result of specific physical processes, whether those processes are carried out in silicon or in neurons. There's no real criteria that I can see - or that anyone has really pointed to ITT - by which you can really differentiate the two. If an artificial entity were able to consistently act as though it had volition, consciousness, emotion, etc, it would be sensible to say that it actually had those qualities.

So when OP talks about all these reasons why the emotionality of fictional robot characters is less authentic, it seems to me - and I think this is what other people ITT are also saying - that this is an incorrect way to think about emotion, and there's no real standard by which you can say that one of those kinds of emotion is authentic and the other one isn't, if we're talking about a fictional robot that does display emotion. The whole sentiment doesn't really make sense with what we know about human beings.

+1

(Anonymous) 2017-04-29 11:18 am (UTC)(link)
This. This is exactly why I brought up "specialness." Because, when you get down to it, humans aren't special and there's really no spiritual value inherent. Even in the most spiritual person, there are human lives that are worth less than others.

It reeks of not understanding where emotion and consciousness even come from. They're basically electric currents firing off in the right way.

Re: +1

(Anonymous) 2017-04-29 12:34 pm (UTC)(link)
You are the one who frankly doesn't understand where emotion and consciousness come from. "Electric currents firing off in the right way" in an identifiable organic nervous system. Emotion and consciousness are complex concepts that are extremely difficult to define, but we know we have them, it's reasonable to assume that other animals have them, too. There is zero reason to assume artificial intelligence is capable of this, period, and that even if it simulates emotion, it's in any way like what we consider actual emotion.

Re: +1

(Anonymous) 2017-04-29 06:17 pm (UTC)(link)
I am NAYRT but I am the poster who AYRT was responding to (the one who made the post beginning "I don't know why we're talking about specialness"). I hope that's clear.

Anyway, what is the actual meaningful connection between the organic-ness of a system and its capacity for having emotion and consciousness? Like, what is the specific reason why organicness is important for having emotion and consciousness?

Because unless there is some specific reason, it seems to me that what we know is that emotion and consciousness arise out of specifically structured physical systems. I can't see any reason why organic-ness is an important quality in terms of those structures attaining those states. Therefore, it seems to me reasonable to suspect it's not distinctively organic. What we have is a system of responses and systems and out of that arises consciousness. Neurons (it seems likely to me) are merely a medium in which those systems and responses and messages take place.

AYRT

(Anonymous) 2017-04-29 11:14 am (UTC)(link)
I'm saying the exact opposite--that you're NOT special. You and OP want to be special in the universe, probably because it's comforting, but you're really... Not.

Re: AYRT

(Anonymous) 2017-04-29 12:36 pm (UTC)(link)
You're the one who brought up specialness, and humans. We're talking about organic life in general (not exclusively humans) vs robots. If you think a toaster, furby doll, and AIM bot has value equivalent to living organisms, that's your call. I don't. I do prescribe value to organic life over inanimate objects. If you call that self-centered, I don't know what to tell you.

(Anonymous) 2017-04-29 12:36 pm (UTC)(link)
No, we're not just a series of electrical impulses. That's one feedback system sure but we're far far more complex than that. Robots are not alive, get over it.

(Anonymous) 2017-04-29 06:43 pm (UTC)(link)
first of all, it doesn't matter how many feedback systems we are. we're unutterably complex feedback systems, sure, but the point is the same no matter how many or how complex the physical feedback systems are - they're still basically physical feedback systems.

second of all, no one is claiming that actual existing robots are alive. the argument is whether a hypothetical (fictional) robot (or other artificial entity) could be said to have consciousness, emotion, agency, and volition. please stop arguing against the strawman that robots are alive, because that's not what anyone is saying.

(Anonymous) 2017-04-29 12:37 am (UTC)(link)
Organic life is still, like, physical matter operating according to the constraints of physical laws

It's not some special kind of substance

(Anonymous) 2017-04-29 01:08 am (UTC)(link)
like, everything is something. that doesn't mean living organisms aren't any different from rocks.

(Anonymous) 2017-04-29 01:18 am (UTC)(link)
since the emotions and other experiences of living organisms are built up out of physical matter by physical laws, what - in principle - precludes some non-organic structure from replicating them?

(Anonymous) 2017-04-29 03:06 am (UTC)(link)
Wait, what? So you're saying emotions are something physical. So what, processes in organic brain matter/nervous system? If that's the case, it's absolutely logical to say that if this process isn't happening in organic matter, then it's not authentic.

(Anonymous) 2017-04-29 03:57 am (UTC)(link)
So, if emotion in humans is the result of physical processes, it doesn't seem like it should matter what the actual process is, as long as it ends up in the same place. If you have physical process A (which is in organic matter) and physical process B (which is in a robot), and the actions that result from A and B are indistinguishable, there's no philosophical reason I can see why one of those processes is more authentic than the other. They're both just physical processes that result in certain structures.

Like, I just genuinely don't see a reason why it should be true that the process happening in organic matter makes the end-state more authentic or legitimate.