case: (Default)
Case ([personal profile] case) wrote in [community profile] fandomsecrets2017-12-19 06:47 pm

[ SECRET POST #4003 ]


⌈ Secret Post #4003 ⌋

Warning: Some secrets are NOT worksafe and may contain SPOILERS.

01.



__________________________________________________



02.


__________________________________________________



03.


__________________________________________________



04.


__________________________________________________



05.


__________________________________________________



06.


__________________________________________________



07.













Notes:

Secrets Left to Post: 01 pages, 21 secrets from Secret Submission Post #573.
Secrets Not Posted: [ 0 - broken links ], [ 0 - not!secrets ], [ 0 - not!fandom ], [ 0 - too big ], [ 0 - repeat ].
Current Secret Submissions Post: here.
Suggestions, comments, and concerns should go here.
thewakokid: (Default)

[personal profile] thewakokid 2017-12-20 12:29 am (UTC)(link)
Really? She has shown more complexity than a normal machine, I don't think it's unreasonable to believe she may develop human like affection, or at least a digital equivalent, for John.

(Anonymous) 2017-12-20 01:08 am (UTC)(link)
+10000

(Anonymous) 2017-12-20 01:48 am (UTC)(link)
I don't think it's unreasonable to believe she may develop human like affection, or at least a digital equivalent, for John.

This hits on something I find really interesting about these sorts of stories, and especially T:SCC, which is, when a machine is so advanced that it develops its own subroutines that function like emotion, like love - how can we say they aren't? I mean, when it's our own programming calling the shots, that's one thing, I can understand humans saying, "No, we designed that programming and put it in there; it's doing what we designed it to do." But I mean, if it looks like love and it acts like love, and we didn't program it - who exactly is to say what that function is or isn't?

(Anonymous) 2017-12-20 08:39 am (UTC)(link)
Sorry, but no. It is unreasonable.

If a machine suddenly feels a sense of panic because it thinks its outer shell is too fat, then maybe I'd see what you're getting at in terms of the other side of that human emotion spectrum. But for now? That just sounds like a load of bull.

(Anonymous) 2017-12-20 10:49 am (UTC)(link)
That just sounds like a load of bull.

I feel like you watched a very different show than I did. Because one of the shows largest and most narratively consistent threads was about the slow, strange, ambiguous but increasingly evident development of Cameron's character - of her consciousness.
thewakokid: (Default)

[personal profile] thewakokid 2017-12-20 07:26 pm (UTC)(link)
Oh, my dude of dudes, I am so happy to be having this argument!

Well, no a machine will not panic because it's it worries it's outer shell is too fat, because by all accounts the machine probably wouldn't feel that to be a priority. It's not likely to be too connected to it's body. It is, as you say an outer shell, it's sens of self would be disconnected to it's outershell because it is capable of existing beyond that shell. If it feels it's outershell is not suitable to it's goals, it is capable of swapping or modifying the shell with some ease.

BUT as seen in this show, Cameron is capable of panic, it's just different priorities than what makes us panic. For example, the scene where john is going to shut her off. The human expressions were exaggerated and calculated, but it was very clear she was paniced and not wanting to be destroyed.

And then later, when it seems possible that she might turn, she installed a killswitch, because what worried her more than being destroyed is failing her mission. There is a clear sense of priorities and and clear, if not necessarily recognisably human, set of desires.

(Anonymous) 2017-12-20 11:15 pm (UTC)(link)
+100000 to all of this.

(Anonymous) 2017-12-20 12:42 am (UTC)(link)
i feel like "robots falling in love" is a fairly standard genre thing, though

(Anonymous) 2017-12-20 01:32 am (UTC)(link)
I completely disagree with your interpretation of Cameron's character. She was an atypical machine from the get-go, and she only became more atypical after she was damaged in the explosion at the end of S1 (the creator himself talked about it in one of the audio commentaries).

The bottom line is we don't really know what degree of emotion and personal choice she's capable of. And we have no idea what dregree of emotion and personal choice she may become capable of over time.

Cameron's programming is incredibly complex. It is adaptive, and, as the series progresses, we actually see Cameron do things that have no other apparent purpose besides existential self-exploration. Furthermore, Camerons has, at her core, conflicting programming; she is programmed both to terminate and to protect John Connor. And because, according to the series creator, Cameron was never actually "fixed" after the explosion, we're left to conclude that she didn't kill him in 2.02 because she chose the programming that told her to protect him over the programming that told her to terminate him.

So she's adaptive and she's capable of choice. I don't think the audience is given addequate information in the course of the series to know what exactly that means for - and about - Cameron. But it certainly calls a lot of preconcieved notions about her mechanistic limitations into question.

By the end of S2, I wouldn't be comfortable saying, definitively, that Cameron loved John - but I also wouldn't be comfortable saying, definitively, that she didn't love him or was incapable of doing so eventually. And given the arc of the show, and the creator's own comments about John and Cameron's relationship, I consider it likely that John and Cameron's relationship was ultimately designed to be a love story - albeit a highly complex, ambiguous one.

(Anonymous) 2017-12-20 03:03 am (UTC)(link)
+1, well said. She certainly has some kind of complex protective feeling towards him, though whether that's "love" or anything identifiable by humans as emotion is a different and super interesting question in itself.

(Anonymous) 2017-12-20 08:47 am (UTC)(link)
she chose the programming that told her to protect him

Nah, I think it could easily be argued that the override just sort of kicked in the way it did before her chip got damaged. She always had the terminator in her, and we saw that once she rebooted after the explosion. Where was her fantastic decision-making and self-awareness during most of the season 2 premiere episode?

I agree that it was probably designed to be a love story, but I don't think it was designed to be a love story in the classic way that we often see between two human characters, and that's what was supposed to be intriguing about it.

Though I say all of this as a John/Cam shipper lol so I, admittedly, may be blinded by what I've come up with in my own way of adapting to read this ship as canon through re-watches.

(Anonymous) 2017-12-20 10:40 am (UTC)(link)
I don't think anything but the base-level terminator aspect of Cameron's "mind" was conscious for most of that episode. Then John jammed her head between two vehicles and took out her chip. When he put it back in, both the terminator and the protector parts of her were conscious. Maybe because of the impact of the car, maybe because of the reboot, maybe something else - who knows. On her system screen we see that the predominant command at that moment is to terminate, but then she doesn't, she overrides the command. But the thing is, she was running two subroutines at once, and she acted on the secondary one rather than the primary one. As far as we're ever told in the show, that doesn't happen. Terminators are good as long as their secondary programming holds; when it fails, they revert, end of story. The fact that Cameron was of two minds in that moment was not standard to her programming. It was new. She was conflicted.