Jump to content

Ad Victoriam, But Why?


Deleted4363562User

Recommended Posts

Well, for what it's worth, you have my sincere sympathy.

 

Not sure if ALL synths are psychopaths though. In fact, most don't really seem to fit the Hare Psychopathy Checklist-Revised profile, I would say. But I'm not really qualified to make that diagnostic, so take that with a grain of salt.

 

Just about the only thing that is an item on that list is the deception, but, eh, even taking that as confirmed, it's scoring 2 on that checklist, which is the average for the regular population outside prisons. And even then it's only those used for infiltrations.

 

Plus, honestly, they're forced to. They're put in a situation where either the Institute or the friends of the victim WILL kill them if they failed the infiltration. The Goodneighbour triggermen aren't exactly taking prisoners, nor does the Compound, nor do other vigilante groups like the UP Deathclaws, plus see in Diamond City how even the guy's brother pulled a gun on him at the mere suspicion of being a synth. So I wouldn't say it's an indication that they enjoy deception. They're put in a situation where they do or die, and apparently most just die.

 

Or think of it in justice terms. You can't really establish the mens rea (evil intent) if the guy was coerced to do the evil deed. If the following are met (using the American law criteria), and they ARE in the synth's case, it qualifies as being done under duress:

 

1. The threat must be of serious bodily harm or death

2. The threatened harm must be greater than the harm caused by the crime

3. The threat must be immediate and inescapable

4. The defendant must have become involved in the situation through no fault of his own

 

And, honestly, all four apply and then some.

 

1. The threat IS death.

2. The harm in etting killed is obviously greater than that of spying on someone. Or at least an average person threatened with death would generally rationalize it that way.

3. The threat is rather immediate, as people tend to kill you on the spot.

And it's rather inescapable. Running away will just get you hunted down by a courser (or worse, Kellogg) and killed or mind-wiped, which is to say, erased from existence. And going back to the institute is hardly an option, seein' as they planned to let even the loyal DC mayor be "decommissioned in the field" (read: killed) rather than extract him. So it's inescapable in that ALL the option have a near-certainty of ending in your ceasing to exist.

4. I don't think synths have the freedom to choose their own missions. In fact, as evidenced in the mayor's case, even asking for a next assignment or pointing out your merits to the Institute is evidence of too much self-awareness and will result in your termination or mind-wipe.

 

So it seems to me a bit unfair to blame it on psychopathy. We don't really have the evidence to blame it on that, when duress would compel one to do the same even without being a psychopath.

 

 

I don't actually consider synths psychopaths because I don't consider them sentient. While I've conceded some points based on what is said and isn't said in Fallout 4, I'm not convinced synths are anything more than advanced machines that highly approximate humans. I just tend to view them through the prism of experience I've had with a psychopath. While it's not highly accurate, there is an analogy to be made. If synths are indeed not sentient, then they have a similar disregard for anything outside their prime directives. They wouldn't experience fear, distress or guilt.

 

I stand by my views in the video. While the Institute scientists could be lying, they regularly deny that Synths are sentient. Both Father and Justin Ayo are very adamant about this. While some of Fallout lore implies that there hasn't been a new CPU developed in 200 years, it doesn't rule it out. Inventing a molecular relay is pretty impressive and seems likely to open up a wealth of possibilities. I mean, that alone would require an absurd amount of processing power.

 

The narrative in Fallout 4 is whether or not synths are sentient and deserve human rights and status. If they're just humans with minor hardware additions, there's no story.

 

EDIT: I admit it's partial conjecture, but I still think the synth components house most of their higher functions. The synthetic brain is there to support the synthetic body which functions as an organic chassis. The mention of neurons, neural pathways and mappings etc is in reference to synthetic versions of those within the synth components.

Link to comment
Share on other sites

  • Replies 71
  • Created
  • Last Reply

Top Posters In This Topic

Well... I suppose it COULD actually be some kind of robobrain-like design, where the human brain is just some kind of co-processor. I suppose the institute could have found that tech and ran with it. We don't have evidence that that's the case, but I suppose we also don't have much evidence that it isn't.

 

But regardless of how that architecture may or may not be, wnd whether those feelings may be programmed or whatever... the problem is that everything else again flies in the face in the face of information that actually is in Fallout 3 and 4.

 

E.g., Harkness from Fallout 3 is a courser that was experiencing remorse and guilt, which is basically why he actually looked for someone to wipe his brain for him. E.g., Chase from Acadia expresses guilt and remorse for failing to save the runaway synth.

 

And synths not experiencing fear? Seriously? From the synths cowering in the basement of Bunker Hill to the scared synth running off in Far Harbour, their being scared is more the norm than the exception.

Link to comment
Share on other sites

Well... I suppose it COULD actually be some kind of robobrain-like design, where the human brain is just some kind of co-processor. I suppose the institute could have found that tech and ran with it. We don't have evidence that that's the case, but I suppose we also don't have much evidence that it isn't.

 

But regardless of how that architecture may or may not be, wnd whether those feelings may be programmed or whatever... the problem is that everything else again flies in the face in the face of information that actually is in Fallout 3 and 4.

 

E.g., Harkness from Fallout 3 is a courser that was experiencing remorse and guilt, which is basically why he actually looked for someone to wipe his brain for him. E.g., Chase from Acadia expresses guilt and remorse for failing to save the runaway synth.

 

And synths not experiencing fear? Seriously? From the synths cowering in the basement of Bunker Hill to the scared synth running off in Far Harbour, their being scared is more the norm than the exception.

 

 

This is my point about how the writers ignored one thing to achieve another. Fallout 4 coursers act like the emotionless guys from Equilibrium, yet in Fallout 3 Harkness acts very differently. In general, nothing holds up in Fallout under any level of serious scrutiny. On one hand, there is good evidence that compels us to believe there hasn't been any new processors in Fallout. On the other hand, some of the new technology simply demands it. Like you said Beth isn't overly concerned with lore. I'd extend that and say they likewise are not concerned with technological cohesion either. Their aim is a loose, open-ended narrative. They don't want any player decisions to be backed up definitively. They want everyone to feel justified in whatever decision they end up making. In some ways that makes the writing admirable.

 

My goal in making the video was mostly to introduce the real-world ramifications of AI and related technology to Fallout players who were unfamiliar with some of it. Fallout 4 has a somewhat politically left leaning undertone that paints the plight of synths with a broad brush. At the same time, I wasn't trying to get overly political with everything. :laugh: Not sure what I accomplished.

 

Regarding fear, just because something displays the appearance of emotions doesn't mean it actually has them. That's the whole point of the Turing Test. Furthermore, displaying emotions, especially fear is a great method to manipulate people. I know this from firsthand experience. It's why I liken synths to psychopaths. Again, I know they aren't, but they have some similar features.

Link to comment
Share on other sites

1. Actually, in Fallout 3 too you find out that Harkness functioned quite well as a courser for a while, before gradually starting to be unable to live with himself. And in FO4 Chase functioned for a while as a courser before she ran away and started helping other people escape. So I see no contradiction.

 

It's a journey. The ones still working for the institute are obviously not yet there. Trying to see a contradiction is like insisting that if you saw people at the starting line of a marathon, obviously others couldn't get all the way to the finish line. It just makes no sense.

 

 

2. And yes, the ones still working for the institute act like drones. Again, you find it in the institute terminals that showing any initiative or signs of self-awareness gets you terminated at worst or mind wiped at best. So what do you expect them to do? TRY to get themselves out of existence one way or the other?

 

You'd act like a mindless drone too, if the slightest sign of being anything different got you shot in the face. Hell, I know I would.

 

Hell, I pretty much DID in the army. Do you think it was my dream to crawl through mud when some other guy said everyone crawl through mud? And go "sir, yes sir!" at that, like he's been knighted or something? Nah, you probably don't think I was like that naturally. But if the alternative to being a drone is worse than being a drone, hell, I'm a as mindless a drone as they come.

 

 

3. As for emotions, yes, they CAN be faked. And believe me, I know first hand too that a psychopath can fake them very well. I don't like to talk about it, for reasons, but yes, you're not the only one to have extended exposure to one. I was pretty much raised by one. Not one of my parents, mind you, but a relative who was more than happy to play nanny (read: completely ignore us kids) as long as she gets to bum around the house and stir drama when she's bored. They're bored all the time, btw. I meant it when I said you have my sincere sympathy.

 

BUT, and here's another big BUT, they won't go and do something stupid and against their best interest just for the sake of faking an emotion. They might manipulate YOU into doing something stupid against your own interests, but they don't deliberately shoot themselves in both feet for no other reason than just to fake an emotion. In fact, the ONLY thing that works to curb antisocial behaviour in one is to point out alternatives that are more advantageous to themselves. Because while they may not care about anyone else, they still care about themselves.

 

Harkness got himself mind-wiped, i.e., erased from existence, because he couldn't live with himself. He essentially committed suicide by doctor. You don't do that just to fake an emotion. Psychopath or not, nobody removes themselves from existence just to fake an emotion.

 

The guy in Far Harbor panicked and ran into the fog, and got himself killed. Again, you don't DO that just to fake an emotion. Psychopath or not, there are better ways to fake it if the goal is to manipulate someone, than actually putting yourself in concrete and real mortal danger.

 

The guys in the Bunker Hill basement cower until the end, whether it's you killing them or the courser taking them for a mind-wipe. You don't DO that if you're not afraid. You try to get away, or manipulate someone else to take you away from the danger, or just grab a gun and try to save yourself. Any chance of getting away is better than certain doom, and if you're not paralyzed by fear, you'll take that chance.

 

Etc.

 

There is no rational reason to think any of those were faking it.

Link to comment
Share on other sites

1. Actually, in Fallout 3 too you find out that Harkness functioned quite well as a courser for a while, before gradually starting to be unable to live with himself. And in FO4 Chase functioned for a while as a courser before she ran away and started helping other people escape. So I see no contradiction.

 

It's a journey. The ones still working for the institute are obviously not yet there. Trying to see a contradiction is like insisting that if you saw people at the starting line of a marathon, obviously others couldn't get all the way to the finish line. It just makes no sense.

 

 

2. And yes, the ones still working for the institute act like drones. Again, you find it in the institute terminals that showing any initiative or signs of self-awareness gets you terminated at worst or mind wiped at best. So what do you expect them to do? TRY to get themselves out of existence one way or the other?

 

You'd act like a mindless drone too, if the slightest sign of being anything different got you shot in the face. Hell, I know I would.

 

Hell, I pretty much DID in the army. Do you think it was my dream to crawl through mud when some other guy said everyone crawl through mud? And go "sir, yes sir!" at that, like he's been knighted or something? Nah, you probably don't think I was like that naturally. But if the alternative to being a drone is worse than being a drone, hell, I'm a as mindless a drone as they come.

 

 

3. As for emotions, yes, they CAN be faked. And believe me, I know first hand too that a psychopath can fake them very well. I don't like to talk about it, for reasons, but yes, you're not the only one to have extended exposure to one. I was pretty much raised by one. Not one of my parents, mind you, but a relative who was more than happy to play nanny (read: completely ignore us kids) as long as she gets to bum around the house and stir drama when she's bored. They're bored all the time, btw. I meant it when I said you have my sincere sympathy.

 

BUT, and here's another big BUT, they won't go and do something stupid and against their best interest just for the sake of faking an emotion. They might manipulate YOU into doing something stupid against your own interests, but they don't deliberately shoot themselves in both feet for no other reason than just to fake an emotion. In fact, the ONLY thing that works to curb antisocial behaviour in one is to point out alternatives that are more advantageous to themselves. Because while they may not care about anyone else, they still care about themselves.

 

Harkness got himself mind-wiped, i.e., erased from existence, because he couldn't live with himself. He essentially committed suicide by doctor. You don't do that just to fake an emotion. Psychopath or not, nobody removes themselves from existence just to fake an emotion.

 

The guy in Far Harbor panicked and ran into the fog, and got himself killed. Again, you don't DO that just to fake an emotion. Psychopath or not, there are better ways to fake it if the goal is to manipulate someone, than actually putting yourself in concrete and real mortal danger.

 

The guys in the Bunker Hill basement cower until the end, whether it's you killing them or the courser taking them for a mind-wipe. You don't DO that if you're not afraid. You try to get away, or manipulate someone else to take you away from the danger, or just grab a gun and try to save yourself. Any chance of getting away is better than certain doom, and if you're not paralyzed by fear, you'll take that chance.

 

Etc.

 

There is no rational reason to think any of those were faking it.

 

 

 

You're not convincing me. Much of your reasoning hinges on synths being self-aware, and I don't accept that. The Institute states synths are not sentient, no matter how much they approximate humans. At this point we're both picking and choosing what we want to take at face value in the game.

 

My perspective is that the general narrative the writers present is that synths are not humans. They're human-like, but not clones. They need to be human-like robots in order to advance the narrative, to present the player with a dilemma, to ask the question, what does it mean to be human? Furthermore, again, if they are just glorified human clones, and the Institute can manipulate neurons, they'd have no reason to build synths and replace humans in the first place. They could just kidnap wastelanders and remap their brains.

 

The bunker hill synths might have calculated that a direct fight at that point was futile, and the only hope they had was to play the sympathy card to the bitter end. The player did just blast through dozens of heavily armored agents with Gauss Rifles, and dozens of BOS soldiers in power armor. They don't have access to the player's brain, so they can't rule out that the sympathy card won't work. But again, these situations are just there for the writers to advance plot elements and pull at emotional strings. They never wanted players to know anything definitively. They wanted to get people thinking.

 

We can go back and forth with this indefinitely at this point. You just seem to sympathize with synths whereas I don't. I approach it from the standpoint that we don't know enough about them to arrive at a solid conclusion, therefore I'm not willing to grant them human status and risk the human race. If i had my way as a player, I'd dismantle every single AI in the Fallout universe. I'd shut down Codsworth, Curie, Nick. Everything.

 

Nick Valentine is another interesting point of discussion. He IS made of synthetic parts with no organic matter. Nevertheless, he has a "complete" personality matrix of a former human. That seems to dispel the notion that the Institute doesn't have the tech for something akin to a positronic brain. Furthermore, since he has no organic matter, he can't possibly have the same experience of emotional biochemistry that a human being does. Yet, he imitates that pretty well. If it weren't for the terminator facade, he'd pass the Turing Test with flying colors.

Link to comment
Share on other sites

Well, again, there's a terminal in SRB saying that the mayor is too self aware. So we're back to your choosing to believe your own fantasies over what's actually stated in the game.

 

Which, of course, you're free to, if it makes YOU feel better about YOUR choices. But it's about as relevant to the actual lore as my pretending that my character is an elf that's secretly a ninja for the honourable shogun of Akavir. She used to be a high elf, even, but then her dealer got busted :wink:

 

Sad to say, though, there just isn't a good path in FO4. No matter how much people try to rationalize one as the best choice, the thing is, good option vs bad option kinda choices are bad game design. If everyone takes option A, then that's not even much of a choice. So no designer will give you a choice that's a clear cut case of puppies and rainbows vs a kick in the nuts. That's not even a Beth-only thing, although what Beth did do differently was pretty much making all choices be horrible, rather than each being good in some way.

 

And we had people rationalize THEIR choice as it must be good not evil ever since. No matter what piece of actual lore they have to override with their own fantasy. Which seems to be what you're doing too, no offense.

 

 

That said, the objection about neurochemistry is probably one of the most bogus I've heard in a long time, sad to say. The whole role of that chemistry is to get a certain signal from A to B. Insisting that it HAS to be chemically transmitted is about as absurd as if someone were to say that my room is dark, because it's only light if it's produced the traditional way by a fire. Electricity doesn't count.

 

As long as it transmits the signal, who the hell even cares if it's chemical or electricity or magic?

Link to comment
Share on other sites

Well there is a way to program a brain in a relatively short amount of time. Through the use of stimuli a person can be conditioned to an amazing degree. Subliminal messaging is another- though it has no scientific merit. The only way a human clone could be given someone else's identity is through some type of genetic transfer. In other words a human clone is made for one specific set of genetic material- one person's. So that clone can only ever be the person it copied. They cannot swap one persons essence for another. It all sounds impossible- but the last one especially.

 

The idea of a human brain being "downloaded" or "uploaded" is the most ridiculous nonsense I have ever heard. That level of technology will never exist because there is no way for anyone or anything to ever access the memories of a gerbil- let alone a human being.

 

The only conclusion left is that a clone or synthetic could be made to a reasonable approximation of a specific living organism. It may be a spot on match in appearance but the rest would have to be "programmed" artificially. Even this theory has holes as has been pointed out. I suppose we must assume that Institute scientist have worked out how to replicate a specific humans personality and such through some advanced means of cellular manipulation or something.

 

There is a scientist that specializes in robotics and AI that is trying to cheat death by transferring his wife's personality and memories into a robot. He believes that he can successfully "upload" her into this thing. Its a real story but I cant seem to find it again. Saw it on Discovery or something about a year ago. Personally don't think it will ever be possible. Ever heard of "28 grams"? That is supposed to be the weight lost by a body at the time of death. :cool:

Link to comment
Share on other sites

Well, that's SF for you. I suppose if it didn't involve some technology that borders on magic, it wouldn't be very interesting.

 

The Brain Uploading and Neural Implanting tropes, or the combination of the two, are both widespread and really not new. Even Mr Data, since you mentions him before, gets his creator's mind uploaded into him in one episode. Plus, speaking of Star Trek, the Ocampas in Voyager just download their parents' memories, so to speak.

 

Hell, it's not even SF only. Lovecraft pulled exactly that stunt via magic in "The Thing On The Doorstep" in 1933, for example. And he wasn't the first by far.

 

And, of course, one could argue that ancient stories about people possessed by someone else's spirit are essentially the same thing. They didn't think of the brain as a computer and the mind as the data in it. Your consciousness was some form or another of soul or spirit, so, well, that's how you'd get the mind of person A in the body of person B. That kind of stories are literally at least as old as writing, as they go back all the way to ancient Mesopotamia and Egypt.

 

Hell, in shamanic cultures as well as in ancient Egypt, you could even get the soul of an animal inside a human. Egyptian magicians would (claim to) deliver prophecies and hidden information by basically getting themselves possesed by a spirit.

 

 

Basically, while I'll agree that it's not very realistic, it seems to be the kind of story than endured more than 5000 years of recorded history, and is present all over the globe. So I guess it must be liked enough by people. Can't really fault Beth for going with it, I suppose.

Link to comment
Share on other sites

Well, again, there's a terminal in SRB saying that the mayor is too self aware. So we're back to your choosing to believe your own fantasies over what's actually stated in the game.

 

Which, of course, you're free to, if it makes YOU feel better about YOUR choices. But it's about as relevant to the actual lore as my pretending that my character is an elf that's secretly a ninja for the honourable shogun of Akavir. She used to be a high elf, even, but then her dealer got busted :wink:

 

Sad to say, though, there just isn't a good path in FO4. No matter how much people try to rationalize one as the best choice, the thing is, good option vs bad option kinda choices are bad game design. If everyone takes option A, then that's not even much of a choice. So no designer will give you a choice that's a clear cut case of puppies and rainbows vs a kick in the nuts. That's not even a Beth-only thing, although what Beth did do differently was pretty much making all choices be horrible, rather than each being good in some way.

 

And we had people rationalize THEIR choice as it must be good not evil ever since. No matter what piece of actual lore they have to override with their own fantasy. Which seems to be what you're doing too, no offense.

 

 

That said, the objection about neurochemistry is probably one of the most bogus I've heard in a long time, sad to say. The whole role of that chemistry is to get a certain signal from A to B. Insisting that it HAS to be chemically transmitted is about as absurd as if someone were to say that my room is dark, because it's only light if it's produced the traditional way by a fire. Electricity doesn't count.

 

As long as it transmits the signal, who the hell even cares if it's chemical or electricity or magic?

 

Ok.

 

Why do you ignore Father and Ayo, and place more emphasis on a terminal entry? Why are you acting as if there is simply no question at all about synths, as if the writers never intended there to be. Granted, the writers seem to encourage players to arrive at the conclusion that synths are sentient and that the Institute just doesn't want to admit it. But it's never spelled out. They leave it open to develop a story with a moral dilemma.

 

Why do you claim strict adherence to lore, yet you ignore that Nick Valentine demonstrates how the Institute clearly can pass the Turing Test with synthetic, inorganic materials. I can concede points you made about Amari and the Robobrains, but given the general narrative of the game, it seems odd that synths would be nothing more than clones with minor hardware add-ons. Again, they'd be human, and there'd be no moral dilemma. Cloning is not new in Fallout. The Institute would just admit it once you're the Director, and the emphasis would be on the fact that they can control human beings' brains down to the neural level.

 

I don't agree with your conclusions, but that doesn't mean I think you're living in a fantasy. You have an intelligent rationale for thinking what you do. I simply don't think I have enough information to declare that synths are either humans or sentient beings. Thus I won't risk actual human lives for them.

 

 

EDIT: Also, getting back to how chemistry functions as a neural transmitter. Are you admitting that the Institute has the technology to make a completely artificial brain, then? I thought your point was that they hadn't even made a new CPU for 200 years, let alone a fully functional artificial inorganic brain.

Link to comment
Share on other sites

Well there is a way to program a brain in a relatively short amount of time. Through the use of stimuli a person can be conditioned to an amazing degree. Subliminal messaging is another- though it has no scientific merit. The only way a human clone could be given someone else's identity is through some type of genetic transfer. In other words a human clone is made for one specific set of genetic material- one person's. So that clone can only ever be the person it copied. They cannot swap one persons essence for another. It all sounds impossible- but the last one especially.

 

The idea of a human brain being "downloaded" or "uploaded" is the most ridiculous nonsense I have ever heard. That level of technology will never exist because there is no way for anyone or anything to ever access the memories of a gerbil- let alone a human being.

 

The only conclusion left is that a clone or synthetic could be made to a reasonable approximation of a specific living organism. It may be a spot on match in appearance but the rest would have to be "programmed" artificially. Even this theory has holes as has been pointed out. I suppose we must assume that Institute scientist have worked out how to replicate a specific humans personality and such through some advanced means of cellular manipulation or something.

 

There is a scientist that specializes in robotics and AI that is trying to cheat death by transferring his wife's personality and memories into a robot. He believes that he can successfully "upload" her into this thing. Its a real story but I cant seem to find it again. Saw it on Discovery or something about a year ago. Personally don't think it will ever be possible. Ever heard of "28 grams"? That is supposed to be the weight lost by a body at the time of death. :cool:

 

 

You should read this. It's long, but totally worth it. Brains are very complex but they're not magic. I don't think humans could ever do it themselves, but if we invent an AI with a sufficient intelligence improvement algorithm, then pretty much anything is possible. It would only be a matter of time. Not saying I think it's coming any time soon, just that's it's possible, perhaps even probable at some point. I think it becomes a philosophical question of whether or not the uploaded brain is still the same person and not a digital copy. What if you upload your brain and are still alive? There's you and a digital you. Creepy.

 

I'm gonna go with the Jurassic Park guy on this and say we shouldn't be aiming for Artificial General Intelligence at all. Personally, I think we're all doomed. If a misalignment in an AI's prime directive doesn't kill us, the class warfare and automated jobs will. This is one of those things, I desperately want to be wrong about. :laugh:

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...