Jump to content

Ad Victoriam, But Why?


Deleted4363562User

Recommended Posts

Sorry, I didn't really address the other synths displaying guilt and mindless fear.

 

I admit you have a point there...to an extent. Or I think so anyway.... I look at these behaviors and any other behavior that deviates from their core programming as a run-time error. They have certain behavioral programs and algorithms written by the Institute. The Institute code probably doesn't account for every scenario and/or variable out in the wasteland, so there are bound to be events that alter synth behavior in unintended ways. I more or less said that in the video....I think I did, anyway. Maybe I cut that out? Either way, different synths have different prime directives and protocols. Mayor McDonough is probably just carrying out some kind of survival protocol after his identity is revealed in DC and his run-time conclusion is that the Institute isn't going to help him. :laugh:

 

I did also want to say again that even if the synths are sentient, I would still be weary of them. I'd be tempted to grant them human rights, but on the other hand, if their higher functions are within the Synth Components and therefore inorganic, then it'd be a matter of time before they'd outpace us in intelligence. I don't consider marginal intelligence increases a threat, but eventually, their intelligence would be such that we'd be defenseless if they decided to act against us. I know you say they have human brains, but I don't agree. Why would the Institute build a completely artificial brain which would almost certainly process information faster and then decide to go back to a slower organic brain? BTW, Nick does have increased processing power. I forget the scene, but he basically takes in a ton of information in a few seconds. Sorry, I can't be more precise.

 

Also, this goes against my belief that synths are not sentient, but....if the they do indeed have an artificial brain with increased processing speed, then that might explain why they eventually become self-aware. Again, if their behavioral deviations aren't the result of unforeseen variables in the real world.

Link to comment
Share on other sites

  • Replies 71
  • Created
  • Last Reply

Top Posters In This Topic

As I read through the various posts, and play through the game, I have become more and more convinced that the gen3 synths are not synths, are not clones and are sentient.

 

From the wiki and what can be learned in the game –

Generation 3 synths are physically and mentally indistinguishable from ordinary humans, having lab-grown bodies of real human flesh, bones, and organs instead of plastic and metal.

The “synth component” that can be found in dead Gen3 synths is; “a small device of unknown function found only on synths of all three generations. The actual location of the synth component on the bodies of synths is unknown.” But is assumed to be in the brain or some vital organ.

 

The Gen3 synths are built in a lab and the lab can be visited. It isn’t exactly cloning as the synths are being manufactured. The synths are being built on Shaun’s uncontaminated DNA and the way I see it, this is similar to the movie The 5th Element. The Element is killed in a crash and the small remains (a hand) is placed in a 3D BioPrinter and the Element is rebuilt. Not cloned, but rebuilt back to original in a BioPrinter using DNA. This is why the Institute needed Shaun’s DNA. The purpose is to build a better human and that would not be possible with contaminated DNA. This is also why Shaun is seen as “Father” as all Gen3 synths come from his DNA.

 

The programming the synths get after being built is like that of Total Recall. The character Quaid/Hauser in the movie is programmed not only with memories, but a complete different personality. One need look at the institute after the capture of the rogue synth in “Synth Retention”. The rogue synth is in the lab being reprogrammed. Wipe the “bad” personality that made them a raider and program a new one as a “better” human being.

 

As for the “synth component”, it isn’t a brain or programmed AI, it’s an emergency shut down the scientists made to address synths who become self aware and run away. A “safe word” if one will allow, such as that used on River Tam in Serenity, that triggers a form of unconsciousness. The reason it is mechanical is to ensure it always works in case the synths brain forgets or eliminates the memory of the safe word due to illness, injury or development of other memories/personality.

 

While I am sure fault can be found in the above reasoning, it is after all a game and science fiction where anything is possible.

Link to comment
Share on other sites

Why do you ignore Father and Ayo, and place more emphasis on a terminal entry?

I'm not. You're making stuff up. There's a difference.

 

Father never goes into such details as exactly what are the differences. And, about the topic at hand, you can ask Father about Synth self-awareness quite point-blank ("If the synths are intelligent and self-aware, then they have a right to free will.") and he never contradicts it. In fact his answer is simply, "However closely they may approximate human behavior, they are still our creations."

 

And that's a very important clue there. He does NOT say that they're not self-aware or that they can't have free will. His objection is just basically that since the Institute made them, the Institute owns them.

 

Nor does Ayo ever say that the Synths can't be self-aware. In fact, none of his dialogue files even contains ANYTHING about self-awareness or free will. The only evidence on what Ayo knows is exactly in that terminal entry is from the SRB main terminal, by a high ranking SRB member (Alana Secord), and explicitly mentioning talking to Ayo about it. So the notion that somehow Ayo doesn't know that synth can be self-aware is nonsense. The evidence in the game points otherwise.

 

Basically if you want to talk lore, then talk actual lore, don't just make it up.

 

Why are you acting as if there is simply no question at all about synths, as if the writers never intended there to be. Granted, the writers seem to encourage players to arrive at the conclusion that synths are sentient and that the Institute just doesn't want to admit it. But it's never spelled out. They leave it open to develop a story with a moral dilemma.

Actually, the fact that synth are sentient, or have attributes associated with sentience, is mentioned several times in the game. In fact, the only notable figure that denies it is Father, but strangely enough that's at the same time as never denying that they're self aware or any other thing related to sentience. So that's a bit like saying that my pet isn't a cat, while not denying that it's small, furry, tabby, purrs and meows.

 

Why do you claim strict adherence to lore, yet you ignore that Nick Valentine demonstrates how the Institute clearly can pass the Turing Test with synthetic, inorganic materials. I can concede points you made about Amari and the Robobrains, but given the general narrative of the game, it seems odd that synths would be nothing more than clones with minor hardware add-ons. Again, they'd be human, and there'd be no moral dilemma. Cloning is not new in Fallout. The Institute would just admit it once you're the Director, and the emphasis would be on the fact that they can control human beings' brains down to the neural level.

I'm not ignoring it, I'm just not seeing how it's relevant. Pre-war tech can run a human personality and take its own decisions just fine all the time. See, for example, "president" Eden in Fallout 3. Also, Curie, Ada and Codsworth in Fallout 4, among others.

 

But just because that's possible, doesn't mean that the Institute can also design a new human-like brain. It's like saying that if my pal Max can design a car (no, really, that's his job even), then he obviously designed his cat too.

 

I don't agree with your conclusions, but that doesn't mean I think you're living in a fantasy. You have an intelligent rationale for thinking what you do. I simply don't think I have enough information to declare that synths are either humans or sentient beings. Thus I won't risk actual human lives for them.

Nobody said you should risk anything, or really do anything. You can do whatever you wish, if the game allows it. And if it doesn't, well, there are always mods :tongue:

 

There isn't any right or wrong way to play a single player game, and how you play isn't anyone's business anyway.

 

I'm just talking about what the game actually says, not what you should or shouldn't do. I'm not going to tell you what you should do in a single player game.

 

EDIT: Also, getting back to how chemistry functions as a neural transmitter. Are you admitting that the Institute has the technology to make a completely artificial brain, then? I thought your point was that they hadn't even made a new CPU for 200 years, let alone a fully functional artificial inorganic brain.

I'm not saying they can, I'm just saying it's fully irrelevant either way. Whether it's chemical or electrical or by magic is just as fully irrelevant for whether some mental process exists or not, as it's irrelevant for whether something produces light or not.

Link to comment
Share on other sites

A 3rd generation synths programming enables it to accurately simulate human behavior and thinking. This is not to say that they can act for themselves. They seem to be forever dependent on outside influence in order to adjust responses and the like. There is no evidence in the game to suggest that they have the capacity for original thinking. One example may be looking at a mountain and deciding it would be a worthy challenge to climb. While this is a characteristic of a human, there does not seem to be any way for a synthetic to understand the concept of abstract emotions or thinking. Simply wanting to survive or defend itself does not count. :geek:

 

Its too bad that Beth didn't develop this a little further so as to give a player the possibility to ferret out a synthetic through some clever means or perhaps through closer scrutiny to notice subtle (almost imperceptible) differences. That being said I think they were intentionally vague on the subject to maintain the suspension of disbelief.

 

Though there is one I can think of. You never seem to see a synthetic drug addict or even one that uses drugs- I mean aside from the occasional drink of alcohol. Also they are never sickly as synths are immune to common human ailments.

Link to comment
Share on other sites

In addition to that:

- Glory refused a mind-wipe and wanted to become a Railroad enforcer, unlike most other synths (... that she has no problem goading to the mind-wipe doctor; which is one thing that makes me call her a hypocrite)

- Harkness decided to essentially erase himself from existence because he couldn't live with himself

- K1-98, the synth in Greentech Genetics, decided to run from both the Institute AND the Railroad and try to make it on her own

- McDonough drew attention to himself by requesting to become a courser

Etc.

 

Hell, it's not just the synths. As I was saying, pre-war tech seems capable of taking its own decisions just as well. E.g.,

- "president" Eden decides to let you free in FO3, and can destroy the Enclave base to let you escape

- Ada decides to enlist help and try to avenge her maker

- Codsworth decides to basically mislead you until he's sure he can trust you, and there's no indication that the original Mr Handy programming was designed to do that

- Miss Edna, the Miss Nanny robot in the DC school, fell in love with the teacher and is aware of how unusual that is

- Curie is aware of her limitation and requests to be put in a human body, something that not only wasn't programmed into her, but wasn't even POSSIBLE at the time

- KLEO decided to run a weapons shop instead of performing her function as an assaultron, and insists that she considers herself a woman

- Both Curie and Codsworth can have empathic responses to your actions, especially actions concerning other robots, and can refuse to come with you if they don't like your actions. There's no indication that ANY pre-war robots were INTENDED to disobey their owners or hate their human masters for how they treat other robots.

Etc.

 

None of that was programmed to happen. In fact, pretty much all of them are the kind of things you do NOT want to even be possible, if you're programming an AI.

 

Now you could say that it's the influence of the environment, e.g., that Edna saw people being in love and extrapolated from there. And you probably wouldn't be wrong.

 

But the same can be said about humans (or for that matter about cats and dogs.) Your actions aren't random. They're based on those existing associations up there in your noggin. That's what intelligence is about. You can learn and you can apply what you've learned.

Link to comment
Share on other sites

@ Moraelin, I found your initial argumentation enlightening, until you started engaging in semantic gymnastics.

 

Justin Ayo saying that "synths do not want" is a pretty clear, albeit not an explicit statement, that synths don't have free will. Father not explicitly denying it, is not the same as him confirming it. Furthermore, approximating free will isn't equal to free will.

 

One of your initial arguments was that Synths are basically clones, because there isn't a CPU sufficient in Fallout to facilitate the passing of the Turing Test. Then you went on to cite that the prewar tech could run full personality matrices. How is that not a contradiction?

 

Honestly, it's not that I think that your claims are entirely unsubstantiated. You can clearly cite things in the game to support them. I enjoyed reading those. What I have a problem with now at this point is your assertion that my claims are completely unfounded, unsubstantiated, and are based on fabrications. I freely admit that I cannot prove my claims beyond a reasonable doubt. The game wasn't designed or written so these sorts of claims could be, yet you keep acting as if your perspective on the presented lore, with all its contradictions, gaps and open-ended nature is the only valid one.

 

I've really appreciated everyone's feedback here, but at this point I have all I need. I won't be commenting anymore. But regardless of our disagreement, I sincerely thank you for the discussion, Moraelin. I enjoyed your intellect.

Link to comment
Share on other sites

Not denying it is in fact not the same as confirming it, but that goes both ways. You accused me specifically of ignoring what Father and Ayo say, in favour of a terminal entry. Your argument was just that... err... what, exactly? That I ignore that they don't directly either confirm or deny it? It's hardly an argument, is it? If there isn't relevant information in there, then exactly what am I ignoring?
Link to comment
Share on other sites

  • 4 weeks later...

In my current play-through I am being a complete naturalist and actively destroy ANY Synths or sympathizers with extreme prejudice. I have weighed the debate for awhile and finally decided that the Institute, Railroad, and others were to be eradicated as per the mission objective by the BOS. No matter how accurately a synthetic can copy a human, they are still just a copy- and not human! They are an abomination built with malevolent intent by the Institute, coddled by the idiotic RR, and just feared or dismissed by other factions that SHOULD also be opposed to their creation and existence.

 

So in summary, I believe the BOS are correct in their assessment of the situation and in their campaign to destroy the Institute and all synthetics.

 

As a side note I wonder why Paladin Danse "hates it" when you destroy a synth? This is before his origins are revealed mind you, and not consistent with BOS ideals. If anything he should love every synth I destroy. Curious. :wink:

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...