Jump to content

Synth Shaun: Moral Issue?


jjb54

Recommended Posts

I've given up on "discussing" this with RattleAndGrind, so this time I'm going to aim my thoughts in the general direction of the thread readers. To summarize the essence of RattleAndGrind's position, as I understand it and as it relates to the OP's morality question, he's saying that actions that would be immoral when done to a human being are not immoral when done to Synth Shaun and gen-3 synths in general, because gen-3 synths don't meet all the criteria for qualifying as animal life. And since animal life is the only life that current "real-world" science recognizes, then gen-3 synths are technically not alive, and THAT is the single most important factor in determining morality in this scenario, overriding any other considerations.

 

MY position is that no, there are other considerations that are clearly more important and relevant, such as the fact that gen-3 synths are obviously sentient and experience physical and emotional pain (the game makes that clear enough). To ignore that and instead cling to a technicality is unreasonable to say the least (stronger words I might use are "callous" and "deranged"). I don't know if anyone cares about this at this point, but I wanted to make my position clear.

The question of morality with regards to synthetic beings might better be based not on whether or not they are "alive," so much as on whether or not they are "conscious." Sapience (self-awareness, as opposed simply to sentience, the ability to perceive one's environment, without necessarily being aware of one's one place in it) is a more limiting parameter. Synths are self-aware, have emotions, and evolve as persons. A good example of the latter is how the synth raider in Libertalia wasn't originally "programmed" by Dr. Amari to be a raider, but eventually fell down that route, the same way any human might. In contrast, a robot, even one with as developed a "personality" as Codsworth, is still limited to its programming. So, free will factors into it, as well.

 

Are synths human? Arguably, they are something similar, but not the same. Is it moral to treat them the same way we would treat other humans? As fellow sapients, I would say so, just as it would be to treat an alien race of sapients according to our best social standards. Finally, with regard to morals, it may be better to consider how the way in which we treat others affects US, because how we interact with them defines aspects of our OWN humanity. A synth may or may not be human, but treating it callously or abusively degrades the one who does so. ='[.]'=

Link to comment
Share on other sites

  • Replies 72
  • Created
  • Last Reply

Top Posters In This Topic

Like w/the Star Trek: TNG - Self-Aware. What does that mean exactly. As Picard stated in the dialog, Data was very much AWARE of what was going on and what it meant to Data Personally.

 

So what is " Self-Aware " - " Conscience " ??

 

Because in the game, the Gen-3 Synths show FEAR and emotion, and awareness of " death " if they get caught. They can also feel pain and other emotions as well.

Link to comment
Share on other sites

I simply don't believe that Synth Shaun is incapable of aging. I think he's designed to go through puberty, if only because Synths are human in EVERYTHING but the brain, and maybe even parts of the brain. That's why you can't tell a Synth from a normal human being, because they are in every way human.Probably most of the brain is human insofar as hormone/endrocrin is concerned. Otherwise, for instance Synths would never get hungry or experience menstruation.

Link to comment
Share on other sites

The moral treatment of any entity is based solely on the criteria of the individual rendering the treatment. Such behavior is a personal choice and is founded in the beliefs of the individual.

 

In spite of the pontification of greentea101 and the narrow representation of my position, the issue is very much wider and has been debated since Plato. The morality of and moral obligations to self aware machines has been addressed in scientific articles and science fiction alike.

 

Here is the current state of the debate.

 

Autonomous mechanical entities arrive at the end of their respective assembly line whole and complete and ready to be switched on. But with manufactured entities, where do moral actions originate? How does one determine whether an entity has moral agency; the ability to freely act in a moral fashion? How does one determine it a mechanical device is a true moral agent?

> Properties, those characteristics which intricate that an entity is a moral agent.

  • Property P indicates morality.
  • Entity E demonstrates property P.
  • Entity E is moral and is a moral agent.

Let us consider compassion. Can compassion be simulated in code? Yes. Consequently, compassion is not a valid property for determining if an entity is a moral agent.

 

After going through the entire list of human moral properties, we find that they are all capable of being simulated via code. So simply demonstrating moral properties is not a valid criteria for determining if an entity is a moral agent.

 

> Sentience, freedom of thought and action.

 

A sentient entity can choose to behave as a moral agent. But how does a sentient entity know what properties are appropriate moral responses? Where is the learning and experience to know which property is germane to the situation?

 

Given that a mechanical device is ready to be activated at the end of its assembly line, it is not morally ethical for a human being to activate a sentient machine without first providing it with some fundamental knowledge of what constitutes moral properties and their use. So the moral properties in a sentient entity will be encoded in them during manufacture, (like the Three Laws of Robotics in Asimov's I, Robot). So a sentient machine is not necessarily choosing to be moral and therefore is not truly a moral agent.

 

> Sapience. Sentience coupled with self awareness and self motivation.

 

A sapient machine (Data in ST:TNG or a Gen 3 Synth in FO4) is fully aware of itself, and knows its nature. Are these machines mortal agents?

 

Here again, human morality demands that a sentient machine have some basic understanding of the properties of morality at activation. So here too, these properties are build into the machine at manufacture so that the machine has a fundamental blueprint from which to determine moral action. So here again, These entities are not necessarily moral agents, but are simply responding to coded instructions.

 

> Assigned Value. I like it so I will treat it morally.

 

Each year, tens of thousands cute, fluffy, cuddly little baby chicks, rabbits, ducks and geese are sold leading into the Christian holiday of Easter. Within the next month, virtually all of these animals are dead on in shelters. This is an extreme example of assigned value run amok. The assigned value is not intrinsic to the entity in question, but on the human assigning the value, and does not make a mechanical entity a moral agent.

 

> Reciprocity. It behaves morally, so I treat it morally.

 

In the late 1940's; Alan Turing postulated that eventually, humans would not be able to distinguish between a machine and a human during a normal conversation. The machines behavior is programmed to be indistinguishable from a true human. This programming will contain moral properties and their use so as to simulate true moral agency. Since people believe It is a moral agent, they will believe it is deserving of moral treatment. But the machine is not a true moral agent, it is simulating one.

 

(Reciprocity works both ways. A machine can behave in an immoral fashion, piss an individual off and cause them to throw the pointing device through the monitor.)

 

The other tact in this discussion of moral agency in mechanical entities is to look at it from the point of view of these entities themselves. The focus of these studies is to determine if a mechanical entity can learn and appropriately apply moral properties so that the entity acts as a moral agent.

 

> Moral properties and their application learned by trial and error.

 

As children (most of us) learn the moral properties and their applications via trial and error, under the tutelage of our parents and teachers. These guardians also prevent the vast majority of us from making fatal mistakes while we learn.

 

A mechanical entity does not have a 'childhood'. It is brought off the assembly line whole and complete and must now learn the foibles of moral agency. What are the limits on this education? A child is too weak to rend another human apart, so it will not make that mistake. But a fully formed and capable machine can surely make that mistake. And who is there to correct the actions of this machine? Who is there to provide the necessary education to develop a truly moral entity. And who prevents these entities from making fatal mistakes, which are the antithesis of learning?

 

> Moral properties and their application via group learning.

 

This is what insects do, right? So why not machines? Have any of us forgotten our rebellious years. That time when we tossed off the teachings of childhood and tested the world on our own, making mistakes and (hopefully) learning from them.

 

What will be the consequences if a machine abandons the moral properties it is taught and decides to test them against the world. Does this put us back to "Moral properties and their application learned by trial and error"?

 

The counter to the scenario is that the machines will not have a rebellious phase and therefore the point is moot. But is it? If the machine is simply adapting the learning of others without question, is it behaving as a moral agent? Or is it simply adapting the learning of others and its behavior is not an actual moral choice, but a reflex response?

 

tl;dr

 

So there it is, The poor mans view of the current state of the debate on mechanical moral agency. Now, please keep in mind that what is here is almost twenty five centuries of debate, argument and counter argument, opinions and philosophy boiled down into very simple terms. It starts with Plato and is now into the thinking of intellectual heavyweights like Stephen Hawking, Elon Musk, and Bill Gates.

 

This is the basis of my statements that machines cannot choose to act as moral agents. Since machines are not true moral agents, they are not entangled with the same moral consideration as a truly living being. The bacteria in my mucus is more worthy of moral treatment than any machine. That said, I kill millions of bacteria a day and still take my auto in for periodic maintenance. It's that 'assigned value' thing.

Link to comment
Share on other sites

I simply don't believe that Synth Shaun is incapable of aging. I think he's designed to go through puberty, if only because Synths are human in EVERYTHING but the brain, and maybe even parts of the brain. That's why you can't tell a Synth from a normal human being, because they are in every way human.Probably most of the brain is human insofar as hormone/endrocrin is concerned. Otherwise, for instance Synths would never get hungry or experience menstruation.

 

Well I knew that I heard one of the Institute Scientists ( Female ) talking with Elder Shaun and stated what I did with LINK in my OP.

 

 

From:

http://fallout.wikia...i/Shaun_(synth)

  • Overhearing conversations within the Institute reveals that Shaun cannot grow up and age like a normal human, and thus will remain a child forever.

So it has been confirmed, I have even had a couple of other players agree, that they too heard this dialog as well.

So unless you have some evidence, which I will gladly look at ... this seems very much to be the case with Synth Shaun.

Link to comment
Share on other sites

I also re-read the Fallout Wiki and found this:

 

 


Shaun was created not for any research or scientific advancement, but simply on the whim of Father.

 

 

So this could also explain why Synth Shaun is " incomplete " - no real thought was given, it was apparently on a " whim " ....

Link to comment
Share on other sites

Ah that's Bethesda being STUPID again. It simply doesn't follow how Synths are made. And besides, it could be a thing of prototyping and while they don't believe Shaun can grow up, it would be a simple matter of programing in for the child to hit puberty. He's already got human bones, human skin and human muscles, all it needs is the right endocrine cocktail to get the process started.

Then again, I feel fully entitled to head mod things that don't make sense, which (with green mods) is how I got through Fallout 3.

 

Assuming Bethesda's asinine writing were correct, I'd leave him to die, but that's mostly because I consider childhood a dreadful physical and mental disability that most of us are fortunate to grow out of.

Link to comment
Share on other sites

 

Assuming Bethesda's asinine writing were correct, I'd leave him to die, but that's mostly because I consider childhood a dreadful physical and mental disability that most of us are fortunate to grow out of.

 

I agree entirely... especially with "most of us" ;)

 

You bring up a very interesting part about the role of hormones in synths. I doubt the writers/designers were thinking of this, however. I have similar gripes about some of the mechanical assets in the game-- like the settlement generators, they are clearly the work of a 3d artist who has no idea what a generator is, and how one would go about generating power (hint-- re-use the car engine-block assets you see lying around). Another thing which bugs me is how you are able to stand in the open cabin of a vertibird barely six feet from the exhaust output of the nacelle and not get cooked.

Link to comment
Share on other sites

 

 

Assuming Bethesda's asinine writing were correct, I'd leave him to die, but that's mostly because I consider childhood a dreadful physical and mental disability that most of us are fortunate to grow out of.

 

I agree entirely... especially with "most of us" :wink:

 

You bring up a very interesting part about the role of hormones in synths. I doubt the writers/designers were thinking of this, however. I have similar gripes about some of the mechanical assets in the game-- like the settlement generators, they are clearly the work of a 3d artist who has no idea what a generator is, and how one would go about generating power (hint-- re-use the car engine-block assets you see lying around). Another thing which bugs me is how you are able to stand in the open cabin of a vertibird barely six feet from the exhaust output of the nacelle and not get cooked.

 

 

Well, being now totally off topic - which is fine:

 

I still wonder how they NPC - ALL OF THEM - survive a bullet to the head, let alone 2 of them. But hey - umm .... the radiation exposure??? ;)

Link to comment
Share on other sites

Oh it's more than possible to survive a bullet to the head. Not continue to fight, but survive. And as to why enemies can get shot multiple times in the head and still keep fighting, that's why we need mods and I feel more than justifying modding out and head cannoning anything Bethesda does that's really, really stupid. Hence why I love green mods for 3 and 4.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...