Jump to content

Synth Shaun: Moral Issue?


jjb54

Recommended Posts

 

 

 

The child synth named Shaun is a artificial being, a machine, an manufactured artifact. Regardless of his apparent intelligence and intellect, these affectations are nothing more than clever programming and sophisticated decision tables. Given this, the child synth named Shaun has no more moral entanglements than a toaster or a unmade bed or a old pair of shoes.

 

Spoken like a " True BoS Soldier "

 

Don't be rude. I HATE the BoS. The only value they have it FO4 is as a source for stealing power armor frames.

 

I also know that 'Synth' is short for 'Synthetic'. According to my Merriam, "Synthetic: made by combining different substances : not natural". So I stand by my evaluation that the morality issues inherent in a Synthetic being are virtually non-existent. And that evaluation includes R. Daneel Olivaw, a hero from my youth and the author of 'The Zeroth Law of Robotics'.

 

 

Why does it matter how a being is "made", when it comes to morality? Human beings are essentially biological machines, put together by the forces of physics. The gen-3 synths are literally flesh-and-blood human beings, with some internal tech augments added to make them easier to control (unfortunately the game doesn't seem to give any details about that). As far as I can tell, they have human brains with all the human "circuitry" intact (emotions, survival instinct, self-determination, etc).

 

 

I am once again reminded of Capt. Picard's defense of Cmdr Data, when he was thought to be the " property " of Star Fleet.

Link to comment
Share on other sites

  • Replies 72
  • Created
  • Last Reply

Top Posters In This Topic

 

 

 

The child synth named Shaun is a artificial being, a machine, an manufactured artifact. Regardless of his apparent intelligence and intellect, these affectations are nothing more than clever programming and sophisticated decision tables. Given this, the child synth named Shaun has no more moral entanglements than a toaster or a unmade bed or a old pair of shoes.

 

Spoken like a " True BoS Soldier "

 

Don't be rude. I HATE the BoS. The only value they have it FO4 is as a source for stealing power armor frames.

 

I also know that 'Synth' is short for 'Synthetic'. According to my Merriam, "Synthetic: made by combining different substances : not natural". So I stand by my evaluation that the morality issues inherent in a Synthetic being are virtually non-existent. And that evaluation includes R. Daneel Olivaw, a hero from my youth and the author of 'The Zeroth Law of Robotics'.

 

 

Why does it matter how a being is "made", when it comes to morality? Human beings are essentially biological machines, put together by the forces of physics. The gen-3 synths are literally flesh-and-blood human beings, with some internal tech augments added to make them easier to control (unfortunately the game doesn't seem to give any details about that). As far as I can tell, they have human brains with all the human "circuitry" intact (emotions, survival instinct, self-determination, etc).

 

 

A tool is just a tool. When a tools usefulness has been exhausted, it is discarded.

 

A machine is just a machine. When a machine ceases to function, it is discarded.

 

We can use tools and machines in the way they were intended or in ways which were never imagined by the designer. This is the nature of tools and machines. It is not immoral to use a tool in any manner one desires. The level of sophistication or purported intelligence of a machine does not change that.

 

Your analogy of the human body being a machine is perfect. We abuse and torture this machine. We put toxins into this machine. We destroy the resources this machine needs to continue functioning. We mangle and mutilate this machine. We expose this machine to dangerous situations. We destroy these machines by the tens of millions 1. Obviously, this machine is free from moral entanglements. And so it is with all machines.

 

Now, let me reverse this discussion. What happens when a super-intelligent machine 2 decides that humanity is no longer necessary? Is it then moral to destroy these intelligent machines to preserve humanity? Is morality situational?

 

Or is morality just so much cotton candy or dandelion fluff? Plato 3,4,5, Nietzsche 6, Machiavelli 7 and philosophers today continue to debate the origin and intent of "morality". The most popular positions are that the concept of morality was developed:

  • by the weak to curb the abuses of the strong.
  • by the minority to inhibit the abuses of the majority.
  • by parishioners to diminish the power of the clergy.
  • by the poor to induce charity from the rich.
  • by slaves to persuade their overseers to give them more food.

What ever the origins and intent, the impact of "morality" is with us today. It exists in 'political correctness' and 'Black Lives Matter' and 'All Lives Matter' and 'life begins at fertilization' and 'fetuses are parasites' and affirmative action and terrorism and the war on drugs and the war on poverty. The Magna Carta was an attempt to impose morality on Royalty. The United Nations Universal Declaration of Human Rights is simply an attempt to define 'universal morality'. The entire Bill Of Rights appended to the United States Constitution is nothing but an attempt to legislate morality. (DO NOT infer that I oppose or endorse any of those listed here. They are listed here solely as examples of our 'morality').

 

A man blows himself up in a bar. He believes that his cause is just and that his actions are moral. The man believes himself moral because he opposes the oppression of his particular minority by the majority and millions who agree with the man are adamant that the man is moral. The 34 injured and the families of the 15 dead have an entirely different opinion. To them, the man is immoral because he killed random people who were not actively oppressing him. This one man is both moral and immoral for the same act.

 

So for me, it is simple. Just as "One mans religion is another mans belly laugh" 8, so it is with morality.

 

 

1. WWII casualities.

2. Superintelligence, Paths, Dangers Stratagies

3. Plato, The Republic

4. Plato, Gorgias

5. Plato, Five Dialogues

6. F. Nietzsche, Thus Spake Zarathustra

7. Niccol Machiavelli, The Prince

8. Robert A. Heinlien, The Notebooks of Lazarus Long

Edited by RattleAndGrind
Link to comment
Share on other sites

 

 

 

The child synth named Shaun is a artificial being, a machine, an manufactured artifact. Regardless of his apparent intelligence and intellect, these affectations are nothing more than clever programming and sophisticated decision tables. Given this, the child synth named Shaun has no more moral entanglements than a toaster or a unmade bed or a old pair of shoes.

 

Spoken like a " True BoS Soldier "

 

Don't be rude. I HATE the BoS. The only value they have it FO4 is as a source for stealing power armor frames.

 

I also know that 'Synth' is short for 'Synthetic'. According to my Merriam, "Synthetic: made by combining different substances : not natural". So I stand by my evaluation that the morality issues inherent in a Synthetic being are virtually non-existent. And that evaluation includes R. Daneel Olivaw, a hero from my youth and the author of 'The Zeroth Law of Robotics'.

 

 

It was not meant to be rude at all. Poking fun a little, yes. You post does indeed read as a True Blue BoS.

 

Again, not meant to be rude - was just having fun. :smile:

 

My "Don't be rude" was not a angry statement. It was more the gentle chiding that one would give a child for being ill-mannered toward someone they did not know. :tongue:

Link to comment
Share on other sites

Is there a difference between Commander Data from ST: TNG and Synths?

 

 

<snip>

 

 

 

No, not really. And the reason is simple. By definition, neither Data nor Synths are living beings. The dead end for both is:

 

Reproduction The ability to reproduce and pass genetic information onto their offspring.

 

Without this ability, even a sentient humaniform machine is just clockworks and battery.

Edited by RattleAndGrind
Link to comment
Share on other sites

 

Is there a difference between Commander Data from ST: TNG and Synths?

 

 

<snip>

 

 

 

No, not really. And the reason is simple. By definition, neither Data nor Synths are living beings. The dead end for both is:

 

Reproduction The ability to reproduce and pass genetic information onto their offspring.

 

Without this ability, even a sentient humaniform machine is just clockworks and battery.

 

 

Your logic doesn't make much sense to me. The page you link to gives a purely technical definition that is meant to separate the animal kingdom from the mineral kingdom, and is not suitable as a basis from which to derive morality. According to your logic, a sterile human being is not deserving of moral considerations.

 

You also ignore the possibility that that definition would be expanded if we had Data-like androids running around. That definition only defines life as we know it, for the purpose of categorizing what currently exists in our sphere of experience, which excludes sentient androids and gen-3 synths. As far as I'm concerned, you are unjustifiably re-purposing that definition to fit your preconceived idea, and you are irrationally hanging on to a technicality that is irrelevant in a world that has non-animal sentient beings.

Link to comment
Share on other sites

Is there a difference between Commander Data from ST: TNG and Synths?

 

 

 

I love this reference!

 

 

You guys gave another layer of complexity to the "synth dilemma" that is affecting the choices in my current playthrough. I used to be pro-railroad, anti-BoS, but now I'm not so sure. To drop another Star Trek metaphor, the BoS rhetoric about synths being inherently better than humans reminded me of the Eugenics Wars and how there is a strict regulation on gene tampering in Starfleet space.

Link to comment
Share on other sites

 

 

Is there a difference between Commander Data from ST: TNG and Synths?

 

 

<snip>

 

 

 

No, not really. And the reason is simple. By definition, neither Data nor Synths are living beings. The dead end for both is:

 

Reproduction The ability to reproduce and pass genetic information onto their offspring.

 

Without this ability, even a sentient humaniform machine is just clockworks and battery.

 

 

Your logic doesn't make much sense to me. The page you link to gives a purely technical definition that is meant to separate the animal kingdom from the mineral kingdom, and is not suitable as a basis from which to derive morality. According to your logic, a sterile human being is not deserving of moral considerations.

 

You also ignore the possibility that that definition would be expanded if we had Data-like androids running around. That definition only defines life as we know it, for the purpose of categorizing what currently exists in our sphere of experience, which excludes sentient androids and gen-3 synths. As far as I'm concerned, you are unjustifiably re-purposing that definition to fit your preconceived idea, and you are irrationally hanging on to a technicality that is irrelevant in a world that has non-animal sentient beings.

 

 

Fortunately for me, I am not responsible for your inability to comprehend a simple premise. Neither am I bound by your contortions of my comments so they affirm your lack of understanding. Nor can I be held accountable of your attempts to dismiss a scientific article entitled "Characteristics of living things" because is does not fit what you wish to be true.

 

You may twist the definition of "alive" to include chemical reactions and nuclear explosions and collapsing stars and the rapidly expanding universe and the entirety of the cosmos if it will help you feel better. But such convolutions do not conform to the reality of biology.

 

I look at what exists and contemplate the future based on reality. I do not indulge in flights of fantasy or wishful thinking (not often at least). I do not delve into the fantastic or postulate 'what if' as fact. I eschew the existential and esoteric. You are allowed to do otherwise. When contemplating the future, there should be no limits on ones imagination. However, what one is not allowed to do is violate the laws of physics or the nature of biology.

Edited by RattleAndGrind
Link to comment
Share on other sites

 

 

 

Is there a difference between Commander Data from ST: TNG and Synths?

 

 

<snip>

 

 

 

No, not really. And the reason is simple. By definition, neither Data nor Synths are living beings. The dead end for both is:

 

Reproduction The ability to reproduce and pass genetic information onto their offspring.

 

Without this ability, even a sentient humaniform machine is just clockworks and battery.

 

 

Your logic doesn't make much sense to me. The page you link to gives a purely technical definition that is meant to separate the animal kingdom from the mineral kingdom, and is not suitable as a basis from which to derive morality. According to your logic, a sterile human being is not deserving of moral considerations.

 

You also ignore the possibility that that definition would be expanded if we had Data-like androids running around. That definition only defines life as we know it, for the purpose of categorizing what currently exists in our sphere of experience, which excludes sentient androids and gen-3 synths. As far as I'm concerned, you are unjustifiably re-purposing that definition to fit your preconceived idea, and you are irrationally hanging on to a technicality that is irrelevant in a world that has non-animal sentient beings.

 

 

Fortunately for me, I am not responsible for your inability to comprehend a simple premise. Neither am I bound by your contortions of my comments so they affirm your lack of understanding. Nor can I be held accountable of your attempts to dismiss a scientific article entitled "Characteristics of living things" because is does not fit what you wish to be true.

 

You may twist the definition of "alive" to include chemical reactions and nuclear explosions and collapsing stars and the rapidly expanding universe and the entirety of the cosmos if it will help you feel better. But such convolutions do not conform to the reality of biology.

 

I look at what exists and contemplate the future based on reality. I do not indulge in flights of fantasy or wishful thinking (not often at least). I do not delve into the fantastic or postulate 'what if' as fact. I eschew the existential and esoteric. You are allowed to do otherwise. When contemplating the future, there should be no limits on ones imagination. However, what one is not allowed to do is violate the laws of physics or the nature of biology.

 

 

>You may twist the definition of "alive" to include chemical reactions and nuclear explosions and collapsing stars and the rapidly expanding universe and the entirety of the cosmos if it will help you feel better.

You're making a caricature of what I said by grossly exaggerating it, effectively putting words in my mouth. My first argument was that your rigid use of a scientific definition to inform your morality is unreasonable, when we're talking about intelligent, sentient, self-aware entities (especially when they're capable of experiencing physical and emotional pain). Apparently it's my second argument you have a problem with here, which states that science has not yet had the chance to seriously consider the issue of artificial life because it doesn't exist at this time, and that the definition of life might be different if there was such a thing as sentient artificial intelligence interacting with the environment through an artificial body. I did not twist any definitions, and I agree that the definition of life that you linked is correct.

 

>I look at what exists and contemplate the future based on reality. I do not indulge in flights of fantasy or wishful thinking (not often at least). I do not delve into the fantastic or postulate 'what if' as fact.

I find that hard to take seriously, when on this thread you just gave your opinion on the morality of a fictional situation that happens in a fictional story set in a fictional universe.

 

>When contemplating the future, there should be no limits on ones imagination. However, what one is not allowed to do is violate the laws of physics or the nature of biology.

OK, but what I've said violates neither the laws of physics, nor the nature of biology. I can't imagine how you figure that it does.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...