Jump to content

Discussion about the rights of androids


Northwain

Recommended Posts

But it also depend on the biology of the synths... is there a partial brain in there somewhere, hormones, nerves able to transmit pain impulses? Or it's just all hardware and software running a program that imitates human pattern?

An AI, even if unique and self aware, can only simulate human emotions but not fully understand it or suffer from it unless it's partially organic, differently from a dog or a chimp.

I Robot had the 3 laws that, in a way, makes sense as synths rights, but I still wouldn't considering wasting human life to defend a synth, even if it cant be replicated or transferred to another body.

Let's say you lose an arm and replace it with a prosthesis that simulates the electrical impulses that were previously generated by your biological arm. The arm is sophisticated and advanced enough that you don't notice any difference to your previous arm.

Does that give you less rights than a person with a fleshy arm? Of course not.

 

Now let's take that thought experiment a bit further. Let's say you lose a few neurons in your brain. Not enough to render you dysfunctional, but enough to give you some minor impairment. So you implant a chip with a hypothetical artificial neural network that can simulate those neurons, interface with your biological ones, and restore your old cognitive capabilities. Do you have less rights now than before? I'm sure you'll agree that that would not make much moral sense.

 

But now over time you gradually replace all your old biological neurons with simulated ones, that can transmit information in exactly the same way as the biological ones. Now we don't have artificial neural networks today that can do this, but suppose in 2277 we have. You also replace your old meaty organs and limbs with more robust synthetic ones.

Where is the point at which you stop being human? After the first artificial neuron? After the last?

 

What we consider "actual" consciousness and artificial consciousness is a false dichotomy in my opinion. There is no difference between simulated emotions and actual emotions if the simulation can capture the patterns of the real emotional patterns in the brain.

The suffering would be just as real, and to say otherwise is morally highly questionable in my view.

Edited by Bitflip
Link to comment
Share on other sites

 

But it also depend on the biology of the synths... is there a partial brain in there somewhere, hormones, nerves able to transmit pain impulses? Or it's just all hardware and software running a program that imitates human pattern?

An AI, even if unique and self aware, can only simulate human emotions but not fully understand it or suffer from it unless it's partially organic, differently from a dog or a chimp.

I Robot had the 3 laws that, in a way, makes sense as synths rights, but I still wouldn't considering wasting human life to defend a synth, even if it cant be replicated or transferred to another body.

Let's say you lose an arm and replace it with a prosthesis that simulates the electrical impulses that were previously generated by your biological arm. The arm is sophisticated and advanced enough that you don't notice any difference to your previous arm.

Does that give you less rights than a person with a fleshy arm? Of course not.

 

Now let's take that thought experiment a bit further. Let's say you lose a few neurons in your brain. Not enough to render you dysfunctional, but enough to give you some minor impairment. So you implant a chip with a hypothetical artificial neural network that can simulate those neurons, interface with your biological ones, and restore your old cognitive capabilities. Do you have less rights now than before? I'm sure you'll agree that that would not make much moral sense.

 

But now over time you gradually replace all your old biological neurons with simulated ones, that can transmit information in exactly the same way as the biological ones. Now we don't have artificial neural networks today that can do this, but suppose in 2277 we have. You also replace your old meaty organs and limbs with more robust synthetic ones.

Where is the point at which you stop being human? After the first artificial neuron? After the last?

 

What we consider "actual" consciousness and artificial consciousness is a false dichotomy in my opinion. There is no difference between simulated emotions and actual emotions if the simulation can capture the patterns of the real emotional patterns in the brain.

The suffering would be just as real, and to say otherwise is morally highly questionable in my view.

 

 

But thats why I asked about their physiology... Do they have an organic brain? We know that for sure? You can replace body parts, you can't replace your brain. The other organs don't really matter as they are just executing a function but without an organic brain, if they are pure AI, they are not really feeling any of what they "think" they are feeling, only running a program that replicates human behavior.

 

You mention replacing biological neuron for synthetic ones but you can, in theory, enhance it, not totally replace it or you will cease to exist. A full organic body with an AI is still a robot. A full robotic body with a human brain is still human. According to the Fallout Wiki Synths are artificial intelligence units designed to look, function and behave like humans. But still completely AI... just a program running its functions. Doesn't mean they dont have the right to exist or protect themselves, but it's all simulated emotion and behavior, not real ones, just like that mad Toaster in OWB.

 

I want to avoid spoilers but if ingame they explain or hint that they do have some kind of organic brain I may change may mind, but a full AI, at the end, is still just a program.

Edited by avallanche
Link to comment
Share on other sites

 

But thats why I asked about their physiology... Do they have an organic brain? We know that for sure? You can replace body parts, you can't replace your brain. The other organs don't really matter as they are just executing a function but without an organic brain, if they are pure AI, they are not really feeling any of what they "think" they are feeling, only running a program that replicates human behavior.

 

You mention replacing biological neuron for synthetic ones but you can, in theory, enhance it, not totally replace it or you will cease to exist. A full organic body with an AI is still a robot. A full robotic body with a human brain is still human. According to the Fallout Wiki Synths are artificial intelligence units designed to look, function and behave like humans. But still completely AI... just a program running its functions. Doesn't mean they dont have the right to exist or protect themselves, but it's all simulated emotion and behavior, not real ones, just like that mad Toaster in OWB.

 

I want to avoid spoilers but if ingame they explain or hint that they do have some kind of organic brain I may change may mind, but a full AI, at the end, is still just a program.

 

 

Our brain is just a program running its functions as well. It's just that it's hardware is organic rather than based on silicon. The matter of which the hardware consists of is irrelevant if the functionality is the same.

 

I don't agree with this sort of organic chauvinism :tongue:

 

For reasons that I outlined before, in my opinion your statement "but it's all simulated emotion and behavior, not real ones" does not make any sense.

There is no qualitative difference between simulated and "real" emotions if the simulation is sufficiently comprehensive. We can't even know for sure that our emotions are not simulated, so who are we to say any emotion of which the origin is known to be synthetic is to be regarded as qualitative different than ours?

Link to comment
Share on other sites

Well, our brain is a computer as well. You could, at least in theory, build a computer out of rocks, so building one out of organic matter is definitely conceivable.

See https://en.wikipedia.org/wiki/Wetware_computer.

 

I guess the question comes down to whether or not the brain can be simulated by a turing machine. This further leads to the question of whether the brain is computable, which implies the question of whether a subset of the physical universe is computable.

To date, there is no real reason to think otherwise, and no reason to believe the brain is non-deterministic.

Of course, we don't know this for sure - I'd just rather err on the side of caution and treat the hypothetical simulated brain no different than ours.

Edited by Bitflip
Link to comment
Share on other sites

Well, I will hold my thoughts for now until I have played the game, but I frankly don't agree with you on this... if its not organic its not alive and if its not alive it cant really feel anything other than what they were programmed to "feel". Its an illusion of freedom, feelings and even self awareness of an essentially inanimate being that don't even comprehend the information its processing... could a synth feel enough sadness to commit suicide??? Would they comprehends this concept? I don't think so, nor do I think they could even die since they are not alive...

Regulation laws? Yes
Right to protect itself? Maybe
Full Civil Rights? No
Waste my life to save one? Hell No.

Link to comment
Share on other sites

I think this hinges on the definition of "alive" and life in general, which is poorly defined and not at all agreed upon.

So making an ethical conclusion based on a poorly defined concept seems rather dangerous to me.

 

As far as the "illusion of freedom, feelings and even self awareness" goes, I'd argue that this is no different in a deterministic brain. Note that this in no ways diminishes the actual subjective experience of pain and joy, and therefore it is irrelevant to an ethical value system that uses these experiences as axioms to determine what should be deemed good or bad.

Link to comment
Share on other sites

I feel like I agree with most of those who have posted here. Truly if someone or something can differ between wright and wrong and can make the choice of either or, along with feeling empathy then It should be seen as a sentient being. and therefore given the rights as such.

 

P.S. love the star trek clip, defiantly helps put some things into perspective.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...