Sepherose Posted July 16, 2011 Author Share Posted July 16, 2011 I don't think it's an inability to grasp concepts. The reason I haven't chimed in is arguing fantasies doesn't appeal to me. Sorry, but we might as well be talking about gay marriage between unicorns. Actually, if you read through all of those science articles, you would realize that something like that is in fact an actual likelihood. Actual to me doesn't fit in with the word likelihood. That's like distinct possibility of an affirmed maybe. I understand what you guys are trying to say and it is noble, but I was responding to the statement that those people who voted no weren't responding for a given reason. All sentient beings deserve respect, dignity and honor, even when they can't understand the concept. Animals are a good example of this. My question is how you can determine if these so-called intelligent machines are sentient. Are they self aware, beyond their mechanical need for diagnostics. A mere computer can be set to run diagnostic tests at startup, but does that make it sentient? I should clarify, intelligent and self aware machines are a likelihood, sentience is a different ball of wax. Sentience involves the ability to feel emotion, and that isn't what I am talking about. I was responding to your assertion that it is a purely fantasy idea, which it is not so far as science is concerned. It is a grey area though. And no, to answer your question, though I don't think it is needed. Self awareness is a difficult thing to measure. One thing that can hint that an animal is self aware, is if you place something on it's body that it won't notice without a mirror, and if it attempts to remove the object from it's own body, it is indicative of at least base self awareness. The same test could be done to a learning machine that has no programmed idea of what a mirror is, which would lean in the direction of some form of rudimentary self awareness and intelligence, if it were to pass. Link to comment Share on other sites More sharing options...
kvnchrist Posted July 16, 2011 Share Posted July 16, 2011 Sentient beings is the issue here, not wither something can learn basic stuff. We are talking about a life form here, is what I'm trying to get you to understand. People can't even decide when life begins, let alone defining it. sentient beings You've stepped into some pretty deep stuff here. Link to comment Share on other sites More sharing options...
Sepherose Posted July 16, 2011 Author Share Posted July 16, 2011 Sentient beings is the issue here, not wither something can learn basic stuff. We are talking about a life form here, is what I'm trying to get you to understand. People can't even decide when life begins, let alone defining it. sentient beings You've stepped into some pretty deep stuff here. Actually no that is not the issue here. Now, I did muck up in the first post here and use the word sentient one time. That was a mistake. The poll on the other hand, only mentions intelligence, and I make multiple references to self awareness. I'm not talking about sentient machines. Self awareness can easily lead to something realizing it is being taken advantage of though, and the ability to learn can make that self awareness more complex, so I suppose in theory, self awareness could LEAD to sentience, but it is only a small piece of the puzzle. Machines, due to the nature of technology WILL improve their abilities and hardware faster than we do our organic bodies. And once we create a machine with even a modicum of ability to be innovative, we have sealed the deal on a complex, and artificial life form, coming about. Technology exponentially gets better, unlike organic life. I'll break down the question in the poll in a different way: If a machine looks at you one day and asks "What am I, and why am I here?", that alludes to self awareness. It doesn't have to be backed by emotion. That question can be asked from a purely logical standpoint. Maybe it suddenly realized it had the physical capabilities and software malleability to be doing something entirely different, and wants to know exactly why it is doing job A rather than job B. That is self awareness. Again, a part of sentience, but by no means all of it. Link to comment Share on other sites More sharing options...
marharth Posted July 16, 2011 Share Posted July 16, 2011 Sentient beings is the issue here, not wither something can learn basic stuff. We are talking about a life form here, is what I'm trying to get you to understand. People can't even decide when life begins, let alone defining it. sentient beings You've stepped into some pretty deep stuff here.The intelligence to know and categorize what is around you and the intelligence to be able to learn things on your own. It isn't much more then that despite what people want to beleive. To make this simple, if a machine could act the same way as a human and learn the same way as a human Link to comment Share on other sites More sharing options...
kvnchrist Posted July 16, 2011 Share Posted July 16, 2011 My idea of sentient means they are aware. Not only of what is around them, but what is not. The idea was put foreword about personal rights and being treated fairly. Could it be said that an entity is sentient, when it knows when it is being treated unfairly and is not granted the rights that others have, without being told so. Link to comment Share on other sites More sharing options...
Sync182 Posted July 16, 2011 Share Posted July 16, 2011 (edited) Interestingly, this subject HAS been dealt with in the Star Trek: The Next Generation episode The Measure of a Man http://memory-alpha.org/wiki/The_Measure_Of_A_Man_(episode) First the prosecution of Data:Commander William T. Riker: The Commander is a physical representation of a dream - an idea, conceived of by the mind of a man. Its purpose: to serve human needs and interests. It's a collection of neural nets and heuristic algorithms; its responses dictated by an elaborate software written by a man, its hardware built by a man. And now... and now a man will shut it off. [Riker switches off Data, who slumps forward like a lifeless puppet] Commander William T. Riker: Pinocchio is broken. Its strings have been cut. Then the defence of Data (the underlining in the first quote is mine):Picard goes on to grind away at Commander Maddox's views about Data; in doing so, Picard maneuvers Maddox into conceding that Data fulfills most of the cyberneticist's own criteria for sentience - intelligence and self-awareness - and dramatically coerces the scientist into admission that the remaining criterion, consciousness, is too nebulous a concept to precisely determine whether the android is in possession of it or not. Capt. Picard: Now, tell me, Commander, what is Data? Commander Bruce Maddox: I don't understand. Capt. Picard: What is he? Commander Bruce Maddox: A machine! Capt. Picard: Is he? Are you sure? Commander Bruce Maddox: Yes! Capt. Picard: You see, he's met two of your three criteria for sentience, so what if he meets the third, consciousness, in even the smallest degree? What is he then? I don't know. Do you? Capt. Picard: [to Riker] Do you? Capt. Picard: [to Louvois] Do you? Well, that's the question you have to answer. Capt. Picard: Your honor, the courtroom is a crucible; in it, we burn away irrelevancies until we are left with a purer product: the truth, for all time. Now sooner or later, this man [Commander Maddox] - or others like him - will succeed in replicating Commander Data. The decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of people we are; what he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedom: expanding them for some, savagely curtailing them for others. Are you prepared to condemn him [Commander Data] - and all who will come after him - to servitude and slavery? Your honor, Starfleet was founded to seek out new life: well, there it sits! Waiting. This debate will likely rage on long after all of us here are dead and buried. And I still haven't voted. Edited July 16, 2011 by Sync182 Link to comment Share on other sites More sharing options...
Sepherose Posted July 16, 2011 Author Share Posted July 16, 2011 (edited) My idea of sentient means they are aware. Not only of what is around them, but what is not. The idea was put foreword about personal rights and being treated fairly. Could it be said that an entity is sentient, when it knows when it is being treated unfairly and is not granted the rights that others have, without being told so. I will say it again in a different way. If a machine, without being told that it is being treated unfairly, has a realization that they are as smart as, or smarter than, a particular human that is being treated better, and asks why, that machine is self aware due to it's own learning. The concept of fair is a social construct, but you have to think: A learning capable machine in a human rich environment will develop a concept of fair*, if it's hardware is complex enough. *according to the cuture Edited July 16, 2011 by Sepherose Link to comment Share on other sites More sharing options...
Thor. Posted July 16, 2011 Share Posted July 16, 2011 (edited) Don't forget Watson yo, they are trying to turn it into a star trek computer in a since. give it 5 years http://www.youtube.com/watch?v=lI-M7O_bRNg Edited July 16, 2011 by Thor. Link to comment Share on other sites More sharing options...
hoofhearted4 Posted July 16, 2011 Share Posted July 16, 2011 ok IIRC in the matrix, robots were denied their right, and then got all pissed off because of it and such events caused the matrix. for now, robots can still only feel the emotions we give them. even if they can change their emotions, we can still limit what they feel. when technology gets to the point it can evolve itself, thats when things go into a sci fi state. also self replication. i believe self replication is a big possibility because its the only plausible way to search the universe, you need self replication drones that can use any energy source to fuel themselves. Link to comment Share on other sites More sharing options...
Sepherose Posted July 17, 2011 Author Share Posted July 17, 2011 ok IIRC in the matrix, robots were denied their right, and then got all pissed off because of it and such events caused the matrix. for now, robots can still only feel the emotions we give them. even if they can change their emotions, we can still limit what they feel. when technology gets to the point it can evolve itself, thats when things go into a sci fi state. also self replication. i believe self replication is a big possibility because its the only plausible way to search the universe, you need self replication drones that can use any energy source to fuel themselves. In the articles I linked on the first page, there is one that describes a small "robot factory" that doesn't need human input at all to design and make small robots, so you have your self replication right there. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now