Farm animals don't have a "right to life" but if we find a young boy torturing a chicken, he is disciplined for it -- not for the sake of the chicken, but for our own good. We simply cannot tolerate people in society who take pleasure in the misery of other living things. Otherwise the boy might direct his murderous tastes to his own kind.
My computer can play CDs, and my microwave prepares food. Does that mean they have rights?
What an ass.
Only if those rights extend to my toaster and iPod.
One electronic is equal to any other.
Q. When does artificial intelligence demand humane treatment?
A. Never.
Even posing this question opens the curtain on the irrational projection of human identities onto creatures and things which are not human.
Scientists are unwilling to define WHEN life begins so as to further justify the selfish act of infanticide that is abortion.
Some also have a god complex and want to believe that they have created LIFE so to treat their creation with RESPECT, they want to have Scientist given RIGHTS for robots.
Only now waiting for them to force the robots to love them “or else”.
Why don’t we wait until a robot asks us, unprompted, about it?
Machine rights.
Interesting.
A robot is just a collection of parts and a program. No program I ever wrote had or deserved rights of any kind. If we’re going to extend rights to a collection of parts, we might as well extend them to a single part. Or a frying pan.
Come on, come on - certainly there are some machines that deserve humane treatment. My girlfriend calls me The Love Machine, yet she overworks me and constantly overtaxes and abuses my mechanism; doesn’t that rate some sort of sanction?
Be kind to your fridge and it’ll be kind to you!
Right and before you know it they will be telling you
"F*** you @$$h0le"
PETER — People for the ethichal treatement of every robot.’
It also describes in vulgar slang the type of people that would join such a group if it existed.
I suppose the matter depends whether they’re considering moral or legal rights. Even corporations have legal rights, and they’re not exactly human.
On the other hand, moral rights that are not divinely given are rather arbitrary and meaningless, just a matter of taste. You might as well struggle over the notion that chocolate tastes better than vanilla.
I suppose some religions might allow full human rights to non-humans. Which one does the author rely on here?
Do robots deserve rights?
That is an easy one... NO.
Having Rights means also having Responsibilities. This is an often forgotten fact in our “I got Rights, man” society. But citizens being willing to take personal responsibility for their actions is an ESSENTIAL part of maintaining a civilized society where everyone has individual Rights.
When dogs, cats porpoises, robots and other non-human entities are equipped and willing to accept responsibility for themselves then, and only then, are they deserving of Rights.
Non sequitur. It is known to be impossible, even in theory, to absolutely predict the position and velocity of a subatomic particle -- by the author's "reasoning" that means that subatomic particles are supernatural.
By this reasoning, a human who gives no outward sign of having "transcended his programming" (e.g. someone raised by bad parents who grows up to be a criminal thug) has no soul and may be treated as eighty kilos of meat that emits 310K blackbody radiation and greenhouse gases.
When they develop conciousness and intellect equal to a human mind .... give them freedom.
How about we look at it from an even DIFFERENT perspective, because, whether humans care to think about it or not, or realize it, there ARE other intelligences out there.
Some DAY we’re going to come face-to-face with another race of creatures who are intelligent, perhaps from another star system.
And, whether folks care to admit it or not, dolphins, whales, apes and some other animals exhibit great intelligence, perhaps not quite on par with humans, but definitely an intelligence.
So, it is a foregone conclusion that eventually robotic “entities” will exist, and they will have some sort of intelligence. (On the other hand, I know that my computer is very good at playing chess and beating me quite often, but I’ve no qualms about turning it off, because I understand it is a program with mathematical algorithms... which, theoretically is all our own minds are... but on a protein level rather than an electrical-digital level).
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.Later, Asimov added the Zeroth Law: "A robot may not harm humanity, or, by inaction, allow humanity to come to harm"; the rest of the laws are modified sequentially to acknowledge this.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
In general the concept of rights has not been successfully defined, not even in law. It’s fine to talk about rights, but realize that nobody knows the rules of the game.
I would support giving rights to an intelligent machine capable of making moral decisions, harboring desires, and shaping its own identity...if such a machine could be constructed.
I don't think a machine like that could ever be made, but then again I'm not very educated in computer science. Of course, the best minds in human history have been wrong when it comes to predicting the bounds of human achievement.