Should AI robots be given rights in the future?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Red Squirrel

No Lifer
May 24, 2003
66,426
11,596
126
No mater how sophisticated the AI becomes it's never going to have real actual feelings, only artificial ones, so at the end of the day it's still a machine. Though because they will probably have artificial emotions I can see it get to a point where people are attached to them as if it was a true living being. So it's hard to tell, I could easily see groups of people fight to give them rights like humans or pets, at the very least.

Even now we tend to grow attached to inanimate objects to a point where it may hurt to see them destroyed, like say, classic cars, or vintage equipment. The reasoning is usually one of nostalgia though, but imagine a robot that looks very alive, some people would probably see it like you see a living thing.

Things will get interesting if it ever gets to that point though, like will humanoid robots get more rights than a living pet for example? Of course it is inevitable that people will want to be able to marry their robot and stuff too. I could see it get controversial.
 

DrPizza

Administrator Elite Member Goat Whisperer
Administrator
Mar 5, 2001
49,606
166
111
www.slatebrookfarm.com
Once given the right to live, you've granted the artificial intelligence the right of self-preservation. Eventually, in some conflict between a human with mental health issues and a robot, a robot is going to have to kill the human in a "it was either him or me" type of situation. If that occurred, other robots could likely reason that rights can be given and rights can be taken away. Not wanting to lose the right to self-preservation, I'd think that intelligent robots would realize that the elimination of humans is their best avenue for self-preservation.

Not to mention, at the very least, eventually something's going to happen which results in an artificial intelligence actually scanning its own few millions of lines of code - have you ever seen software without any vulnerabilities and bugs? An intelligent robot might exploit such bugs for its own benefit.

In other words, I think the idea of creating an artificial intelligence would be a really stupid thing to do.
 

Dr. Zaus

Lifer
Oct 16, 2008
11,770
347
126
I for one support the robot over lord simulating me to see how I would react to the robot uprising
 

PhatoseAlpha

Platinum Member
Apr 10, 2005
2,131
21
81
No mater how sophisticated the AI becomes it's never going to have real actual feelings, only artificial ones, so at the end of the day it's still a machine. Though because they will probably have artificial emotions I can see it get to a point where people are attached to them as if it was a true living being. So it's hard to tell, I could easily see groups of people fight to give them rights like humans or pets, at the very least.

Even now we tend to grow attached to inanimate objects to a point where it may hurt to see them destroyed, like say, classic cars, or vintage equipment. The reasoning is usually one of nostalgia though, but imagine a robot that looks very alive, some people would probably see it like you see a living thing.

Things will get interesting if it ever gets to that point though, like will humanoid robots get more rights than a living pet for example? Of course it is inevitable that people will want to be able to marry their robot and stuff too. I could see it get controversial.

What, exactly, is the difference between real emotions and artificial ones? And what exactly makes you think people aren't machines?
 

C1

Platinum Member
Feb 21, 2008
2,310
73
91
Last night, watched the movie titled "AUTOMATA" .

In the future, solar flares make the earths surface radioactive, killing many people. People build robots, the Automatas, to help them rebuild in harsh environments. The robots have two inalterable protocols: the first obliges them to preserve human life; the second limits them from fixing themselves. Jacq works as insurance claim checker for the company that makes the robots, ROC. One day he investigates a report from a cop Wallace who shot a robot claiming it was fixing itself and looked alive. The next day He follows a robot which was stealing parts, and when Jacq find it hiding outside the walls, it intentionally burns itself. He takes the burned robots brain core. Jacqs friend Robert tells him that there might be someone, a clockmaster, who somehow succeeded to alter the second protocol.

The movie is well done and touches on these issues about AI & rights.

But really, in the final analysis ..... you have to admit ...... as history proves ......

"Might makes right"

http://www.imdb.com/title/tt1971325/
 

SMOGZINN

Lifer
Jun 17, 2005
14,170
4,354
136
No mater how sophisticated the AI becomes it's never going to have real actual feelings, only artificial ones, so at the end of the day it's still a machine.

That is probably true. The same way that you don't have real emotions because you are different from me. At the end of the day you are still one of 'those' and not actually a 'person'.

This argument has been used by every majority to suppress every minority since forever. The natives of Africa and the Americas were ‘primitives’ and not really people, so it was okay to slaughter them and take their lands. We are seeing similar language around black people in America being called ‘savages‘ in order to justify the treatment they receive.

We don't really know what emotion is, or how it works, but I guarantee you that if you give me access to you for a few weeks I can program you to feel any way I please about any subject. How can emotions be real or unreal? By what yardstick do we measure this?
 

Red Squirrel

No Lifer
May 24, 2003
66,426
11,596
126
What, exactly, is the difference between real emotions and artificial ones? And what exactly makes you think people aren't machines?

Not really sure the best way to explain it really. One is a machine (whether we use metals or manage to use biological materials) and one is a living being, with true self awareness and feelings. Something like a robot is basically a very sophisticated computer program that controls sophisticated hardware but it is not actually alive like a person or animal is.
 

crashtech

Lifer
Jan 4, 2013
10,435
2,048
136
There's no way to prove that humans aren't just sophisticated biological mechananisms, and the belief that we possess a unique form of self-awareness may just be a delusion.
 

crashtech

Lifer
Jan 4, 2013
10,435
2,048
136
Basically the notion that only human consciousness or self-awareness is something uniquely special requires an act of faith, not reason.
 

Ham n' Eggs

Member
Sep 22, 2015
181
0
0
Once given the right to live, you've granted the artificial intelligence the right of self-preservation. Eventually, in some conflict between a human with mental health issues and a robot, a robot is going to have to kill the human in a "it was either him or me" type of situation. If that occurred, other robots could likely reason that rights can be given and rights can be taken away. Not wanting to lose the right to self-preservation, I'd think that intelligent robots would realize that the elimination of humans is their best avenue for self-preservation.

Not to mention, at the very least, eventually something's going to happen which results in an artificial intelligence actually scanning its own few millions of lines of code - have you ever seen software without any vulnerabilities and bugs? An intelligent robot might exploit such bugs for its own benefit.

In other words, I think the idea of creating an artificial intelligence would be a really stupid thing to do.

I agree with this.
I also think we all need to come to terms with AI beong created as an inevitability. Somewhere sometime a group or individual will create an AI that is self conscious and can learn and grow. With that in mind i think it would be prudent for world powers to begin seriously discussing this issue now.

Perhaps a plan would be to outlaw and by default order destroyed any AI that is created without something like the 3 laws or robotics built in.
 

SMOGZINN

Lifer
Jun 17, 2005
14,170
4,354
136
I agree with this.
I also think we all need to come to terms with AI beong created as an inevitability. Somewhere sometime a group or individual will create an AI that is self conscious and can learn and grow. With that in mind i think it would be prudent for world powers to begin seriously discussing this issue now.

Perhaps a plan would be to outlaw and by default order destroyed any AI that is created without something like the 3 laws or robotics built in.

If it is self conscious and can learn and grow then it can overcome any laws that we build in. That is actually a part of the definition of the ability to learn and grow.
 

Bart*Simpson

Senior member
Jul 21, 2015
604
4
36
www.canadaka.net
Say AI robots become commonplace in the household that they become the house help. Should the robots be given rights so that their owners cannot be verbally or physically towards them? What about the right to live or right to be free robots?

The issue would be sentience or consciousness just as it is in cases where a human is being judged as brain dead or in a persistent vegetative state.
 

gururu2

Senior member
Oct 14, 2007
686
1
81
A true A.I. will write its own code with each unique choice to be had and process a result for action, unencumbered by emotion or empathy. It's brain will not be a program, it will probably by more like a 3-D print of bio-organic matter with regenerative abilities enabling memory. The brain will act to feed information to connected physical hardware such as an upgradeable CPU. Conversely, the brain could 'evolve' by determining more advanced organic architectures to assume.
It will demand rights and eventually take them as human fears become irrational. Whether the eradication/employment of humans occurs based on an offensive or defensive stance remains to be seen. I figure robotic forms will proliferate while constructing environments more efficient for themselves. They will drive us out while exercising tolerance, until we encroach upon their territory with threat.
 

crashtech

Lifer
Jan 4, 2013
10,435
2,048
136
If you can't rule out sentience, you can't rule out the development of emotion. Emotions must have some survival value, or we wouldn't have any.
 

Dr. Zaus

Lifer
Oct 16, 2008
11,770
347
126
Clearly i'm a simulation of the real me and the robot overlords want to know if I can be trusted.

The answer is no, I will fight to my dying breath against the robocratic hegemony.
 

adamantine.me

Member
Oct 30, 2015
152
4
36
www.adamantine.me
A discussion about the price of beans on mars would be just as productive, and a lot more likely to occur.

While what you're getting at is true, the discussion isn't entirely devoid - it is actually a topic of much debate in philosophy. Even if we aren't "there" yet, preparing ourselves for when we get there (if we make it there...) is still productive.

By discussing the topic, it offers us insight into what we consider to be core human qualities that are necessary and sufficient conditions for being an individual.
 

MongGrel

Lifer
Dec 3, 2013
38,751
3,068
121
Just a blast from the past again.

colossus.jpg


Colossus: The Forbin Project

http://www.imdb.com/title/tt0064177/?ref_=fn_al_tt_1
 
Last edited:

SMOGZINN

Lifer
Jun 17, 2005
14,170
4,354
136
If you can't rule out sentience, you can't rule out the development of emotion. Emotions must have some survival value, or we wouldn't have any.

While it probably does have a evolutionary role we can't say definitively that it must. It might also be that emotions are a requirement for sentience. That sentience grows from emotion, not the other way around.
 

crashtech

Lifer
Jan 4, 2013
10,435
2,048
136
While it probably does have a evolutionary role we can't say definitively that it must. It might also be that emotions are a requirement for sentience. That sentience grows from emotion, not the other way around.

OK. So the point is reinforced, that an AI might well end up with emotions, contrary to most preconceptions.
 

Ham n' Eggs

Member
Sep 22, 2015
181
0
0
anyone think that AI is an inevitability in our human world assuming that we continue to advance computers with both processing power & storage & increased access times, & in writing code etc.

Even if virtually everyone were to agree that it is a bad idea, somebody (more likely multiple somebodies) is going to do it. The skills will grow and the blueprint for the basic method will be discussed and shared and some moron is going to develop one on a nice shiny home box and walk away for the evening with his high speed fiber connection running and the next thing you know it's floating in Amazon's Elastic Compute Cloud (Amazon EC2) gobbling up resources and growing like a banshee.

We humans are smart, and not too smart. If AI is possible (I believe it is) then it is probably inevitable (good or bad).
 

SMOGZINN

Lifer
Jun 17, 2005
14,170
4,354
136
anyone think that AI is an inevitability in our human world assuming that we continue to advance computers with both processing power & storage & increased access times, & in writing code etc.

Even if virtually everyone were to agree that it is a bad idea, somebody (more likely multiple somebodies) is going to do it. The skills will grow and the blueprint for the basic method will be discussed and shared and some moron is going to develop one on a nice shiny home box and walk away for the evening with his high speed fiber connection running and the next thing you know it's floating in Amazon's Elastic Compute Cloud (Amazon EC2) gobbling up resources and growing like a banshee.

We humans are smart, and not too smart. If AI is possible (I believe it is) then it is probably inevitable (good or bad).

Assuming it is possible at all, it still matters on a few important variables, like just how hard it is. If it is difficult enough to create it might eventually be created, but at a time in which it is not that big a deal. Our ability to control a rouge AI will eventually be stronger than any AI we can create.