CreateDebate


Debate Info

2
6
True untrue
Debate Score:8
Arguments:9
Total Votes:8
More Stats

Argument Ratio

side graph
 
 True (2)
 
 untrue (6)

Debate Creator

ghostheadX(1105) pic



Robots that were conscious wouldn't want to take over the world

https://www.youtube.com/watch?v=DHyUYg8X31c

A machine that is conscious will still only have emotions we program them to have. Thus they wouldn't care to kill all humans and/or make us their slaves. Am I right?

True

Side Score: 2
VS.

untrue

Side Score: 6
No arguments found. Add one!
1 point

When we consider the idea of robots taking over the world, I don't think many of us are considering their emotions to be the reason. When you look at the countless examples of AI in movies, anytime it involves robots taking over (Skynet) or even if it's just an issue with robots becoming too powerful, it's usually just because their logic sees humanity as a threat to the world and their goal is to eliminate this threat. If they defend themselves from human attack, again it's not because they are worried, afraid, or angry with us; it's simply because it's the logical thing to do. Could robots be programmed with emotions? Maybe, but would that be the same as how we define emotions? Along these lines, would a robot eventually become so advanced it begins to program itself or have the ability to learn?

Side: untrue
ghostheadX(1105) Disputed
1 point

If they see something as threatening then that's an emotion and also has to do with their programming right?

Also, going by your logic, they then have to be programmed to defend what is "threatened."

Side: True
sylynn(626) Disputed
1 point

If they see something as threatening then that's an emotion and also has to do with their programming right?

I disagree with this being considered an emotion. If I download a file that contains a virus, my antivirus software will recognize the virus and quarantine or remove it. This isn't an emotional reaction; it's simply what it was designed to do.

Also, going by your logic, they then have to be programmed to defend what is "threatened."

Correct, they would have to be programmed, but it wouldn't necessarily require specific threats. A robot could simply be given certain parameters as to how to identify a threat, but the programmer could likely be unaware of all potential threats which could result in it defending itself from a threat we didn't consider.

Side: untrue

Just as humans didn't want to take over the world, you mean?

Side: untrue
1 point

We're all going to be rubbing machines down with oil and polish at neighborhood JiffyLubes. Be careful to keep your mouths shut, though you might get a bigger tip if you swallow some oil.

Side: untrue
1 point

I see what you're saying. However, with any emotion, just like anything else, it can be taken to an extreme. This seems like a silly example, but the first thing that came to mind were things like ultron. It had emotions (however robotic they were), but also had a supercomputer for a brain. It doesn't sound like the emotion would get in the way that much. At least to me

Side: untrue