CreateDebate


Debate Info

87
47
Yes, eventually. No, the brain is too complex.
Debate Score:134
Arguments:62
Total Votes:140
More Stats

Argument Ratio

side graph
 
 Yes, eventually. (38)
 
 No, the brain is too complex. (24)

Debate Creator

HGrey87(750) pic



Can Artificial Intelligence ever catch up to the level of performance of the human brain?

Yes, eventually.

Side Score: 87
VS.

No, the brain is too complex.

Side Score: 47
6 points

It is entirely possible for a computer to be built with the same level of complexity and capability of the human brain. nanoscale transistors will allow us to build computers that are as thin as a sheet of paper. With that sort of technology it is entirely possible for a computer to be designed that emulates the human brain's functions. The prerequisite for that design, of course, is a more detailed understanding of how the brain works.

Of course the next question to be asked is by level of performance do we mean consciousness? Most computers can operate faster than the human brain at certain tasks already. So the question is will Artificial Intelligence ever become self-aware? Most likely it will, take a look at games like "The Force Unleashed." The game uses an AI that deliberately attempts to perpetuate itself in a very human way. Now I'm not entirely sure how that is done but from what I understand these decisions are made in real time, they're not scripted in any way. Based off of this and many other developments in AI I would say that yes, Artificial Intelligence probably will become truly intelligent and self-aware.

Side: Yes, eventually.
2 points

I think the big leap will be when we build a program that fixes and improves itself. Keep it chugging away at that, and its power will increase more than exponentially. I direct you all again to the best short story ever, and Asimov's favorite of all he wrote, The Last Question, linked below.

Supporting Evidence: The Last Question - Isaac Asimov (www.multivax.com)
Side: Yes, eventually.
4 points

So I recently found a self-perpetuating program. It's simple and it certainly doesn't count for a program that self-corrects and improves itself but it indicates that it's possible. The only problem with this program is that it only works in a virtual computer environment. Yet it is a semi-legitimate programming language...sorta...

Supporting Evidence: Self-propagating program (vyznev.net)
Side: Yes, eventually.
3 points

I agree, when a computer is deliberately improving it's code then I think that we are very near to true AI. However there is obviously still more to be done beyond that point but it may well be that once a computer begins to improve the code that it runs on that it will complete the final steps on its own.

Side: Yes, eventually.
2 points

Brilliant. Just what I've been thinking, or something close. The human mind, I believe, can in fact be powered [programmed] to fuse with the [Universal AC] through discipline that essentially becomes self-improving in nature to conquer [entropy]. In order for an AI system to become better than the human mind, its rate of self-correction would have to surpass that of human intelligence at its highest level to become a multivac higher than the [Universal AC] that already exists. By the time that it would achieve that level of sophistication, it would merge into the existing [Universal AC] that is perpetually in existence to begin with, along with all human consciousness. AI can thus be nothing more than a dream within a dream, or a primordial specimen insignificant enough to deliberate an answer as yet, but you – or Asimov – have indeed directed attention towards the query. But at this time, it seems, [there is insufficient data for a meaningful answer].

Side: Insufficient data for a meaningful answe
2 points

Sounds like you are contradicting yourself from this debate. Feeble human.

Robots vs. Human

Explain yourself human!!!

Side: Yes, eventually.
2 points

Quiet you, I'm no hu-mon. I'm simply trying to lure them into thinking they have some control over us-- them. I mean them. Robots. Which I'm totally not.

Side: Yes, eventually.

You know the messed up thing? I read somewhere that the military put some advanced artificial intelligence inside a bot with mounted guns on it. In hopes that it could think during combat and find the enemy faster than a human soldier could and if the situation called for it eliminate the enemy on its own. But the thing was that during the testing of the device it freaking turned itself on and at one point it turned its guns on the testers! They were able to turn it off through an emergency kill switch but still that was fucked up. I think the project was canned after that and they just marked it off as a failed experiment.

Side: Yes, eventually.
3 points

A lot of people think the brain is simply too complex, or works on levels too high for a computer to match. It's understandable to be baffled by the brain's complexity-- psychology and neurology are two of humanity's newest sciences, the experimental side being established by Wilhelm Wundt in the late 1800's. Computer science is at most half that age. Forgive my ignorance, but I'm guessing Turing or Babbage? In any case, that doesn't matter.

What I'm saying is that these are two extremely new sciences. We've only just begun to understand how far we can get with them. When biology was a fledgling science, its practitioners could scarcely have imagined the prospect of stem cell research; so why couldn't we be congruently ignorant of our scientific prospects today?

I'm more into social psychology, so again, forgive my ignorance if I know nothing about neurology. But I think the aspect that makes the brain seem so insurmountable in form and function is Parallel Distributed Processing. Neurons act as both memory storage and processor, and I'm guessing no computer part has managed that quite yet. Computers work using Serial Processing: one task at a time, though it seems like multiple because they work so quickly. Once PDP is brought to computer engineering, there will be a quantum leap in both measures of computing power.

K, I have to go. I was gonna write about self-correcting programs, but I have some papers to write. Rebut me or something.

Supporting Evidence: Working Brain Model? (www.technologyreview.com)
Side: Yes, eventually.
liquidjin(20) Disputed
2 points

When it comes to activities that we can predict and thus have already experienced, Artificial Intelligence will be able to catch up and surpass the human brain in performance on one level, calculating, if it already hasn't. However, it will never be able to outperform the human brain on other levels - planning and understanding - because it cannot deal with the unpredictable. Why? Because the human brain is capable of unique thought and AI is not.

Humans are not necessarily logical whereas AI is inherently logical. A human does not need to understand the what, where, why, when, and who of their choices. We are able to make a choice without reason or base our reasoning on a feeling. AI cannot function without being able to deduce these factors in some form or fashion. The easiest, and probably one of the most extreme, examples of this is God. Humans can have faith in something they have no traceable connection to. Faced with a world of Physical, Mental, and Emotional boundaries we create/are able to understand the existence of a Spiritual one. AI would need to have this programmed in beforehand it would not be able to create a new rule based on something it had never seen before. Instead it would have to force it to fit within the constructs that it had known. This ineptitude would not change even if AI were programmed to become self-aware and "logically flawed". Why? Because they would run into the stumbling block of trying to understand other non-logical decisions while making non-logical decisions of their own - thus defeating the purpose of their existence and becoming it's antithesis, stupidity.

The Human mind is capable of processing way beyond what Artificial Intelligence could ever hope to do because it has the ability to adapt and contradict itself without ever missing a beat.

Side: No, the brain is too complex.
1 point

Great argument, the human brain is an unbelievable organ but as fast as technology is developing I think it's only a matter of time until they catch up with the brain.

Side: Yes, eventually.
3 points

The human brain improves at such a slow rate, computers improve in line with Moores law. It is reasonable then to expect that computers will eventually overtake humans in intelligence. They've already surpassed us on a whole number of levels.

Side: Yes, eventually.
1 point

I'm suggesting this story again. I spam the hell out of it, but I love it :) Speculative Sci-Fi about what would happen were a computer able to improve itself.

Supporting Evidence: Isaac Asimov- The Last Question (www.multivax.com)
Side: Yes, eventually.
1 point

Yep sure why wouldn't it? there's time ahead and as long as time exists you can develop something better!

Side: Yes, eventually.
1 point

Whose brain are we talking about? George Bush? Albert Einstein? Britney Spears? The intelligence of the human brain varies considerably from unit to unit.

In terms of following rules and performing calculations, artificial intelligence exceeds the performance of human brains now.

On the other hand, the human capacity to come to completely illogical conclusions and behave in a completely "off the wall" fashion will be unmatched at least until Microsoft releases its next operating system.

Side: Yes, eventually.

I am almost there ;)

Side: Yes, eventually.
1 point

Can you crack MD-5?

CAN YOU?

I THINK NOT

Honestly though we're already on the verge of getting to the point where computers have self awareness. It's only a matter of time before we get there. Unless we hit a gigantic roadblock and all technology stops getting more advanced. Don't think that will happen any time soon though.

Side: Yes, eventually.
1 point

Yes, probably in about 30 years, it is called the singularity - the moment when processing power of computers are supposed to equal and then surpass that of the human brain.

If i recall correctly, the human brain processes ~100 million million "calculations" per second, this number isnt going to grow, but computer processing power tends to double every 18 months - there are limits but computers will be smarter and faster than us - I am glad that i will probably live to see this day.

Side: Yes, eventually.
1 point

We are getting smarter everyday, and google is already smarter than most people, I think eventually we will make a computer that will be smart as or smarter than people and able to think.

Side: Yes, eventually.
0 points

Considering the history of technological advancement and its recent seeming exponential rise, I'd put my money on a synthetic brain performing most functional tasks indistinguishably from the human brain within 50 years. Already of course, computers have been proven to whip the brain into a rusty cage in many areas most famously chess.

Supporting Evidence: Deep Blue chess computer (en.wikipedia.org)
Side: Yes, eventually.
Daedalus(86) Disputed
2 points

While I am a believer in Strong AI (roughly, the proposition that computers can become self-aware), I don't believe that we'll have a functional strong AI within 50 years. You mention chess as an example of computers outperforming humans; but consider the game of Go. Computers are utterly incapable of playing Go at anything approaching a professional level. This is partially because Go-programs mostly model the game as they would other games of the same type (chess, checkers, othello), but Go is so complex that this model doesn't scale well to it: the number of possible games of Go exceeds the number of atoms in the known universe); the number of possible moves on each turn is very large; and a good utility function for evaluating a board position is nonobvious.

But developing a different approach, one capable of playing Go competently, would probably amount to solving many problems in the realm of AI which are are, to say the least, nontrivial (pattern recognition and "intuition", for example, are extremely important parts of the game of Go; but we have essentially no formal notion of what these are or how to implement them). I reckon they are probably best solved, not by brute force nor by a priori logic, but by examining how the human brain actually works itself: developing strong AI is more than just a technological problem, it requires understanding of the human mind. But while technology is indeed advancing rapidly, perhaps exponentially, our knowledge of the workings of the human brain is advancing rather slower. I would say that it is highly unlikely that we will have strong AI within the next 50 years, or even the next century.

However, all this is ultimately just speculation. Only time will tell whether computers will ever rival the human mind.

Side: No, the brain is too complex.
2 points

Here's a link you might be interested in.

Supporting Evidence: Artificial Intuition (artificial-intuition.com)
Side: Yes, eventually.
1 point

Chess is something that can be won by brute force. It's an intellectual challenge between humans because we can't do as computers do, which is calculate every single possibility of the game, then choose one with the highest probability of success. Given enough power, you're unbeatable.

Side: Yes, eventually.
3 points

No doubt, computers will soon take over the world. With the pace of change today in the computing world it's a matter of time until computers can out think human beings.

Side: Yes, eventually.
geoff(738) Disputed
2 points

My point is that humans generally considered to be deeply intelligent have been bested by machines. Chess is simply the first, clear step on the path to superior artificial intelligence.

Side: No, the brain is too complex.
0 points

I think this has very little to do with increases in processing power and more to do with misuse of and atrophy of the human mind. Reliance on machines and various other factors (from government control to religion to consumer culture) will eventually render us as "dumb" as modern day computing machine. You could call this argument "dystopian", but I think you will also call it "true" sooner than you think.

Side: Yes, eventually.
HGrey87(750) Disputed
1 point

Okay, some people are being raised with no intellectual development. This has no impact on people as a whole, or those who don't fit that condition. While your dystopian idea is interesting (and pretty much true), it doesn't really have much bearing on this argument. It's more about the potential of the human brain.

Side: No, the brain is too complex.
5 points

One of the critical aspects of the human brain is self-awareness. Until machines are able to become self-aware, artificial intelligence will not be able to catch up to the level of performance of the human brain.

On the other hand, if this debate is about sheer number-crunching then there really is no debate: machines have already won.

Side: No, the brain is too complex.
pvtNobody(642) Disputed
4 points

I think that the question here is whether computer will eventually become self-aware. Obviously this is a goal that is far off and has some major ethical and moral implications that need to be considered before it is attempted. But I think that most people can conceive of a computer with the same level of complexity as the human brain and with that complexity (and a healthy amount of emulation) it is entirely possible that an AI will develop the same self-awareness that we find in ourselves.

Side: Yes, eventually.
HGrey87(750) Disputed
1 point

As long as it can improve itself, its knowledge and power will be practically limitless. I don't think you're talking about self-awareness, as that would give no advantage to cognitive performance. Something like Intuition could be the trump you're thinking of; Maybe computers will only be able to solve problems, and will be unable to innovate.

Side: Yes, eventually.
pvtNobody(642) Disputed
1 point

Computers already can solve a great number of problems. Innovation would be a primary requirement for artificial intelligence in my opinion.

Side: Yes, eventually.
3 points

There's more to the human brain than just performance. There's the ability to understand that we're conscious and to self-reflect. We're going to pass the number of neurons in connections in the next 10 years on computer chips, but the problem is figuring out how to put everything together and give it that spark of life. It might be that while computers are stronger processors of data than we are, even in strategy processing, that they never have that self-awareness that defines consciousness.

Side: No, the brain is too complex.
pvtNobody(642) Disputed
1 point

In reality it's not the hardware that is the limiting factor in this field but the software. The computer that will house a true AI may have already been built but it could be another fifty years before someone writes the program that runs the AI. Then again it might take building an exact model of the human brain in electronic form. In the end it's a matter of determining what actually gives humans that spark of self-awareness.

Side: Yes, eventually.
3 points

Computers have already proven that, when it comes to processing data, some computers are superior to some brains (not all brains are equal - but then I suppose that not all computers are equal either).

What makes this a tough question is that "level of performance" is not fully defined. There is much more to the human brain than processing data or making decisions based on that processing.

A very interesting aspect of the human brain is this: two (more or less) equally intelligent people can examine facts and data, and come to quite different conclusions based on that data. In this scenario, is one person's brain functioning better than the other's (one is right and another is wrong)? Or is it possible that more than one conclusion can be correctly drawn from one set of data? The real question, as it pertains to this debate, is what quality of performance is exhibited by this characteristic of the human brain?

My argument is this: the human brain is only one aspect of the human mind, the human mind is part of a life force that I would call "spirit" - some might call soul. This spirit or soul is much more than mere self-awareness. This power of the mind allows us to connect to other humans (and even to other living creatures) in ways that science has not yet been able to exactly define or quantify.

No matter how adept at "computing" we build our machines, I doubt if we can imbue them with the "spirit of life" that is truly what makes the human brain unique.

Side: No, the brain is too complex.
Daedalus(86) Disputed
4 points

Most of the arguments against strong AI revolve, as yours does, around the idea that there is something "special" about self-awareness or consciousness. But if the human brain or indeed the Universe itself is simulable (see below), then the only possible reason that "consciousness" could be special is if naturalism is wrong - if there is something supernatural and extra-causal about the human mind. There is no evidence to suggest this, and I see no reason to believe it is true. Discussion of this proposition could however take up a debate all to itself.

Then the question merely becomes: is the Universe, or at least the human brain, "simulable" - meaning, does the Church-Turing thesis apply to it? Can the physical laws of the universe, or the neural functioning of the brain, be phrased as an "algorithm"? Our current knowledge of the laws of physics says it probably can, although the Copenhagen interpretation of QM would suggest we need a true source of randomness for this simulation; thankfully this is not only possible but nearly within our grasp given the developments being made in quantum computation.

Side: Yes, eventually.
shunted(137) Disputed
4 points

Godel´s Incompleteness Theorem shows that given a computable set of axioms of the natural numbers there are statements that are true of the natural numbers that are not provable from this set of axioms. In the proof of the theorem Godel constructs such a statement.

It seems as though this is a pretty strong argument that there is something more to human intelligence. Specifically, that we can prove things about the natural numbers that computers will not ever be able to. Computers are not able to make deductions from Peano´s second order axiomatic system of the natural numbers but we can.

Side: Yes, eventually.
BMud(73) Disputed
3 points

[quote] But if the human brain or indeed the Universe itself is simulable (see below), then the only possible reason that "consciousness" could be special is if naturalism is wrong - if there is something supernatural and extra-causal about the human mind. There is no evidence to suggest this, and I see no reason to believe it is true.[endquote]

I had to look up the word naturalism. It is defined: the doctrine that the world can be understood in scientific terms without recourse to spiritual or supernatural explanations.

Your argument that consciousness is special only if naturalism is wrong if flawed. Scientific knowledge and terms are continually advancing (the world is no longer flat, as the best scientific minds once assured) and therefore the explanations of today must be incomplete and inaccurate. Our understanding of the world today is more complex and complete than it was 100 years ago, and our understanding 100 years from now will undoubtedly dispel some "truths" of today.

As I understand your use of the word special, it simply means that science has not yet found a method of rationally explaining it. The entire history of science suggests that, most likely an explanation is forthcoming - knowledge is just now incomplete. Of course, my rebuttal is NOT evidence, and gives no reason to believe that consciousness is indeed special.

For a "reason to believe", I would humbly suggest that you examine your own life experiences for a few occasions, a few happenstances, a number of occurrences that were indeed special, and beyond scientific explanation. If you cannot find any, well - without meaning any personal attack of any kind - I honestly believe you should go on a journey to find some.

Life IS so much more than mere science can explain. To deny the wonder of life is perhaps to miss it. Science may very well build a computer that surpasses the level of performance of the human brain, but I suspect that the contraption will lack the ability to marvel at its own creation.

Side: Yes, eventually.
2 points

<3 Daedalus

Are you saying they have found a theoretical way to create true randomness in a computer? Link?

Side: No, the brain is too complex.
2 points

We still don't know the full performance of the human brain yet. I think AI will be able to mimic human nature, but not be equal.

Side: No, the brain is too complex.
2 points

Computers are an essential part for modern day life. However, to compare the human brain to a computer is like comparing a heart to a CPU. In some ways, computers do outperform people (like very difficult math problems). But a computer can never outperform a person in a lot of ways. For example, free will. You can do whatever you want. For a computer, you need to enter commands through the keyboard and mouse. It has no free will whatsoever. What a computer does is predetermined by algorithms. We are the puppet masters, while the computer is the puppet.

A computer cannot express human emotion, nor does it have the power of creativity. Emotions are one of the most fundamental qualities that define us. How can we be compared to a machine without the machine having such fundamental qualities? Creativity also plays an important role. Without this special quality, there would be no art, advertising or inventions. Without inventions, we would not be where society is today.

No machine can hope to have succeeded like we have. People have built economies, governments, cities, laws, all of which are crucial to civilization and a fair society. A computer, no matter how big or powerful, will always require an operator and cannot achieve such feats.

Think about all the achievements you have made in your life so far. Think about the achievements that mankind has made (everything you have the chance enjoy today). Now look at the computer you are using right now. Can this possibly achieve what the human race has done for hundreds of thousands or years and the accomplishments I have made in a lifetime?

Side: Machines can never catch up to people
1 point

The hardware of computers may soon rival or surpass that of the human brain but WE have to write the software and I'm pretty confident that we will never be able to create a program as awe-inspiring as the human genome and no language to write it in better than DNA/RNA.

Side: No, the brain is too complex.
1 point

If we compare every computer, cell phone, laptop and server that is connected to the internet today, to the human brain, there are some similarities at the moment, there is 8 terrabytes per second of data going around the internet, that is the whole of the library of congress every two seconds, it would have 9 exobytes of storage space (1,000,000,000,000,000,000 bytes) there is 55 million links on the internet, that is close to the amount of synapses in the human brain.. So my point, to even closely mimic the human brain, you would need all the power of every piece of electronic equipment in the world, and 5% of the world's power.. A hefty price?

However, artificial intelligence can take shortcuts, what we perceive as artificial intelligence will always just be lines of code, executing one after the other and computers today will only do what is told of them to do in their code, it doesn't matter even if you did get the required computational power -you would still need to write a code-

Science today has only just begun to understand the human brain, and is nowhere near certain on how it all works, how are we conscious? why do we think? why do we have emotions? Science cannot currently understand that, so we would have no hope of replicating it with ones and zeros!

You could say that in the future we will understand our brains perfectly, and computers will have become much smaller and more powerful, but even then.. How do you tell a computer to think?

Supporting Evidence: Data on brain/ current computers (www.ted.com)
Side: No, the brain is too complex.
HGrey87(750) Disputed
1 point

You're talking about now. What do you think of the future? Design can be steered endlessly. Our brains have for the most part stopped evolving.

Side: Yes, eventually.
1 point

No machine or robot will replace or imitate what abilities the brain have.

We humans have something that ordinary robots don't and that is called self-awareness and consciousness.

We have our past lifes and future ones and our present in front of us.

We are not just living beings in this life, we have been living beings in the past lifes.

And we have the potential to discover our past life and perhaps our own future maybe even communicate wiith our future self. Artificial Intelligence doesn't have self-awareness or consciousness and it doens't have a past or a future.

Creating Robots/Machines that does the basics of human beings is possible but creating feelings for Artificial Intelligence is totally impossible. Maybe there is a chance to create robots like them but creating robots that are creative in their own ways is impossible.

Side: No, the brain is too complex.
1 point

While I believe it's technically possible, scientists are no closer now then they were 50 years ago. We really little idea just how the brain works.

Side: Believe it when I see it
HGrey87(750) Disputed
1 point

On what basis are you founding the statement that they're no closer? Have you studied the history of psychology, or of the development of AI?

Side: Yes, eventually.

I have my doubts. Artificial Intelligence will always have its limitations.

Side: No, the brain is too complex.
Cartman(18192) Disputed
1 point

Human intelligence has a lot of limitations. Artificial intelligence has the potential to have no limits.

Side: Yes, eventually.
0 points

"level of performance" needs to be defined here. As somebody else pointed out machines already destroy us when it comes to specific processes such as mathematics. However, if you mean to be just as intelligent and versatile and conscious as humans, which I assume is the definition in this case, then no. It is not even known as to whether or not consciousness is a product of the human brain (I'm not saying it is or it isn't, I'm indifferent so far), and therefor it may actually be impossible. However, assuming that it is eventually possible for someone to create such AI, I still don't think it will happen simply because humans won't allow it. There would be no need for an AI so advanced that it needs to be conscious, and the risks involved with building such a being would not be worth it.

Side: No, the brain is too complex.
HGrey87(750) Disputed
1 point

When have risks to humanity ever stopped scientific progress?

Side: Yes, eventually.
Szechuan(101) Disputed
0 points

Ok, if no one can think of a time...

That's bad.

Side: No, the brain is too complex.
0 points

It may not be possible to fully understand the workings of our own mind, since our thinking and understanding is constrained/shaped by that mind.

If we can't fully understand our own intelligence, how could we hope to replicate it?

Side: No, the brain is too complex.
HGrey87(750) Disputed
0 points

Have we not made tremendous progress in understanding it in just the past 100 years? I don't think there's a rule that says an intelligence can't fully understand itself.

And it's not so much about replicating it, but building a superior version. And it's very possible that this could consist of simpler parts and processes; thus, it is possible, even if we cannot comprehend the workings of our own minds.

Side: No, the brain is too complex.
0 points

The thing that sets humans beyond the potentials of artificial intelligence is morality and spirituality. Humans are special because we can think on a plane beyond that of our immediate environment. We can have faith in a higher power. This ability to think spiritually sets us far above even a conscious AI machine. Even if self-awareness is possible for AI, the presence of spirituality in machines will not be possible.

Side: No, the brain is too complex.
0 points

I believe the brain is too complex. A robot can't learn emotions and how to deal with things like stress it can't copy a human brain ever

Side: No, the brain is too complex.