CreateDebate


Debate Info

15
20
No, it's just science fiction. Yes, Moore's Law predicts it.
Debate Score:35
Arguments:15
Total Votes:43
More Stats

Argument Ratio

side graph
 
 No, it's just science fiction. (6)
 
 Yes, Moore's Law predicts it. (9)

Debate Creator

Tamisan(890) pic



Do you believe there will be a technological Singularity?


No, it's just science fiction.

Side Score: 15
VS.

Yes, Moore's Law predicts it.

Side Score: 20
4 points

As usual I'm tempted to come down the middle of the argument...Will there be a rapid increase in development of new technology - probably yes for a while...but the thing to be considered is...technology is not made for its own sake...it is an economic activity, and so the way it happens often surprises tech geeks..for eg, everyone in 1940 predicted flying cars by 2010...and here's the thing, we can quite easily have flying personal vehicles today...so why don't we see them? cos its cheaper to use a combination of good urban planning, public transport, and normal cars, when these 3 already exist. It's a chicken and egg problem.

Similarly, AI will get better definitely. Will we make 'conscious' AI? I think that's very possible. BUt please keep in mind that the majority of humans can't even see that most higher animals are conscious, have emotions etc...so will the world be interacting with these 'conscious' AI as sentient beings? probably not...in fact making an absolutely conscious AI serves very few economic purposes, so my bet is barely conscious AI...what do i mean by that?

As long as an AI has 2-3 layers of cognitive processes, one for getting inputs from environment, doing primal pre-processing, sending signals (emotions) to higher layers that then do conceptual data structure formation by inductive reasoning, then in my view the AI is technically conscious...it's not human, but neither are dogs, whereas dogs seem pretty conscious too...

So if you're asking can we predict the state of technology after the singularity...no we can't...but we can't predict the state of technology more than 50 years in the future anyway...we couldn't do that 50 years back...

apparently mobile phones use quantum mechanic theory...do people who use it know that? do they marvel at the scientific 'singularity' that happened 50 years back?

as for bio-medical advances, again..yes...lots of advances..but will humans allow their basic instincts to be changed? probably not...and yes humans have basic instincts, thats why all of humanity shows easily identifiable patterns of behavior...

Lastly - virtual reality...this was something i had a lot of hope for..i could see it happening, brains transferred to VR, people living forever...but it struck me recently that this might be an impossibility, like artificial gravity on a macro scale might be an impossibility...because the brain's memory storage mechanism might not be in a form that's readable...we make computer hard-drives so that they can be easily read from begg to end..the brain stores data differently, it's spread out, overlapping...it's like how neural networks store it...now for the brain to get transferred to VR we essentially need to ghost it, like ghosting a hard drive...but ghosting a multi million connection neural network that has had 20+ years of training may be impossible to simplify...it will be an NP complete problem with a time period of a million factorial or something...that will take the life span of a sun dying to be processed...

Technology may advance all it wants to, but gravity remains gravity and NP complete remains NP complete...

Side: No, it's just science fiction.
4 points

The whole argument is based on the premise that we can create a machine that surpasses human intelligence that can then design a better machine to surpass it's own intelligence and so forth down the line. The problem is how can you create something that knows more than you know? You can't give something knowledge you don't have or know how to get.

We can create a computer to calculate Pi to the Trillionth decimal place, but only because we know how to do it. Even if we create a computer that can learn/obtain progressive information. The biggest part about such a machine would be it's ability to report the information it learns to the people in charge of it. Progressively increasing our intelligence as well. So wouldn't we both evolve at a similar exponential pace?

Plus if we're going to create a thinking computer unlike anything we have now, wouldn't it have to be modeled on how the human brain works? Which means we would already know more about how the brain works to a point where we might be able to modify it with technology evolving us even further at the same time as technology.

The Singularity is just Science Fiction.

Side: No, it's just science fiction.
3 points

I admit it, this passage sounds quite sexy:

If machines could even slightly surpass human intellect, they could improve their own designs in ways unseen by their designers, and thus recursively augment themselves into far greater intelligences. -IJ Good

Hard to buy the argument that the machines would instantly augment themselves into far greater intelligences. Where's the evidence?

Side: No, it's just science fiction.

In order to build machines that, at least, match our diverse abilities, we will need to know more about how the brain works. If we are also able to do genetic engineering, then we can augment ourselves and keep ahead of the machines. Hell, maybe we can augment ourselves and surpass God! I don't think He'll get mad at us because, doesn't every parent want his children to surpass them?

Side: No, it's just science fiction.
0 points

Okay, apparently I can't make a post supporting the view that "this is a stupid false dichotomy".

Moore's Law describes exponential growth, same with all similar computer/AI growth models. Exponentials DO NOT HAVE SINGULARITIES. So Moore's Law doesn't predict any kind of singularity, period.

Even if there was some kind of technological growth model that is singular (which I've seen in a minority of human energy use models), it will not necessarily correspond to the point of developing "Strong AI". So let's just separate all this out a bit.

Now, I believe that we will eventually develop Strong AI, and that singular models of human development do exist and have been and will be surpassed (one singularity we already passed is that of weapon lethality: the superbomb has an infinite theoretical yield - we raced the Russians to build it in the 50s and 60s - both sides won by different measures).

The notion that humans can't create something "smarter" or with "more knowledge" than themselves is absurd. Every time we take a digital photograph, it stores "more knowledge" than us (that can be read intelligently by a machine), and every time we use a calculator we are exploiting a tool that is "smarter" in a computational sense. We make things that are better than ourselves all the time, and it doesn't violate any sort of cosmic logic. We've even created systems of logic that are better than our own, for crying out loud.

In conclusion, you're all wrong about something.

Side: bad dichotomy
-2 points
3 points

I doubt that Moore's Law will remain entirely accurate as we approach what could be called a technological singularity, but I do believe that eventually, strong AI will be achieved, whether it's on purpose or by accident. I've heard it said that the human brain is too complex to simulate; Intelligence does not have to come in the form of a human mind, and wit ha network of enough processors, it would be possible to create a structure similar to the human brain where "cells" are replaced by individual computers. A machine which contains its own specification and which can thus modify and extend itself is capable of starting the technological singularity. I can't say I'm certain such a singularity will occur; we could take steps to prevent it given the risk such a paradigm shift could pose to humanity as it is, but if we embrace technology to its full at all times, I think it is more likely to happen than not.

Side: Yes, Moore's Law predicts it.
3 points

I think we will have a rapid acceleration of intelligence after the creation of the first Strong A.I.

I do think that it will follow the classic S-curve that most exponential phenomena follow and eventually top out at a maximum effective level. What that level would be and what will limit further progress, from this side of the Singularity, is anybody's guess.

My hope is that we approach the Singularity from the Human-centric Intelligence Amplification ( I.A.) path rather than a pure Artificial Intelligence (A.I.) path.

Side: Yes, Moore's Law predicts it.
Cthulhu(50) Disputed
3 points

I don't think that IA is the optimal path toward the singularity. In order for our brains to reach the capacity needed, we would have to undergo significant changes. Either a group consciousness/hive mind, genetic alteration to improve mental capacity or mechanical augmentation of our brains (which is usually seen as the precursor to the hive mind stage) would be required, and all of those would move us away from our current state - and thus away from 'humanity' as a concept. The singularity, if it occurs, will devastate the human race; whether or not we survive it depends on us seeing it coming, preparing and keeping up with our own technology.

Side: Yes, Moore's Law predicts it.
2 points

Just the recent predicted technological breakthroughs seem to suggest that this is inevitable

Side: Yes, Moore's Law predicts it.
2 points

"Singularity" is used variously to describe the emergence of self-improving non-human intelligence (Good, Vinge) and a "rupture in the fabric of human history" precipitated by technological change accelerating to a rate beyond human capacity to adapt (Kurzweil). I think the former is likely, though perhaps not in the time frames suggested by many futurists. The latter I find harder to credit: its possible that there is a limit to humanity's ability to adapt, but I doubt it.

Side: Yes, Moore's Law predicts it.

I hope this article compliments your argument.

http://www.dailygalaxy.com/my_weblog/2008/06/exponential-tec.html

Side: Yes, Moore's Law predicts it.
1 point

Thanks, I appreciate you supplementing this debate with that article. Although, I could easily argue many of it's suppositions. I'll leave that to the other debaters, though. :)

Side: resources

I think it is every parents wish that their children / creation surpass them.

Side: Yes, Moore's Law predicts it.
1 point

more s law predicts that every 18 months prosesors duble in power it all so says that we can only make them so small but after law runs out physically we can make nano super computers till they are only made up of about a few atoms

Side: Yes, Moore's Law predicts it.