Friday, September 14, 2007

More on the Singularity

For some time now I've been trying to write a thoughtful article on the Technological Singularity. In true blogger style I've decided that rather than expend my energies on creating my own article, I will find someone else's and link to it.

The writer is Ronald Bailey from Reason Online, the online wing of a reasonably popular (by UK standards) libertarian magazine, and he is reporting on the recent Singularity Summit.

Bailey does a good job of summarising the basic ideas surrounding the Technological Singularity. He quotes one of the attendees of the conference: Eliezer Yudkowsky, cofounder of the Singularity Institute:

"...the Event Horizon school is just one of the three main schools of thought about the Singularity. The other two are the Accelerationist and the Intelligence Explosion schools..."

My summary of these groups is as follows:

Event Horizon: once an intelligence is developed that is "greater" (dismiss for a moment the difficulty in quantifying intelligence) than ours we, by our very nature, will be unable to predict what will happen.

Accelerationist: advances in computer hardware (c.f. Moore's Law) will continue to accelerate, along with our understanding of our own biology and our ability (via genetic engineering, implants, bioengineering etc) to alter our own biology. This means that within a few decades we will merge with our technology and become the greater intelligence. Suggested by Ray Kurzweil in The Singularity is Near.

Intelligence Explosion (not a bomb at Thames House, the other kind of intelligence): technology arises from the application of intelligence to problems. When technology is applied to our own apparent lack of intelligence, we will get marginally better intelligence which will result in marginally better technology which will produce even better intelligence. A feed-back loop will be created, with "intelligence" increasing with each iteration. Suggested by I.J. Good in a New Scientist article.

The three concepts feed into one another and don't necessarily cancel each other out.

I suspect that the world sketched out by Kurzweil is not impossible, but the timeframe seems implausible. There is no reason why matter shouldn't be able to support beings that are more durable than we are, longer lived, faster at learning, with better memories, and that experience the world more slowly and deeply (i.e. each second for them would offer what would amount to a week's worth of thinking time to us).

However the current state of our ability to control matter, though significant, doesn't seem to offer the possibility of superhumans within, as Kurzweil suggests, 50 years.

If silicon-based computer chips are currently undergoing exponential increases in the transistor per centimetre counts then it doesn't necessarily entail similar progress in another area like brain-scanning.

Kurzweil does a good job of pointing out exponential trends similar to Moore's Law in The Singularity is Near, for example the Human Genome Project (page 510 of the USA Penguin hardback copy I have), and the resolution of non-invasive brain scanning (page 159).

My basic problem with Kurzweil's book is my incredulity: the book is compelling whilst you read it, but once you're back in the real world you simply can't imagine a "singularity" of any flavour occurring.

Which, ahem, is pretty much the definition of the event horizon style singularity.

*sigh*

So I suppose I'll just have to wait and see, like everyone else...

No comments: