Tuesday, February 10, 2009

Singularity and transhumanism

PZ Myers has written an interesting critique of Ray Kurzweil's thoughts on a possible technological singularity:

...not only is the chart an artificial and perhaps even conscious attempt to fit the data to a predetermined conclusion, but what it actually represents is the proximity of the familiar.

We are much more aware of innovations in our current time and environment, and the farther back we look, the blurrier the distinctions get. We may think it's a grand step forward to have these fancy cell phones that don't tie you to a cord coming from the wall, but there was also a time when people thought it was radical to be using this new bow & arrow thingie, instead of the good ol' atlatl.

We just lump that prior event into a "flinging pointy things" category and don't think much of it. When Kurzweil reifies biases that way, he gets garbage, like this graph, out.

Now I do think that human culture has allowed and encouraged greater rates of change than are possible without active, intelligent engagement—but this techno-mystical crap is just kookery, plain and simple, and the rationale is disgracefully bad. One thing I will say for Kurzweil, though, is that he seems to be a first-rate bullshit artist.

...

Kurzweil tosses a bunch of things into a graph, shows a curve that goes upward, and gets all misty-eyed and spiritual over our Bold Future. Some places it's OK, when he's actually looking at something measurable, like processor speed over time.

In other places, where he puts bacteria and monkeys on the Y-axis and pontificates about the future of evolution, it's absurd. I am completely baffled by Kurzweil's popularity, and in particular the respect he gets in some circles, since his claims simply do not hold up to even casually critical examination.

Calling Kurzweil a bullshit artist is unfair: Kurzweil is a genuinely talented inventor and engineer. His beliefs might be a little kooky to some, but I've always found his writing compelling.

Kurzweil is a spiritualist: there's nothing wrong with that. A belief in the power of some imminent superhuman AI to solve all our problems is slightly less absurd than most religious beliefs, and Kurzweil doesn't come across as the type to build a pyramid of skulls in the meantime.

But really: who honestly cares about the singularity?

Building artificial human minds may be possible within my lifetime, or it may not.

There will still be substantial technological change, even if the prime mover remains good old-fashioned human grey matter.

What I find compelling is the suggestion of where ongoing developments in biology, computing, genetics, and human augmentation may take us over the next few decades.

Among these developments are new ways of combining human intelligence with machine intelligence that result in a substantial increase along all dimensions of intellectual development (what Kurzweil calls the law of accelerating returns.)

So although the idea of the singularity has become less compelling what continues to excite me about Kurzweil's writings are his descriptions of posthumans. Partly for the good ol' SFnal sensawunda, and partly because maybe it could happen to me. Maybe I could become a posthuman.

I think the idea and potential reality of self-guided human evolution is a great idea in itself. I can take or leave the singularity.

Prof Myers also comments separately on the recent pronouncements on the future of humanity Juan Enriquez at the TED conference:

Every species also takes control over its own evolution, in a sense; individuals make choices of all sorts that influence what will happen in the next generation. You could rightly argue that they don't do it with planning and intent, but I have seen nothing that suggests that our attempts to modify our species, low tech and high tech together, are any wiser or better informed about the long-term consequences than those of any rat fighting for an opportunity to mate. We do what we do; don't pretend it's part of a long term plan that is actually prepared for all of the unexpected eventualities.

I agree with Myers up to a point: he's basically saying that developments in biotechnology and the progress of transhumanism won't happen in some big, top-down, organised way, but will rather develop as a series of steps through stochastic tinkering in the lab and (eventually) the marketplace.

The beauty of human progress is it doesn't have any long term plan: we do what we do and we tinker and experiment and find things out.

Juan Enriquez can make all the grand pronouncements about the future of humanity he likes but what he is actually trying to do is raise investment capital for his company Biotechonomy.

And Biotechonomy will pay scientists to tinker and experiment and find things out.

Such is the nature of technological advancement.

Prof Myers ends on a positive note:

Maybe this information age will have as dramatic and as important an effect on humanity as the invention of writing, but even if it does, don't expect a nerd rapture to come of it. Just more cool stuff, and a bigger, shinier, fancier playground for humanity to gambol about in.

Well I certainly agree with that.

No comments: