Monday, May 22, 2006

Kurzweil's dangerous idea

Last Saturday, 13 May 02006, was the Singularity Summit at Stanford University, featuring such luminaries as Ray Kurzweil, Douglas Hofstadter, Eric Drexler, and others. The Singularity is a rather fascinating concept pioneered by mathematician John von Neumann, popularised by Vernor Vinge in a famous 1993 paper and made famous in the last couple of books by Kurzweil. It is a hypothetical technological event or scenario in the not too distant future in which artificial intelligence exceeds human intelligence. In mathematics a singularity is defined as "a point where a mathematical function goes to infinity or is in certain other ways ill-behaved". Applied to historical change, then, it's a sort of metaphor for the experience of scaling the dizzying heights of the curve of accelerating change, which amounts to an historical discontinuity -- an event so sudden and far-reaching that it's virtually unimaginable what things on the other side look like.

None of the speakers at the Stanford Summit that I can recall provided examples of previous singularity-like events. However, sci-fi author Bruce Sterling, in a very entertaining presentation for the Long Now Foundation in 02004, cited three historical instances that provided a foretaste of singularitarian discontinuity: the atomic bomb, LSD, and computer viruses. These were world-reinventing phenomena. Proponents of The (upcoming technological) Singularity claim it would be the mother of them all, bringing a transformation tantamount to an end to human limits, including death.

Now, some futurists use the concept of "wild cards" (high-impact, low probability events) to think about possible large scale changes. This can be a useful way to provoke thinking about sudden events that defy the assumptions and practices which collectively constitute "business as usual". A straightforward example of a wild card is the natural disaster -- tsunami, earthquake, or volcanic eruption. But the technological Singularity is not a wild card scenario in this sense. Yes, it is characterised by its advocates (of which Kurzweil is the chief prophet) as an extremely high impact event, but also one with high probability attached -- actually, they say, it's inevitable. Here is a cause for concern about the idea of The Singularity, or rather the way it is presented in some quarters.

John Brockman's 1 January 02006 Huffington Post article supplies a range of prominent inellectuals' responses to "The Edge" annual question, which was simply, "What is your Dangerous Idea?" The intention was to elicit from respondents their main thoughts that that could be considered dangerous not because they might be false -- but because they might be true. Kurzweil's answer was "The near-term inevitability of radical life extension and expansion." (There are lots of other interesting responses on very different themes - take a look.)

I like this suggestion. Radical life extension and expansion is most certainly a possibility worthy of careful consideration. Much public discussion coasts uneasily on a tissue of assumptions vulnerable to periodic disruption by unforeseen events. In light of current worries about funding the pensions of baby boomers on the verge of retirement, for instance, think how this challenge would be shifted by a series major medical breakthroughs extending their life expectancy. Many current problems would look very different today if they had been thought about carefully much earlier, when they were still just distant possibilities on the horizon. Our political debates have not yet evolved to expect the unexpected, but opinions about the future found in mainstream discussion, based on the need to seem reasonable, are so often wrong that it behooves us to entertain much more outlandish, radical, and yes, dangerous ideas. (Kurt Vonnegut, Slapstick: "History is merely a list of surprises. It can only prepare us to be surprised yet again.") This rhymes with futurist Jim Dator's Second Law: "Any useful idea about the futures should appear to be ridiculous."

I no longer find the Kurzweilian outlook ridiculous -- though until quite recently I suppose I did. The idea of a technological singularity appeared somehow absurd to me, until I read and heard more about it and realized I had not been taking the acceleration of change in computing power seriously enough. (When change is accelerating, more happens during a short period late in the process than a long period early on.) I think the singularitarian views of Kurzweil and friends, not despite but because of their extremism, can be highly useful as a provocation and an example of the surprising or apparently ridiculous ideas that can become plausible in circumstances of extremely rapid change: twenty or thirty more years of Moore's law, exponential growth in computing power, and we could expect a lot of things to look different. I applaud singularitarians for challenging us to take accelerating change seriously. There are other, very big doubts I have about this line of thinking -- for instance, I am not convinced that other systems, such as ecological ones, will necessarily continue to support smooth sailing up the curve of computational speed for that period of time -- but we can focus on those concerns another time.

What troubles me is the posited inevitability of the Singularity; and the messianic end-of-days character with which it seems to be imbued. The idea has been described as "the rapture of the nerds", a comical name, but one with a disturbing grain of truth about it -- and the problem is right there in the label "singularity" ... it's a singular, exclusive, totalising conception of the future of humanity and the world, with a distinctly religious, eschatological air about it. In this light, it's a discourse that brooks no dissent; for the corollary message of inevitability is: get on board or get out of the way -- there's nothing else you can do. This notion of inevitability -- applied to *any* scenario that results from human decisions and behaviours, unlike a volcanic eruption -- truly is a dangerous idea. The danger is that people make it inevitable by accepting it as such; and to that extent they forego, or fail to recognise, the need and opportunity to imagine and pursue their preferred alternatives. We make and set these trends in motion, and then disown responsibility for them as if they were acts of God. This is highly suspect from a political as well as philosophical perspective.

Let's be clear: this is not an argument, luddite-style, against the "contents" of The Singularity scenario as undesirable (to the extent that such contents can actually be discerned, amid all the fluff and excitement). Perhaps it would be a desirable future, even a magnificent one, at least for some. Nor is it an argument that The Singularity can't or won't happen. Perhaps it will. But these are the wrong questions. Who's making these changes occur? Who wants it to happen, and why? What might we do or refrain from doing, individually and collectively, to make the things we'd like to see happen, and avoid or mitigate the things we don't? The implications of these possibilities for our action was not the focus of this particular event, but it absolutely should be. Otherwise, what's the point?

The argument I'm trying to make is that some proponents of The Singularity appear to be preparing themselves, and as many of the rest of us as possible, for The One True Future, and proselytising for recruits. Especially if their enthusiasm springs from a perception that stronger-than-human-AI would be an unqualified good, those people advertising its inevitability are, wittingly or not, colonising the future for us -- ensuring that debates on these other issues (that scenario's preferability and probability) are framed in terms that marginalise dissenting views. They place this human process beyond the scope of any discussion of preferences or values. Hence the Singularity Summit, while extremely well organised, well attended, and thought provoking -- all of which made being there more than worthwhile -- felt largely like a rally of the faithful. Two sceptical voices were featured in the program, Douglas Hofstadter and Bill McKibben, yet neither of whom made the point forcefully enough that I think most needs to be made. History is a human story, which ought to be written collaboratively and carefully, and the Singularitarian who treats that favourite scenario as a fait accompli is either naively or deviously denying the responsibility to consider our other possible paths.

That's why I'm a Singularity Sceptic.

No comments: