Skip to comments.Singularities and Nightmares
Posted on 03/30/2006 4:52:09 AM PST by Neville72
Options for a coming singularity include self-destruction of civilization, a positive singularity, a negative singularity (machines take over), and retreat into tradition. Our urgent goal: find (and avoid) failure modes, using anticipation (thought experiments) and resiliency -- establishing robust systems that can deal with almost any problem as it arises.
In order to give you pleasant dreams tonight, let me offer a few possibilities about the days that lie aheadchanges that may occur within the next twenty or so years, roughly a single human generation. Possibilities that are taken seriously by some of today's best minds. Potential transformations of human life on Earth and, perhaps, even what it means to be human.
For example, what if biologists and organic chemists manage to do to their laboratories the same thing that cyberneticists did to computers? Shrinking their vast biochemical labs from building-sized behemoths down to units that are utterly compact, making them smaller, cheaper, and more powerful than anyone imagined. Isn't that what happened to those gigantic computers of yesteryear? Until, today, your pocket cell phone contains as much processing power and sophistication as NASA owned during the moon shots. People who foresaw this change were able to ride this technological wave. Some of them made a lot of money.
(Excerpt) Read more at kurzweilai.net ...
Long, rambling article about an important idea: here is the guts of it:
The options before us appear to fall into four broad categories:
1. Self-destruction. Immolation or desolation or mass-death. Or ecological suicide. Or social collapse. Name your favorite poison. Followed by a long era when our few successors (if any) look back upon us with envy. For a wonderfully depressing and informative look at this option, see Jared Diamond's Collapse: How Societies Choose to Fail or Succeed. (Note that Diamond restricts himself to ecological disasters that resonate with civilization-failures of the past; thus he only touches on the range of possible catastrophe modes.) We are used to imagining self-destruction happening as a result of mistakes by ruling elites. But in this article we have explored how it also could happen if society enters an age of universal democratization of the means of destructionor, as Thomas Friedman puts it, "the super-empowerment of the angry young man"without accompanying advances in social maturity and general wisdom.
2. Achieve some form of 'Positive Singularity'or at least a phase shift to a higher and more knowledgeable society (one that may have problems of its own that we can't imagine.) Positive singularities would, in general, offer normal human beings every opportunity to participate in spectacular advances, experiencing voluntary, dramatic self-improvement, without anything being compulsory or too much of a betrayal to the core values of decency we share.
3. Then there is the 'Negative Singularity'a version of self-destruction in which a skyrocket of technological progress does occur, but in ways that members of our generation would find unpalatable. Specific scenarios that fall into this category might include being abused by new, super-intelligent successors (as in Terminator or The Matrix), or simply being "left behind" by super entities that pat us on the head and move on to great things that we can never understand. Even the softest and most benign version of such a 'Negative Singularity' is perceived as loathsome by some perceptive renunciators, like Bill Joy, who take a dour view of the prospect that humans may become a less-than-pinnacle form of life on Planet Earth.10
4. Finally, there is the ultimate outcome that is implicit in every renunciation scenario: Retreat into some more traditional form of human society, like those that maintained static sameness under pyramidal hierarchies of control for at least four millennia. One that quashes the technologies that might lead to results 1 or 2 or 3. With four thousand years of experience at this process, hyper-conservative hierarchies could probably manage this agreeable task, if we give them the power. That is, they could do it for a while.
When the various paths11 are laid out in this way, it seems to be a daunting future that we face. Perhaps an era when all of human destiny will be decided. Certainly not one that's devoid of "history." For a somewhat similar, though more detailed, examination of these paths, the reader might pick up Joel Garreau's fine book, Radical Evolution. It takes a good look at two extreme scenarios for the future"Heaven" and Hell"then posits a third"Prevail"as the one that rings most true.
So, which of these outcomes seem plausible?
The only scenario that I don't find plausible is #4. Repressive governments may be able to slow technological advances, but they can't stop it.
A Singularity event is still the most likely. I'd lean towards a negative scenario though. The hyper-capable leaving behind those too hide bound to advance. A conflict then ensuing. It hits all the classical societal stress points for just about every metric you care to imagine.
Extropians and transhumanists have been knocking around ideas for failsafe tech limits, strategies for paradigm shifts, dealing with religious zealotry and objectionism, ect...
I see this, too.
A conflict then ensuing.
Why? I think the extropians will just leave the planet, probably with technology that humans can't even understand. They may stick around to maintain the earth as a human zoo. If so I expect that would take their zoo-keeper role as seriously as most other zoo-keepers do.
Alternately they might just disengage.
Advances I'm seeing coming to fruition fairly soon are computer/net interfaces, immune system revamp, re-engineering of entire gene sequences in fully developed biological systems. Any one of these gives a huge advantage over those who, for one reason or another, don't want to be "improved".
Frankly, the idea of extended life span, increased damage resistance/repair, and a direct neural feed to computing systems would be my "buy in" criteria. Computer enhanced eidetic memory alone would almost be worth it.
lab on a chip is commonplace
Well, I wouldn't consider that too negative, but then I know which group I'd plan to be in :)
A conflict then ensuing.
I don't think that's inevitable. The "transhumans" would likely be so wealthy and powerful that they could easily afford to let the "luddites" live in peace, perhaps even assisting them. I do see large potentials for conflict in the process of getting to that point, for example radical egalitarians determined that if they're not going to be enhanced, nobody else should be. (In a way, Islamic terrorism is a form of this).
"Advances I'm seeing coming to fruition fairly soon are computer/net interfaces, immune system revamp, re-engineering of entire gene sequences in fully developed biological systems. Any one of these gives a huge advantage over those who, for one reason or another, don't want to be "improved"."
I hear alot of big talk on this site and others, from those who claim they wouldn't want to have their lifetimes radically extended even if it was in a healthy state. I find that utterly ludicrous and don't believe it for a minute.
In essence what we're talking about is that long sought after "fountain of youth". How many could resist. Very, very few I'd bet.
My wife is one. Philosophical reasons. If it does come down to that though, I'll miss her. Terribly.
Me either. To be fair, lots of people hear "live for 500 years" and think that means 400 years in a wheelchair and on dialysis, and I can understand not looking forward to that. Once people understand the full implications of effectively curing aging, I expect to see very few holdouts.
That sucks. Perhaps she'll change her mind if it becomes more than a theoretical issue.
I like the idea of living with her for a few hundred years. As odd as that may sound in this day and age when any given marriage has a 50% chance of success.
Not me. I'll take ten thousand years, please. Always something new out there to see and do. By time you got done with "everything", a whole bunch of new things would have come around.
It's like the world's biggest game of "Risk". George Soros and his ilk are playing it today, in fact, sans the extended lifespans. ;)
There may be some who resist, but I foresee larger numbers who would like to join the transhumans, but can't due to cost--this is where I see the conflict arising..."haves," who can afford the enhancements, versus "have nots," who can't. It's difficult to evision a scenario under which this would be offered to all comers gratis. I suspect that most of us who are looking forward to this future will find ourselves on the outside looking in, as the billionaires move on to the next level--luddites by circimstance, not choice.
Yeah, but this is a somewhat different subject---a "chemical plant" on a chip--that is, a miniaturized system for chemical synthesis, rather than a system for chemical analysis.
"It's difficult to evision a scenario under which this would be offered to all comers gratis. "
I believe in his new book, Ray Kurzweil makes a good case for the therapies being, not free but widely available.I'm paraphasing heavily but his belief is that once proven the political pressure to make treatments relatively easy for most to acquire(at least in the developed countries) will be so enormous no politician who opposed it could win election.
Who said anything about "gratis"? Capitalism works better than socialism.
I would prefer to fight for a beginning, not an ending.
Brin's article is one in a series of 11 recent essays that address the possible dangers and opportunities inherent in the coming nanotech/genetic revolution.
I gotten through seven so far and all are interesting, thought provoking reads.