Escape Artists

The Lounge at the End of the Universe => Gallimaufry => Topic started by: nebulinda on February 13, 2007, 04:13:24 AM

Title: Singularity?
Post by: nebulinda on February 13, 2007, 04:13:24 AM
I am a relative new-comer to science fiction, having spent most of my reading time in fantasy. I've seen the term "singularity" come up a lot, and I don't really have a clue what it means. As an astrophysics student, when I think of singularities, I think of collapsed stars, the centers of black holes.

What is a singularity in a SF context?
Title: Re: Singularity?
Post by: SFEley on February 13, 2007, 04:33:57 AM
What is a singularity in a SF context?

Short answer: it refers to a predicted near-future event when intelligence on Earth (either AI or machine-augmented human intelligence) advances to the point where it becomes capable of further improving itself exponentially, resulting in an explosion of new technology and a world that is literally unimaginable to human intelligence today. 

The term was first coined by Vernor Vinge in the 1980s and developed in a 1993 paper presented to a NASA symposium (http://rohan.sdsu.edu/faculty/vinge/misc/singularity.html), whose abstract begins:  "Within thirty years, we will have the technological means to create superhuman intelligence.  Shortly after, the human era will be ended."

Long answer:
http://en.wikipedia.org/wiki/Technological_singularity (http://en.wikipedia.org/wiki/Technological_singularity)
Title: Re: Singularity?
Post by: JaredAxelrod on February 13, 2007, 07:26:06 PM
Interesting sidenote, Mr. Vinge is giving a lecture in San Francisco this Thursday, according to Bruce Sterling (http://blog.wired.com/sterling/2007/02/why_singularity.html).

The title? "What If the Singularity Does NOT Happen?"
Title: Re: Singularity?
Post by: Heradel on February 13, 2007, 07:50:28 PM
This is bordering on off-topic, but it's the best place for it because I don't want to drive a contest story off topic, so here goes.

I have to say I've never really liked the concept of the Singularity. I'm not quite sure I can pin down why, but it just seems to me that, well, a big event that will change the world in a way that we can't visualize and life will be incredibly different once it happened... eh, It's Been Done.

Think of the technological advances between 1900 and 1945. Or 1900 and today. No one in 1899 would be able to imagine the internet, atomic weapons, airplanes, computing, the ability to order kung pao chicken on a telephone and have it delivered to your front door in less than an hour in a small town. Or the 24 hour news cycle, the rise of megacorporations, MRI's, heart/lung/face transplants, CERN and the new Hadron Collider.

I don't like it because it makes it seem like everything leading up to it hasn't been that big a deal. I'm not saying that something like it won't happen, but I really disagree that it hasn't already happened before. The first sentient AI will be important. Machine intelligence surpassing human intelligence and self-progressing itself is an interesting concept, and I don't doubt that it will happen at some point. Making it seem like it's the end of humanity... I don't believe.

Whenever I try to visualize it I always go back to Hitchhiker's Guide to the Galaxy. Sure, there are super-intelligence machines. Heck, they might even be the Earth. Maybe we will all end up in the machine, or just in a reality outside it augmented by it. But I have a hard time believing you can have sentience without becoming... well, Human. And once they become human, well... how long before the technology they bring in just becomes another cool new gadget?

Title: Re: Singularity?
Post by: Russell Nash on February 14, 2007, 07:58:44 PM
I don't think the question is, "What happens when they equal us?" It's, "What happens when they greatly surpass us?"
Title: Re: Singularity?
Post by: SFEley on February 14, 2007, 08:56:46 PM
I don't think the question is, "What happens when they equal us?" It's, "What happens when they greatly surpass us?"

Right. 

Another framing of singularity theory is one made popular by Ray Kurzweil, who talks about exactly what Heradel said: that technology has already transformed the world many times, including several transformations in the last century, and that those world-changing advances are coming faster and faster.  If the acceleration of technological development continues, then sometime within the next couple of decades we'll hit a point where the world becomes unrecognizable from one day to the next.

Of course, human minds are completely unable to cope with such rapid change.  So either the pace of development has to flatten out before that happens, or the inhabitants who function in that world comfortably will have to have minds with capabilities greater than what we today call "human."

That's the singularity.