This is bordering on off-topic, but it's the best place for it because I don't want to drive a contest story off topic, so here goes.
I have to say I've never really liked the concept of the Singularity. I'm not quite sure I can pin down why, but it just seems to me that, well, a big event that will change the world in a way that we can't visualize and life will be incredibly different once it happened... eh, It's Been Done.
Think of the technological advances between 1900 and 1945. Or 1900 and today. No one in 1899 would be able to imagine the internet, atomic weapons, airplanes, computing, the ability to order kung pao chicken on a telephone and have it delivered to your front door in less than an hour in a small town. Or the 24 hour news cycle, the rise of megacorporations, MRI's, heart/lung/face transplants, CERN and the new Hadron Collider.
I don't like it because it makes it seem like everything leading up to it hasn't been that big a deal. I'm not saying that something like it won't happen, but I really disagree that it hasn't already happened before. The first sentient AI will be important. Machine intelligence surpassing human intelligence and self-progressing itself is an interesting concept, and I don't doubt that it will happen at some point. Making it seem like it's the end of humanity... I don't believe.
Whenever I try to visualize it I always go back to Hitchhiker's Guide to the Galaxy. Sure, there are super-intelligence machines. Heck, they might even be the Earth. Maybe we will all end up in the machine, or just in a reality outside it augmented by it. But I have a hard time believing you can have sentience without becoming... well, Human. And once they become human, well... how long before the technology they bring in just becomes another cool new gadget?