A lot of people like the idea of a "singularity", or other, similar theories.
Personally, I love the idea. But there is something to remember:
99% of humanity isn't interested in participating.
This creates one hell of a drag! As far as I can see, there are only three ways a "singularity" could occur. On the plus side, I can't see many ways in which a singularity will fail to occur - just ways in which it can be pushed forward and back.
1) The early adopter curve. In theory, a singularity could be steadily "adopted" in. The way the internet has been. This would probably take 20 years per "stage", plus an additional 20 years. However, I don't see this as very likely when compared to the other options.
2) A biological upgrade. The reason there's so much cultural drag is because the majority of people are against changing. Offer eternal youth, that 99/1 ratio becomes a 1/99 ratio. Given the remarkable advances in biotech, this is plausible. However, given society's behavior about such research, it would be akin to an act of sheer anarchy to provide it. If this route is followed, human society would utterly collapse and be rebuilt in less than twenty years.
3) Carnivores. This is the way I see as most likely. Carnivores don't much care about bringing humanity out of their ancient ways. Instead, we - uh, I mean, they - utilize existing human infrastructure to fund our future, thanks to our vastly improved ability to manipulate it. The end result from the human side is "business is good"... but the end result from our side is "thanks for the space ship, see you never." The end result is an obsolete, languishing humanity watching the ever-growing exploits of their children, who may be in space or might stay on earth. Please note, this is compatible with type (2), if type (2) bioadvances are restricted, either by law or by cultural norms.
The idea of intelligence amplification and artificial intelligence are entirely type 3. Most of humanity won't get amped. That's even more true if it's an AI, rather than a human, doing the thinking. I hold that true AI is inevitable, but my short-term bet is on IA, which is happening as we speak. As you read this blog, for example.
The final thought here is a bit unsettling:
All your "learning" is useless, save as practice. What you need to do is adopt practices which increase your capacity for learning and cogitating. You can expand your "vision" by, say, reading really good blogs, but that doesn't actually increase your capacity above what the intellectual excersize provides.
Increase your capacity by using all the methods you can find. For example, making friends with geniuses is a great way to increase both your and their intelligence. Unfortunately, this is still very difficult on the internet due to a lack of peer pressure. Keeping extensive and well-networked data records is another way to increase your intelligence, as is subjecting yourself to extensive memetic control (also known as "reading something incredibly inspiring every day").
To expand your intelligence in the "correct" direction requires you to know what direction you're expanding in.
I'm still working on that bit...
But if you don't think of your mind as a device that can be improved, you're really missing out. Whatever your intellectual capacity, it can be shoved through the roof if you try.
Don't think about acting. Act about thinking. Then thinking about acting will be so much easier!