Brad Wardell's views about technology, politics, religion, world affairs, and all sorts of politically incorrect topics.

I’m reading a recent book called The Singularity is Near: When Humans Transcend Biology.

Written by Ray Kurzweil who is a fairly known futurist having predicted things from the Internet as we know it (Back in the early 80s), iPhones (well, basically the iPhone) as well as having developed a number of interesting technologies himself, Kurzweil makes a pretty interesting case that by 2045 (his predictions are very specific to invite comparison when the time comes) humans will have become essentially immortal.

Now, putting aside the ramifications of immortality for a moment, how do we get from where we are today to immortality in merely 40 years?

The answer, it seems is the exponential growth of knowledge and technology. According to Kurzweil, by 2020 we’ll be well into the nano-technology economy where we’ll be able to make devices easily and affordably very small. In addition, the computer of 2020 will have access to so much data as such high speeds that we’ll be able to start integrating humans and technology together.

I’m way over simplifying here but the point being is that by 2045, we’ll have nanobots as part of our biology that clean up disease, repair injury, de-age us, and grant us access to limitless amounts of information and computing power that will be, by then, seamless.

It’s a very good read though I’m pretty skeptical since humans, by nature, tend to be more conservative in their adaption of things than what is theoretically possible.

More info

The Singularity Is Near: When Humans Transcend Biology
Comments
on Aug 29, 2009

Surely there will be "anti that" cults though...

on Aug 29, 2009

I dunno if a technological singularity is that close... as I understand it, predictions about elements of the idea tend to be overly optimistic.

I haven't read Kurzweil's stuff (and I've only skimmed across most singularity research... one of these days) but I'm willing to wager that true immortality  of the kind you're  describing isn't going to be around in 40 years. What I could see, however, is that human lifespan will have been significantly extended by that time in less comprehensive ways, buying time for new research to extend it even further, etc.

That's what Aubrey De Grey calls "longevity escape velocity" and the idea is sound in principle. His assertions that the first thousand year old might be 50 years old today seem optimistic (to say the least!), but he does make a surprisingly good case for his ideas.

I guess we'll see.

on Aug 31, 2009

The timing of the “singularity” obviously depends on how fast humans merge with computers. Naturally evolving a smarter human takes millions of years. The SI-FI aliens with big heads to hold their big brains wouldn’t happen in reality because the inevitable merging with the synthetic would relieve any evolutionary pressure to make their brains larger.

Once we start down a path that can double base human intelligence for participants every two years the singularity won’t be far off. As to why I say inevitable, most people won’t turn away from healthy immortality our survival instinct is too strong, however there will be many that reject all or part of it. It will be a dark time for humanity as we begin to separate into three distinct species. 

The ones whom embrace it will become a new species both hated and feared by the rest. They will dictate the future.

“They hate us because they know that when the end comes we will be all that is left” AI.

The ones whom will embrace parts of it but refuse to give up there biological nature. They will not die from natural causes and because of this longevity and with the ability to accelerate biological evolution, they will keep up with the first group for some time however they’ll eventually be as left behind as the last group.   

The purist whom will certainly benefit from the new technologies introduced by this new species won’t understand them and while respected and revered they’ll essentially be a kept race.

As to whether the AI quote comes to pass that is dependant on the nature of the intelligence we spawn and what value it sees in us. We would not be a threat to their existence so I don’t see why they would have a problem with a sentient species following their own path. This is why I believe the merging of the biological and synthetic is very important rather than trying to create an AI from scratch. No matter what you became you would always think of yourself as human.