The impending Singularity... (advance, TVs, cameras, plasma)
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
To sum it up for those of us who don't have time to read a novel like this: All throughout history change has been exponential, but like any exponential curve it is unnoticeable until a certain point. Kurzweil thinks the point where technological change (near vertical line on graph) becomes almost infinite from our point of view is within 50 yrs. We might soon see huge advances in genetics, nanotechnology, and biology that would come along with this, as well as machine intelligences that can improve themselves.
Location: Charlotte,NC, US, North America, Earth, Alpha Quadrant,Milky Way Galaxy
3,770 posts, read 7,523,595 times
Reputation: 2118
Hmm, this is a fun topic to discuss.
There have been numerous movies and other stories based in part on some of these theories- Terminator, iRobot, A Space Odyssey, The Matrix, etc. The whole machine learning thing such that machines could possibly develop a conscience or at least self-awareness is possible, just not in my life lifetime. The level of complexity required to achieve that is unreal. We don't even have computer vision or computer hearing down. I'm a computer engineer by training, and I remember sitting in a class where the professor discuss the ability of the human brain and eye to "process" vast amount of visual information in "real-time" (depth perception, motion, relative motion of several objects, etc.). What's also very interesting is for the human brain to bring to the foreground a particular conversation in a noisy room (when there are other conversations going on). While computers are fast, and can process the digital information of sight/images/audio it's a very very long way from being able to do what the human brain can do.
Anyway all that to say is that there's got to be a break-through technology something on par with the "transistor" back in the 30s that makes the computer world we live in possible. I just don't see that type of quantum leap on the horizon. Do you?
In regards to genetics though, I do believe we'll see a LOT more advancements (and tampering) in this area. *That* is definitely growing at a very fast pace. I truly believe we'll see real cures for cancers, the ability to correct genetic defects after conception (tampering), etc.
I think the answer to much of this is in evolving nanotechnology in which cells and technology sort of work in unison. As for right now, much of it is within the realm of science fiction.
The thing is that I don't imagine humans will ever be able to "program" a computer program to be as versatile as the human mind. It's just too complicated. However, I firmly think that if you can build a "base" program that evolves with positive and negative feedback loops that you can make headway with it.
I liken it much to how you see "code-breakers" work in the movies. You see a thief hook up a scanner to the alarm system and it cycles through the digits individually until the correct number is obtained. It then moves on to the next number and then the next number until the entire code is cracked. Basically, it's game theory.
Kurzweil makes one basic error (among many). He often takes what is being said and reported at face value, without bothering to assess the counterpoints prior to commenting. An example is the rapid decyphering of the human genome. Yes it was fast, but there also were some skips and shortcuts involved in that declaration of completion.
His report, written in 2001, missed on the subsequent meltdown of the tech sector, shows an almost childlike view of the Fed's impact with changing of interest rates, and most puzzling of all, discusses how microchips will be three-dimensional in the future. They already are three dimensional, and subject to some restrictions based on thermal weakness and the need for cooling, have had stacked and interconnected circuitry for a long time.
He also neglects the impact of the various religions. The world spiritual belief system isn't just Christian and Muslim. Competition from some sects of both religions against science have already been shown to be a factor, and we haven't even scratched the surface of other religions.
Then there is the fact that a substantial, and potentially growing portion of the world's population exists in abject poverty and to put it delicately, has low level reasoning based more on superstition than logic. Of what use is a singularity to a family eating mud and grass to survive?
I'm reminded, when reading his idea of a singularity, of an old science fiction story, where monks in an isolated and forgotten monastery have been working for centuries writing down all the names of God. On the final day of their work, having written every single name, the stars begin, one by one, winking out of existence. The purpose of mankind, according to the precepts of those monks, has been fulfilled.
In Kurzweil's version, he posits a "singularity." If he was correct, and all knowledge became known and available at that instance of time, what next? The stars winking out would be as good a followup as anything else. The purpose has been fulfilled.
An area that Kurzweil touches on holds as much probability of happening than his singularity, perhaps even more. As science and industry leverage the abilities of an individual or group to impact the world (TNT vs. black powder is an example) then more individuals will have access to increasingly dangerous tools as time progresses. If the wrong person or group gains control of the wrong technology, the world or part of it could become history in a very short period of time. His exponential curve will more likely turn into a wave.
We already have a whole host of machine vision products that use cameras and PCs to 'in real-time' look at things, evaluate and determine if a product is good or bad. They are a LOT more efficient than human eyes and brains for this task because its a hard, set rule that they can follow.
Whether or not we'll be able to see 'artificial intelligence'... thats debatable. We have algorithms and computers have the ability to 'learn'. The easiest are the furby dolls, that you can teach words, etc. We can teach computer things... but only within the limits that we specify.
I think it was Mark Twain who, when asked what he thought of Western Civilization, said "I think it would be a good idea."
The news today is that a kid who beheaded a man was found guilty,
the stock market rose, even though real inflation is killing the middle class,
the Pope calls sex abuse scandal a 'deep shame',
and...
Yea, but the dog died of renal failure! A wise use of technology and the media.
Read Isac asmov from the 50s. All we are doing is refining everything figured out or invented previously.
Yep, not a whole lot of revolution going on right now unless you count plasma tvs.....
Quantum computing, now that would be revolutionary.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.