Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Science and Technology
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
 
Old 12-17-2013, 07:09 AM
 
18,547 posts, read 15,586,958 times
Reputation: 16235

Advertisements

Quote:
Originally Posted by Josseppie View Post
I already posted about the new 3D molecular circuits that will be the next paradigm. So you dont have to look back here is a link to that article:

March 16, 2010 -- The features on computer chips are getting so small that soon the process used to make them, which has hardly changed in the last 50 years, won’t work anymore. One of the alternatives that academic researchers have been exploring is to create tiny circuits using molecules that automatically arrange themselves into useful patterns. In a paper that appeared Monday in Nature Nanotechnology, MIT researchers have taken an important step toward making that approach practical.

EDACafe - googletag.pubads().definePassback('/4250/MCADCafe/EDACafe', [300, 250]).display();

Here is another example that proves computers will continue to advance exponentially even after the current paradigm, the integrated circuit, is over.


Notice this is from MIT Technology review and has nothing to do with Ray Kurzweil.

A new breed of computer chips that operate more like the brain may be about to narrow the gulf between artificial and natural computation—between circuits that crunch through logical operations at blistering speed and a mechanism honed by evolution to process and act on sensory input from the real world. Advances in neuroscience and chip technology have made it practical to build devices that, on a small scale at least, process data the way a mammalian brain does. These “neuromorphic” chips may be the missing piece of many promising but unfinished projects in artificial intelligence, such as cars that drive themselves reliably in all conditions, and smartphones that act as competent conversational assistants.

“Modern computers are inherited from calculators, good for crunching numbers,” says Dharmendra Modha, a senior researcher at IBM Research in Almaden, California. “Brains evolved in the real world.” Modha leads one of two groups that have built computer chips with a basic architecture copied from the mammalian brain under a $100 million project called Synapse, funded by the Pentagon’s Defense Advanced Research Projects Agency.

The prototypes have already shown early sparks of intelligence, processing images very efficiently and gaining new skills in a way that resembles biological learning. IBM has created tools to let software engineers program these brain-inspired chips; the other prototype, at HRL Laboratories in Malibu, California, will soon be installed inside a tiny robotic aircraft, from which it will learn to recognize its surroundings.

The link: Processors That Work Like Brains Will Accelerate Artificial Intelligence | MIT Technology Review
That article is very conservative in its claims , using language like "researchers have been exploring" and "could lead to".

You said "proof"

Please explain how you bridge the gap from "possibly", "exploring", etc. to "PROOF".

Also, please explain how you are not "conveniently ignoring" the fact that the researchers have failed to solve the heat dissipation problem necessary to overcome the limits discussed in the Herb Sutter article "The Free Lunch is Over".

Finally, how old are you? If you're wrong, will I be able to laugh at you while you're still alive?
Reply With Quote Quick reply to this message

 
Old 12-17-2013, 09:38 AM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
Quote:
Originally Posted by ncole1 View Post
That article is very conservative in its claims , using language like "researchers have been exploring" and "could lead to".

You said "proof"

Please explain how you bridge the gap from "possibly", "exploring", etc. to "PROOF".

Also, please explain how you are not "conveniently ignoring" the fact that the researchers have failed to solve the heat dissipation problem necessary to overcome the limits discussed in the Herb Sutter article "The Free Lunch is Over".

Finally, how old are you? If you're wrong, will I be able to laugh at you while you're still alive?
That is how researchers talk. I am sure if we went to when they were developing the integrated circuit they would be saying the same thing. However everything I read says it will be ready by 2020, well before the integrated circuit runs out of steam, so computers should be advancing exponentially for the foreseeable future.

I am 40 so I will be in my 50's in the 2020's well chronologically but by then with advancements in biology and genetics we will have reverse aging so I will be biologically in my early 20's.
Reply With Quote Quick reply to this message
 
Old 12-17-2013, 04:48 PM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
I happen to run across this video put out by IBM on what we will see in the next 5 years on personalized cancer treatment. Since one of the things I talk about with regards to the singularity is how genetics and biology will advance exponentially allowing us to cure most diseases and live for a very long time (if not forever) by 2030 I decided to post this. One of the things we will see by 2023 is reverse aging and the ability to turn off the so called fat gene, all thanks to genetics and biology. I mean if we are going to see this kind of advancement in 5 years imagine what we will see in 10 - 15 years.


Reply With Quote Quick reply to this message
 
Old 12-18-2013, 04:40 PM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
Interesting I need to take the time to watch them...
Reply With Quote Quick reply to this message
 
Old 12-18-2013, 04:43 PM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
One of the things I talk about is how one paradigm leads to a better and smaller paradigm. Well currently we are going from the smart phone paradigm to wearable computers. The next paradigm which will take place in the 2020's will be computers inside us and that will lead to the singularity. Well there is a article written on the new wearable computer paradigm so I thought I would post it here.

Why Wearable Tech Will Be as Big as the Smartphone


The link: Why Wearable Tech Will Be as Big as the Smartphone | Gadget Lab | Wired.com
Reply With Quote Quick reply to this message
 
Old 12-18-2013, 05:30 PM
 
18,547 posts, read 15,586,958 times
Reputation: 16235
Quote:
Originally Posted by Josseppie View Post
That is how researchers talk. I am sure if we went to when they were developing the integrated circuit they would be saying the same thing. However everything I read says it will be ready by 2020, well before the integrated circuit runs out of steam, so computers should be advancing exponentially for the foreseeable future.

I am 40 so I will be in my 50's in the 2020's well chronologically but by then with advancements in biology and genetics we will have reverse aging so I will be biologically in my early 20's.
No, this is not necessarily how researchers talk. When actual data already exist (once the technology is mature) they do not.

There is a distinction between speculative lines of investigation and a proven (demonstrated) technology.

What we have are a) an empirical falsification of a continuing exponential on CPU speed; and b) a speculative exploration of new technologies which could mature later and allow continuing improvement in computation.

If the technology does not increase the clock speed of a single-core CPU produced by Intel to > 5 GHz or increase rate/heat dissipated by > 10% on or before December 31st, 2020, will you accept that you are wrong?

I'm waiting for the big day when I can come back to this thread with a big laugh...
Reply With Quote Quick reply to this message
 
Old 12-18-2013, 07:22 PM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
Quote:
Originally Posted by ncole1 View Post
No, this is not necessarily how researchers talk. When actual data already exist (once the technology is mature) they do not.

There is a distinction between speculative lines of investigation and a proven (demonstrated) technology.

What we have are a) an empirical falsification of a continuing exponential on CPU speed; and b) a speculative exploration of new technologies which could mature later and allow continuing improvement in computation.

If the technology does not increase the clock speed of a single-core CPU produced by Intel to > 5 GHz or increase rate/heat dissipated by > 10% on or before December 31st, 2020, will you accept that you are wrong?

I'm waiting for the big day when I can come back to this thread with a big laugh...
So if I understand you correctly your argument is computers will stop advaning exponentially sometime in the 2020's?
Reply With Quote Quick reply to this message
 
Old 12-19-2013, 12:55 PM
 
18,547 posts, read 15,586,958 times
Reputation: 16235
Quote:
Originally Posted by Josseppie View Post
So if I understand you correctly your argument is computers will stop advaning exponentially sometime in the 2020's?
No, my argument is they already have stopped. Even if they will at some point in the future restart, the continuous exponential has been broken.
Reply With Quote Quick reply to this message
 
Old 12-19-2013, 02:55 PM
 
Location: Pueblo - Colorado's Second City
12,262 posts, read 24,461,491 times
Reputation: 4395
Quote:
Originally Posted by ncole1 View Post
No, my argument is they already have stopped. Even if they will at some point in the future restart, the continuous exponential has been broken.
Your statement has already been proven wrong. Just look at the latest paradigm shift.

This is from the Pueblo Chieftain:


SAN FRANCISCO — The digital domain is creeping off our desktops and onto our bodies, from music players that match your tunes to your heart beat, to mood sweaters that change color depending on your emotional state. “Everyone agrees the race is just beginning, and I think we’re going to see some very, very big leaps in just the next year,” said tech entrepreneur Manish Chandra at a wearable technology conference and fashion show in San Francisco Monday.

Wearable technologies have long been a sideshow to mainstream laptop and smartphones, but this year Google’s glasses and rumors of Apple’s iWatch are popularizing the field.

Analysts forecast swift growth.

- See more at: The Pueblo Chieftain |

So computers are not only continuing to advance exponentially but they are actually advancing faster then ever before. That is why in the 2020's the paradigm shift will be from wearable computers to computers inside our bodies and that is when we will merge with computers and the singularity by 2030.

Now as I have posted More's law will come to an end in the 2020's. I have read sometime between 2022 and 2028 when transistor densities are around 5nm. However that is not new as we are on the 5th paradigm already since the first modern computer was built in 1890 so all we will do is move on to the next paradigm, 3D self organizing molecular structures, and computers will continue to advance exponentially.

I think one of the reasons many people are so shocked at the end of the ingratiated circuit is because it has been around so long. It was first developed in the 1960's well before many people were born so that is all they know. Even me at 40 the integrated circuit has been around longer then I have been alive. If you look at the first 4 paradigms they did not last as long so I don't think people were surprised when they moved to a new one plus there was no internet back then so people were not able to talk about it or do as much research on it as we are today. So while this might seem like its going to be a big deal its really not, just like when we went from vacuum tubes to transistors or transistors to the integrated circuit, and computers will keep doing what they have been doing since 1890, advancing exponentially.

Last edited by Josseppie; 12-19-2013 at 03:17 PM..
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Science and Technology
Similar Threads

All times are GMT -6. The time now is 05:35 AM.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top