Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Science and Technology > Computers
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 11-18-2011, 12:28 PM
 
24,488 posts, read 41,025,885 times
Reputation: 12919

Advertisements

Quote:
Originally Posted by MediocreButArrogant View Post
Fixed that for you.
If you're trying to be funny, that's fine... but you should leave some sort of clue.

Otherwise someone might take you seriously.
Reply With Quote Quick reply to this message

 
Old 11-18-2011, 12:42 PM
 
Location: Silicon Valley
3,683 posts, read 9,833,113 times
Reputation: 3015
Quote:
Originally Posted by NJBest View Post
If you're trying to be funny, that's fine... but you should leave some sort of clue.

Otherwise someone might take you seriously.
It was a joke, but there is an element of truth there too.

Quad core processors are a necessity for most home users.

One core dedicated to run the virus scanner.
One core dedicated to install Windows updates
One core dedicated to running the Aero interface
One core to do actual useful work.

That was a joke too, but again, there is an element of truth.
Reply With Quote Quick reply to this message
 
Old 11-18-2011, 11:57 PM
 
Location: Mableton, GA USA (NW Atlanta suburb, 4 miles OTP)
11,334 posts, read 26,006,593 times
Reputation: 3990
Quote:
Originally Posted by NJBest View Post
If we come to a point where speed doesn't matter to consumers anymore, that just means that we're slacking in software innovation.
Only if "software innovation" to you means "requires a lot more CPU", which these days generally means "using more bloated libraries and frameworks to generate the same core functionality we've had for two decades but in a prettier package".

Video and network/bandwidth seem to be the primary bottlenecks these days.
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 01:04 AM
 
24,488 posts, read 41,025,885 times
Reputation: 12919
Quote:
Originally Posted by rcsteiner View Post
Only if "software innovation" to you means "requires a lot more CPU", which these days generally means "using more bloated libraries and frameworks to generate the same core functionality we've had for two decades but in a prettier package".

Video and network/bandwidth seem to be the primary bottlenecks these days.
Nope. Nice try though.

Bandwidth surely is a bottleneck, as is storage (HDD/SSD)... but so is CPU. Most people don't consider CPU as a bottleneck since software typically does not make it to market until processing power is available. Which is different than network applications. There's many areas of software that with the the help of faster CPUs can have a great impact on how the average consumer uses computers today. For example, contextual analysis, perception, and inference.
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 01:12 AM
 
Location: Mableton, GA USA (NW Atlanta suburb, 4 miles OTP)
11,334 posts, read 26,006,593 times
Reputation: 3990
Quote:
Originally Posted by NJBest View Post
Nope. Nice try though.

Bandwidth surely is a bottleneck, as is storage (HDD/SSD)... but so is CPU. Most people don't consider CPU as a bottleneck since software typically does not make it to market until processing power is available. Which is different than network applications. There's many areas of software that with the the help of faster CPUs can have a great impact on how the average consumer uses computers today. For example, contextual analysis, perception, and inference.
Can you give me a real world example of an application that would be used by a typical consumer?
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 01:38 AM
 
24,488 posts, read 41,025,885 times
Reputation: 12919
Quote:
Originally Posted by rcsteiner View Post
Can you give me a real world example of an application that would be used by a typical consumer?
Currently, you can't go to your computer and say in natural language "Create a presentation that shows the link between smoking an cancer". There's two issues here. First, computers cannot understand natural language. For example, "call a taxi for me" and "fetch me a cab" mean the same thing in natural language, but computers (at the consumer level) cannot see it as the same thing.

The other issue is that for a computer to effectively gather and organize information about a relationship between to entities, it needs to understand context. Watson has showed that we're getting very close, but Watson is a supercomputer full of many many CPUs.

We wrap ourselves around ways computers input and out data. In the future, we'll be able to have computers that understand us in our natural environment using our natural methods of communication. This includes, speech, gestures, and even facial expressions.

Devices such as Siri, Wii, and Kinect are just the beginning. As software improves and is able to harness all that CPU power effectively, it will be for more than entertainment.


Another good example is search. I can't effectively search through my pictures. I can't ask my computer to search for all the pictures that have my dog in it. No one would argue that this is impossible... but the software and cpu power to do so is not available at the consumer level.

Last edited by NJBest; 11-19-2011 at 01:59 AM..
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 02:38 AM
 
Location: Mableton, GA USA (NW Atlanta suburb, 4 miles OTP)
11,334 posts, read 26,006,593 times
Reputation: 3990
Quote:
Originally Posted by NJBest View Post
Currently, you can't go to your computer and say in natural language "Create a presentation that shows the link between smoking an cancer". There's two issues here. First, computers cannot understand natural language. For example, "call a taxi for me" and "fetch me a cab" mean the same thing in natural language, but computers (at the consumer level) cannot see it as the same thing.
Hmmm. Let's go back in history.

OS/2 Warp 4 was a PC desktop OS released by IBM with trainable voice dictation and voice navigation in the standard package ... in 1996. Some early boxes had a headset microphone in the box. I had two.

That was 15 years ago. It worked rather well, from what I remember, well enough for people who were patient enough to go through the training process to use voice for most of their common desktop operations, to write documents, etc. I played with it some, but I'm faster typing with my hands than I am speaking, so I didn't really find much use for it. I did know two people who completely switched to using it, though.

At that point in time, the high-end boxes were dual-CPU 200Mhz Pentium Pro boxes (single-core 686, Socket 8) with perhaps 128MB or 256MB of RAM. IBM's VoiceType Dictation required training, but it was relatively sophisticated, and it would work on a machine with considerably lesser power than the above.

IBM VoiceType Dictation for Windows 95 Version 3.0

We already have over 10 times the CPU power in each core on existing machines. 200MHz is only 0.2GHz, and the Windows 95 version of IBM's product claimed it would run on a 90 MHz Pentium. That's a 586 chip, which is a far cry from a 686 with 32-bit code. Maybe 0.07 GHz?

You really think natural language processing will require that much more than the dictation software back then? Voice recognition software is not my speciality, but I find that hard to believe. What you describe above appears to be the result of a very sophisticated action recognition engine, but the voice processing part of it seems trivial to me, and the database that would associate smoking and cancer and create a presentation would be the main part of the work.

Such associations are prime targets for preprocessing. Generating relational tables is one of the main ways to speed up any processing like that. If you want to do it on the fly all the time, I would probably question your seriousness about solving the problem.

Quote:
The other issue is that for a computer to effectively gather and organize information about a relationship between to entities, it needs to understand context. Watson has showed that we're getting very close, but Watson is a supercomputer full of many many CPUs.
The basic understanding of complex relationships is a very different problem from the ability to generate such relationships on the fly in real time.

I can see that taking a lot of CPU power, but I question how relevant it is to consumer software products in the near future.
Quote:
We wrap ourselves around ways computers input and out data. In the future, we'll be able to have computers that understand us in our natural environment using our natural methods of communication. This includes, speech, gestures, and even facial expressions.

Devices such as Siri, Wii, and Kinect are just the beginning. As software improves and is able to harness all that CPU power effectively, it will be for more than entertainment.
Mainframes have had separate dedicated I/O processors for decades for a good reason ... there's no need to use core CPU power if you can embed a dedicated CPU in your peripheral devices.

You're talking about expansions to the base hardware platform, but those may or may not require more processing power at the core.

Quote:
Another good example is search. I can't effectively search through my pictures. I can't ask my computer to search for all the pictures that have my dog in it. No one would argue that this is impossible... but the software and cpu power to do so is not available at the consumer level.
I would argue that the main bottleneck is media speed in the above instance, since you have to filter through a rather large amount of mass storage during that process. Obviously, some way to build an index dynamically at the point one adds each new file is one of the better ways to handle it. But how does one recognize images? That's an issue which requires smart algorithms, yes, but who knows how much CPU would be needed. If well implemented, I suspect not a lot...

I dunno...

Last edited by rcsteiner; 11-19-2011 at 02:48 AM..
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 03:11 AM
 
24,488 posts, read 41,025,885 times
Reputation: 12919
Quote:
Originally Posted by rcsteiner View Post
Hmmm. Let's go back in history.

OS/2 Warp 4 was a PC desktop OS released by IBM with trainable voice dictation and voice navigation in the standard package ... in 1996. Some early boxes had a headset microphone in the box. I had two.

That was 15 years ago. It worked rather well, from what I remember, well enough for people who were patient enough to go through the training process to use voice for most of their common desktop operations, to write documents, etc. I played with it some, but I'm faster typing with my hands than I am speaking, so I didn't really find much use for it. I did know two people who completely switched to using it, though.

At that point in time, the high-end boxes were dual-CPU 200Mhz Pentium Pro boxes (single-core 686, Socket 8) with perhaps 128MB or 256MB of RAM. IBM's VoiceType Dictation required training, but it was relatively sophisticated, and it would work on a machine with considerably lesser power than the above.

IBM VoiceType Dictation for Windows 95 Version 3.0

We already have over 10 times the CPU power in each core on existing machines. 200MHz is only 0.2GHz, and the Windows 95 version of IBM's product claimed it would run on a 90 MHz Pentium. That's a 586 chip, which is a far cry from a 686 with 32-bit code. Maybe 0.07 GHz?

You really think natural language processing will require that much more than the dictation software back then? Voice recognition software is not my speciality, but I find that hard to believe. What you describe above appears to be the result of a very sophisticated action recognition engine, but the voice processing part of it seems trivial to me, and the database that would associate smoking and cancer and create a presentation would be the main part of the work.



Such associations are prime targets for preprocessing. Generating relational tables is one of the main ways to speed up any processing like that. If you want to do it on the fly all the time, I would probably question your seriousness about solving the problem.



The basic understanding of complex relationships is a very different problem from the ability to generate such relationships on the fly in real time.

I can see that taking a lot of CPU power, but I question how relevant it is to consumer software products in the near future.
Yes... it's significantly more complex than voice recognition. Modern databases use relations that are defined by humans after the normalizing process. So the association of every two potential entities would not pre-exist in a database... atleast in any current type of database we have available to us today.

But that's not the issue. Finding the data is not a problem. Even relating it is not an issue. But determining what is relevant and how to organize and display it is. You seem to be confusing being able to store and retrieve data with a computer's ability to understand it in context.
Quote:
Originally Posted by rcsteiner View Post

Mainframes have had separate dedicated I/O processors for decades for a good reason ... there's no need to use core CPU power if you can embed a dedicated CPU in your peripheral devices.

You're talking about expansions to the base hardware platform, but those may or may not require more processing power at the core.
The peripheral would be the microphone or sensors. Making sense of the input would have to be done on the CPU for obvious reasons. I don't even know how you could argue that the peripheral would responsible for making sense of input.
Quote:
Originally Posted by rcsteiner View Post

I would argue that the main bottleneck is media speed in the above instance, since you have to filter through a rather large amount of mass storage during that process. Obviously, some way to build an index dynamically at the point one adds each new file is one of the better ways to handle it. But how does one recognize images? That's an issue which requires smart algorithms, yes, but who knows how much CPU would be needed. If well implemented, I suspect not a lot...

I dunno...
Quite a bit of cpu power unless you want to have to wait for the index to build every time you unload your SD card.
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 10:56 PM
 
Location: Maryland's 6th District.
8,358 posts, read 25,175,237 times
Reputation: 6540
Quote:
Originally Posted by NJBest View Post
There's many areas of software that with the the help of faster CPUs can have a great impact on how the average consumer uses computers today. For example, contextual analysis, perception, and inference.
CPU clock speed, sure. But more cores? What is the advantage to having 6, 8, 12+ cores when the average consumer barely uses two? For a bit of clarification, by consumer I mean the Average Joe. I understand that those in the audio/video/design/scientific/etc. communities, who more-often-than-not benefit from multi-core processors, are consumers, too. Not talking about them.
Reply With Quote Quick reply to this message
 
Old 11-19-2011, 11:03 PM
 
Location: Covington County, Alabama
259,023 posts, read 90,315,976 times
Reputation: 138557
I already have more than I need with this quad core AMD. I'd rather spend the money on good steaks.
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Science and Technology > Computers
Similar Threads

All times are GMT -6.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top