Welcome to City-Data.com Forum!
U.S. CitiesCity-Data Forum Index
Go Back   City-Data Forum > General Forums > Science and Technology > Computers
 [Register]
Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
View detailed profile (Advanced) or search
site with Google Custom Search

Search Forums  (Advanced)
Reply Start New Thread
 
Old 03-17-2017, 07:32 PM
 
2,924 posts, read 1,587,826 times
Reputation: 2498

Advertisements

In the past, when they had 16 bit systems, they really made sure that they were using memory wisely as using up too much could literally run the system out of memory. Now that we've gone onto 64 (and perhaps more) bit memory, it seems like they're being downright sloppy, resulting in terrible memory leaks.

(Though maybe it's just, in my case, Flash + Internet = BAD)


One of the worst offenders seems to be browsers. It seems that Chrome, Firefox (memory leaks (plus the Brendan Eich incident) were what drove me away from it and it had been my favorite browser too), etc are terrible with this, particularly when running social media sites like Facebook or Twitter or on news sites.


Also, it seems that, at least on Windows 10, Cortana and system processes seem to hog too much memory too.
Reply With Quote Quick reply to this message

 
Old 03-21-2017, 01:39 PM
 
2,014 posts, read 1,649,540 times
Reputation: 2826
all programmers are bad playwrights and all computers are lousy actors.
Reply With Quote Quick reply to this message
 
Old 03-21-2017, 01:51 PM
 
Location: SC
8,793 posts, read 8,164,508 times
Reputation: 12992
There is no point (or little point) to spending gobs of hours trying to manage every little bit of memory. It is a waste of time in a system that addresses 8+ gigabytes of memory space and employs swapping.

If you are doing that, you are concentrating on the wrong areas for enhancement. I think you would much better spend your time tuning data retrievals.

When I started, computers had about 20K of memory. Memory management was very important then.
Reply With Quote Quick reply to this message
 
Old 03-22-2017, 09:09 AM
 
Location: Birmingham
11,787 posts, read 17,771,707 times
Reputation: 10120
Yes, because memory is cheap and plentiful.
Reply With Quote Quick reply to this message
 
Old 03-22-2017, 11:26 AM
 
2,669 posts, read 2,092,040 times
Reputation: 3690
I think the better question would be if the frameworks and runtime environments are efficient at garbage collection. As far I as I understand, it is now rare for developers to explicitly free up unused space. Only a few language support this, in any case...
Reply With Quote Quick reply to this message
 
Old 03-22-2017, 11:39 AM
 
Location: Tricity, PL
61,722 posts, read 87,123,005 times
Reputation: 131695
The biggest issue with memory for me are the constant forced updates and the service packets.
I wish there is an easy way to clean/delete old updates without reinstalling the whole system.

I have Surface 3/64GB and I didn't download anything there, nor store any files/games/etc. I actually deleted as many of the preinstalled crap as I could, but last time I checked the constant updates ate more than 65% of the memory already. Pretty soon the tablet will start to have performance issue because of that.

Last edited by elnina; 03-22-2017 at 11:48 AM..
Reply With Quote Quick reply to this message
 
Old 03-22-2017, 11:46 AM
 
Location: Florida -
10,213 posts, read 14,834,115 times
Reputation: 21848
When I started in computers, 128K or 256K of RAM was a big deal (IBM 5100/5110). Programmers had to be efficient and creative. These days, RAM is so readily available at such a low cost, it's easier and less expensive to add more RAM-- than to spend time trying to efficiently use what one has.

As a sidelaugh, we used to sell a filing cabinet size case that held TWO, 8-1/2-inch floppy-disks ... each holding an amazing 1.2 GB! -- The idea was that being able to switch floppy-disks in and out gave one virtually unlimited storage. These initial "small business/personal computers sold for $20K-$25K each. -- Times have certainly changed.
Reply With Quote Quick reply to this message
 
Old 03-22-2017, 11:59 AM
 
8,418 posts, read 7,414,580 times
Reputation: 8767
Memory hogging and memory leaking are two different things.

On the one hand, the easy/cheap availability of RAM on most personal computer systems means that most software developers can't create true memory hogs anymore unless they are explicitly trying to nefariously make one.

On the other hand, memory leaks can slip into the software when the code allocates memory within an iteration or a recursion and doesn't properly deallocate memory before the next loop of the iteration or invocation of the next recursion.
Reply With Quote Quick reply to this message
 
Old 03-23-2017, 02:49 PM
 
17,587 posts, read 15,259,939 times
Reputation: 22915
Quote:
Originally Posted by jghorton View Post
When I started in computers, 128K or 256K of RAM was a big deal (IBM 5100/5110). Programmers had to be efficient and creative. These days, RAM is so readily available at such a low cost, it's easier and less expensive to add more RAM-- than to spend time trying to efficiently use what one has.

As a sidelaugh, we used to sell a filing cabinet size case that held TWO, 8-1/2-inch floppy-disks ... each holding an amazing 1.2 GB! -- The idea was that being able to switch floppy-disks in and out gave one virtually unlimited storage. These initial "small business/personal computers sold for $20K-$25K each. -- Times have certainly changed.
I think you mean 1.2 MEGS

I remember the 8 1/2 drives, but never actually used one. They were a museum piece by the time I got in.. Though I do recall the 5 1/4 and 3 1/2 quite well..

The advance through "K", "Megs", "Gigs" and now "Tera" throws me at times.. We use some SSD drives for our integration that are 8G.. Raised from the 128M CFs that we used to use.. And occasionally, I still make the "Have him put an 8 Meg drive in it" mistake.
Reply With Quote Quick reply to this message
 
Old 03-23-2017, 08:35 PM
 
Location: Florida -
10,213 posts, read 14,834,115 times
Reputation: 21848
Quote:
Originally Posted by Labonte18 View Post
I think you mean 1.2 MEGS

I remember the 8 1/2 drives, but never actually used one. They were a museum piece by the time I got in.. Though I do recall the 5 1/4 and 3 1/2 quite well..

The advance through "K", "Megs", "Gigs" and now "Tera" throws me at times.. We use some SSD drives for our integration that are 8G.. Raised from the 128M CFs that we used to use.. And occasionally, I still make the "Have him put an 8 Meg drive in it" mistake.
You are correct! -- One gets so used to thinking in terms of Gigs and even Terabytes today, it's hard to remember the early systems (mid-to-late 70's) measured storage in K's and Megs.
Reply With Quote Quick reply to this message
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.

Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.


Reply
Please update this thread with any new information or opinions. This open thread is still read by thousands of people, so we encourage all additional points of view.

Quick Reply
Message:


Over $104,000 in prizes was already given out to active posters on our forum and additional giveaways are planned!

Go Back   City-Data Forum > General Forums > Science and Technology > Computers

All times are GMT -6. The time now is 09:25 PM.

© 2005-2024, Advameg, Inc. · Please obey Forum Rules · Terms of Use and Privacy Policy · Bug Bounty

City-Data.com - Contact Us - Archive 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37 - Top