Please register to participate in our discussions with 2 million other members - it's free and quick! Some forums can only be seen by registered members. After you create your account, you'll be able to customize options and access all our 15,000 new posts/day with fewer ads.
In the past, when they had 16 bit systems, they really made sure that they were using memory wisely as using up too much could literally run the system out of memory. Now that we've gone onto 64 (and perhaps more) bit memory, it seems like they're being downright sloppy, resulting in terrible memory leaks.
(Though maybe it's just, in my case, Flash + Internet = BAD)
One of the worst offenders seems to be browsers. It seems that Chrome, Firefox (memory leaks (plus the Brendan Eich incident) were what drove me away from it and it had been my favorite browser too), etc are terrible with this, particularly when running social media sites like Facebook or Twitter or on news sites.
Also, it seems that, at least on Windows 10, Cortana and system processes seem to hog too much memory too.
There is no point (or little point) to spending gobs of hours trying to manage every little bit of memory. It is a waste of time in a system that addresses 8+ gigabytes of memory space and employs swapping.
If you are doing that, you are concentrating on the wrong areas for enhancement. I think you would much better spend your time tuning data retrievals.
When I started, computers had about 20K of memory. Memory management was very important then.
I think the better question would be if the frameworks and runtime environments are efficient at garbage collection. As far I as I understand, it is now rare for developers to explicitly free up unused space. Only a few language support this, in any case...
The biggest issue with memory for me are the constant forced updates and the service packets.
I wish there is an easy way to clean/delete old updates without reinstalling the whole system.
I have Surface 3/64GB and I didn't download anything there, nor store any files/games/etc. I actually deleted as many of the preinstalled crap as I could, but last time I checked the constant updates ate more than 65% of the memory already. Pretty soon the tablet will start to have performance issue because of that.
When I started in computers, 128K or 256K of RAM was a big deal (IBM 5100/5110). Programmers had to be efficient and creative. These days, RAM is so readily available at such a low cost, it's easier and less expensive to add more RAM-- than to spend time trying to efficiently use what one has.
As a sidelaugh, we used to sell a filing cabinet size case that held TWO, 8-1/2-inch floppy-disks ... each holding an amazing 1.2 GB! -- The idea was that being able to switch floppy-disks in and out gave one virtually unlimited storage. These initial "small business/personal computers sold for $20K-$25K each. -- Times have certainly changed.
Memory hogging and memory leaking are two different things.
On the one hand, the easy/cheap availability of RAM on most personal computer systems means that most software developers can't create true memory hogs anymore unless they are explicitly trying to nefariously make one.
On the other hand, memory leaks can slip into the software when the code allocates memory within an iteration or a recursion and doesn't properly deallocate memory before the next loop of the iteration or invocation of the next recursion.
When I started in computers, 128K or 256K of RAM was a big deal (IBM 5100/5110). Programmers had to be efficient and creative. These days, RAM is so readily available at such a low cost, it's easier and less expensive to add more RAM-- than to spend time trying to efficiently use what one has.
As a sidelaugh, we used to sell a filing cabinet size case that held TWO, 8-1/2-inch floppy-disks ... each holding an amazing 1.2 GB! -- The idea was that being able to switch floppy-disks in and out gave one virtually unlimited storage. These initial "small business/personal computers sold for $20K-$25K each. -- Times have certainly changed.
I think you mean 1.2 MEGS
I remember the 8 1/2 drives, but never actually used one. They were a museum piece by the time I got in.. Though I do recall the 5 1/4 and 3 1/2 quite well..
The advance through "K", "Megs", "Gigs" and now "Tera" throws me at times.. We use some SSD drives for our integration that are 8G.. Raised from the 128M CFs that we used to use.. And occasionally, I still make the "Have him put an 8 Meg drive in it" mistake.
I remember the 8 1/2 drives, but never actually used one. They were a museum piece by the time I got in.. Though I do recall the 5 1/4 and 3 1/2 quite well..
The advance through "K", "Megs", "Gigs" and now "Tera" throws me at times.. We use some SSD drives for our integration that are 8G.. Raised from the 128M CFs that we used to use.. And occasionally, I still make the "Have him put an 8 Meg drive in it" mistake.
You are correct! -- One gets so used to thinking in terms of Gigs and even Terabytes today, it's hard to remember the early systems (mid-to-late 70's) measured storage in K's and Megs.
Please register to post and access all features of our very popular forum. It is free and quick. Over $68,000 in prizes has already been given out to active posters on our forum. Additional giveaways are planned.
Detailed information about all U.S. cities, counties, and zip codes on our site: City-data.com.