Menu

Mo’ Memory, Mo’ Problems

March 26, 2017 - General Computing

I’m writing this, right now, on a computer from 1999; a Rev C iMac G3 system that I got off Craigslist for $20. This system has 64MB of memory and a 333Mhz Processor, using Microsoft Word 2001 running under Mac OS 9.2.2.

Considering the advances we’ve seen in tech, you would expect this system to be entirely unusable; and yet here I am using a relatively full-featured productivity application with seemingly the same responsive behaviour and capability as a more modern system.

This is leading inexhorably to a discussion regarding bloat. As computers grow faster, the software that runs on them expands to use the additional capabilities. I’ve had modern text editors that type out what I wrote in slow motion- updating a character every second or so, on a 4Ghz quad core system with 32GB of Memory that isn’t otherwise being taxed. There is very little excuse for this, and yet it happens.

As computers moved forward we find that extra capability is, oftentimes, absorbed by developers. That is, a faster Processor means they can get the same speed by using C instead of Assembly, or they can write the software in a higher-level Language like Java or C# instead of writing it in C. Those are entirely reasonable as, in a way, they do eventually reduce the cost of software to the consumer. Nowadays that question is in regards to web applications. We have many pieces of software that are themselves written in Javascript for example, which put a heavy load on the interpreter under which they run. As an interpreted language the performance is reduced even further, but it is considered acceptable because faster systems are the norm and your typical system is capable of running that at speed.

But in many respects, we’ve been upgrading our hardware but running in place. While many aspects have certainly improved- entertainment/game software, for example, has higher resolutions, more colours, higher-res textures and more polygons than ever before – a lot of otherwise basic tasks have not greatly improved.

But at the same time one is often left wondering exactly what features we have gained in this inexhorable forward march. As I type this the system I’m writing on is not connected to the  Internet; I’m not receiving any notifications or tips, I’m not seeing advertisements for Cloud storage or nag screens about installing the latest update to the system software. I can install applications and use them, and in many cases I can use them with a lot of the same effectiveness and even performance as corresponding modern software. Accounting for the exception, of course, of web browsers.

Web browsers are an interesting case in looking how computing has changed.  Your typical system from the late nineties would have had perhaps 64MB of RAM, like the iMac G3 I’m using right now. I can run Internet Explorer and open local web pages (I’m too lazy to move the system to connect it via Ethernet, since it naturally has no Wireless capabilities on it’s own), and Internet Explorer only consumes 10MB of memory. Compared to my main desktop system, the proportions are similar- oftentimes I find Firefox or Chrome consuming upwards of 1GB of Memory! It is easy to blame this on software bloat- that browsers have merely started using more memory because they don’t need to not- obviously a web browser using upwards of 1GB of memory couldn’t have existed at all in 1999, but it runs without issue on most modern systems, particularly now that 4GB is becoming a “bare minimum” to run a system with. Blaming it on that would be an oversimplification, as the task of browsing has ballooned from the time period where it could be done with that much RAM; now browsers need to not only support more complicated HTML structures as well as features such as Stylesheets, but they are effectively becoming a platform on their own, with web applications running inside of browsers. To make a contemporary reference- it’s like a Hypercard stack, to put it in terms relative to the older Mac systems I’ve been fiddling with lately. As a result, saying it is entirely due to bloat would certainly be unfair.

Perhaps, then, we need a better benchmark for comparison. I’m writing this in Microsoft Word 2001 for Mac, so perhaps Microsoft Word is a better comparison? As I write this, Microsoft Word is using 10MB of Memory. Launching Microsoft Word 2013, and opening a blank document, I find that, to begin with, it is using 55MB of memory.

Now, compared to the total amount of memory on each system, Word 2013 is actually using a much smaller percent; 10MB is about %15 of the total memory on this Mac, but 55MB is only 0.32% of the total 16GB of memory on the laptop I started Word 2013 on; so in that sense I suppose we could argue that memory usage of applications has reduced compared to the available hardware. But of course in absolute terms the story told is different, and a blank document is using over 5 times as much memory as it takes for this older release on an older computer to maintain and display a multiple-page document.

There are a number of reasons for this sort of difference. For one thing, excessive memory usage by certain components might not come up in testing on more recent machines; as long as it runs, excess memory usage might not be detected, and even if 55MB is higher than it is on this older system, as established, the smaller usage of total physical memory on most any modern system is going to result in it not being considered an issue. Another reason is that sometimes with additional capabilities, Software gets added effects. Features like Aero Glass and the drawing of features like the modern Office Ribbon, for example. Also to be considered are modern features like font smoothing, which were less prevalent and advanced in 1999.

Nonetheless, it is still somewhat humourous that a basic word processor has managed to start using that much more memory for what is effectively the same task! The actual word processing capabilities are largely equivalent between the two releases of the software, which is not something we can argue with browsers.

Perhaps it is not something that is too much of a problem. In many respects, it would seem that application needs eventually dictate what people consider a “bare minimum” of RAM, meanwhile, we can see many different productivity tasks remained largely the same and contain similar feature sets and capabilities as those requirements rise. Early versions of Microsoft Word or Excel, for example, generally contain the bulk of features that people make use of in the latest release of the software, while using a relatively infinitesimal amount of system memory in doing so. This does lead to what I find cringeworthy proclamations such as “How can you possibly do anything with only 2GB of Memory?” which make sense in a certain context but when applied broadly can be pretty silly; We managed to do many of the same things we are doing nowadays with desktop computers 20 or 30 years ago, with far less memory and processing power, after all. Additionally one could easily imagine bringing somebody from, say, 1994 forward in time to hear such a statement and have them be in awe at how such an unimaginably large amount of memory – an amount still unheard of even for Hard Disk sizes to them – was being dismissed as far too little RAM for many of the same sort of tasks they had performed without issue.

Personally, I’ve found popping onto an old computer- like this iMac G3, to be a helpful experience. These are computers that, many years ago, were top of the line, the sort of systems some people would drool over. And now they are relegated to $20 Craigslist ads, which are the only thing between them and the dump. Meanwhile, the Operating Systems are not only responsive but are designed in such a way that they are quite easy to use and even, dare I say it, fun to use! Mac OS 9.2.2 has little audio flairs and clicks and pops and swoosh sounds associated with UI interactions that actually had me missing it when using my more recent systems. Which is not to suggest that I think it wouldn’t become annoying fairly quickly with regular usage.

Unfortunately — or I suppose in some ways, fortunately — the systems are relics of a bygone era. Computers are commonplace enough that we have them in a form that we can keep in our pocket. We have become so accustomed to the devices that they are now a part of daily life, as are the networked components, perhaps even more so. People are constantly updating their Facebook feed, checking other people’s posts, reading their Twitter feeds or their Instagram, sending people text messages, arguing about politics with some brother of a friend of a friend of a friend on Facebook who they’ve never met, etc. We are living in what could effectively be described as a “Fantasy World” by people in the 90’s and yet here we are living it everyday to the point where it is not only mundane, but where things considered unimaginable conveniences only a decade ago are now unacceptable.

This is not intended to suggest we should not strive for or even demand progress, just that maybe we should lay off the hyperbole in describing how lack of such progress is detrimental to our life. A web portal missing a minor convenience feature is not something to throw a fuss over; software beign released at a price point you disagree with is not a reason to go on the warpath against it’s developer, and just because you have a Tumblr blog with 5 readers doesn’t make you a social media influencer that developers- or anybody, for that matter- should cater to in any way.

There is an argument that something ineffable has been lost with the rise and ubiquity of the internet that is very much beyond description. While nowadays “research” is loading up Google in a new tab and running a few searches, it used to consist of going to the library and looking up Index cards and reference material. Where dealing with say a Programming conundrum or trying to use a program feature you weren’t familiar with meant looking it up in the hard-copy manual, or in the former case, actually working on your own solution based on what you needed, now you just Google for it and go directly to the answer on sites like Stack Overflow- you copy paste the code function and use it, or you mindlessly follow the steps outlined to use the program feature you want. Neither way is of course better than the other, it’s just that the Internet really is the ultimate Enabler.

I have about a half dozen books that I’ve barely even cracked open that, if I had them a decade ago, I would have read cover to cover several times over by now. I’ve had Project ideas squashed before I even started them by a quick Google search to find out that there are already programs that performed the function and they did it better than I even imagined. Whereas before I would have pursued such projects anyway- not knowing there was anything already done, and end up learning as a result.

As much as the ubiquity of the Internet has helped us, it has also acted as the ever-present enabler to our addictions. It feeds our addiction to information, our addiction to instant gratification, and our ever-present curiousity, but it does so with what could almost be described as empty calories. It’s like eating hard candies when you are hungry. So it leaves many unsatisfied and seeking more, wondering what is missing.

It was the hunt for the information, the trail that you blazed while doing research or cross-referencing in a Dewey-decimal index. It was the excitement of finding a nice thick book on a subject you were interested in, knowing it would keep your information “mouth” fed for weeks to come as you read and reread it, sucking in all the information it had to offer. Even the bits you couldn’t use. I read “Applied Structured BASIC” from cover to cover multiple times despite it covering ancient BASIC dialects that had long since stopped mattering.

Now, I find that there is a place for the phrase “information overload”. No bookshelf, no matter how full, can possibly compete with the World wide web in terms of the ease of access to information, the accuracy and inaccuracy of that information, as well as the sheer amount of that information, to the point where one could almost argue there is too much. Perhaps the skill in using the Internet for information is having a proper “tunnel vision” to get the information you want and ‘get out’ when you are looking for something specific. The alternative, of course, is to go looking up how to create a Jump List in Windows 7 and later and suddenly finding yourself reading about how Strawberries are harvested.

Have something to say about this post? Comment!