Menu

Computers And Schools

March 14, 2011 - General Computing

Needless to say, Computers, and the internet, have become nearly ubiquitous today. And it’s no surprise that they have found a spot in our schools. However, the way I see it being taught in school feels somewhat backwards to me.

The common concept is that “children should be exposed to computers at an early age”.

my response is “why?” I mean, it seams reasonable at first glance- there is no doubt they will be a central pinnacle for both communication as well as data management of all forms- but people seem to think that only children can learn anything. You don’t need eidetic memory to learn how to use a computer. I didn’t even own a computer until I was 16, and that was in 2002 (or something) and it was a 20 year old computer at the time (286).

What did I use it for? I learned basic batch programming. Why? because that was pretty much all I was capable of doing with it.

That I feel is a failure of almost all Computer education courses; they teach what is current. They teach Microsoft word. They tell you key bindings, how to copy and paste.

None of them ever go into the history of such features, or of the computer as a whole. Computers are treated like an appliance- similar to the way nobody really cares about the history of the toaster.

“But computers are an appliance” you might say- that is true, to some extent- at the very least you wouldn’t be the only one to feel that way.

But you don’t find college university courses teaching Washing machine science or dryer science or toaster science, so it sort of presents the question of why the computer is different. Clearly it’s different from any other appliance- obviously then it will have to be treated differently.

This sort of argues against my previous point (that computers shouldn’t be taught it schools) which it does- but really what I mean is that computers shouldn’t be taught in schools as if they were appliances; when we learn about math, we start with the basics. When we learn about science (outside Texas, I believe it’s banned there) we learn about stuff like the scientific method, and sometimes it even get’s mixed in with some history.

Yet when now when children first learn about the computer, they learn how to format text in word. They are learning the utility and not the means of that utility.

It feels like the entire point is to emphasize and teach muscle memory and memorization of the functions of the appliance of the computer, rather then try to teach analytical thinking or critical analysis of problems; For example, I’ve heard of people saying that things are “impossible” merely because they aren’t implemented in their program of choice. This has two flaws; one is that sometimes the feature implemented in their program of choice, it just happens to not have been covered in the rudimentary “here is the font combo, here is the size combo, good luck” course they took; or, sometimes the functionality actually isn’t in the program- but that is still only one program. For every single problem set there are countless applications.

This actually brings me to another point- It isn’t the computer itself that is the appliance- it is the applications the computer runs that are of appliance- in fact, the words themselves sort of give away the connection. What does this mean, exactly? Well, the way it is being taught and learned by many is as if the computer was the appliance, this isn’t bad per se but I feel it’s not quite the correct way to go about it for many people. The software is the appliance; people should learn how the software works, but they shouldn’t really need to have to know the details of it unless they want to. But, the very basics of the computer itself- the machine that is running these appliances- is something that should be required for somebody to own a computer. I don’t say this out of selfish elitism, but rather because confusion about the few terms that can easily be dispelled early on can cause those people myriads of issues later on down the road.

There are some people pushing for improvements in software so that people can “use the computer like an appliance; it should be intuitive, like a toaster or a microwave”; and one can see the logic in their reasoning. However, when we realize that the appliance is not the computer but the software itself, that changes the deal. The original “request” was basically to make it so people didn’t have to learn about software applications or hardware or memory or RAM or any of that stuff; however, when the appliance is the application, there is no excuse. When an instruction manual tells you to use a phillips screwdriver to remove the four largest screws on the back case, it makes several “assumptions” about what you already know; it assumes you know what a screwdriver is, what a philips head screwdriver is, how to judge size, how to know back from front, etc. This may sound silly, but really it’s not; after all, we weren’t born knowing what a screwdriver is or was for or how to use it, or what specific screwdriver types are, or how to accurately gauge distance and size, etc. These are all skills we developed over time. In the same vein, a software application manual shouldn’t have to describe to you the difference between memory and hard disk space when it quotes how much you need free any more then the aforementioned instruction manual should have to explain the difference between a flathead, Robertson, or Philip’s screwdriver. Neither should a software application be designed to hide these particulars from you any more then an appliance; aside from design considerations you can still see and access the screws holding a toaster together.

If anything, software is the appliance. Not the computer.

Have something to say about this post? Comment!