“Classic vs. Modern Computing”

0
Oct1985

Image from the Internet Archive, Cover page of Compute!'s Gazette magazine October 1985 Edition.

I was feeling a bit nostalgic this week and began to think about some of the differences between computers 40 years ago and today. You might find it surprising that there really are only minor differences in capabilities and programming techniques. While we have several new programming languages, there is not that much difference in the low-level languages of today than there were in the 1980s. The main difference is around the amount of information they can store and access.

Computer processors work off some fairly simple concepts that have not changed since the 1980s, and they probably have not really changed much from even earlier systems. The processor consists of a few main components that all do the same things today as they did back then. You have an instruction set, which is a list of things the CPU knows how to do, for example add, subtract, multiply, divide and compare values. They have registers which are memory locations local to the processor where values are loaded, calculations are done and results are stored. They have memory access channels which allow the CPU to store more values than they have registers and recall them later. The architecture, meaning how these functions are built electrically, has changed significantly, but the instructions have not changed, with the exception of some additional instructions to handle larger amounts of memory and some specialized arithmetic units designed for more rapid matrix computations. So really, if you learned programming on a computer 30 or more years ago, everything you learned and remember would still work today.

I will say as far as programming and software design goes, the biggest change has been that we have finally come very close to perfecting artificial intelligence (AI). We have had the dream of computers that think and can assist us in solving more complex problems since the first computer was designed, so AI is not anything new, but the big thing that changed to make AI more effective today is the amount of digital data available to the algorithms. You see, just like we learn by exposing ourselves to new information, the same is required of AI. The more information you have access to and remember, the more intelligent you become. It is the same for AI; the more information it can access, the smarter it seems to become.

The advancement of AI has made a significant shift in how I look at computers today. In the past I always felt like the computer was an excellent tool for learning. It gave us rapid access to new information, challenged us to improve the software for that access, and with its limited capabilities, it was a tool for learning. I feel exactly the opposite today. With the more advanced AI systems and newer computers, they seem to hinder our ability to learn. In the 1980s if I wanted to use a computer effectively, I needed to write my own software, usually in BASIC to teach the computer how to accomplish the task I wanted to speed up. However, today I just ask Google how to do something, anything at all, not even related to computers and an AI will give me step-by-step instructions and maybe even a video on how to do it.

I feel like the roles have shifted and we have become the tool to train the AI on how to accomplish tasks, so that others can learn, not from us directly, but from the AI how to do things. The computer has gone from being a tool that can be used for learning to being a tool to avoid learning. If I just have to ask my cell phone how to do something, or calculate something for me, why should I memorize anything, or in that case learn anything for myself? I get that question from a lot of the high-school students I work with. “Why do I have to learn math? The computer will do it all for me” or “why do I have to write papers when I can just ask Google to write it?”

In the Netherlands they discovered that introducing computers too early in education resulted in much lower-than-average test scores. Unfortunately, this was after they made a shift to full online-based learning, even in the classroom. They have since made a shift back to pen and paper learning in their national education system and have seen improvement in scores. I attribute this to the misconception that computers will always be there, so I can just let it do things for me. There is no promise that Google, ChatGPT, Claude AI, or Microsoft Co-Pilot will still be around five years from now, and I can say that even within my lifetime I have data on disks from the 1980s that I can no longer access without finding a hardware simulator and the original drive capable of reading the disk. We now seem to only rely on “cloud” storage, and very few people or businesses keep anything on paper. I challenge you this week to take inventory. What would you need if the Internet went away tomorrow? It can happen; in fact, it is happening now in Iran. Make sure you are not caught without critical information if your phone stops tomorrow. Until next week, stay safe and learn something new.

Scott Hamilton is an Expert in Emerging Technologies at ATOS and can be reached with questions and comments via email to shamilton@techshepherd.org or through his website at https://www.techshepherd.org.

Leave a Reply

Share via
Copy link
Powered by Social Snap