By Scott Hamilton
This week I am doing something a little different. I came across some excellent free training material in digital design and computer architecture. Atos is constantly seeking computer system architects, and finding well-trained people is difficult. So, I wanted to give a quick overview of digital design and point my readers to some great free material.
Digital design focuses on the underlying circuits that make computers possible. You learn about the logic circuits that make up the brains of the computer. These logic circuits are commonly referred to as logic gates. There are four basic gates that combine to make all computation possible, “and,” “or,” “xor” and “not.”
First, we need to explain logic signals; they are always either “True,” represented by a positive voltage or the number 1, and “False,” represented by a low voltage or the number 0. In classic logic circuits, “True” is between 4.7- and 5-volts and “False” is between 0 and 0.3 volts.
The “and” gate is fairly simple to understand; it basically means that all the inputs to the gate must be “True” for the output to be “True,” otherwise the output is “False.” The “or” gate is also fairly straight forward; if any input to the “or” gate is “True,” the output is “True,” and if they are all “False” the output is “False.”
The “xor,” or exclusive or, is a little more complicated. It is only “True” if a single input is “True;” if no inputs are “True” or more than one input is “True,” then the output is “False.” This is very useful in machine-learning algorithms where one expects only a single match; as a result it has become a very important logic gate in modern computers.
The last gate is probably one of the most useful because it allows one to modify the input or output of any logic gate. The “not” gate works to invert the input, so if the input is “True,” the output will be “False,” and if the input is “False,” the output will be “True.”
Very early in digital design courses, you will learn how to make each of these circuits from standard transistors and how to use them to do simple mathematical operations. These are the foundational principles behind computer architecture, and a solid understanding of these basic principles is critical to understanding computer architecture.
Once these basics are thoroughly understood, the course material usually moves on to teach how modern computers work from the bottom up. You will learn about the tradeoffs between different designs and ideas. You will design a simple microprocessor of your own, usually a two- or four-bit processor, because these can usually be designed and implemented with simple circuits and demonstrate the workings at a level that can be easily studied.
You go on to learn how to debug increasingly complex systems at the level of the computer processor. This usually involves understanding how to correct hardware errors through software modification. This is because modifying the underlying hardware of the microprocessor is not possible, at least not in modern computers. The early computers were built out of these simple logic circuits and could actually be repaired and modified after manufacturing to correct hardware errors. The integrated circuit, which puts billions of transistors on a chip the size of a dime, makes repairing the individual circuits impossible today.
I highly recommend the freely available course, “Digital Design and Computer Architecture,” taught by my friend, Professor Onur Mutlu, at Eidgenössische Technische Hochschule Zürich (ETH Zurich) in Switzerland. You can find the course materials at http://lnkd.in/dZsaMZz and the videos on YouTube at http://lnkd.in/d-AquTb. Stay safe and learn something new.
Scott Hamilton is a Senior Expert in Emerging Technologies at ATOS and can be reached with questions and comments via email to email@example.com.