How Computers Work
The "How Computers Work" section of this chapter is a bonus section, because it is somewhat more technical than most of the information in this unit. There will be more extra-credit questions on the test than usual, and they will all focus on the information in this part.
The second part of this chapter, "Switches Used in Computers," is not extra credit and must be read and understood fully.
It is important to remember that computers are not "smart" in the way that we think of the word. That is, they do not "think" like people do. They are very literal, straightforward calculators. However, they are extremely complex and very fast, and run on cleverly designed programs, so they give the impression of being "smart."
But how do they function? What allows them to use 0s and 1s to represent logical decision making?
You know about switches; you use them every day. When you turn on a light, you flick a switch. The switch then sends electricity to the light. When you turn the switch off, there is no electricity and the light goes out.
Switches in computers are used to express numbers. If a switch is a "1," it reads as "TRUE." If the switch is a "0," it reads as "FALSE."
This uses only two digits, a 0 and a 1, which is referred to as binary counting. We will learn more about binary in another unit.
In 1847, mathematician George Boole introduced Boolean Math, a system of mathematical logic, in which all mathematical operations could be worked out using the concepts of AND, OR, and NOT.
In the image below, you can see the three statements. The red areas show what each statement refers to. The red areas are TRUE (the red area is what the statement is referring to), and the white areas are FALSE (they are not what the statement is referring to).
The first is x∧y, meaning x AND y.
Where x and y are the same, the value is TRUE, or "1".
Where it is only x, only y, or neither, the value is FALSE, or "0".
The second is x∨y, meaning x OR y.
Anywhere within x, or y, or in both locations is TRUE, or "1".
Outside of x or y, the value is FALSE, or "0".
The final set is ¬x, or NOT x.
In this case, anything outside of x is TRUE, or "1."
Anything outside of x is FALSE, or "0".
Notice that the values for any set are 0 or 1. This works for computers, which use switches, which are either ON or OFF—that is, 1 or 0, or TRUE or FALSE. Mathematical logic can be expressed by turning switches on and off.
Boolean Logic in Computers
In 1937, Claude Shannon showed that you could use electronics to apply Boolean logic to perform any mathematical process. Boolean math could be carried out in binary (using only 0s and 1s) using only electronic circuits. His work laid the foundation for using electronic machines to perform mathematical operations—the basis of modern computing.
As a result of the work of Shannon and others, computers were able to perform complex operations using Boolean logic, which was easily applied to the electronic parts available.
While all computer operations could be performed with only three decision types (AND, OR, NOT), computers use as many as seven (AND, NAND, OR, NOR, NOT, XOR, and XNOR).
The switches in computers can be arranged as Logic Gates. Logic gates are a number of switches which are set up to express Boolean operations: NOT, AND, NAND, OR, NOR, XOR, and XNOR.
These gates will respond in very specific ways to binary input—that is, when you send them 1s or 0s, they will react in different ways, and pass on different results.
To the right of each operation is a Truth Table, which shows the possible results of inputs into the gates.
A gate set up as a NOT gate (¬x) will change a 1 to a 0, or a 0 to a 1.
It has one input (a 1 or a 0) and one output (a 1 or a 0).
This gate is like a "No!" machine; it disagrees with everything.
Other gates have two or more inputs.
An AND gate (x∧y) will produce a TRUE result (a 1) only if both inputs are 1 (1+1).
If you have any other combination (1+0, 0+1, or 0+0) will result in a "0," or FALSE. In other words, the AND gate will only give you a "1" if both inputs are true.
It's kind of like those movies where two people with keys must turn the keys at the same time.
A NAND is a NOT-AND gate. This is the opposite of the AND gate. In a NAND gate, 1+1 returns a FALSE result (0), and everything else returns a TRUE result (1).
This is when you want no more than one true value to pass.
For example, it is like a hat: it can be unused (no one wears it), or one person may wear it, but two or more people cannot wear it at the same time.
An OR gate (x∨y) will produce a TRUE result if at least one of two or more inputs are "1".
So, 1+1, 1+0, and 0+1 will return a TRUE result (a 1), while 0+0 will return a FALSE (0) result.
This is kind of like saying "either or both." You want an ice cream cone; if there are two scoops or one scoop, you're okay, but if there are no scoops, you get upset.
A NOR gate is a NOT-OR gate.
It gives a TRUE result only if both or more inputs are "0". So, 0+0 is TRUE (neither one is 1). However, 0+1, 1+0, or 1+1 all produce FALSE results (a 0).
This is kind of like a "neither" or "none" gate.
The XOR is the EXCLUSIVE-OR gate. It gives a TRUE result if both inputs are different from each other. So, a 0+1 and 1+0 return a TRUE (1) result, while 1+1 and 0+0 return a FALSE (0) result.
The XOR only can take 2 inputs, not more.
This is where you can only choose one of something, but not both or neither.
The XNOR is the EXCLUSIVE-NOR gate. It gives a TRUE result if both inputs are the same as each other. So, a 1+1 and 0+0 return a TRUE (1) result, while 0+1 and 1+0 return a FALSE (0) result.
The XNOR also only can take 2 inputs.
This is like "all or nothing," a "package deal"; you can choose both or neither, but not just one.
Connecting the Gates
This kind of logic is something that you may use often when you do a Google search. Searches are usually set up to be Boolean. You can list search terms to find, and other terms to avoid. For example, if you do a Google search for United States -California, then you are looking for any web pages that have both United AND States, and NOT California.
To get results for this, you need two AND gates and one NOT gate. They would look like:
These gates always behave the same way under the same circumstances. They are limited, and one can learn how they work so you can always figure out, fairly easily, how they will work under whatever circumstances you can imagine.
Relays were the first type of technology used in electronic computers to represent ON and OFF, which was equivalent to TRUE and FALSE, or 1 and 0, necessary in Boolean logic.
A relay is a switch. When an electronic signal is sent to a relay, it opens or closes the switch, using mechanical means (often magnets open or close a connection).
Many of the early computers, including the Zuse Z3, used relays. Because they were mechanical (moving) parts, the computers were not fully electronic.
The Vacuum Tube
Vacuum tubes, the second generation of electronic computer technology, were developed around 1907 and are sometimes referred to as “radio tubes” because they were used in old radio sets. The tubes had multiple uses, but in computers, they acted as switching mechanisms. Each vacuum tube could store represent a 1 or a 0, and could acts as gates.
The Colossus computer was the first to make full use of vacuum tubes, but because of its secrecy, nobody knew of this for decades. ENIAC, instead, was celebrated as the first full vacuum-tube-powered computer.
Although it was much faster than relays, the vacuum tube had many disadvantages. It was large, created a great deal of heat, and failed too often. The use of these tubes and the equipment needed to support them made many devices, including radios, quite large. The first well-known electronic computers, such as ENIAC or Whirlwind, occupied entire buildings.
Vacuum tubes are electronic, but they are not always considered "solid state."
When transistors were introduced in the early- to mid-1950s, they replaced vacuum tubes in a variety of areas. One that most people noticed was in the release of transistor radios, allowing for smaller, portable units than was previously possible.
Like vacuum tubes, transistors had multiple uses—one of them being a binary switch, usable in computers. Many computers started using them, but none of them were particularly famous. The first prototype appeared in 1953, but transistor-run computers did not start becoming common until the late 1950's.
Their advantage was that they were smaller, required less energy, and produced less heat. Also, because they were solid state (built of a solid material with no moving parts), they did not break down as much. The transistor made computers much smaller, able to fit into a room rather than requiring a whole building.
Transistors represent the third generation of electronic computer technology. Despite being smaller and cooler than vacuum tubes, transistors were still too large for many applications. In the 1950s and 60s, the space program required computers smaller and smaller, so something beyond the transistor had to be developed.
The Integrated Circuit Chip & the Microprocessor
In 1958, engineers created the first very small IC, or Integrated Circuit chips, which had micro-transistors on a small semi-conducting chip. Although the first IC chips had only a few transistors, it began a new generation of technology which would eventually lead to billions of transistors on a tiny piece of silicon. Instead of having to add multiple transistors to a machine, IC chips made it possible to have many transistors within a single solid-state element. This helped make computers smaller, lighter, cheaper, and more energy-efficient.
Later, in the 1960's and early 1970's, IC chips advanced to extremely complex designs. These were called microprocessors, now known as CPUs. Previously, the computer was composed of many small chips which worked together. By putting a large number of circuits on one chip, computers could be built more cheaply, and could operate much faster. The first microprocessor, the Intel 4004, went on sale commercially in 1971.
Gordon Moore was a co-founder of Intel, the company that makes the microprocessors for most computers today. Before he created that company, however, he came up with an estimate of how quickly transistors and other components could be miniaturized. In 1965, he predicted that the number of components in a certain space, especially transistors, would double every year; however, in 1975, he changed his prediction to say that the number would double every two years.
This prediction, called Moore's Law, was roughly true for decades. However, in recent years, as components became so small that they could be measure in dozens of molecules of width, Moore's Law began breaking down. As a result, microchip makers have been trying to create other ways to make their products faster and more powerful.