Computer Processor History

As of the writing of this article, the first microprocessor–the Intel 4004–will turn 48 years old when the clock strikes 12:00am on November 15th, 2019. For nearly half a century, processors have changed the way computers handle information. However, that fact only scratches the long journey the processor took to be what it is today. Its journey starts way back in 1823. You read that right: its journey starts in the early 1800’s.

Stating the invention of the microprocessor in 1971 would not be the full story and, quite frankly, would downplay the accomplishments of others. In truth, the processor would not be what it is today if key components weren’t discovered decades before. It just took the right people making the right decisions, to stir the right ingredients together.

The Official Discovery of Silicon, Atomic Number 14

Silicon was placed on the path to greatness when the element made its way into the hands of Jons Jacob Berzelius in 1823. Before then, silicon or rather ‘silica,’ was known for quite some time–even used in ancient civilizations–but thanks to Jons Jacob Berzelius’ work, silicon was “discovered” and, in the same year, was first isolated.

In fact, it was almost isolated back in 1811 by Louis Jacques Thenard and Joseph Louis Gay-Lussac, but refused or failed to purify. Jons Jacob Berzelius then used the same method 22 years later and purifying it in the process, thus, creating and cataloging amorphous silicon.

Nikola Tesla and Electrical Logic Circuits

Yes, that Nikola Tesla, the great mind himself, would go on to create electrical logic circuits called “gates” and “switches” back in 1903. He would patent them that same year, solidifying the next component for modern day processors.

The Invention of Transistors

43 years later, in 1947, John Bardeen, Walter Brattain and William Shockley developed the transfer resistance or, what it’s better known as, the “transistor.” Consisting of semiconductors, the transistor jumpstarts the groundwork for the processor–a very important start that cements the domino effect that was started in 1823. The following year, in 1948, its patent would be approved.

What makes the transistor such an integral part of a CPU is the binary language, that which the computer uses to communicate. Vacuum tubes fell to the wayside and transistors took over. Vacuum tubes were large and expelled a lot of heat, whereas you could cram dozens of transistors into the same amount of space without creating a sauna.

With the transistor taking a front row seat to the development of the microchip, something incredible starts to happen when you group them together: they can be used to speak in binary and solve problems using Boolean logic. A grossly simplistic explanation of Boolean logic is a value is either TRUE or FALSE, but that’s the main nugget of truth and it’s thanks to George Boole.

In other words, certain conditions have to be met for a certain outcome to happen. Imagine you’re trying to buy an M-rated game. In America, you generally have to be 17 to buy an M-rated game. Now, let’s say you drop the game on the counter and the employee asks, “Are you 17?” and you say “No.” That could be considered a FALSE statement and whatever outcome you were hoping for isn’t going to happen.

As much of a jump the transistor would be for the glorious processor, it wouldn’t be for another 29 years before Intel would drop their first microprocessor. There was still so roadwork to be made.

The Creation of the First Integrated Circuit

September 12, 1958 hailed the coming of the first integrated circuit, developed by Robert Noyce of Fairchild Semiconductor and Jack Kilby of Texas Instruments. But here’s a funny bit that many can get confused with: every microprocessor is an example of an integrated circuit. That might make you think computer processors were invented in the 1950s, not in 1971, right? Well, no. While every microprocessor is an integrated circuit, not every integrated circuit is a microprocessor.

At any rate, that same Robert Noyce would go on to find Intel Corporation with Gordon Moore in 1968, ten year later and, with the help of Ted Hoff, Federico Faggin and Stan Mazor, magic would happen just three more years later in 1971.

All Hail the Microprocessor

So, relative to 1971, it took 148 years of discoveries and inventions before Intel and Ted Hoff would release the first microprocessor, the Intel 4004. And all those lovely discoveries played important roles in creating it.

Jons Jackob Berzelius’ discovery of silicon, which is one of the most abundant materials used for processors. Nikola Tesla with his electrical logic circuits and the first transistor. The first integrated circuit in 1958, only to have all of those incredible inventions funnel into Intel’s microprocessor.

The stars aligned for a momentous occasion and you, sitting there reading this on your computer or your phone, are using the same technology that was made and carefully crafted so long ago, albeit, in a far more powerful and more sophisticated fashion, at least the processor was.

The Intel 4004 Microprocessor

Released on November 15, 1971, the Intel 4004 was, as one would imagine in that time, a powerful piece of hardware. It contained 2300 transistors, could perform 60,000 operations every second, and had 640 bytes of memory. At the time, it was about USD$200. What do you think such an amazing piece of technology would be used for? Why, a calculator, of course, specifically the Busicom 141-PF, also known as the NCR 18-36.

To be fair, the creation of the Intel 4004 was specifically designed for the Busico calculators. It may not seem impressive now, considering everyone’s walking around with more technology in their pockets than was available for the Moon Landing, but the Intel 4004 deserves the recognition. It’s like being the first commercial automobile or the first plane, which later technology would build and improve upon. Even that calculator could process math that was far faster than the human mind could manage.

When you use a calculator, and you ask it a math problem, that act of you giving it a job is known as an “instruction.” Every bit of software you use makes use of instructions, whether it’s the browser you’re using to read this article or the pixels on your video game, and depending on the software, it could be uses millions of instructions at a time. The Intel 4004, for example, can churn out about 46,250 to 92,500 instructions every second.

To give a really good comparison on how impressive that is, at least at the time, is to compare the amount of instructions the Intel 4004 can pull of against ENIAC, the first electronic computer. ENIAC was a general-purpose computer that was completed back in 1945 (but the public wouldn’t know about it until 1946). ENIAC could pull off about 5,000 instructions per second. Not impressed? Well, ENIAC was the size of a 50-foot room. The Intel 4004? About the size of your fingernail. Today’s Intel processors are pushing past 80 billion instructions a second. That should help put into perspective of the leap processors make every year.

The Intel 8080

The next microprocessor to set the bar was the Intel 8080 in 1974. While Intel did release the Intel 8008 in 1972, a year after Intel 4004, it had 1200 more transistors (3500 altogether) but its clock speeds could only ever reach about 60% of the 4004. For comparison, the 4004 was designed to reach a clock of 1 MHz but could only ever reach 740 kHz, while the 8008 was only hitting 0.5 MHz or 500 kHz. Intel 8080 is about 10 times faster than the Intel 8008.

The Intel 8080 would push the boundaries at a clock speed of 2 MHz, even as high as 3 MHz. It made its way into the Altair 8800. It’s hotly contested, and usually considered, the first personal computer. For those of you too young to remember, when you ordered the Altair 8800, it was sent to you as a kit, though you could buy it already assembled.

What makes the Altair 8800 such a pivotal turn for the next generations of computers, and by extension the Intel 8080, is it would go on to inspire the minds of Bill Gates and Steve Jobs. And let’s not forget that the Intel 8080 was also used for Space Invaders.

The Pentium Processor

Jump forward to 1993, past several generations of Intel’s processors, and you’ll see the birth of a familiar name, the Pentium line of processors. The P5, the very first of the Pentium family of microprocessors, was first introduced in March 22, 1993.

As for specs, the P5 was pushing 60 MHz, a huge leap compared to the Intel 4004’s 0.74 MHz. And instead of 2300 transistors, the P5 was packing 3.1 million of them. Its other model, the 567-pin version had a clock speed even faster at 66 MHz. It was the P5 that made it possible for a processor to cover two instructions at a time, rather than a single instruction. It essentially doubled the processing.

Now processors, while definitely processors by definition in generations past, were really pushing the boundaries. With the P5 being a 32-bit processor, it was hailed as the most advanced processor and would go to house itself in many of the computers that were manufactured at the time.

The AMD Athlon

This microprocessor makes its way here for being the first desktop processor to reach clock speeds of 1 GHz, and that was 1999. To put that in perspective, just 6 years before, processors were pushing 66 MHz. By 1999, they were reaching 1 GHz, which is 1,000 MHz and, as you may have guessed by now, Intel wasn’t the only ones making processors anymore.

AMD was around, ever since 1969, but wasn’t a serious contender against Intel until much later. The AMD Athlon is what made Intel sweat because if its It was faster than the Intel Pentium III, which means the Pentium was in its third generation of processors.

Of course, Intel would push the envelope again the following year in 2000 with their own Intel Pentium IV, with clock speeds surpassing 1 GHZ and into 1.3 GHz to 3.08 GHz.

Bottom Line

There are generations of processors that could make their way to the list and each generation surpasses the last, whether it’s in terms of clock speeds, transistors or power. As of the writing of this article, Intel is currently pushing their Core i9-9900KF, which sits at 3.6 GHz by default, as its base, and can boost itself all the way to 5.0 GHz. AMD, on the other hand, is always close or at Intel’s level.

The microprocessor has had a very long journey and the kind of power it brings to your computer has led to developing fans to keep them cool (because they can melt). And the processing power you get now will probably be only half or less than what a processor might achieve another 10 years down the road. Only time will tell, but it’s likely you’ll see 10 GHz or more. Computers tend to boost their capabilities about every two years.

Until then, it’s always worth looking back and comparing the once celebrated 740 kHz clock speed to Intel’s massive 3.6 to 5.0 GHz processor.