How long has digitization existed?

A journey into the history of computers and digitization

When did digitization begin? A question that is not that easy to answer. The term “Industry 4.0” – which is often used synonymously in Germany – will only appear from 2011 onwards. However, we have to start much earlier in the search for the origin of digitization. The first part of our two-part article “The History of Digitization”.

\\Credit: marketwatch.com

Historians like to begin their analysis by disassembling and exploring terms. The root of the word “digital” is the Latin “digitus”, which means “finger, toe” and alludes to the counting of numbers below 10 on the fingers. In English, the word “digit” stands for “finger” next to “number”. The hand has ten fingers – and ten, represented as a number, consists of 1 and 0, at the same time also the two basic states of a computer in the binary system, power off and power on.

The broad use of the term “digital” in today’s sense is much more recent. It came up with the first computers that calculated digitally, that is, with the binary system 1 and 0. These came into being in the late 1930s / early 1940s. In 1937 the German engineer Konrad Zuse developed the Z1. This was itself still built mechanically and looked more like slide rules or calculating machines that had been around for centuries. The special thing about it: It was the first to be able to calculate with binary numbers and at the same time was freely programmable.

A short time later, computers emerged that were more similar to today’s PCs, such as the American ENIAC in 1944 and the German Zuse Z4 in 1945. Both of them calculated electronically, albeit with electron tubes and relays instead of the transistors used today. A similarity that only related to the inner values: after all, ENIAC took up a hall measuring 10 by 17 meters! In 1954, the American TRADIC came into being, which used transistors instead of electron tubes for calculations – and was only the size of three refrigerators. TRADIC stands for TRansistorized Airborne Digital Computer – where we find our term “digital”.

Zero and one – the secret of digitization

But why was binary arithmetic so important for the history of computers? Binary numbers only have two unique states, either 0 and 1. This makes the distinction on the hardware level – that is, in an electron tube, transistor or slide rule – easier than a multitude of states. The two states 0 and 1 can be represented electrically with “power on” and “power off” instead of different voltage states (e.g. 1 volt = digit 1, 2 volts = digit 2), which would require complex circuits. This also makes them more fail-safe when copying and saving. Besides, any type of information pattern can easily be stored in binary code, which makes it universally applicable. Mathematician Claude Shannon defined all of this in his 1948 essay“A Mathematical Theory of Communication” (PDF) .

Bipolar transistors, the smallest components and arithmetic elements of a modern computer, can do exactly two things: let current through or not let through – they are simply “on” and “off”, states that correspond to “0” and “1” in the binary system. You can count on “power on” and “power off”. And because transistors can be made very small, they are so well suited for making powerful computers from them. For size comparison: the smallest transistors are around 20 to 30 nanometers in size – around 250 times smaller than a red blood cell. Today, computer chips accommodate billions of transistors the size of a fingernail and, thanks to this miniaturization, can calculate so quickly.

Thus began the incessant triumph of the digital computer in the 1950s. Since then, the speed of a computer has doubled every two years – the famous Moore’s Law – and we see more and more computerized applications in our everyday lives. With the advent of the Internet at the latest, data exchange and storage increased enormously.

By the way: Many other digital technologies that are state-of-the-art today also have origins going back a long way. The artificial intelligence , now on everyone’s lips as the next big thing, has its origins back in the early 1960s. However, computers were not yet powerful enough at the time, AI was neglected by science and industry for a long time.

One moment please – none of that sounds like the digitization that is so present in the media!

That’s right. What we mean by digitization today – smartphones, cloud, AI and co – has little in common with Konrad Zuse and refrigerator-sized computers. But these are without question the origins of digitization. And they are an important building block on the way to answering our initial question: When did digitization begin?

First of all, we should clarify what is actually meant by the term digitization. Just so that we can speak a common language. Today we understand it to be a social process that describes the massive use of digital technologies in our everyday lives and that contributes to a change in the way society, work and the economy work. None of that was the case in the 50s, when the “electronic revolution” was more likely to be heard in the rock and roll electric guitars.

To properly understand digitization and ultimately figure it out, we have to look at a second process that is inextricably linked to digitization and which we mean when we talk about it: digitized automation.

Leave a Reply

Your email address will not be published. Required fields are marked *