If you’re into the history of computers, you’ll be amazed to learn these astounding and interesting facts of computers past.
Around for centuries, when you take a look at the history of computing, you’ll find that it’s unlike what is purported in the famous recent movie, The Imitation Game.
Computers hit their biggest moment of evolution when they became digitised.
The First Digital Computer
The first digital computer weighed in at 27 tonnes and was as big as several houses! It cost a whopping $55,000 USD. That’s almost $800,000 USD in 2019.
ENIAC, or the Electronic Numerical Integrator and Computer, was funded by the US Army and developed by the University of Pennsylvania’s John Mauchly and J. Presper Eckert. Prior to the invention of transistors, the developments of ENIAC were the birthplace of large computing systems and drastically decreased the size required to direct current.
The First Hard Drive
The first hard drive ever, held only a measly 5MB of data. Introduced in 1956 by IBM, it measured 1.5 sq meters.
As a result, each generation of computers replaced larger, more sensitive and more cumbersome computer devices. The earliest computers were only usable in dustproof and static proof data centre environments. Then, they progressively moved to factories, office, and homes.
They were initially 24″ and were then reduced to 3.5-inch and 2.5-inch sizes as we know them today.
The First Virus
The first computer virus, BRAIN, was written in 1986 by the Farooq Alvi brothers. The virus was made unintentionally, as the brothers were trying to stop customers from making pirated copies of their software. This virus infected the boot sector of storage media.
Made only to infect illegal copies of their software, they put their names and contact details into the virus code. But, they soon realised that the virus had gone global. They had lost the ability to control it. Now, they were getting calls from the US and UK to Pakistan, and hearing reports that the virus had wiped data and made computer drives unresponsive.
Now, 40 years later there are over 100,000 known computer viruses. Read all about the 4 common ways to get a virus.
Doco: Brain, The First Virus In Pakistan
The Origin Of The Word Computer
The word computer was first recorded in 1613. It comes from the Latin word “putare” which means both to think and to prune. Back then, it was applied to those who performed calculations or computations. The term links to phrases like tidying, setting to rights, balancing an account, and reckoning up. You can learn more about it here: What’s the root of the word computer?
The Difference Engine
The difference engine was invented in 1822 by Englishman, Charles Babbage. It computed several sets of numbers and made hard copies of the results.
It’s said that Babbage saw the clear need to automate the calculation of long tedious astronomical calculations. He penned the paper, “On the Theoretical Principles of the Machinery for Calculating Tables.” Accuracy is assured through the automation of calculation. Babbage earned one of the world’s first government grants for research and technological development.
This was the birthplace of calculations going from 6 digits to 20-30 digits routinely. Each digit was placed on a toothed wheel, from 0-9 and when the wheel turned, it caused the next one to turn.
The Colossus, the first programmable set of computers, was invented for British codebreakers to read encrypted German messages at the start of World War 2. This was recreated in the fascinating movie, The Imitation Game.
Then an improved Colossus Mark 2, increased the decryption speed by a multiple of 5. It first worked on the 1st of June, 1944, just in time for the Normandy Landings on D-Day.
Kept a secret until the mid-1970s, in an effort to maintain their secrecy, the machines and designs were destroyed, which prevented the pioneers from exploring the field further in their lifetimes.
The Future Of Computing: Quantum Computers?
And now, look where computers are heading. Here’s a glimpse of the first quantum computer for the home. Will it completely disrupt the home computing sector? The primary evolution is that processors will move to understanding 4 states of bits instead of just 1 and 0. They can do both simultaneously, so the processor reads 0,0/0,1/1,0/1,1. Learn more about IBM’s new quantum computer here.