9
SHARE

Ask any millennial about computing, and they’ll likely point you in the direction of a half-eaten fruit logo. The history of computing – which refers to algorithms and other mathematical steps - far exceeds the use of smartphones and tablets. Beginning with humble chalk and slate, its applications range from counting crops and livestock, to code-breaking. In fact, whilst we now understand a computer as a piece of machinery, the term once referred to a person who performs arithmetic.

In a modern sense, we use computers for myriad purposes - not least of which are the growing variety of entertainment options. However modern computing hardware was developed from far larger, and far more basic apparatus, born primarily out of necessity. The original computational device was the abacus, which was supposedly invented in ancient Babylon circa 2,300 B.C. Sometime later (around 100 B.C.), the Antikythera Mechanism was used to predict astronomical movements, decades in advance.

Punching above their weight

Later machines used a system of punched card data processing; known as the “father of modern computing”, Charles Babbage conceived of a steam-powered machine capable of computing tables of numbers using this method in 1822. The project was a failure for the English government, however it set the wheels in motion for later generations of computer development.

Due to rapidly growing population, the 1890 US census was predicted the take seven years to tabulate; using punched cards which could be read by a tabulating machine, Hermann Hollerith reputedly saved the US government $5 million and completed the task in three years – two years faster than the previous census. Some years later, Hollerith’s company would become central to the founding of IBM.

Memory makers

Famed English computer scientist Alan Turing laid down the theoretical principle for modern computers in 1936: the notion of a Universal Turing Machine held that one machine could capably perform duties of any other machine with appropriate programming. A device is considered to be “Turing-complete” if it can simulate any other system considered to be Turing complete, such as another Turing machine (or by extension, any possible real-world computer). Then in 1941, a memorable moment occurred at Iowa State University, when a computer able to store information on its main memory solved 29 equations simultaneously. 

Making technology more accessible to the general public, in 1964 Douglas Englebart presented something today’s generation would be more familiar with - a prototype for the modern computer, with a mouse and graphical user interface. Soon, amidst more rapidly advancing tech, the mid-1970s saw the dawn of Microsoft and Apple – and the conception of the personal computer. Five years after the first dotcom was officially registered, Hyper Text Markup Language (HTML) gave rise to the World Wide Web in 1990, and with advances in graphics, 1994 saw some PCs becoming dedicated gaming machines thanks to Pentium processors.

Quantum leap

We live in the age of Google, Apple, Facebook, and Amazon: the Digital Era.

Recent years have seen developments come thick and fast: with regularity, new models surpass the old; tech is smaller, faster, more geared towards our daily lives; we have apps for work and play, to the extent that these products have become integral in our ability to function and in some cases, the reason we do. Coming generations will get to grips with the first reprogrammable quantum computer, virtual reality headsets and holographic assistants.

With such rapid and advantageous developments taking place concurrently, it puts the potential for massive change at the tip of your fingers. How will you shape the future of computing? A great place to start is with BSc (Hons) Computing (Mobile Computing) and MSc Data Analytics and IT Security Management

Recommended Programmes
  • MSc Data Analytics & Finance

    Postgraduate

  • MSc Data Analytics & Finance

    Postgraduate

More insight from our blog

  • Most useful mobile apps for students

    Need to manage your study time better? Want to keep on top of you class notes? Here’s...

    Read More
  • How Data Analytics Differs From Busine...

    Data Analytics and Business Intelligence are two distinct disciplines falling under t...

    Read More
  • Why does a software engineer need an M...

    computing, IT, software engineer, MBA, careers

    Read More
  • Why coding is the most important skill...

    Anyone looking to move into the influential field of coding will need to find the app...

    Read More
  • The importance of security: roles and ...

    The role of an IT Security Manager is becoming more important each day. Do you have w...

    Read More
  • How important is information managemen...

    All you need to know about information management, and why it’s become such an integr...

    Read More