Manufacturing and the era of cloud computing

Posted on 20 Aug 2015 by The Manufacturer

The term cloud computing may be commonly uttered in businesses all over the world, but its complex history is less well known. Self-service business intelligence and analytics specialist, Matillion reveals all.

Matillion - the ere of cloud computing

Considering cloud technology has only achieved mainstream adoption in the past decade or so, it may come as some surprise to discover that the origins of the cloud are relatively long-standing.

Far from being a newly discovered technological gimmick, cloud computing can trace its roots back to the 1940s and 50s, developing alongside the personal computer and the World Wide Web.

If you have ever accessed a file or application that was not stored locally on your smartphone, tablet or PC then you’ve benefited from the cloud.

But while this technology has become ubiquitous today, it’s worth remembering that computers themselves have not always been easily available.

It may sound obvious, but for cloud computing to achieve popularity, computers themselves had to enter the mainstream.

The birth of modern computing

It wasn’t until 1947 that the first computer capable of storing a program was created by researchers at Manchester University.

The world's first stored program computer
Manchester University developed the world’s first stored program computer in 1947.

The “Manchester Baby,” as it was dubbed, may have been overly cumbersome with a length of 17 feet and limited in its functionality, but it played a pivotal role in the history of computers and software.

Moving into the 1950s, the concept of “time sharing” bears more than a passing resemblance to cloud computing, despite the latter term not coming into existence for another 40 years.

In the early days of computing, devices were large and expensive, meaning any downtime was a huge waste of resources. Time sharing was a concept developed by academics and researchers that allocated users an amount of processing power to ensure computers were used more efficiently.

The idea of sharing resources is the same one that lies at the heart of cloud computing today.

Hardware advances meant that computers became smaller and more affordable over time, with the Apple II computer launched in 1977 and going on to sell between five and six million units over its lifetime.

The first version of Microsoft’s now-famous operating system Windows 1.0 launches in 1985
The first version of Microsoft’s now-famous operating system Windows 1.0 launched in 1985.

The launch of the first Windows operating system in 1985 also helped to demonstrate the many benefits that computers could provide for both consumers and businesses; but in order for cloud computing to take off, network technology had to achieve a similar level of progress.

Networks go global

This begun in the early 1960s with the discovery of “packet switching,” which allowed more than two people to use the same network at any given time.

This in turn led to the creation of the ARPANET in 1969 which connected multiple computers together and became the first large-scale packet switching network.

When this was followed up by the work of Tim Berners Lee in 1991, the World Wide Web was created and the foundations for cloud computing were in place.

To find out more about the history of cloud computing, explore Matillion’s interactive infographic here

The burst of the ‘Dot Com Bubble’ and the growth of digital businesses all helped accelerate cloud adoption until it became the multi-billion dollar industry of today.

The future of cloud computing

The likes of Amazon; Google; Microsoft, and Netflix all rely on cloud computing to deliver their services and with countless untapped benefits yet to be discovered, the future remains bright for cloud computing.