It was in the 1960s that general purpose business computers first became available and kicked off the evolution of business computing. Because they were so large and so expensive, it was impractical for small and medium size firms to have their own. However, they could buy “time” on someone else’s computer and this gave rise to the first time-sharing operating systems – which allowed many users to run their programs on a single computer – apparently simultaneously and without interfering with each other. Of course, these time-sharing users had to get their data in and out of the computer. This was best achieved (remotely) using telephone circuits and modems which convert digital data into a form that could be transmitted over a network designed for speech.
In the 70s and 80s, the cost of computers dropped, enabling even small businesses to own them. Indeed, not just one per company but one per employee – the personal computer (PC) had arrived. This then created the issue of how to distribute the software to make these PCs useful – for example spreadsheets, word processors, accounting systems, etc. In the era when telephone networks were still painfully slow and unreliable, CDs and DVDs were the media of choice. Every time an update was made to the software – to improve functionality or to fix a bug – the user had to be sent a new CD/DVD so they could update the software without destroying their own data. And if the update did not work as intended, the user had to go back to the previous version. The result was that many users preferred to stay on old versions for years rather than risk an abortive update. Add to that the fact that CD and DVDs could be copied, software manufactures added so many debilitating licencing mechanisms that it was often impossible to run software you owned.
By 2000, the practical realization of a worldwide network connecting any type of digital device had extended from academia and the military into the general business world to become what we now call the internet. With the internet in place, it now made practical sense to go full circle and return to the 1960s idea of time-sharingbig computers. Sure, by 2000 most people had a computer on their desk, but big computers had become even more powerful and a whole raft of applications existed that simply would not run on a PC. The most obvious examples are artificial intelligence applications such as speech recognition, but it turned out that many less demanding applications such as business accounting worked much better with many users sharing a big computer than each having their own small terminal.
Nevertheless, it certainly made good sense to use a PC as a glorified terminal connecting to a large computer located remotely and running some super-powerful software. But how does a non-specialist PC user run a remote application from any PC? Once again, a technology developed in academia and government solved this problem – the World Wide Web. In fact the web is simply a set of rules that programmers stick to in order that their software can talk together. The Internet covers the rules for communicating the data and the web covers the rule for sharing the data with a human.
However, the issue remained that the Internet was still not universally fast or reliable. This needed to become the norm for the average home user to tick the last box for the growth of cloud computing – cloud being the buzzword adopted to describe the set of all computers connected to the internet. Although it did not happen instantaneously, by 2010 we had reached the tipping point, where software developers were now considering developing Cloud applications – that is applications that run on remote computers accessed by the Internet – rather than native applications – applications that run on PCs in local area networks.
Almost inevitably, we continue to repeat the cycle of returning to old technologies. Just as we have returned to time-sharing, we are slowly becoming more independent of the world-wide web and its utterly dependent application, the web browser (for example, Internet Explorer, Chrome, Safari, and so on). Great as these are in hiding the differences between different types of device such as Macintosh, Unix, Windows, or phone, they do impose limitations not there for native applications. So the trend nowadays is to replace web applications with native applications, which still use the internet to communicate with the remote host computer. Examples are YouTube, Office 365 and WhatsApp. These are all cloud applications but you can access them with either a web browser or a native application. But it is only a trend. It would be impossible to drop support for web applications without losing the freedom to run the cloud service from any device anywhere in the world.
So with that brief review of the background, here are some of the key enablers for cloud computing…
- High speed reliable Internet – typically with average download speeds exceeding 10 mbs with a 99% uptime.
- Generic service hosting – clearly cloud computing would not be possible without the cloud applications to run. All the major software manufacturers have significant cloud service offerings.
- The revolution in the design of software for distributed systems – writing software that communicates with somebody else’s software is no easy task and for this to work successfully has required considerable advances in software design.
- Open source software has removed the massive barrier presented to software development by proprietary (secret) models. Now most developers will search for existing software before developing their own – frequently developing applications in months involving 10s of man/years development effort.
- The rental model – most cloud services are monetized by a monthly fee rather than an up-front one-off payment. This benefits the user by not requiring a large capital payment, not having to take out support contracts, and having the freedom to stop paying when the software is no longer required.
The key benefits…
- Continuous upgrades – in the cloud model, software is continuously updated and upgraded so the user gets immediate benefit with minimal effort.
- Concentration of computer power – as said, most applications run more efficiently on one large computer than on many small ones. This is partly because the large computer is intrinsically more efficient but also because they mop up of latency – that is, the time most computers spend doing nothing)
- Vastly lower costs – no need to spend time procuring, servicing, maintaining, updating, securing physical hardware and native applications.
- Working from anywhere – cloud based applications not only allow users to use any computer in any place, this also means in the event of a disaster, data is secure and business interruption is minimised.
The key risks…
- Dependence on Internet – cloud services imply significant dependence on the availability of the Internet. Although some do allow “off line” working, this is not always practical.
- Security breaches – with all a user’s data stored in the cloud, it is an ever-present temptation for hackers and competitors in a way that would not be possible if data stored onsite. Data can be destroyed, stolen, sold, modified, ransomed, and so on.
- Privacy breaches – sharing confidential data among employees can be as damaging as business data.
- Data latency – no matter how fast Internet access becomes it is worth remembering that nothing travels faster than light, and when you are accessing data on a cloud server the other side of the world, these delays can add up.
- The price of “free” services – many cloud services are available free of charge. As we all know, nothing is ever free, and the price you pay is sharing your data with the service provider. Read their Ts & Cs and you may be shocked what you have agreed to. And if you don’t pay, you do not have many rights.
So in summary, cloud applications are an enormous advance on our early native application model although the distribution of work between the local client and remote server is constantly changing. In almost every aspect, cloud-based applications win over on applications based on local area networks but it is important not to forget that there are some risks.