Transforming software products into cloud services

It was in the 1960s that general purpose business computers first became available and kicked off the evolution of business computing. Because they were so large and so expensive, it was impractical for small and medium size firms to have their own. However, they could buy “time” on someone else’s computer and this gave rise to the first time-sharing operating systems – which allowed many users to run their programs on a single computer – apparently simultaneously and without interfering with each other. Of course, these time-sharing users had to get their data in and out of the computer. This was best achieved (remotely) using telephone circuits and modems which convert digital data into a form that could be transmitted over a network designed for speech.

In the 70s and 80s, the cost of computers dropped, enabling even small businesses to own them. Indeed, not just one per company but one per employee – the personal computer (PC) had arrived. This then created the issue of how to distribute the software to make these PCs useful – for example spreadsheets, word processors, accounting systems, etc. In the era when telephone networks were still painfully slow and unreliable, CDs and DVDs were the media of choice. Every time an update was made to the software – to improve functionality or to fix a bug – the user had to be sent a new CD/DVD so they could update the software without destroying their own data. And if the update did not work as intended, the user had to go back to the previous version. The result was that many users preferred to stay on old versions for years rather than risk an abortive update. Add to that the fact that CD and DVDs could be copied, software manufactures added so many debilitating licencing mechanisms that it was often impossible to run software you owned.

By 2000, the practical realization of a worldwide network connecting any type of digital device had extended from academia and the military into the general business world to become what we now call the internet. With the internet in place, it now made practical sense to go full circle and return to the 1960s idea of time-sharingbig computers. Sure, by 2000 most people had a computer on their desk, but big computers had become even more powerful and a whole raft of applications existed that simply would not run on a PC. The most obvious examples are artificial intelligence applications such as speech recognition, but it turned out that many less demanding applications such as business accounting worked much better with many users sharing a big computer than each having their own small terminal.

Nevertheless, it certainly made good sense to use a PC as a glorified terminal connecting to a large computer located remotely and running some super-powerful software. But how does a non-specialist PC user run a remote application from any PC? Once again, a technology developed in academia and government solved this problem – the World Wide Web. In fact the web is simply a set of rules that programmers stick to in order that their software can talk together. The Internet covers the rules for communicating the data and the web covers the rule for sharing the data with a human.

However, the issue remained that the Internet was still not universally fast or reliable. This needed to become the norm for the average home user to tick the last box for the growth of cloud computing – cloud being the buzzword adopted to describe the set of all computers connected to the internet. Although it did not happen instantaneously, by 2010 we had reached the tipping point, where software developers were now considering developing Cloud applications – that is applications that run on remote computers accessed by the Internet – rather than native applications – applications that run on PCs in local area networks.

Almost inevitably, we continue to repeat the cycle of returning to old technologies. Just as we have returned to time-sharing, we are slowly becoming more independent of the world-wide web and its utterly dependent application, the web browser (for example, Internet Explorer, Chrome, Safari, and so on). Great as these are in hiding the differences between different types of device such as Macintosh, Unix, Windows, or phone, they do impose limitations not there for native applications. So the trend nowadays is to replace web applications with native applications, which still use the internet to communicate with the remote host computer. Examples are YouTube, Office 365 and WhatsApp. These are all cloud applications but you can access them with either a web browser or a native application. But it is only a trend. It would be impossible to drop support for web applications without losing the freedom to run the cloud service from any device anywhere in the world.

So with that brief review of the background, here are some of the key enablers for cloud computing…

  • High speed reliable Internet – typically with average download speeds exceeding 10 mbs with a 99% uptime.
  • Generic service hosting – clearly cloud computing would not be possible without the cloud applications to run. All the major software manufacturers have significant cloud service offerings.
  • The revolution in the design of software for distributed systems – writing software that communicates with somebody else’s software is no easy task and for this to work successfully has required considerable advances in software design.
  • Open source software has removed the massive barrier presented to software development by proprietary (secret) models. Now most developers will search for existing software before developing their own – frequently developing applications in months involving 10s of man/years development effort.
  • The rental model – most cloud services are monetized by a monthly fee rather than an up-front one-off payment. This benefits the user by not requiring a large capital payment, not having to take out support contracts, and having the freedom to stop paying when the software is no longer required.

The key benefits…

  • Continuous upgrades – in the cloud model, software is continuously updated and upgraded so the user gets immediate benefit with minimal effort.
  • Concentration of computer power – as said, most applications run more efficiently on one large computer than on many small ones. This is partly because the large computer is intrinsically more efficient but also because they mop up of latency – that is, the time most computers spend doing nothing)
  • Vastly lower costs – no need to spend time procuring, servicing, maintaining, updating, securing physical hardware and native applications.
  • Working from anywhere – cloud based applications not only allow users to use any computer in any place, this also means in the event of a disaster, data is secure and business interruption is minimised.

The key risks…

  • Dependence on Internet – cloud services imply significant dependence on the availability of the Internet. Although some do allow “off line” working, this is not always practical.
  • Security breaches – with all a user’s data stored in the cloud, it is an ever-present temptation for hackers and competitors in a way that would not be possible if data stored onsite. Data can be destroyed, stolen, sold, modified, ransomed, and so on.
  • Privacy breaches – sharing confidential data among employees can be as damaging as business data.
  • Data latency – no matter how fast Internet access becomes it is worth remembering that nothing travels faster than light, and when you are accessing data on a cloud server the other side of the world, these delays can add up.
  • The price of “free” services – many cloud services are available free of charge. As we all know, nothing is ever free, and the price you pay is sharing your data with the service provider. Read their Ts & Cs and you may be shocked what you have agreed to. And if you don’t pay, you do not have many rights.

So in summary, cloud applications are an enormous advance on our early native application model although the distribution of work between the local client and remote server is constantly changing. In almost every aspect, cloud-based applications win over on applications based on local area networks but it is important not to forget that there are some risks.

Dr John Yardley

Dr John Yardley, Managing Director, JPY Limited and Threads Software Ltd
John began his career as a researcher in computer science and electronic engineering with the National Physical Laboratory (NPL), where he undertook a PhD in speech recognition. In early 2019, John founded Threads Software Ltd as a spin off from his company JPY Ltd to commercialise and exploit the Threads Intelligent Message Hub, developed originally by JPY Ltd. Today, JPY represents manufacturers of over 30 software products, distributed through a channel of 100 specialist resellers. Threads software allows for easy searching and retrieval of data by uniquely collecting, aggregating and indexing all required digital business communications – both written and spoken.
John brings a depth of understanding of a wide range of the technologies that underpin the software industry. He has a PhD in Electrical Engineering from the University of Essex and a BSc in Computer Science from City University, London.

Ab Initio partners with BT Group to deliver big data

Luke Conrad • 24th October 2022

AI is becoming an increasingly important element of the digital transformation of many businesses. As well as introducing new opportunities, it also poses a number of challenges for IT teams and the data teams supporting them. Ab Initio has announced a partnership with BT Group to implement its big data management solutions on BT’s internal...

WAICF – Dive into AI visiting one of the most...

Delia Salinas • 10th March 2022

Every year Cannes held an international technological event called World Artificial Intelligence Cannes Festival, better known by its acronym WAICF. One of the most luxurious cities around the world, located on the French Riviera and host of the annual Cannes Film Festival, Midem, and Cannes Lions International Festival of Creativity. 

Bouncing back from a natural disaster with resilience

Amber Donovan-Stevens • 16th December 2021

In the last decade, we’ve seen some of the most extreme weather events since records began, all driven by our human impact on the plant. Businesses are rapidly trying to implement new green policies to do their part, but climate change has also forced businesses to adapt and redefine their disaster recovery approach. Curtis Preston,...