Protecting the data driving the industry 4.0 revolution

Sensitive data under attack

Protecting data and software from unauthorized access and modification requires the use of cryptographic algorithms, specifically encryption and digital signature algorithms. Together these can ensure that the data/software hasn’t been altered, it comes from a known source, and is protected for confidentiality.

For the past 20 years the current encryption and digital signature algorithms have served this purpose well, but with the advent of quantum computers, this could all be about to change. Various experts predict that quantum computers will be available to state actors within 3-10 years and, once available, could be used to break these algorithms.

What’s being done about it?

To counter this threat, on July 5 2022, the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) announced the first results of its 6-year effort to standardize on new quantum-resistant algorithms. Three new digital signature algorithms have been announced and one encryption algorithm, with further encryption algorithms to follow.

To put this into context, the last time this exercise was conducted was in 1997-2001, when the elliptic curve digital signature algorithm and the AES encryption algorithm were chosen and standardized. Both algorithms have since been shown to be vulnerable to attack using quantum computers, hence the need to replace them.

What’s the significance of this?

In a nutshell, if you have data or software that today relies upon these algorithms, but still needs to be protected from unauthorized access and modification in the future, then you need to think about moving to the new algorithms as soon as possible. Various state actors are already harvesting encrypted data (e.g. intellectual property data) so that it may be decrypted in the future. Similarly, safety critical software that is digitally signed today, could be maliciously updated, causing a future safety, environmental or security hazard.

But there’s a new challenge…

Having decided that you need to protect your sensitive data and software with the new algorithms, the new challenge is that it is going to take a long time for the new algorithms to be available in fully tested, trusted forms. In the rush to implement the new algorithms, it is inevitable that additional vulnerabilities will be introduced, and this in turn could lead to data and software compromise.

Let’s take a step back

What are digital signature and encryption algorithms?  

Put simply, these are sequences of mathematical steps that are used to protect data.  

The data, plus some additional secret information, are combined together with the purpose of keeping the data confidential (i.e. encryption) and to ensure that the data hasn’t been altered (digital signatures).

Encryption makes use of a random number known as a ‘secret key’ and uses this to scramble the data. The same secret key is used to unscramble the data at a later time. Digital signatures are slightly different as there are two keys, one called the ‘private key’ and the other the ‘public key’. The digital signature is created with the private key and can be verified with the public key. Digital signatures can be thought of as acting like a handwritten signature on a document.

In order to keep the secret and private keys protected, they are typically held within secure, tamper resistant, cryptographic hardware chips i.e. chips or devices that are resistant to various forms of external attack. For individual users, the chip may be placed within a physical dongle or it may be embedded within another device e.g. a smartphone. For organisations with multiple keys, protection is usually provided by a larger device known as a Hardware Security Module (HSM).

Cryptographical algorithms are designed to be difficult to break i.e. to be able to deduce the secret and private keys used to protect the data. The thinking was that it would take many years to deduce these keys using conventional computers, which is why quantum computers pose such a threat.

D-Sig’s approach

There are two approaches D-Sig is taking to this:

The first is to adopt a cautious approach when implementing the new algorithms, making sure the implementations are truly robust. In addition, the algorithms will be implemented within protected cryptographic hardware devices, and these devices will have gone through a formal evaluation process.

The second is to give the algorithms time to embed themselves, time to allow large scale crypto-analysis to have taken place, to ensure they are truly quantum-resistant.

Until D-Sig is confident on both points, it is adopting a hybrid ‘belt and braces’ approach, implementing the new cryptographic algorithms but maintaining the current algorithms in tandem. In practice, this means implementing double encryption and an additional digital signature. The original algorithms are used to sign and encrypt the data, and then the result is signed and encrypted with the new algorithms. This provides a fall-back to the existing algorithms with the potential benefit of the new ones.

Supporting this approach requires new forms of public key certificates that can embed the public keys of the new digital signature algorithms. D-Sig is able to provide this capability as it has its own software able to issue these certificates. 

Keeping one step ahead

Technological advances, in particular the opportunity to connect devices and analyze data in real time and/or using artificial intelligence, have the potential to open up a wealth of opportunities across industries as diverse as manufacturing and transport to chemical and process engineering and healthcare. 

Ensuring the security of that data is key to allowing industries to release previously untapped potential and productivity gains. However, those same advances mean that previously hypothetical security threats are now moving into the realms of near-future reality. 

The recent standardization of quantum-resistant algorithms is a key landmark in the fight but staying one step ahead of bad actors is going to be an ongoing challenge for everyone, across the security industry.

Barry Arnott

Barry Arnott, founder and CTO of D-Sig

Ab Initio partners with BT Group to deliver big data

Luke Conrad • 24th October 2022

AI is becoming an increasingly important element of the digital transformation of many businesses. As well as introducing new opportunities, it also poses a number of challenges for IT teams and the data teams supporting them. Ab Initio has announced a partnership with BT Group to implement its big data management solutions on BT’s internal...

WAICF – Dive into AI visiting one of the most...

Delia Salinas • 10th March 2022

Every year Cannes held an international technological event called World Artificial Intelligence Cannes Festival, better known by its acronym WAICF. One of the most luxurious cities around the world, located on the French Riviera and host of the annual Cannes Film Festival, Midem, and Cannes Lions International Festival of Creativity. 

Bouncing back from a natural disaster with resilience

Amber Donovan-Stevens • 16th December 2021

In the last decade, we’ve seen some of the most extreme weather events since records began, all driven by our human impact on the plant. Businesses are rapidly trying to implement new green policies to do their part, but climate change has also forced businesses to adapt and redefine their disaster recovery approach. Curtis Preston,...