Zero trust architecture is not just ‘nice to have’

Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity, and explores the essential components organisations must embrace to achieve this.
The US National Institute of Standards and Technology’s (NIST) recent Special Publication (SP 800-207) has changed the table stakes when it comes to cybersecurity best practice. While not mandatory, the federal agency’s role in enhancing economic security cannot be under-estimated. As such, its guidance on refining the concept of Zero Trust and its high-level roadmap on how organizations can implement a standardized approach to a Zero Trust Architecture can also not be ignored.
Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity and explores the essential components organizations must embrace to achieve this.
Zero Trust

The concept of ‘zero trust’ is not new; originally defined in Stephen Paul Marsh’s doctoral thesis on computational security in 1994, it became a key cybersecurity concept when Forrester’s John Kindervag reignited it in the late 2000’s. The idea being that would-be attacks could come from both within, as well as from without, an organization’s network.

However, until recently, the debate around zero trust has remained – in my view – focused solely on authenticating the user within the system rather than taking a more holistic approach and looking at user authentication and access to sensitive data using protected micro-segments. This concept has changed with NIST’s Special Publication; no longer is the network the focus of zero trust; finally, it is the data that traverses the network.

At its core, NIST’s Special Publication decouples data security from the network. Its key tenets of policy definition and dynamic policy enforcement, micro-segmentation and observability offer a new standard of Zero Trust Architecture (ZTA) for which today’s enterprise is responsible.

Dynamic policy aligned to business intent

As data owners, organizations are responsible for protecting their sensitive information. Moreover, with increasing regulation that specifically targets the protection of this sensitive data, it is more important than ever that organizations adopt a cybersecurity stance that can ensure – and maintain – compliance, or information assurance. However, not all data has the same level of sensitivity.

Under the latest zero trust standards, data needs to be classified according to differing levels of sensitivity and the business intent of that data. This business intent needs to define an organization’s operational policy around how data is handled and accessed, when, where and by whom, with micro-segmentation protecting each data class from external compromise and providing isolation from other data classifications.

In addition, enterprises are encouraged to observe and collect as much information as possible about their asset security posture, network traffic and access requests; process that data; and use any insight gained to dynamically improve policy creation and enforcement.

Authentication and authorization

Under NIST’s zero trust standards, access to individual enterprise resources is granted on a per-session basis based on a combination of component relationships, such as the observable state of client identity, application/service, and the requesting asset—and may include other behavioral and environmental attributes – with operational policy enforcement.

Authentication and authorization to one resource does not grant access to another resource. It is also dynamic, requiring a constant cycle of obtaining access, scanning and assessing threats, adapting, and continually re-evaluating trust in ongoing communication.

Cybersecurity best practice demands that, by creating fine-grain policies, authentication and authorization are done on a ‘per-packet’ basis, only allowing access to the resources required. Layer-4 encryption protects data as it transits between policy enforcement points. It provides full observability by encrypting the payload only, leaving the packet header in the clear and allowing for granular enforcement of security policies.

Network visibility and observability tools are the linchpins that provide real-time contextual meta-data enabling rapid detection of out-of-policy data and fast response and remediation to any non-compliant traffic flow or policy change to maintain the required security posture on a continuous basis.

READ MORE: 

No compromise

Fundamentally, a Zero Trust posture must be achievable without compromising the network’s performance, allowing users with authenticated and authorized access to the data they need to do their jobs seamlessly.

Organizations need to be able to secure data in transit, across any network, with zero impact to performance, scalability or operational visibility. As the latest NIST zero trust standards advocate, decoupling security from network hardware in this way is a unique approach and enables security teams to be confident that their organization’s data is assured, regardless of what is happening to the network – finally putting the focus for cyber security best practice where it belongs – the data.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Amber Donovan-Stevens

Amber is a Content Editor at Top Business Tech

Unlock the Power of WiFi 6: How To Leverage It...

TBT Newsroom • 01st March 2023

Are you tired of being left behind in the technological world? Well, fear not! WiFi 6 is here to save the day and bring your business into the future. With unprecedented speeds and a host of new capabilities, WiFi 6 is the must-have technology for any business looking to stay ahead of the curve.

Sustainable Phones

TBT Newsroom • 04th May 2022

Cat phones (made by UK-based company Bullitt Group) are explicitly designed to be rugged, with devices built to last and have a longer lifespan. Industry Analyst firm Canalys notes that the current average lifecycle of smartphones in the mass market is approximately 37 months for iPhones and 33 months for Android devices.

From Credit Cards To Mobile Payment  

Ripsy Plaid • 27th April 2022

Plaid, the open finance data network, and payments platform have appointed Ripsy Bandourian as its first Head of Europe as it continues to rapidly expand across the continent. Based in Amsterdam, Ripsy will lead the business strategy and operations for Plaid’s Europe arm as it moves into its next stage of growth. 

How biometric technology can be used for remote proof of...

Chris Corfield • 08th April 2022

The pandemic has accelerated the adoption of digital financial services, driving organizations to speed up their transformation programs globally. Most banks, as well as pension providers, are still in the early stages of integrating technologies such as machine learning and artificial intelligence, and as the world continues to battle the long-term effects of COVID-19, the...