Zero trust architecture is not just ‘nice to have’

Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity, and explores the essential components organisations must embrace to achieve this.
The US National Institute of Standards and Technology’s (NIST) recent Special Publication (SP 800-207) has changed the table stakes when it comes to cybersecurity best practice. While not mandatory, the federal agency’s role in enhancing economic security cannot be under-estimated. As such, its guidance on refining the concept of Zero Trust and its high-level roadmap on how organizations can implement a standardized approach to a Zero Trust Architecture can also not be ignored.
Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity and explores the essential components organizations must embrace to achieve this.
Zero Trust

The concept of ‘zero trust’ is not new; originally defined in Stephen Paul Marsh’s doctoral thesis on computational security in 1994, it became a key cybersecurity concept when Forrester’s John Kindervag reignited it in the late 2000’s. The idea being that would-be attacks could come from both within, as well as from without, an organization’s network.

However, until recently, the debate around zero trust has remained – in my view – focused solely on authenticating the user within the system rather than taking a more holistic approach and looking at user authentication and access to sensitive data using protected micro-segments. This concept has changed with NIST’s Special Publication; no longer is the network the focus of zero trust; finally, it is the data that traverses the network.

At its core, NIST’s Special Publication decouples data security from the network. Its key tenets of policy definition and dynamic policy enforcement, micro-segmentation and observability offer a new standard of Zero Trust Architecture (ZTA) for which today’s enterprise is responsible.

Dynamic policy aligned to business intent

As data owners, organizations are responsible for protecting their sensitive information. Moreover, with increasing regulation that specifically targets the protection of this sensitive data, it is more important than ever that organizations adopt a cybersecurity stance that can ensure – and maintain – compliance, or information assurance. However, not all data has the same level of sensitivity.

Under the latest zero trust standards, data needs to be classified according to differing levels of sensitivity and the business intent of that data. This business intent needs to define an organization’s operational policy around how data is handled and accessed, when, where and by whom, with micro-segmentation protecting each data class from external compromise and providing isolation from other data classifications.

In addition, enterprises are encouraged to observe and collect as much information as possible about their asset security posture, network traffic and access requests; process that data; and use any insight gained to dynamically improve policy creation and enforcement.

Authentication and authorization

Under NIST’s zero trust standards, access to individual enterprise resources is granted on a per-session basis based on a combination of component relationships, such as the observable state of client identity, application/service, and the requesting asset—and may include other behavioral and environmental attributes – with operational policy enforcement.

Authentication and authorization to one resource does not grant access to another resource. It is also dynamic, requiring a constant cycle of obtaining access, scanning and assessing threats, adapting, and continually re-evaluating trust in ongoing communication.

Cybersecurity best practice demands that, by creating fine-grain policies, authentication and authorization are done on a ‘per-packet’ basis, only allowing access to the resources required. Layer-4 encryption protects data as it transits between policy enforcement points. It provides full observability by encrypting the payload only, leaving the packet header in the clear and allowing for granular enforcement of security policies.

Network visibility and observability tools are the linchpins that provide real-time contextual meta-data enabling rapid detection of out-of-policy data and fast response and remediation to any non-compliant traffic flow or policy change to maintain the required security posture on a continuous basis.

READ MORE: 

No compromise

Fundamentally, a Zero Trust posture must be achievable without compromising the network’s performance, allowing users with authenticated and authorized access to the data they need to do their jobs seamlessly.

Organizations need to be able to secure data in transit, across any network, with zero impact to performance, scalability or operational visibility. As the latest NIST zero trust standards advocate, decoupling security from network hardware in this way is a unique approach and enables security teams to be confident that their organization’s data is assured, regardless of what is happening to the network – finally putting the focus for cyber security best practice where it belongs – the data.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Author

How to move from CIO to Chief Customer Success Officer

Amber Donovan-Stevens • 21st October 2021

Dean Leung, Chief Customer Success Officer at iManage, reflects on his own path shifting from CIO to Chief Customer Success Officer (CCSO) and discusses both the similarities and differences of the two roles, and why it can be a natural progression when approached with the proper mindset.

The importance of edtech in the early years sector

Amber Donovan-Stevens • 18th October 2021

Technology has become an operational mainstay across a multitude of industries – helping businesses, education establishments, governments, and charities to streamline their processes and enhance communications. When it comes to the early years education sector, this is no different. Chris Reid, CEO and founder of Connect Childcare, shares his thoughts on the intrinsic link between...

A deep dive into the Scaled Agile Framework

Jeff Keyes • 14th October 2021

The Scaled Agile Framework (SAFe) was designed to help large organizations successfully adopt agile methodologies. In this article Jeff Keyes, VP of Product Marketing and Strategy at Plutora, discusses the four core values of this approach, and how and why businesses are using the SAFe framework to improve agility in software development.

How click fraud has worsened in the wake of Covid-19

Amber Donovan-Stevens • 05th October 2021

Stewart Boutcher, CTO and Data Lead at Beacon, examines how click fraud – which was already a serious threat to companies engaged in digital marketing prior to the pandemic – has worsened considerably in its wake. He seeks to provide a forecast on how the situation is likely to evolve overtime, and advice on what...

Join our webinar on 26th October: Intelligent Automation - Maintaining the competitive edge.

X