Zero trust architecture is not just ‘nice to have’

Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity, and explores the essential components organisations must embrace to achieve this.
The US National Institute of Standards and Technology’s (NIST) recent Special Publication (SP 800-207) has changed the table stakes when it comes to cybersecurity best practice. While not mandatory, the federal agency’s role in enhancing economic security cannot be under-estimated. As such, its guidance on refining the concept of Zero Trust and its high-level roadmap on how organizations can implement a standardized approach to a Zero Trust Architecture can also not be ignored.
Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity and explores the essential components organizations must embrace to achieve this.
Zero Trust

The concept of ‘zero trust’ is not new; originally defined in Stephen Paul Marsh’s doctoral thesis on computational security in 1994, it became a key cybersecurity concept when Forrester’s John Kindervag reignited it in the late 2000’s. The idea being that would-be attacks could come from both within, as well as from without, an organization’s network.

However, until recently, the debate around zero trust has remained – in my view – focused solely on authenticating the user within the system rather than taking a more holistic approach and looking at user authentication and access to sensitive data using protected micro-segments. This concept has changed with NIST’s Special Publication; no longer is the network the focus of zero trust; finally, it is the data that traverses the network.

At its core, NIST’s Special Publication decouples data security from the network. Its key tenets of policy definition and dynamic policy enforcement, micro-segmentation and observability offer a new standard of Zero Trust Architecture (ZTA) for which today’s enterprise is responsible.

Dynamic policy aligned to business intent

As data owners, organizations are responsible for protecting their sensitive information. Moreover, with increasing regulation that specifically targets the protection of this sensitive data, it is more important than ever that organizations adopt a cybersecurity stance that can ensure – and maintain – compliance, or information assurance. However, not all data has the same level of sensitivity.

Under the latest zero trust standards, data needs to be classified according to differing levels of sensitivity and the business intent of that data. This business intent needs to define an organization’s operational policy around how data is handled and accessed, when, where and by whom, with micro-segmentation protecting each data class from external compromise and providing isolation from other data classifications.

In addition, enterprises are encouraged to observe and collect as much information as possible about their asset security posture, network traffic and access requests; process that data; and use any insight gained to dynamically improve policy creation and enforcement.

Authentication and authorization

Under NIST’s zero trust standards, access to individual enterprise resources is granted on a per-session basis based on a combination of component relationships, such as the observable state of client identity, application/service, and the requesting asset—and may include other behavioral and environmental attributes – with operational policy enforcement.

Authentication and authorization to one resource does not grant access to another resource. It is also dynamic, requiring a constant cycle of obtaining access, scanning and assessing threats, adapting, and continually re-evaluating trust in ongoing communication.

Cybersecurity best practice demands that, by creating fine-grain policies, authentication and authorization are done on a ‘per-packet’ basis, only allowing access to the resources required. Layer-4 encryption protects data as it transits between policy enforcement points. It provides full observability by encrypting the payload only, leaving the packet header in the clear and allowing for granular enforcement of security policies.

Network visibility and observability tools are the linchpins that provide real-time contextual meta-data enabling rapid detection of out-of-policy data and fast response and remediation to any non-compliant traffic flow or policy change to maintain the required security posture on a continuous basis.

READ MORE: 

No compromise

Fundamentally, a Zero Trust posture must be achievable without compromising the network’s performance, allowing users with authenticated and authorized access to the data they need to do their jobs seamlessly.

Organizations need to be able to secure data in transit, across any network, with zero impact to performance, scalability or operational visibility. As the latest NIST zero trust standards advocate, decoupling security from network hardware in this way is a unique approach and enables security teams to be confident that their organization’s data is assured, regardless of what is happening to the network – finally putting the focus for cyber security best practice where it belongs – the data.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Amber Donovan-Stevens

Amber is a Content Editor at Top Business Tech

Unlocking productivity and efficiency gains with data management

Russ Kennedy • 04th July 2023

Enterprise data has been closely linked with hardware for numerous years, but an exciting transformation is underway as the era of the hardware businesses is gone. With advanced data services available through the cloud, organisations can forego investing in hardware and abandon infrastructure management in favour of data management.