Protecting Data Irrespective of Infrastructure

The cyber security threat has risen so high in recent years that most companies globally now accept that a data breach is almost inevitable. But what does this mean for the data protection and compliance officers, as well as senior managers, now personally liable for protecting sensitive company, customer and partner data?

Investing in security infrastructure is not enough to demonstrate compliance in protecting data. Software Defined Wide Area Networks (SD WAN), Firewalls and Virtual Private Networks (VPN) play a role within an overall security posture but they are Infrastructure solutions and do not safeguard data. What happens when the data crosses outside the network to the cloud or a third-party network? How is the business data on the LAN side protected if an SD WAN vulnerability or misconfiguration is exploited? What additional vulnerability is created by relying on the same network security team to both set policies and manage the environment, in direct conflict with Zero Trust guidance?

The only way to ensure the business is protected and compliant is to abstract data protection from the underlying infrastructure. Simon Pamplin, CTO, Certes Networks, insists it is now essential to shift the focus, stop relying on infrastructure security and use Layer 4 encryption to proactively protect business sensitive data irrespective of location.

Acknowledging Escalating Risk

Attitudes to data security need to change fast because today’s infrastructure-led model is creating too much risk. According to the 2022 IBM Data Breach survey, 83% of companies confirm they expect a security breach – and many accept that breaches will occur more than once. Given this perception, the question has to be asked: why are businesses still reliant on a security posture focused on locking the infrastructure down? 

Clearly that doesn’t work. While not every company will experience the catastrophic impact of the four-year-long data breach that ultimately affected 300 million guests of Marriott Hotels, attackers are routinely spending months inside businesses looking for data. In 2022, it took an average of 277 days—about nine months—to identify and contain a breach. Throughout this time, bad actors have access to corporate data; they have the time to explore and identify the most valuable information. And the chance to copy and/or delete that data – depending on the attack’s objective.

The costs are huge: the average cost of a data breach in the US is now $9.44 million ($4.35 is the average cost globally). From regulatory fines – which are increasingly punitive across the globe – to the impact on share value, customer trust, even business partnerships, the long-term implications of a data breach are potentially devastating.

Misplaced Trust in Infrastructure

Yet these affected companies have ostensibly robust security postures. They have highly experienced security teams and an extensive investment in infrastructure. But they have bought into the security industry’s long perpetuated myth that locking down infrastructure, using VPNs, SD WANs and firewalls, will protect a business’ data. 

As breach after breach has confirmed, relying on infrastructure security fails to provide the level of control needed to safeguard data from bad actors. For the vast majority of businesses, data is rarely restricted to the corporate network environment. It is in the cloud, on a user’s laptop, on a supplier’s network. Those perimeters cannot be controlled, especially for any business that is part of supply chain and third-party networks. How does Vendor A protect third party Supplier B when the business has no control over their network? Using traditional, infrastructure dependent security, it can’t. 

Furthermore, while an SD WAN is a more secure way of sending data across the Internet, it only provides control from the network egress point to the end destination. It provides no control over what happens on an organisation’s LAN side. It cannot prohibit data being forwarded on to another location or person. Plus, of course, it is accepted that SD WAN misconfiguration can add a risk of breach, which means the data is exposed – as shown by the public CVE’s (Common Vulnerabilities and Exposures) available to review on most SD WAN vendors’ websites. And while SD WANs, VPNs and firewalls use IPSEC as an encryption protocol, their approach to encryption is flawed: the encryption keys and management are handled by the same group, in direct contravention of accepted zero trust standards of “Separation of Duties”.

Protect the Data

It is, therefore, essential to take another approach, to focus on protecting the data. By wrapping security around the data, a business can safeguard this vital asset irrespective of infrastructure. Adopting Layer 4, policy-based encryption ensures the data payload is protected for its entire journey – whether it was generated within the business or by a third party. 

If it crosses a misconfigured SD WAN, the data is still safeguarded: it is encrypted, making it valueless to any hacker. However long an attack may continue, however long an individual or group can be camped out in the business looking for data to use in a ransomware attack, if the sensitive data is encrypted, there is nothing to work with. 

The fact that the payload data only is encrypted, while header data remains in the clear means minimal disruption to network services or applications, as well as making troubleshooting an encrypted network easier.

This mindset shift protects not only the data and, by default, the business, but also the senior management team responsible – indeed personally liable – for security and information protection compliance. Rather than placing the burden of data protection onto network security teams, this approach realises the true goal of zero trust: separating policy setting responsibility from system administration. The security posture is defined from a business standpoint, rather than a network security and infrastructure position – and that is an essential and long overdue mindset change.

Conclusion

This mindset change is becoming critical – from both a business and regulatory perspective. Over the past few years, regulators globally have increased their focus on data protection. From punitive fines, including the maximum with its €20 million (or 25% of global revenue, whichever is the higher) per breach of European Union’s General Data Protection Regulation (GDPR) to the risk of imprisonment, the rise in regulation across China and the Middle East reinforces the global clear recognition that data loss has a material cost to businesses. 

Until recently, however, regulators have not been prescriptive about the way in which that data is secured – an approach that has allowed the ‘lock down infrastructure’ security model to continue. This attitude is changing. In North America, new laws demand encryption between Utilities’ Command and Control centres to safeguard national infrastructure. This approach is set to expand as regulators and businesses recognise that the only way to safeguard data crossing increasingly dispersed infrastructures, from SD WAN to the cloud, is to encrypt it – and do so in a way that doesn’t impede the ability of the business to function.

It is now essential that companies recognise the limitations of relying on SD WANs, VPNs and firewalls. Abstracting data protection from the underlying infrastructure is the only way to ensure the business is protected and compliant.

Simon Pamplin

CTO, Certes Networks

How E-commerce Marketers Can Win Black Friday

Sue Azari • 11th November 2024

As new global eCommerce players expand their influence across both European and US markets, traditional brands are navigating a rapidly shifting landscape. These fast-growing Asian platforms have gained traction by offering ultra-low prices, rapid product turnarounds, heavy investment in paid user acquisition, and leveraging viral social media trends to create demand almost in real-time. This...

Why microgrids are big news

Craig Tropea • 31st October 2024

As the world continues its march towards a greener future, businesses, communities, and individuals alike are all increasingly turning towards renewable energy sources to power their operations. What is most interesting, though, is how many of them are taking the pro-active position of researching, selecting, and implementing their preferred solutions without the assistance of traditional...

Is automation the silver bullet for customer retention?

Carter Busse • 22nd October 2024

CX innovation has accelerated rapidly since 2020, as business and consumer expectations evolved dramatically during the Covid-19 pandemic. Now, finding the best way to engage and respond to customers has become a top business priority and a key business challenge. Not only do customers expect the highest standard, but companies are prioritising superb CX to...

Automated Testing Tools and Their Impact on Software Quality

Natalia Yanchii • 09th October 2024

Test automation refers to using specialized software tools and frameworks to automate the execution of test cases, thereby reducing the time and effort required for manual testing. This approach ensures that automation tests run quickly and consistently, allowing development teams to identify and resolve defects more effectively. Test automation provides greater accuracy by eliminating human...

Custom Software Development

Natalia Yanchii • 04th October 2024

There is a wide performance gap between industry-leading companies and other market players. What helps these top businesses outperform their competitors? McKinsey & Company researchers are confident that these are digital technologies and custom software solutions. Nearly 70% of the top performers develop their proprietary products to differentiate themselves from competitors and drive growth. As...

The Impact of Test Automation on Software Quality

Natalia Yanchii • 04th October 2024

Software systems have become highly complex now, with multiple interconnected components, diverse user interfaces, and business logic. To ensure quality, QA engineers thoroughly test these systems through either automated or manual testing. At Testlum, we met many software development teams who were pressured to deliver new features and updates at a faster pace. The manual...