Risk Before Popularity: 4 Factors for Determining Security Vulnerability

Security

Stephen Roostan, VP EMEA at Kenna Security looks at identifying what kind of security vulnerability represents the greatest risk.

For all of the wrong reasons, security breaches capture the headlines especially when a high profile organisation has come under attack. In 2020 alone, we’ve seen some of the most well known global brands on the planet fall victim to external threats, including the likes of Zoom, Twitter and Nintendo. In turn, each incident has generated a huge amount of buzz that has understandably left many businesses worried about their own ability to manage risk effectively.

Of course, an increased focus on an organisation’s own security and the external threats that it may fall victim to can be extremely beneficial, especially in the current environment, where many employees are working remotely and the attack surface has grown significantly. A level of internal reflection is important, but these headlines risk pulling the focus away from more dangerous vulnerabilities that don’t command the same level of media attention.

Research from Kenna Security and the Cyentia Institute shows that only 5 percent of vulnerabilities fall under the ‘high-risk’ category indicating that they could be weaponised in some way. There will always be those attacks which both garner a large amount of attention and warrant an equal amount of action – such as the Heartbleed Bug that put millions of websites at risk as a result of a vulnerability in open source cryptographic protocol. However, there are also those that can be just as catastrophic, which seemingly go unnoticed.

Broadening Your Gaze


Framing vulnerability management efforts around security news headlines puts security teams in a precarious position. As the news and hype around security vulnerabilities escalates, it is becoming increasingly difficult for security teams to stay current with the threat landscape and determine how best to prioritise their efforts.

Allocating precious time and energy to yield the biggest dividends where reducing organisational risk is concerned depends on security teams being able to prioritise their efforts based on the factors that really matter. Rather than sinking valuable resources into remediating headline grabbing vulnerabilities that may pose little or no threat to the organisation, identifying the right vulnerabilities to fix increasingly depends on embracing an objective and consistent way to prioritise vulnerabilities.

Let’s take a look at the top four factors that security teams should consider when evaluating which vulnerabilities represent the greatest risk to a specific environment.


Remain wary of remote code execution

Remote code execution enables an attacker to access a computing device from anywhere in the world to make damaging changes, so it’s no surprise that remote code execution tops the wish list of hackers everywhere. Having established a way to run their code on a remote system, hackers then have the ability to inflict all kinds of chaos, including establishing bot networks, stealing data, or infiltrating networks.


Look out for Metasploit and Blackhat exploits

Unfortunately, the same Metasploit security teams use to pen test their organisation’s defences and identify weaknesses has become the de facto standard for exploit development. When hackers use Metasploit, they’re not just creating tests, they’re creating real attacks. So whenever modules appear in Metasploit, it’s a given that attackers are, or soon will be, leveraging these to exploit vulnerabilities.

For that reason, any vulnerability identified with a Metasploit module should be at the top of an enterprise’s list of vulnerabilities to patch or mitigate. Regular patching, running applications or processes with least privileges, and limiting network access to only trusted hosts can all play a pivotal role in limiting a hacker’s ability to leverage Metasploit.

Security teams are also well advised to consider blackhat exploit kits. Despite having a much lower proliferation rate than Metasploit, their intent is much clearer. In other words, using an exploit from a blackhat kit is almost always for malicious intent and for this reason should be incorporated into the remediation decision-making process accordingly.


Keep a close eye on the ability to access networks

Network accessibility plays a major role when determining the severity of a security threat and the likelihood of a vulnerability’s exploitation. Today’s attackers will leverage automation to execute attacks at scale and are on the lookout for network-accessibility vulnerabilities that can form the basis of botnets as well as command-and-control communications.

Cross-site scripting, missing function-level access controls or patterns of excessive use also serve as common examples of network accessibility vulnerabilities that should be prioritised for management.


Always consider the Exploit Database

The Exploit Database is a comprehensive repository of exploits and proof-of-concept attacks. Unfortunately, just like Metasploit, the Exploit Database is an invaluable resource for security teams and attackers alike. Attackers use it to find an exploit that will help compromise a known vulnerability within a target system.

Until a vulnerability appears in the Exploit Database, it remains less likely to emerge as a significant broad-based threat for organisations. However, as soon as a vulnerability appears, organisations will need to take action fast to remediate it.

Straightening Out Priorities


Today’s enterprise security teams have tens of thousands of vulnerabilities to remediate. The reality is that most vulnerabilities are likely to be exploited within 40 to 60 days, yet it can take security teams up to 120 days to put remediation in place. So the pressure is on for security teams to identify those vulnerabilities that pose the biggest risk of exploitation for their organisation and get to work with fixing these first.

As we’ve seen, while keeping up to date with security news is a great way of staying abreast with how the threat landscape is evolving, a vulnerability doesn’t need to be new or buzzworthy to pose a serious threat to the enterprise. All too often headlines can serve to distract security teams from remediating quickly and efficiently those risks that haven’t made it into the hall of fame. What organisations need to remember is that the most important factor to consider is where a vulnerability sits within their ecosystem. For example, a high-risk vulnerability sitting in a low-risk environment poses less of a threat, than a medium-risk vulnerability in a highly accessible environment. Ultimately, visibility and context are everything. Media headlines and ranking on the Common Vulnerabilities Scoring System (CVSS) database can have little bearing. What matters is the risk that the vulnerability poses on the individual organisation.

At the end of the day, effective vulnerability management requires a risk-based approach to prioritising remediation efforts, so that the right vulnerabilities are addressed at the right time. That means streamlining and accelerating efforts by evaluating a vulnerability’s most critical aspects to figure out how much danger a vulnerability really poses. In this way, the limited time and resources of the security team can best be focused on addressing those vulnerabilities that actually pose the most risk to the organisation.


Stephen Roostan

Stephen has over a decade of experience in cyber security and transformation projects, and his role at Kenna is to rapidly grow the EMEA organisation to meet the customer demand for risk-based vulnerability management. Prior to Kenna he held senior sales roles at Forcepoint, Citrix and Imperva, focusing on IT solutions for complex, enterprise requirements.

Unlocking productivity and efficiency gains with data management

Russ Kennedy • 04th July 2023

Enterprise data has been closely linked with hardware for numerous years, but an exciting transformation is underway as the era of the hardware businesses is gone. With advanced data services available through the cloud, organisations can forego investing in hardware and abandon infrastructure management in favour of data management.

The Future of Cloud: A Realistic Look at What’s Ahead

Erin Lanahan • 22nd March 2023

Cloud computing has transformed the way we work, communicate, and consume technology. From storing data to running applications, the cloud has become an essential part of our lives. But what does the future hold for this technology? In this article, we’ll take a realistic look at the future of the cloud and what we can...

Ab Initio partners with BT Group to deliver big data

Luke Conrad • 24th October 2022

AI is becoming an increasingly important element of the digital transformation of many businesses. As well as introducing new opportunities, it also poses a number of challenges for IT teams and the data teams supporting them. Ab Initio has announced a partnership with BT Group to implement its big data management solutions on BT’s internal...