Solving cloud security with observability

Adrian Rowley, Senior Director, Gigamon, looks at the challenges of a hybrid IT infrastructure and advises how businesses can overcome unrealistic expectations and secure their enterprise with zero trust.  
Adrian Rowley, Senior Director, Gigamon, looks at the challenges of a hybrid IT infrastructure and advises how businesses can overcome unrealistic expectations and secure their enterprise with zero trust.  

According to recent Gartner research, 88% of businesses will have a “cloud-first” approach by 2025. However, this doesn’t necessarily mean cloud-only. The latest State of the Cloud report from Flexera found that 82% of organizations are taking a hybrid approach, combining the use of public and private clouds as well as on-premises – it seems hybridity is set to stay.

The reality is that on-premise infrastructure is not going to disappear overnight and be replaced in entirety by virtual, containerized environments such as the cloud. Instead, both will need to work together. What’s more, a complete migration to the cloud takes an average of two-to-four years, meaning hybridity is almost impossible to avoid in some form or another. Therefore, security teams are left with the conundrum of balancing visibility between both cloud and on-premises, without creating blind spots or overspending on digital transformation and IT management.

The move to cloud opens the door to a variety of security issues and the ability to effectively monitor and secure workloads has become more difficult than ever. Factors such as IT complexity, the rate of change, lack of skills, and organizational silos have contributed to the complexity, making observability crucial for IT and security teams. As cyberattacks continue to target every environment, from on-premises, private, public, hybrid and multi-clouds, it’s important to understand best practices for achieving a safe and secure network in order to guarantee a high return on investment when migrating to a cloud infrastructure.

The challenges of a visibility gap

COVID-19 accelerated the adoption of cloud technology and it became a necessity for organizations shifting towards a remote working model. However, the rush to migrate workloads has often compromised security. It can lead to visibility gaps – where network tools struggle to see into the cloud and vice versa – which prevents NetOps teams from maintaining a holistic view of all data-in-motion. In turn, this creates a situation where each environment is operating in silos, making it challenging for these teams to accurately predict ahead and protect from security threats.

Overcoming unrealistic expectations 

The expectation that organizations can rapidly transform and modernize their current infrastructure easily is often unrealistic. In reality, if businesses jump straight in, or don’t develop a considered strategy for growth, they risk impacting the speed and security of their existing infrastructure. Before organizations can begin reaping the benefits of adopting complex, advanced security solutions, they must ensure a minimum level of visibility across their network. In setting unrealistic outcomes and overlooking the importance of a clear view into data, transforming current infrastructure is likely to fail.

Often, it is a lack of observability that is the crux of the issue, even when dealing with complex intrusions. Establishing a holistic, singular platform view and an ability to analyze observations may not be the silver bullet for defence, but it can give SecOps teams a strong foundation for detecting unusual activity and preventing attacks.

Using telemetry data to enable visibility

As organizations continue to migrate to the cloud, they are becoming increasingly reliant upon using logs to gather telemetry data. This data gives security teams key information on the scope of the incident, its root cause, the systems compromised, the impact of the breach and many other significant factors. If there are gaps in this data, it creates additional complexity for the individuals managing log files. Without clear network-based telemetry, derived from visibility into the network, SecOps teams cannot provide a reliable stream of information, even when systems have been compromised or infiltrated. Organizations looking to successfully move to the cloud need to prioritize visibility and use the data available to them to bolster security, reduce cost and stay compliant.

A zero trust framework

For organizations looking to further secure their cloud environment, implementing a zero trust architecture can be a great solution. It works on the basis that all data should be authenticated and eradicates the implicit trust typically given to internal users. This approach is quickly gaining traction both in the security world and further afield. In fact, research by Gigamon found that 61% of senior decision-makers across EMEA believe that zero trust enhances, or would enhance, their IT strategy.

However, zero trust doesn’t just address security issues, it also helps to streamline business processes. The same Gigamon study found that 87% of teams said that productivity has increased since they embarked on their zero trust journey, with reports that it has helped with efficiency and reducing the number of breaches. The last 18 months has seen a dramatic rise in attacks – particularly with ransomware becoming the top online threat to the UK – leaving many organizations vulnerable and at risk. zero trust can help businesses overcome this issue.

READ MORE:

Visibility sits at the heart of a zero trust framework. You cannot manage or monitor what you cannot see, and observability is essential for SecOps teams to authorize what is safe, and protect against what is not. When full visibility is achieved, zero trust can help to detect suspicious behaviors and analyze metadata that will contextualize the origin and movement of a cyberattack. Using this insight, security analysts can make more informed decisions and changes to their policies that will help in addressing the challenges of an increasingly complex threatscape.

Hybridity is here to stay and enterprises need to make sure they are prepared for it – with strong cybersecurity solutions that protect everything from the on-site servers to the virtual workloads. Maintaining a clear view into reliable, relevant, real-time data, setting achievable targets in their digital transformation strategy and implementing a zero trust approach, will help organizations overcome the security challenges of operating within a hybrid cloud model.

For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!

Follow us on LinkedIn and Twitter

Adrian Rowley

Adrian is a Senior Director Sales Engineering EMEA at Gigamon and has over 15 years of experience in the industry. He joined the Gigamon team in 2017 and has since been a prominent thought leader discussing the importance of network visibility, and more recently the challenges of successful cloud migration.

How E-commerce Marketers Can Win Black Friday

Sue Azari • 11th November 2024

As new global eCommerce players expand their influence across both European and US markets, traditional brands are navigating a rapidly shifting landscape. These fast-growing Asian platforms have gained traction by offering ultra-low prices, rapid product turnarounds, heavy investment in paid user acquisition, and leveraging viral social media trends to create demand almost in real-time. This...

Why microgrids are big news

Craig Tropea • 31st October 2024

As the world continues its march towards a greener future, businesses, communities, and individuals alike are all increasingly turning towards renewable energy sources to power their operations. What is most interesting, though, is how many of them are taking the pro-active position of researching, selecting, and implementing their preferred solutions without the assistance of traditional...

Is automation the silver bullet for customer retention?

Carter Busse • 22nd October 2024

CX innovation has accelerated rapidly since 2020, as business and consumer expectations evolved dramatically during the Covid-19 pandemic. Now, finding the best way to engage and respond to customers has become a top business priority and a key business challenge. Not only do customers expect the highest standard, but companies are prioritising superb CX to...

Automated Testing Tools and Their Impact on Software Quality

Natalia Yanchii • 09th October 2024

Test automation refers to using specialized software tools and frameworks to automate the execution of test cases, thereby reducing the time and effort required for manual testing. This approach ensures that automation tests run quickly and consistently, allowing development teams to identify and resolve defects more effectively. Test automation provides greater accuracy by eliminating human...

Custom Software Development

Natalia Yanchii • 04th October 2024

There is a wide performance gap between industry-leading companies and other market players. What helps these top businesses outperform their competitors? McKinsey & Company researchers are confident that these are digital technologies and custom software solutions. Nearly 70% of the top performers develop their proprietary products to differentiate themselves from competitors and drive growth. As...

The Impact of Test Automation on Software Quality

Natalia Yanchii • 04th October 2024

Software systems have become highly complex now, with multiple interconnected components, diverse user interfaces, and business logic. To ensure quality, QA engineers thoroughly test these systems through either automated or manual testing. At Testlum, we met many software development teams who were pressured to deliver new features and updates at a faster pace. The manual...