Cloud SecurityExecutive interviewsThreat Detection & Defense

Deep Observability: A New Layer to Add to Your Security Arsenal

Written by: Khairul Haqeem, Journalist, AOPG

Many organisations still have reservations about migrating all or a portion of their infrastructure and digital ecosystem to the cloud due to security concerns over the increased vulnerability exposure. The addition of deep observability as a security measure strengthens and complements the defences of any enterprise. Instead of relegating security just to the realm of security operations, it is now possible to “democratise” security activities across cloud and network operations teams.

Bassam Khan, Vice President of Product and Technical Marketing at Gigamon, said, “As of now, observability tools have depended on metrics, events, logs, and traces, or MELT data for its analytical job.” In addition to extending coverage into unmanaged hosts and introducing new infrastructure security features, deep observability enables clients to continue utilising the same tools they’ve been using for application monitoring.

Security That Accelerates Agility

With Bassam’s expertise in the field, I wanted to know what he thought was the biggest threat to the safety of hybrid clouds in Malaysia and the rest of Asia and the Pacific. As he put it, “the balance of security vs. agility is actually the main concern we hear from our clients.”

As public and hybrid cloud infrastructures mature, business stakeholders may increasingly move their workloads and apps to the cloud without involving infrastructure or security teams. While the new paradigm helps businesses adapt quickly to market changes, it also raises the risk that sensitive information and intellectual property may leak outside the company’s firewall.

Nonetheless, in today’s environment of rapid Continuous Integration and Continuous Delivery (CICD), blunders are inevitable. According to Bassam, workload misconfigurations are among the common causes of security breaches. With a guardrail in place to monitor and analyse network traffic, security teams can rest certain that their workloads are safe while yet allowing for rapid development.

A New Cybersecurity Layer

Bassam explained that deep observability can amplify the power of the monitoring, reporting and observability tools that organisations have already invested in and make them much more “insightful” with deep, real-time network-derived intelligence. The security and operational insight provided by this approach can significantly extend the technical and business value of the existing tools.

With network intelligence feeding their observability tools, DevOps can:

  • Evaluate potential security flaws, such as the identification of deprecated SSL cyphers or upcoming TLS certificate expirations.
  • Expedite troubleshooting tasks like distinguishing between network and application problems.
  • Expose malicious activities, such as the presence of unwanted applications or even cryptocurrency mining, either on the customer’s local network or in the customer’s public clouds.

However, Bassam clarified that deep observability is less about the intelligence of the network, and more about intelligence derived from the network. “Deep observability is intelligence but it’s not analytics. That function is performed by customers’ existing toolset, from on-prem SIEMs to next-generation, AI- and cloud-based analytics engines. With deep observability these tools are much more capable of addressing security issues for all hosts in an organisation, whether managed or unmanaged,” he explained.

Gigamon not only offer a deep observability pipeline that harnesses actionable network-level intelligence to amplify the power of its customers’ observability tools but they also provide the ability to extract metadata from the traffic for tools that are unable to ingest network packets.

This means that customers’ already-purchased observability tools, such as New Relic, Dynatrace and Datadog, will be able to greatly expand their visibility from just managed hosts to all hosts communicating in their infrastructure, whether it’s a BYO device, an unknown API in production or an IOT device, such as a heart monitor or surveillance camera.

Lastly, he said Gigamon ensures each tool receives only the traffic it needs to do its job. Irrelevant traffic, such as duplicate packets, are eliminated so that tools operate efficiently. With tools efficiency, IT can often save millions in data centre expenses and reinvest in cloud initiatives.

Future Projection

It’s important to have multiple layers as part of an observability strategy. Gigamon’s mission is to assist its clients in gaining access to and deriving actionable insights from all “data in motion,” whether on-premises or in public cloud containers. By doing so, IT can guarantee that its security and monitoring systems can view all traffic and avoid any potential blind spots.

On a hopeful note, Bassam said, “We are seeing a growing adoption of observability tools, particularly being leveraged for security use-cases. As a result, responsibility for security is now shared more than it’s ever been before.”

Another interesting trend he shared is how network operations are now being proactively brought into cloud projects by cloud or develop operations teams. These teams see the value of network-derived intelligence but they prefer to have their network teams deploy and own deep observability. In the next three to four years, Bassam foresees network operators and engineers leveraging deep observability to enhance their careers in cloud initiatives.

Khairul Haqeem

Khairul is proficient in writing tech-related pieces for the Asia-Pacific region. Some of his most notable work is focused on emerging technologies, data storage, and cybersecurity. His prior experience includes stints as a writer for two iSaham sites: Crepetoast.com and Solanakit.com. Before beginning his writing career, he worked in the field of education. Aside from studying engineering at the International Islamic University Malaysia, he has also worked as a subtitler for Iyuno Global, serving clients like Netflix. His specialities are: • Disruptive Tech. • Data Storage. • Cybersecurity. • Decentralised Tech. • Blockchains.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *