Home Podcasts Videos Guest Posts Q&A My Take Bio Contact
 

Steps forward

 

Black Hat insights: The retooling of SOAR to fit as the automation core protecting evolving networks

By Byron V. Acohido

In less than a decade, SOAR — security orchestration, automation and response — has rapidly matured into an engrained component of the security technology stack in many enterprises.

Related: Equipping SOCs for the long haul

SOAR has done much since it entered the cybersecurity lexicon to relieve the cybersecurity skills shortage. SOAR leverages automation and machine learning to correlate telemetry flooding in from multiple security systems. This dramatically reduces the manual labor required to do a first-level sifting of the data inundating modern business networks

However, SOAR has potential to do so much more, observes Cody Cornell, chief strategy officer and co-founder of Swimlane. SOAR, he argues, is in a position to arise as a tool that can help companies make the pivot to high-reliance on cloud-centric IT infrastructure. At the moment, a lot of organizations are in this boat.

“Covid 19 turned out to be the best digital transformation initiative ever,” Cornell says. “It forced us to do things that probably would’ve taken many more years for us to do, in terms of adopting to remote work and transitioning to cloud services.”

Swimlane, which launched in 2014 and is based in Denver, finds itself in the vanguard of cybersecurity vendors hustling to retool not just SOAR, but also security operations centers (SOCs,) security information and event management (SIEM) systems, and endpoint detection and response (EDR) tools. A core theme at RSA 2021 earlier this year – and at Black Hat USA 2021, taking place this week in Las Vegas – is that the combining of these and other security systems is inevitable and will end up resulting in something greater than the parts, i.e. not just more efficacious security, but optimized business networks overall.

Black Hat insights: Will Axis Security’s ZTNA solution hasten the sunsetting of VPNs, RDP?

By Byron V. Acohido

Company-supplied virtual private networks (VPNs) leave much to be desired, from a security standpoint.

Related: How ‘SASE’ is disrupting cloud security

This has long been the case. Then a global pandemic came along and laid bare just how brittle company VPNs truly are.

Criminal hackers recognized the golden opportunity presented by hundreds of millions employees suddenly using a company VPN to work from home and remotely connect to an array of business apps. Two sweeping trends resulted:  one bad, one good.

First, bad actors instantly began to hammer away at company VPNs; and attacks against instances of Remote Desktop Protocol (RDP) spiked dramatically, as well. VPNs and RDP both enable remote access that can put an intruder deep inside the firewall. And attempts to break into them have risen exponential over the past 17 months.

Conversely, Zero Trust has gained some material traction. As Black Hat USA 2021 convenes in Las Vegas this week, consensus is quickening around the wisdom of sunsetting legacy remote access tools, like VPNs and RDP, and replacing them with systems based on Zero Trust, i.e. trust no one, principles.

One start-up, Axis Security, couldn’t be more in the thick of these trends. Based in San Mateo, CA, Axis publicly announced its advanced Zero Trust access tool in March 2020, just as the global economy was slowing to a crawl.

“We came out of stealth mode right at the beginning of all the big shutdowns, and we got a number of customers, pretty fast, who were looking for solutions to remotely connect users to systems,” says Deena Thomchick, vice president of product marketing at Axis. “These were users who never had remote access before.”

NEW TECH: How the emailing of verified company logos actually stands to fortify cybersecurity

By Byron V. Acohido

Google’s addition to Gmail of something called Verified Mark Certificates (VMCs) is a very big deal in the arcane world of online marketing.

Related: Dangers of weaponized email

This happened rather quietly as Google announced the official launch of VMCs in a blog post on July 12. Henceforth companies will be able to insert their trademarked logos in Gmail’s avatar slot; many marketers can’t wait to distribute email carrying certified logos to billions of inboxes. They view logoed email as an inexpensive way to boost brand awareness and customer engagement on a global scale.

However, there is a fascinating back story about how Google’s introduction of VMCs – to meet advertising and marketing imperatives — could ultimately foster a profound advance in email security. Over the long term, VMCs, and the underlying Brand Indicators for Message Identification (BIMI) standards, could very well give rise to a bulwark against email spoofing and phishing.

I had a chance to sit down with Dean Coclin, senior director of business development at DigiCert, to get into the weeds of this quirky, potentially profound, security development. DigiCert is a Lehi, Utah-based Certificate Authority (CA) and supplier of Public Key Infrastructure services.

Coclin and I worked through how a huge email security breakthrough could serendipitously arrive as a collateral benefit of VMCs. Here are the main takeaways from our discussion:

MY TAKE: Equipping SOCs for the long haul – automation, edge security solidify network defenses

By Byron V. Acohido

Network security is in the throes of a metamorphosis. Advanced technologies and fresh security frameworks are being implemented to deter cyber attacks out at the services edge, where all the action is.

Related: Automating security-by-design in SecOps

This means Security Operations Centers are in a transition. SOCs came on the scene some 20 years ago as the focal point for defending on-premises datacenters of large enterprises. The role of SOCs today is both expanding and deepening, and in doing so, perhaps modeling what it will take to defend IT systems going forward – for organizations of all sizes.

I recently moderated a virtual panel on this topic featuring Scott Dally, director of security operations center Americas at NTT Security, and Devin Johnstone, senior security operations engineer at Palo Alto Networks.

For a full drill down please give a listen to the accompanying podcast version of that discussion. Here are the takeaways:

Pressurized landscape

Organizations today must withstand a constant barrage of cyber attacks. Primary vectors take the form of phishing campaigns, supply chain corruption and ransomware attacks, like the one that recently resulted in the shut down of Colonial Pipeline.

What’s happening is that digital transformation, while providing many benefits, has also dramatically expanded the attack surface. “An old problem is that many companies continue to cling to the notion that cybersecurity is just another cost center, instead of treating it as a potentially catastrophic exposure – one that needs to be continually mitigated,” Dally says.

MY TAKE: Massive data breaches persist as agile software development fosters full-stack hacks

By Byron V. Acohido

Data leaks and data theft are part and parcel of digital commerce, even more so in the era of agile software development.

Related: GraphQL APIs stir new exposures

Many of the high-profile breaches making headlines today are the by-product of hackers pounding away at Application Programming Interfaces (APIs) until they find a crease that gets them into the pathways of the data flowing between an individual user and myriad cloud-based resources.

It’s important to understand the nuances of these full-stack attacks if we’re ever to slow them down. I’ve had a few deep discussions about this with Doug Dooley, chief operating officer at Data Theorem, a Palo Alto, Calif.-based software security vendor specializing in API data protection. Here are a few key takeaways:

Targeting low-hanging fruit

Massive data base breaches today generally follow a distinctive pattern: hack into a client -facing application; manipulate an API; follow the data flow to gain access to an overly permissive database or S3 bucket (cloud storage). A classic example of this type of intrusion is the Capital One data breach.

Suspected Capital One hacker Paige Thompson was indicted for her alleged data breach and theft of more than 100 million people including 140,000 social security numbers and 80,000 linked bank accounts. The 33-year-old Amazon Web Services (AWS) software engineer was also accused of stealing cloud computer power on Capital One’s account to “mine” cryptocurrency for her own benefit, a practice known as “cryptojacking.”

Thompson began pounding away on the Capital One’s public-facing applications supposedly protected by their open-source Web Application Firewall (WAF), and succeeded in carrying out a  “Server Side Request Forgery” (SSRF) attack. By successfully hacking the client-facing application, she was then able to relay commands to a legacy AWS metadata service to obtain credentials.

Password and token harvesting is one of the most common techniques in hacking. Using valid credentials, Thompson was able to gain access using APIs … more

MY TAKE: Why monetizing data lakes will require applying ‘attribute-based’ access rules to encryption

By Byron V. Acohido

The amount of data in the world topped an astounding 59 zetabytes in 2020, much of it pooling in data lakes.

Related:  The importance of basic research

We’ve barely scratched the surface of applying artificial intelligence and advanced data analytics to the raw data collecting in these gargantuan cloud-storage structures erected by Amazon, Microsoft and Google. But it’s coming, in the form of driverless cars, climate-restoring infrastructure and next-gen healthcare technology.

In order to get there, one big technical hurdle must be surmounted. A new form of agile cryptography must get established in order to robustly preserve privacy and security as all this raw data gets put to commercial use.

I recently had the chance to discuss this with Kei Karasawa, vice president of strategy, and Fang Wu, consultant, at NTT Research, a Silicon Valley-based think tank which is in the thick of deriving the math formulas that will get us there.

They outlined why something called attribute-based encryption, or ABE, has emerged as the basis for a new form of agile cryptography that we will need in order to kick digital transformation into high gear.

For a drill down on our discussion, please give the accompanying podcast a listen. Here are the key takeaways:

Cloud exposures

Data lakes continue to swell because each second of every day, every human, on average, is creating 1.7 megabytes of fresh data. These are the rivulets feeding the data lakes.

A zettabyte equals one trillion gigabytes. Big data just keeps getting bigger. And we humans crunch as much of it as we can by applying machine learning and artificial intelligence to derive cool new digital services. But we’re going to need the help of quantum computers to get to the really amazing stuff, and that hardware is coming.

As we press ahead into our digital future, however, we’ll also need to retool the public-key-infrastructure. PKI is the authentication and encryption framework … more

ROUNDTABLE: Experts react to DHS assigning TSA to keep track of cyber attacks on pipelines

By Byron V. Acohido

The same federal agency that makes you take your shoes off and examines your belongings before boarding a flight will begin monitoring cyber incidents at pipeline companies.

Related: DHS begins 60-day cybersecurity sprints

The Department of Homeland Security on Thursday issued a directive requiring all pipeline companies to report cyber incidents to DHS’s Transportation Security Administration (TSA.)

This, of course, follows a devastating ransomware attack that resulted in a shutdown of Colonial Pipeline.

It can be argued that this is one small step toward the true level of federal oversight needed to protect critical infrastructure in modern times. I covered the aviation industry in the 1980s and 1990s when safety regulations proved their value by compelling aircraft manufacturers and air carriers to comply with certain standards, at a time when aircraft fleets were aging and new fly-by-wire technology introduced complex risks.

We’re a long way from having regulatory frameworks for data privacy and network security needed for critical infrastructure — akin to what we have to keep aviation and ground transportation safe and secure. However, the trajectory of ransomware attacks, supply chain corruption, denial of service attacks and cyber espionage is undeniable.

It seems clear we’re going to need more regulations to help guide the private sector into doing the right things. The discussion is just getting started, as you can see by this roundtable of comments from industry experts:

Edgard Capdevielle, CEO, Nozomi Networks

Most critical infrastructure sectors don’t have mandatory cyber standards, and until now that included oil and gas. The requirement for mandatory breach reporting will help shine a light on the extent of the problem in this sector.  Cybersecurity is a team sport.