Monthly Archives: April 2020

The economic value of prevention in the cybersecurity lifecycle

Larry Ponemon

Ponemon Institute is pleased to present the findings of The Economic Value of Prevention in the Cybersecurity Lifecycle, sponsored by Deep Instinct. The cybersecurity lifecycle is the sequence of activities an organization experiences when responding to an attack. The five high-level phases are prevention, detection, containment, recovery and remediation.

We surveyed 634 IT and IT security practitioners who are knowledgeable about their organizations’ cybersecurity technologies and processes. Within their organizations, most of these respondents are responsible for maintaining and implementing security technologies, conducting assessments, leading security teams and testing controls.

“If we could quantify the cost savings of the prevention of attacks, we would be able to increase our IT security budget and debunk the C-suite’s myth that AI is a gimmick. I believe AI is critical to preventing attacks.” —  CISO, financial services industry.

The key takeaway from this research is that when attacks are prevented from entering and causing any damage, organizations can save resources, costs, damages, time and reputation.

To determine the economic value of prevention, respondents were first asked to estimate the cost of one of the following five types of attacks: phishing, zero-day, spyware, nation-state and ransomware. They were then asked to estimate what percentage of the cost is spent on each phase of the cybersecurity lifecycle, including prevention. Because there are fixed costs associated with the prevention phase of the cybersecurity lifecycle, such as in-house expertise and investments in technologies, there will be a cost even if the attack is stopped before doing damage. For example, the average total cost of a phishing attack is $832,500 and of that 82 percent is spent on detection, containment, recovery and remediation. Respondents estimate 18 percent is spent on prevention. Thus, if the attack is prevented the total cost saved would be $682,650 (82 percent of $832,500).

Seventy percent of respondents (34 percent + 36 percent) believe the ability to prevent cyberattacks would strengthen their organization’s cybersecurity posture. However, 76 percent of respondents (40 percent + 36 percent) say they have given up on improving their ability to prevent an attack because it is too difficult to achieve.

The following are the most noteworthy findings from the research.

  • Organizations are most effective in containing cyberattacks. Fifty-five percent of respondents say their organizations are very or highly effective at containing attacks in the cybersecurity lifecycle. Less than half of respondents (46 percent) say their organizations are very or highly effective in preventing cyberattacks. Organizations are also allocating more of the IT security budget to technologies and processes in the containment phase than in the prevention phase. 
  • Prevention of a cyberattack is the most difficult to achieve in the cybersecurity lifecycle. Eighty percent of respondents say prevention is very difficult to achieve followed by recovery from a cyberattack. The reason for the difficulty is that it takes too long to identify an attack. Other reasons are outdated or insufficient technologies and lack of in-house expertise. The technology features considered most important are the ability to prevent attacks in real-time and based on different types of files. 
  • Automation and advanced technologies increase the ability to prevent cyberattacks. Sixty percent of respondents say their organizations currently deploy AI-based or plan to deploy AI for cybersecurity within the next 12 months. Sixty-seven percent of respondents believe the use of automation and advanced technologies would increase their organizations’ ability to prevent cyberattacks. Further, 67 percent of respondents expect to increase their investment in these technologies as they mature. 
  • Deep learning is a form of AI and is inspired by the brain’s ability to learn. In the context of this research, deep learning is defined as follows: once a human brain learns to identify an object, its identification becomes second nature. Deep learning’s artificial brains consist of complex neural networks and can process high amounts of data to get a profound and highly accurate understanding of the data analyzed. The top three reasons to incorporate a deep- learning-based-solution are to lower false positive rates, increase detection rates and prevent unknown first-seen cyberattacks. 
  • Perceptions that AI could be a gimmick and lack of in-house expertise are the two challenges to deployment of AI-based technologies. Fifty percent of respondents say when trying to gain support for the adoption of AI there is internal resistance because it is considered a gimmick. This is followed by the inability to recruit personnel with the necessary expertise (49 percent of respondents).
  • Organizations are making investments in technology that do not strengthen their cybersecurity budget based on the wrong metrics. Fifty percent of respondents say their organizations are wasting limited budgets on investments that don’t improve their cybersecurity posture. The primary reasons for the failure are system complexity, personnel and vendor support issues. Another reason is that most organizations are using return on investment (ROI) to justify investments and is not based on the technology’s ability to increase prevention and detection rates. 
  • IT security budgets are considered inadequate. Only 40 percent of respondents say their budgets are sufficient to achieve a strong cybersecurity posture. The average total IT budget is $94.3 million and of this 14 percent or approximately $13 million is allocated to IT security. Nineteen percent or approximately $2.5 million will be allocated to investments in enabling security technologies such as AI, machine learning, orchestration, automation, blockchain and more.

Sample finding:

With the exception of the exploitation phase of the kill chain, zero-day attacks are very difficult to prevent in the cyber kill chain. The cyber kill chain is a way to understand the sequence of events involved in an external attack on an organization’s IT environment. Understanding the cyber kill chain model is considered helpful in putting the strategies and technologies in place to “kill” or contain the attack at various stages and better protect the IT ecosystem. Following are the 7 steps in the cyber kill chain:

  1. Reconnaissance: the intruder picks a target, researches it and looks for vulnerabilities
  2. Weaponization: the intruder develops malware designed to exploit the vulnerability
  3. Delivery: the intruder transmits the malware via a phishing email or another medium
  4. Exploitation: the malware begins executing on the target system
  5. Installation: the malware installs a backdoor or other ingress accessible to the attacker
  6. Command and Control (C2): the intruder gains persistent access to the organization’s systems/network
  7. Actions on Objective: the Intruder initiates end goal actions, such as data theft, data corruption or data destruction

Respondents were asked to rate the difficulty in preventing a zero-day attack in every phase of the cyber kill chain on a scale of 1 = not difficult to 10 = very difficult. Figure 16 presents the very difficult responses (7+ on the 10-point scale). The most difficult phase to prevent the zero-day attack is the command and control phase (80 percent) in which the intruder gains persistent access to the organization’s systems/network followed by the delivery phase of the kill chain (78 percent).


Read the full report by visiting Deep Instinct’s website





New Podcast: Erin and Noah on the run, why Americans carry tracking devices everywhere now

Alia “followed” me around through cyberspace during one day in Los Angels.

Bob Sullivan

Erin and her son Noah think they’ve finally found a safe place to live, in a quiet Ohio town, invisible to Erin’s abusive ex-husband.  But that life is shattered by a disturbing voice mail after a single photo of Noah accidentally appears on a school website. The call sends mother and child on the run again, but not before a near-disaster at Noah’s school.

Sometimes, privacy is a matter of life and death. And while Erin and Noah’s story is fiction, the story of how privacy advocate Brian Hofer ended up in a police car, with a gun pointed at his brother’s head, is chillingly real.

This week we begin this second season of No Place to Hide. You’re going to hear something very new, and very different: a combination of fiction and non-fiction storytelling. Each episode begins with a scene from the story of Erin and Noah, a mom and her son on the run from his abusive father.  The story is designed to make listeners feel the way victims feel when they are stalked through cyberspace. Then, Alia and I take on the big privacy topics of our day, concluding with a look at the world in 2030 if nothing is done to manage the coming tsunami of privacy invasive technologies.

No Place to Hide is sponsored by Intel Corp.

I’m really proud of this concept, and this series. Privacy isn’t some esoteric idea, or a first-world problem. Privacy is about freedom, and free will, and personal safety, and creativity.

This week’s episode is about location data, a topic that’s in the news a lot right now. Authoritarian regimes around the world are trying to stem the tide of coronavirus by tracking citizens’ movements via their cell phones. Well, every country is trying to do that. In places like China, there is no pretense of worry about civil liberties. In the U.S., Apple and Google have announced a system that uses Bluetooth to alert people who’ve been near a patient that’s tested positive. Theoretically, that limits the information to a small group who really needs it. Still, plenty of firms and governments are bragging about use of cell phone location data as a public health tool  — the state of New Mexico, for example, is using data to see how well residents are social distancing.   The data is being examined nationally, also.

Only a fool wouldn’t try all available tools to beat back coronavirus. But what are the long-term implications of these more aggressive steps by governments to track citizens via their cell phones? And how did we all end up with tracking devices in our pockets in the first place?

This week’s episode of No Place to Hide delves deep into the history of location data; I hope it will help inform public discussion as we move forward to the next step in this crisis, which is sure to include a lot of arm wrestling between the good technology can do and the potential harms.

Ep 4: On location Summary

Erin and Noah — Dad finds them in Ohio because an errant photo ended up on a school website. They have to drop everything and flee, right as their dad shows up at school.

Bob and Alia: Cell phones track our every move, in perhaps the biggest attack on privacy of our time. On location in Los Angeles, Bob and Alia discuss the past ten years of location-specific data hoarding by large companies. Then we hear why Oakland Privacy Commission chair Brian Hofer ended up in a police squad car, and his brother had a gun pointed to his head ‘executioner-style,’ all over a database error.


Partial transcript

BOB: A single piece of location information doesn’t seem that distressing. But when you can put it all on a map, over time, and build a picture of someone’s life, that’s when you’ve really, really invaded their privacy.

ALIA: You know, it kind of reminds me of this person I knew a long time ago, Bob. And I remember one day we were having coffee, and he was telling me about how, uh, assassinations worked. And I thought that was really creepy, but do you know what the first rule was to figure out how to assassinate someone? The rule was you get to know their habits, and you get to know their days, and you watch them. Where they go, how they get there, when they get there, every single day. Because if you know their habits, then you know where the holes are when you might do the deed.

BOB: …That’s what Liam Youens did to Amy Boyer…

ALIA: Yeah… that’s really scary. So what you’re talking about , in like learning someone’s habits– their daily whereabouts– you can look for opportunities to do something terrible potentially. And he was talking about it in like the old school sense of, you know, like stakeouts. You’re watching this person. And what you’re talking about is, basically, you don’t have to do that anymore, because Google does it for you.

BOB: And not just Google of course, any cellphone does this for you.

ALIA: Right. Ugh. 

BOB: Mobile devices are tracking devices, and so who has access to that information? Maybe through that Terms & Conditions box you checked? Your mobile provider.  Your apps. Hundreds of companies in between that are collecting these incredibly detailed profiles of your movements. You know, I recently wrote about a selfie app that teenagers love — it has 300 million downloads. And sends all their location information to the developer…in China.

ALIA: And there’s that NYTimes exposé on location data, that we’ve both been obsessed with. Someone gave the reporters at the Times a copy of a location database with a year’s worth of data.  Using that, they were able to track specific people, like a secret service agent, someone protecting the president, from their home to the White House to their church. And they had this location data for over 12 million people.

BOB: Just imagine what our fictional angry ex-husband from the opening could do with data like that.

ALIA: That’s so scary. 

BOB: When we talk about issues like privacy and data security, I get emotional and philosophical about civil liberties. And maybe you don’t care if Google knows what websites you visit or Amazon knows what kind of dog food you buy. But location data is next level. And as our little experiment showed, as the NYT story showed, something incredible happened in the past decade. The advent of smartphones means that most Americans, and about half the people on Earth, now carry small, incredibly accurate tracking devices with them at all times. And… I don’t remember anyone having a great, open, honest debate about the wisdom of that.

ALIA: Me neither. But I think we should.


BRIAN HOFER: Yeah, I, you know, I, I can’t get half of my friends to use like Signal or other encrypted software, or to, you know, have two factor authorization, cause you know, we’ll trade anything for convenience and speed. 

ALIA: That’s Brian Hofer. He’s a community activist in Oakland, California. We’ll hear a lot more about his activism later, but for now, he paints an amazing picture about the importance of location information.

BRIAN HOFER: It only takes four geospatial data points. So that’s time and location. Four different geospatial data points to identify over 95% of humans. Why? Cause I drive to work the same way, I drive to the gym the same way, I go to the same grocery store. We’re creatures of habit. So you know, whether it’s your scooter, uh, whether it’s even public transit that now mostly use like electronic payments, uh, obviously license plate readers, and obviously cell phones, you only need a couple of, you know, four or five data points and you, and your, you can map somebody out, you can figure out who it is.

The question usually is, Well, I have nothing to hide, so I have nothing to fear. And that, and that’s totally wrong. And I like how Edward Snowden, uh, flipped that on its head and said, you have something to protect. What if we just did have an abortion and there’s cameras right outside of that clinic and a license plate scanner, uh, and you’re tracking my phone calls, you know, to the clinic and my location? Or what if I keep driving and parking in front of the same cancer doctor’s office? Maybe I didn’t want to tell you I had cancer. Um, what if I am exploring my sexuality and there is facial recognition on the front of, uh, bars, you know, a same-sex bar that I wanted to walk into, but now I’m scared because there’s facial recognition. So all these little data points by themselves, probably not a civil Liberty threat, probably not, uh, invading my privacy, but together because of the nature of all the commingled data and databases together, what we now call it, and you’ve seen it in, uh, Sotomayor’s, uh, uh, some of her opinions, we call it the Mosaic Theory, that there’s all these little tiles, these little pieces…

BOB: So, Mosaic theory. This is really important.  It’s super creepy that in an instant, you could see everywhere I went all morning. But it should be even creepier to think that with just a few details, I could pretty much size up your whole life. I mean, imagine you are Erin and Noah, trying to get away from an angry ex husband. In just a moment, with data like this, he would know exactly when to show up at school to snatch a child. You see, most people’s lives aren’t really that complex. We only go to a few places 95% of the time. 

We talked Marc Groman about this — he was the first-ever chief privacy officer at the FTC and senior advisor for privacy in the Obama White House

MARC GROMAN – If you have my precise location over say a couple of weeks, you essentially can draw highly sensitive inferences about my entire life. You will understand my religious beliefs, my political beliefs, my health issues potentially. And by the way, it’s so precise now we know not just that you’re in the hospital, but if you’re in a 12 story building, we know what floor in the hospital

ALIA: Wow. 

ALIA:  Susan Grant of the Consumer Federation of America. We talked to her for a while about location data and I gotta say, when she talked about the creation of ‘megaprofiles’ I got chills.

SUSAN GRANT: Location is just one of the many, um, very revealing things about you that can be compiled into a mega profiles about you. So it’s not just where you are at any given moment, but it’s where you go most frequently.  Um, which can tell a lot about you. Um, you know, uh, uh, where you go to church reveals what your religion is, for instance. Um, these are things that people have a right to keep private if they want. Um, and uh, yet this information is being collected when it’s not needed

ALIA: Ok, this all sounds pretty awful. Tracking gadgets in our pockets and purses. Really precise data being sent to companies we work with, all around the world, available to the government…but I have to ask a question I know you love, Bob. So, Bob …who’s making money off all this location information?

BOB:  That’s always the important question to ask. And, we have to credit Buzzfeed for a great story explaining how valuable location information is.

BOB: “Location-sharing agreements between app developers and app brokers – where apps can send your GPS coordinates up to 14,000 times per day – can bring in a lot of revenue. With just 1,000 users, app developers can get $4/month. If they have 1 million active users, they can get $4,000/month. And that’s from just one broker. If they work with two app brokers with similar payouts, and have at least 10 million active monthly users, they could stand to make $80,000/month.”

BOB: Quote: “With more dangerous permissions given by the user, they will get more sensitive data, which means they’ll make more money.” End Quote. 

ALIA: So…that selfie app we talked about. It had 300 million downloads! OMG, how much money they must be making.

BOB: Exactly. But to me, it’s important to remember that big fish eat little fish metaphor from the first half of the series.

ALIA: Bob, I was waiting for a metaphor.

BOB: So a consumer group in Norway recently investigated dating apps like  Grindr, Tinder, OkCupid, and so on, and found they were selling sensitive data like location data into this ecosystem…but one of the buyers was a firm named MoPub. Which is owned by Twitter.

ALIA: Ahh Twitter. Because someone has to be writing those big checks, driving this whole ecosystem. And again, when did we decide as a society that we were ok with this? We didn’t. It just kind of…happened

BRIAN HOFER: And what is so scary, you know, back when we all read, like, 1984, we thought the government was just going to force everything on us–

ALIA: Here’s Brian Hofer again–

BRIAN: and what the American business genius was is nah, just ask people to do it voluntarily, you know, we’ll just offer them convenience and they’ll do all these things on their own and most people don’t look beneath the hood and don’t really look to see what the ramifications are.