Monthly Archives: June 2018

While negligence causes the most breaches, insiders do the most damage

Larry Ponemon

Ponemon Institute and ObserveIT have released The 2018 Cost of Insider Threats: Global Study, on what companies have spent to deal with a data breach caused by a careless or negligent employee or contractor, criminal or malicious insider or a credential thief. While the negligent insider is the root cause of most breaches, the bad actor who steals employees’ credentials is responsible for the most costly incidents.

The first study on the cost of insider threats was conducted in 2016 and focused exclusively on companies in the United States. In this year’s benchmark study, 717 IT and IT security practitioners in 159 organizations in North America (United States and Canada), Europe, Middle East and Africa, and Asia-Pacific were interviewed.

According to the research, if the incident involved a negligent employee or contractor, companies spent an average of $283,281. The average cost more than doubles if the incident involved an imposter or thief who steals credentials ($648,845). Hackers cost the organizations represented in this research an average of $607,745 per incident.

Here are the main findings of the research:

Imposter risk is the most costly.

The cost ranges significantly based on the type of incident. If it involves a negligent
employee or contractor, each incident can average $283,281. The average cost
more than doubles if the incident involves an imposter or thief who steals credentials
($648,845). Hackers cost the organizations represented in this research
an average of $607,745 per incident. The activities that drive costs are: monitoring &
surveillance, investigation, escalation, incident response, containment, ex-post
analysis and remediation.

The negligent insider is the root cause of most incidents

Most incidents in this research were caused by insider negligence. Specifically, the careless
employee or contractor was the root cause of almost 2,081 of the 3,269 incidents reported. The
most expensive incidents are due to imposters stealing credentials and were the least reported.
There were a total of 440 incidents involving stolen credentials.

Organizational size and industry affects the cost per incident

The cost of incidents varies according to organizational size. Large organizations with a
headcount of more than 75,000 spent an average of $2,081 million over the past year to resolve
insider-related incidents. To deal with the consequences of an insider incident, smaller-sized
organizations with a headcount below 500 spent an average of $1.80 million. Companies in
financial services, energy & utilities and industrial & manufacturing incurred average costs of
$12.05 million, $10.23 million and $8.86 million, respectively

All types of threat of insider risks are increasing.

Since 2016 the average number of incidents involving employee or contractor negligence has increased from 10.5 to 13.4. The average number of credential theft incidents has tripled over the past two years, from 1.0 to 2.9.

Employee or contractor negligence costs companies the most.

In terms of total annual costs, it is clear that employee or contractor negligence represents the most expensive insider profile. While credential theft is the most expensive on a unit cost basis, it represents the least expensive profile on an annualized basis.

It takes an average of more than two months to contain an insider incident.

It took an average of 73 days to contain the incident. Only 16 percent of incidents were contained in less than 30 days.

We conclude that companies need to intensify their efforts to minimize the insider risk because of rising costs and frequency of incidents. Since 2016 the average number of incidents involving employee or contractor negligence has increased from 10.5 to 13.4. The average number of credential theft incidents has tripled over the past two years, from 1.0 to 2.9. In addition, these incidents are not resolved quickly.

Click here to read the rest of this study.

 

Privacy problems? Think of them as side effects

Bob Sullivan

Not long ago, I was approached by someone to help write a book about the race to cure cancer. It was an intriguing idea, and it sent me down a rabbit whole of research so I’d be able to understand what I’d be getting into. What I found was one Greek myth-like tale after another, of a wonderful breakthrough followed by a tragic outcome.  An incredibly promising development followed by crushing consequence.  Of treatments that killed cancer but also killed patients. Of cures that are worse than the disease.

Sometimes, these are stories about egos blinded by a God complex, refusing to see they are hurting instead of helping. Usually, they are stories about people who spend decades in service to humanity and the slow, very unsteady, very unsure march of progress.

And these are stories about damned side effects.

I usually tell people that I’m a tech reporter, but that I focus on the unintended consequences of technology — tech’s dark side.  Privacy, hacking, viruses, manipulation of consumers via big data. These things are kind of like the nuclear waste of “progress.” But lately I’ve been thinking about changing that description.

Now, I think the problem is a lot more like the medical concept of side effects.

Companies like Facebook, Uber, and Google are full of brilliant engineers who spend all their time and energy trying to solve some of the world’s great problems, and they often do.  Uber and its imitators are wonderful at solving vexing transportation problems.  Facebook *has* connected billions of people, and let millions of families share baby photos easily.  These are good tools. Amazing tools.

But tech firms aren’t built to think about side effects.  Long before the Russian trolls in 2016, plenty of people warned Facebook about the crap its service was spewing, about how its tool had been hijacked and weaponized. But Facebook didn’t listen. The firm was too focused on the “cure” it was inventing — maybe too arrogant, maybe too naive — to see the damage it was doing.

There are similar tales all across tech-land.

Banking apps let us pay our friends instantly; they also let criminals steal from us instantly. Talk to banks about this, and you can almost hear the mad-scientist approach (I hear, “Well, consumers really should protect themselves,” as “We can’t let a few victims get in the way of progress!”)

Cell phone companies have created amazing products. And now, we know, they also make it easy for law enforcement to track us.

There’s a cynical way to view this, of course. Facebook is only concerned with making money, Google doesn’t really care about making the world a better place, just making its balance sheet a better place. If you believe that, I’m not trying to talk you out of it.  Corporations are people after all, our Supreme Court says, and greedy people at that. It’s illegal for them to act otherwise; it would be negligent not to maximize shareholder value.

I’ve spent 20 years talking to people in the tech industry, however, and there’s plenty of folks in it who don’t think that way.  I think most folks in tech who fail us are better described a naive Utopians rather than greedy bastards.

In the coming months, I’ll be working on a new set of initiatives around this notion.  The effort really started this year with re-release of Gotcha Capitalism.  My podcast “Breach” is also part of this. So are some new audio projects I’m working on. I’m being vague because I have to, for now.  You might see a bit of a slowdown in posts as I ready this projects, but rest assured, I’m on the beat.

In the new introduction to the new Gotcha Capitalism, I sum up what I feel is the civil rights issue of our time: Big Data being used against consumers.  It fits the Failed Utopia model like a T.  Folks wanted to remove the human element — often susceptible to racial and other forms of bias — from important decisions in realms like credit and criminal punishment. So credit scores are now used to grant mortgages, and formulas are used in sentencing decisions.  Unfortunately, as my Dad taught me in the 1970s, “Garbage In, Garbage Out,” is still the primal rule in computing.  Algorithms can suffer from bias, too. What makes this scary, however, is many folks haven’t woken up to this fact yet.  Just as, once upon a time, people believed that photographs can’t lie, today, many blindly think that data can’t lie.

It can, and does. More important, in the wrong hands, data can be abused.  So now we have the even-worse story of a powerful tool built by a Utopian falling into the wrong hands and being abused by an evil genius.

This is the story of tech today.

I’m hardly the only one who recognizes this. Organizations like the Center for Humane Technology are springing up all over.  This is promising. But the forces aligned against such thoughtful use of tech are powerful, and billions of dollars are at stake.  Sometimes, it can feel like the the onslaught of tech’s takeover is a force of nature, like gravity.  Just ask anyone who’s ever tried to convince a startup to think about security or privacy while it’s racing to release new features.

Not unlike someone racing to invent a cure, side effects be damned.

I hope you’ll join me in this effort. Little things mean a lot — such as this woman’s suggestions for getting people to put down their smartphones when she wants to talk.  Mere awareness of the issue helps a lot. Think about how much news you get from Facebook or Twitter today compared to five years ago. Would your high school civics teacher be proud?

When tech is released in to the world, side effects like privacy and security issues shouldn’t be an afterthought. They should be considered and examined with all the rigor that the medical profession has long practiced. That’s how we’ll make sense out of our future.