Handle with Care: Protecting Sensitive Data in Microsoft SharePoint, Collaboration Tools and File Share Applications

Larry Ponemon

With the plethora of collaboration and file sharing tools in the workplace, the risk of data leakage due to insecure sharing of information among employees and third parties is growing. As discussed in this report, Handle with Care: Protecting Sensitive Data in Microsoft SharePoint, Collaboration Tools and File Share Applications in US,UK and German Organizations, sponsored by Metalogix, although security concerns about the use of collaboration and file sharing tools is high, companies are not taking sufficient steps to protect their sensitive data.

Without appropriate technologies, data breaches in the SharePoint environment can go undetected. Almost half of respondents (49 percent) say their organizations have had at least one data breach in the SharePoint environment in the past two years. However, 22 percent of respondents believe it was likely their organization had a data breach but are not able to know this with certainty.

This research reveals that employees on a frequent basis are accidentally sharing files or documents with other employees or third parties not authorized to receive them. Employees are also receiving content they should not have access to or they are not deleting confidential materials as required by policies.

Although respondents express concern about the risk of a data breach stemming from use of collaboration and file sharing technologies, they are struggling to meet the challenge using their existing security processes and tools. Seventy percent of organizations believe that if their organization had a data breach involving the loss or theft of confidential information in the SharePoint environment they would only be able to detect it some of the time or not at all.

Most companies are not taking steps to reduce the risk through training programs, routine security audits or deployment of specific technologies that discover where sensitive or confidential information resides and how it is used. The survey found that important data governance practices that are not in place for collaboration applications in general, and that when it comes to SharePoint specifically, security tools and practices are even more lacking.

We surveyed 1,403 individuals in the US, UK and Germany who are involved in ensuring the protection of confidential information. Respondents work in IT and IT security as well as lines of business in a variety of industries. On average, respondents say they spend approximately 28 percent of their time in the protection of documents and other content assets in SharePoint.

All companies represented in this research use SharePoint solutions for sharing confidential documents and files. Other solutions include Office 365 and cloud-based services such as Dropbox and/or Box. Other means of collaboration include shared network drives and other file sync and share solutions.

Key findings

In this section, we provide a deeper analysis of the findings. The complete audited findings are presented in the Appendix of this report. The report is organized according to the following seven topics:

  1. Sensitive content within the organization
  2. Risky user behavior
  3. Lack of collaboration in security and governance practices and tools
  4. Challenges in controlling risks in the SharePoint environment
  5. Country differences: United States, United Kingdom and Germany
  6. Industry differences
  7. Conclusions and recommendations


  1. Sensitive content within the organization

 Not knowing who is sharing sensitive data or where such data is stored increases the likelihood of a breach — 63 percent say the inability to know where sensitive data resides represents a serious security risk. Further, only 34 percent of respondents say their organizations have clear visibility into what file sharing applications are being used by employees at work.

These findings demonstrate the need for automated technologies that enable organizations to discover and classify sensitive or confidential information and monitor how it is used.

  1. Risky user behavior

Employee and third party use of SharePoint are greater security concerns than external threat agents.

The pressure to be productive sometimes causes individuals to put sensitive data at risk. Negligent employees are inviting data loss or theft by accidentally exposing information (73 percent of respondents). Eighty-four percent of respondents are worried about third parties having access to data they should not see. Based on the findings, third parties and negligent insiders are more worrisome than external hackers (28 percent of respondents) or malicious employees (19 percent of respondents).

  1. Lack of collaboration in security and governance practices and tools

 Despite the volume of sensitive content stored in collaboration and file sharing tools and the acknowledgement of risky employee behavior, respondents do not have sufficient policies or security tools in place to prevent either accidental exposure or intentional misuse of information.

Only 28 percent of respondents rate their organizations as being highly effective in keeping confidential documents secure in the SharePoint environment. Consequently, as reported previously, almost half of respondents (49 percent) report their companies had at least one data breach resulting from the loss or theft of confidential information in the SharePoint environment in the past two years and 22 percent of respondents say they are not aware of a data breach, but one is likely to have occurred.

  1. Challenges in controlling risks in the SharePoint environment

 If companies are aware of the risk of data breaches due to insecure collaboration and they don’t believe their current approaches are sufficient to keeping content safe, what is preventing them from deploying more effective security solutions?

 A lack of integration is the biggest challenge to reducing SharePoint security risks.

 Seventy-nine percent of respondents say they do not have the right tools in place to support the safe use of sensitive or confidential information assets in SharePoint. Either they believe their tools are only somewhat effective (41 percent of respondents), not effective (49 percent of respondents) or they do not have enough information to know (10 percent of respondents).

  1. Country differences: United States, United Kingdom and Germany

The study identifies clear differences in attitudes and behaviors related to file sharing and collaboration tools among respondents in the United States (US), United Kingdom (UK) and Germany. As shown in Figure 17, German respondents are less concerned than US or UK respondents about the potential for security breaches in their SharePoint environment, regardless of whether the source of the breach is internal or external to their organization.

  1. Industry differences

 In addition to differences among respondents in the different countries represented in this research, we provide an analysis of respondents in nine different industries in the study. Two industries of particular interest are financial services and health and pharma.

Consistent with previous studies conducted by Ponemon Institute, financial services seems to be most effective in dealing with security vulnerabilities. Awareness of information security concerns is consistently high in the financial services industry. A possible reason is the myriad of compliance requirements also requires financial services companies to invest in security tools and develop governance processes at a higher rate than other industries. Typically, financial services companies employ a larger security team with a more diverse set of skills.


7. Conclusions and recommendations

 Despite evidence of data breaches and the increasing pressure from regulators, customers and shareholders to protect confidential data from accidental exposure, companies in this study do not seem to be taking security in file sharing and collaboration environments as seriously as they should.

Following are recommendations for creating a more secure environment for sensitive content.

  • Use automated tools to improve the organization’s ability to discover where sensitive or confidential information resides within SharePoint, file sharing and collaboration tools.


  • Instead of relying upon document owners to classify sensitive or confidential information, use automated tools to improve the ability to secure data in the SharePoint environment. Assign centralized accountability and responsibility for securing documents and files containing confidential information to the department with the necessary expertise, such as IT security.


  • Be aware that personnel and organizational changes can trigger security vulnerabilities. According to respondents, negligent or malicious behaviors can occur when employees leave the organization or there is downsizing. Consider the use of automated user access history with real time monitoring.


  • Conduct meaningful training programs that specifically address the consequences of negligent or careless file sharing practices. These types of behaviors include keeping documents or files no longer needed, receiving and not deleting files and documents not intended for the recipient, forwarding confidential files or documents to individuals not authorized to receive them, using personal or unauthorized file sharing apps to exchange confidential documents and files in the workplace and sending confidential files or documents to unauthorized individuals outside the organization.


  • Address the risks created by third parties, contractors and temporary workers by monitoring and restricting their access to sensitive or confidential information.


  • Have policies that restrict or limit the sharing of confidential documents and enforce those policies, especially to reduce the risks associated with allowing workers to have confidential information on their home computers and devices.


  • Conduct audits to determine the security vulnerabilities and non-compliance of the sharing and accessing practices of employees and third parties. The research proves the ability of such audits to reveal security vulnerabilities in the protection of confidential documents and files.

Download the full report, with accompanying infographics, at this link.

WannCry a symptom of much deeper problems

Bob Sullivan

For a long time, many health care providers have been worried about the wrong thing — compliance rather than patient safety.  Last week, we see the most frightening example yet of the devastating consequences.

So far, one of the worst cyberattacks in recent memory has hit computers in 150 countries, Europol said, with WannaCry encrypting files and demanding ransom from victims. The software can run in 27 different language, according to U.S. cybersecurity officials.

“Our emergency surgeries are running doors open, we can access our software but ransomware window pops up every 20-30 seconds so we are slow,” wrote @fendifille in a post about the attack from a U.K. medical center. 

A feared second spike of attacks from the WannaCry ransomware virus didn’t materialize on Monday, but there’s still plenty to worry about. New variants of the malware have been released, others are most certainly under development, and a Twitter account logging ransom payments shows victims are indeed coughing up roughly $300 in bitcoins to recover their files. As of Monday morning, payments totaled just over $50,000 — tiny compared to the damage caused, but a tidy sum for the criminals. Meanwhile, the required ransom jumped to $600 this week, according to security firm F-Secure.

A confluence of events led to discovery of and then spread of the devastating malware. The technology behind WannaCry was actually developed by the National Security Agency in the U.S., then stolen by hackers using the moniker Shadow Crew. It attacks unpatched Microsoft Windows computers. Most modern Windows PCs were automatically updated to prevent the exploit, but older computers — those running Windows XP, for example — are no longer routinely supported by Microsoft. Many of those were unpatched, and an easy mark for WannaCry.

U.K. hospitals had thousands of these older machines; that’s why the virus hit hard there. I’ve reported earlier on why health care providers often have older computers. Many run single tasks, and are rarely updated, or even noticed, by IT staff.

Spread of the malware slowed for a variety of reasons during the weekend (including this heroic effect by a security researcher). But as workers returned Monday morning, a fresh round of infections were possible, authorities have warned.

“It is important to understand that the way these attacks work means that compromises of machines and networks that have already occurred may not yet have been detected, and that existing infections from the malware can spread within networks,” wrote the U.K.’s National Cyber Security Centre. “This means that as a new working week begins it is likely, in the UK and elsewhere, that further cases of ransomware may come to light, possibly at a significant scale.”

Microsoft has now offered security patches for older Windows machines, and technicians have spent the weekend racing to updates those computers.

The real legacy of WannCry will be the malware’s government-based origins. During the weekend, Microsoft called out the NSA for researching and hiding vulnerabilities, comparing this incident to theft of a U.S. missile

“This attack provides yet another example of why the stockpiling of vulnerabilities by governments is such a problem. This is an emerging pattern in 2017,” chief counsel Brad Smith wrote in a blog post. “We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world. Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen.”

Does NSA bug hunting (and hoarding) make the world safer, or more dangerous?  WannaCry certainly hints at the answer.


Corporate cyber-resilience — bad and getting worse

Larry Ponemon

Resilient, an IBM Company, and Ponemon Institute are pleased to release the findings of the second annual study on the importance of cyber resilience for a strong security posture. In a survey of more than 2,000 IT and IT security professionals from around the world1 , only 32 percent of respondents say their organization has a high level of cyber resilience—down slightly from 35 percent in 2015. The 2016 study also found that 66 percent of respondents believe their organization is not prepared to recover from cyber attacks.

In the context of this research we define cyber resilience as the alignment of prevention, detection and response capabilities to manage, mitigate and move on from cyberattacks. It refers to an enterprise’s capacity to maintain its core purpose and integrity in the face of cyberattacks. A cyber resilient enterprise is one that can prevent, detect, contain and recover from a myriad of serious threats against data, applications and IT infrastructure.

Cyber resilience supports a stronger security posture. In this report, we look at those organizations that believe they have achieved a very high level of cyber resilience and compare them to organizations that believe they have achieved only an average level of cyber resilience. This comparison reveals that high level cyber resilience reduces the occurrence of data breaches, enables organizations to resolve cyber incidents faster and results in fewer disruptions to business processes or IT services. The research also shows that a cybersecurity incident response plan (CSIRP) applied consistently across the entire enterprise with senior management’s support makes a significant difference in the ability to achieve high level cyber resilience.

Despite its importance for cyber resilience, the research demonstrates the continued challenges to implementing a CSIRP. Seventy-five percent of respondents admit they do not have a formal CSIRP applied consistently across the organization. Of those with a CSIRP in place, 52 percent have either not reviewed or updated the plan since it was put in place or have no set plan for doing so. Additionally, 41 percent of respondents say the time to resolve a cyber incident has increased in the past 12 months, compared to only 31 percent of respondents who say it has decreased.

Key components of cyber resilience are not improving. The key components of cyber resilience are the ability to prevent, detect, contain and recover from a cyber attack. As shown in Figure 1, respondents confidence in achieving these components has changed very little since last year’s study.

Last year, 38 percent of respondents rated their ability to prevent a cyber attack as high; this year 40 percent of respondents rated their ability to prevent a cyber attack as high.

Confidence in the ability to quickly detect and contain a cyber attack increased slightly from 47 percent of respondents to 50 percent of respondents and from 52 percent of respondents to 53 percent of respondents, respectively.

Confidence in the ability to recover from a cyber attack declined slightly. Last year, 38 percent of respondents rated their ability as high and this year, only 34 percent of respondents rate their ability as high.

Other key research findings

Investments in training, staffing and managed security services providers improves cyber resilience. In the past 12 months, only 27 percent of respondents say their cyber resilience has significantly improved (9 percent of respondents) or improved (18 percent of respondents). These respondents say if cyber resilience improved it was due to an investment in training of staff (54 percent of respondents) or engaging a managed security services provider (42 percent of respondents).

Business complexity is having a greater impact on cyber resilience. However, insufficient planning and preparedness remain the biggest barriers to cyber resilience. In 2015, 65 percent of respondents said insufficient planning and preparedness was the biggest barrier. This increased to 66 percent in 2016.

Complexity is having a greater impact on cyber resilience. In 2015, 36 percent of respondents said the complexity of IT processes was a barrier to a high level of cyber resilience and this increased significantly to 46 percent of respondents in 2016. More respondents also believe that the complexity of business processes has increased (47 percent of respondents in 2015 and 52 percent of respondents in 2016).

Incident response plans often do not exist or are ad hoc. Seventy-nine percent of respondents rate the importance of a CSIRP with skilled cybersecurity professionals as very important, and more organizations represented in this research have a CSIRP. However, only 25 percent of respondents say they have a CSIRP that is applied consistently across the enterprise (yet this does represent an increase from 18 percent in 2015). Similarly, the percentage of respondents who say their organizations do not have a CSIRP declined from 31 percent to 23 percent of respondents.

Cyber resilience is affected by the length of time it takes to respond to a security incident. Forty-one percent of respondents say the time to resolve a cyber incident has increased significantly (16 percent of respondents) or increased (25 percent of respondents). Only 31 percent of respondents say the time to resolve has decreased (22 percent of respondents) or decreased significantly (9 percent of respondents).

Human error is the top cyber threat affecting cyber resilience. When asked to rate seven IT security threats that may affect cyber resilience, the biggest threat is human error, followed by advanced persistent threats (APTs). Seventy-four percent of respondents say the incidents experienced involved human error. IT system failures and data exfiltration were also significant according to 46 percent of respondents and 45 percent of respondents, respectively. Malware and phishing are the most frequent compromises to an organization’s IT networks or endpoints. Forty-four percent of respondents say disruptions to business processes or IT services as a consequence of cybersecurity breaches occur very frequently (16 percent of respondents) or frequently (28 percent of respondents).

Malware and phishing are the most frequent compromises to an organization’s IT networks or endpoints. Forty-four percent of respondents say disruptions to business processes or IT services as a consequence of cybersecurity breaches occur very frequently (16 percent of respondents) or frequently (28 percent of respondents).

A lack of resources and no perceived benefits are reasons not to share. Why are some companies reluctant to share intelligence? According to the 47 percent of respondents who do not share threat intelligence say it is because there is no perceived benefit (42 percent of respondents), there is a lack of resources (42 percent of respondents) and it costs too much (33 percent of respondents).

Senior management’s perception of the importance of cyber resilience has not changed. A trend that has not improved is the recognition of how cyber resilience affects revenues and brand reputation. In 2015, 52 percent of respondents said their leaders recognize that cyber resilience affects revenues and this declined slightly to 47 percent in 2016. Similarly, in 2015, 43 percent of respondents said cyber resilience affects brand reputation, and this stayed virtually the same in 2016 (45 percent of respondents). Almost half (48 percent of respondents) recognize that enterprise risks affect cyber resilience, a slight increase from 47 percent of respondents in 2015.

Funding increases slightly for cybersecurity budgets. In 2015, the average cybersecurity budget was $10 million. In 2016, this increased to an average of $11.4 million. More funding has been allocated to cyber resilience-related activities. In 2015, 26 percent of the IT security budget was allocated to cyber-resilience activities. This increased to 30 percent in 2016.

Global privacy regulations drive IT security funding. When asked about regulations that drive IT security funding, most respondents believe it is the new EU General Data Protection Regulation (51 percent of respondents) or international laws by country (50 percent of respondents). Only 22 percent of respondents rate their organization’s ability to comply with the EU General Data Protection Regulation as high

To read the rest of this report, visit ResilientSystems.com

Hacked Dallas sirens, maintained by office furniture movers, shows U.S. not serious about critical infrastructure

We’d better not ignore these sirens.

Bob Sullivan

It’s tempting to ignore the warning sirens that blared Dallas out of bed Saturday night — but that would be a very serious mistake.

We hear so much about the importance of securing America’s critical infrastructure systems. Then you find out that the company responsible for maintaining the Dallas outdoor warning siren network — the one that was hacked Saturday night — is also as an office furniture moving company.

In case you missed it, Dallas’s outdoor sirens screeched continuously overnight Saturday, harassing many of the city residents with the ultimate false alarm.  Initially believed to be a malfunction, city officials conceded it was a hack by Sunday.

The sirens are supposed to warn residents about immediate danger, like tornadoes.

They did their job.

America just received perhaps the clearest warning ever that our essential services are comically easy to attack, putting our citizens in serious peril.  Will we listen, or just go back to sleep?

One can’t say it any plainer: When bricks start falling off a bridge into the water, you fix the bridge.  (Maybe.) That’s what we have here.

No one died Sunday morning. There was no blood, so there weren’t any dramatic pictures.  But there will be. It doesn’t take much imagination to see how easily this hacker prank (or, was it a test?) could have gone very wrong. For starters, it served as a denial of service attack on the city’s 911 system, which was overwhelmed with calls.

More than 4,400 911 calls were received from 11:30 p.m. to 3 a.m., the city said.  About 800 came right after midnight, causing wait times of six minutes. As far as we know, no one died because of this.  But that could have happened.

But that’s only the tip of the iceberg. Security experts I’ve chatted with have warned for years of a hybrid attack that could easily cause panic in a big city. Imagine if this hack had been combined with a couple of convincing fake news stories suggesting there was an ongoing chemical attack on Dallas.  Without firing a shot, you could easily see real catastrophes.  Take it a step further, and combine it with some kind of physical attack, and you have a serious, long-lasting incident on your hands. Death, followed by massive confusion, then panic, then a bunch of sitting ducks stuck in traffic.

Playing the “what…if” game sometimes leads to exaggeration. But it is called for when someone is about to ignore a warning sign.  So I asked security consultant Jeff Bardin of Treadstone71 to tell me why the Dallas incident should be taken seriously.

For one, it could have been a diversionary tactic.

“Testing the emergency systems, getting to a ‘cry wolf’ state of affairs, getting authorities into a full state of chaos and confusion while hacking and penetrating something else.  Kansas City shuffle,” he said.  “This looks to me to be a test of the systems. Could also be more than a test meaning what was hacked during this fake emergency?”

Dallas has been hit by “prank” hacks before.  Last year, traffic signs were hijacked to display funny messages like “Work is Canceled — Go Back Home.”  Very funny. But this means we know the city’s systems are being actively probed.  Any intelligent person has to consider what other systems this person or gang has toyed with. And, more important, what other cities have they toyed with.

“If I as a hacker can control the emergency systems, alarms, building security, HVAC, traffic lights, first responder system, medical facility interfaces, law enforcement, etc., all the normal physical systems that now have internet interfaces, I can control the whole of the city,” Bardin said. “What else was penetrated during this ‘test?’  How many other major cities in the US operate the same way? What was injected into these systems during the test for later access?”

Hopefully, the Dallas siren hacker is this is a kid who found flaws in a very old, insecure system and had some fun for a night, Bardin said. Perhaps it was someone trying to “prove a point,” though in a careless, dangerous way that did put lives in danger.

Point not made.  Life is full of disasters averted, then ignored. The planes that almost collided. The car accident narrowly averted. The key that was lost (without a duplicate!) but is found.

It’s 48 hours after a major U.S. city had its sirens blaring all night long. Are you hearing about federal investigations? Are you hearing about executive orders around critical infrastructure? (You did. But then, you didn’t.)

“Amazing this is not getting headlines,” Bardin said. “Not amazing that they have the uninitiated managing the systems who have a side job in furniture. Perfect. Just f**ing perfect.”

As for the furniture-moving company behind the sirens, it’s probably unfair to blame them.  The Dallas Morning News reported that Michigan-based West Shore Services was in charge of maintaining the system.

Indeed, here is the resolution from the city council back in 2015 authorizing payment of $567,000 to West Shore during a six-year period.  Yup, that’s around $100,000 annually, for repair and maintenance. And that’s a MAXIMUM.  I suspect it includes the price of replacing broken equipment. I’d think it doesn’t include penetration testing. I’m sure it doesn’t include overhauling the system from its old, practically indefensible architecture.

No wonder the firm needs a side business.

An operations manager for West Short told the Dallas Morning news he didn’t know anything about the incident.  The firm didn’t respond to my questions sent via email.

But the biggest question of all:  Will anyone hear this warning siren? Or will we all go back to sleep, like Dallas did?

UPDATE 6:30 p.m. 4/10/17 – Federal Signal Corporation, which made the Dallas sirens but does not currently manage them, said it was working with authorities to determine what happened.

“The City of Dallas, Texas, has multiple outdoor warning sirens installed across the Dallas area. The outdoor warning sirens were manufactured by and purchased from Federal Signal Corporation …  Although, Federal Signal does not currently have the contract to maintain the City of Dallas outdoor warning siren system, the company is actively working with the Dallas Office of Emergency Management to determine the cause of the unintended activation,” the firm said in a statement emailed to me.

Dallas Mayor Mike Rawlings seemed to get it, and called for serious investment in the wake of the attack.

“This is yet another serious example of the need for us to upgrade and better safeguard our city’s technology infrastructure,” he wrote on his Facebook page. “It’s a costly proposition, which is why every dollar of taxpayer money must be spent with critical needs such as this in mind. Making the necessary improvements is imperative for the safety of our citizens.”

Let’s hope someone listens, and those sirens are heard far outside Texas.

When seconds count: How Security Analytics Improves Cybersecurity Defenses

Larry Ponemon

When Seconds Count: How Security Analytics Improves Cybersecurity Defenses sponsored by SAS Institute was conducted to evaluate organizations’ experiences with security analytics solutions. Specifically, how have these solutions impacted organizations’ security postures? Where have security analytics initiatives succeeded or encountered roadblocks?  

Ponemon Institute surveyed 621 IT and IT security practitioners who are familiar and involved with security analytics in their organizations.  Eighty-seven percent of these respondents have personally been using the security analytics solution in their organizations and, 80 percent of these organizations have solutions that are fully implemented.

Although many respondents cite deployment challenges, they still believe security analytics has been effective. They report a major improvement in reducing the number of false positives in the analysis of anomalous traffic. Before implementation, 80 percent of respondents say it was very difficult to reduce false positives. After implementation, only one-third of respondents say reducing the number of false positives is very difficult.

Key findings

In this section of the report, we provide an analysis of the findings. The complete audited research results are presented in the Appendix of this report. We have organized the report according to the following topics.

  • Organizations’ security analytics experiences
  • Results of organizations’ security analytics initiatives
  • The future of security analytics: the integrated security intelligence platform
  • Tips for successful security analytics initiatives

Organizations’ security analytics experiences

Most organizations adopt security analytics after an attack. As shown in Figure 2, 68 percent of respondents say the main driver to implement a security analytics solution was a cyber attack or successful intrusion and 53 percent of respondents say their organization was concerned about becoming a victim of a cyber attack or successful intrusion. Only 33 percent of respondents say their organizations are proactive and regularly update their cyber defenses with new technologies.


Organizations use a variety of security analytics solutions, but in-house developed tools are most popular. According to Figure 3, 50 percent of respondents use in-house developed tools used with data lake, followed by 47 percent of respondents who use a Security Information and Event Management (SIEM) solution. Thirty-nine percent of respondents say their solution is delivered and managed by a third party.

Security analytics solutions are mostly deployed both on premise and in the cloud (40 percent of respondents). Thirty-three percent of respondents say the solution is deployed on premise and 23 percent of respondents say it is deployed in the cloud.


Most respondents say the initial deployment of security analytics was challenging. Fifty-six percent of respondents say it was very difficult (26 percent) or difficult (30 percent) to deploy security analytics.

According to Figure 4, 67 percent of respondents who feel the deployment was difficult cite extensive configuration and/or tuning before it was usable. Fifty-one percent of respondents felt there was too much data to deal with and 45 percent of respondents say they had issues getting access to the required data.

Data is a critical component of security analytics initiatives. According to Figure 5, 65 percent of respondents say data challenges are a barrier to success followed by lack of in-house expertise (58 percent of respondents) and insufficient technologies (50 percent of respondents).

Only 40 percent say insufficient resources is a challenge. The findings reveal the average cybersecurity budget is $12.5 million and an average of 22 percent of this budget is earmarked for big data analytics.

The quality of data collected and used for security analytics is the biggest data challenge. As shown in Figure 6, 66 percent of respondents say data quality is an issue followed closely by the ability to integrate data (65 percent of respondents) and data volume (55 percent of respondents). Only two percent of respondents say they have no data challenges.

Most organizations are looking to security analytics to learn what is happening in their networks now. Each one of the objectives listed in Figure 7 is considered important. Seventy-two percent of respondents say it is important or essential to be able to detect security events in progress followed by the ability to determine the root cause of past security events or forensics (69 percent of respondents).

Also important are to: provide advance warning about potential internal threats and attackers (65 percent of respondents), provide advance warning about potential external threats and attackers (62 percent of respondents), prioritize alerts, security threats and vulnerabilities (62 percent of respondents) and analyze logs and/or events (61 percent of respondents).

To read the rest of this research, visit the SAS website.

Howard Schmidt, America's digital guardian angel, served as cyberczar to two Presidents — a memorial

Howard Schmidt

Bob Sullivan

Howard Schmidt had an incredible American life.  He was cyberczar to two presidents – a Republican and a Democrat.  Before that, he ran security at Microsoft, and later practically rescued eBay when it was turning into a cesspool of fraud.  He was a soldier (Air Force, then the Army Reserves), a cop (in Arizona), a genius, and a gentleman. He was one of the first law enforcement officers in America to understand how computers could be used to catch criminals.  He won a Bronze Star in Vietnam. He was an in-demand speaker everywhere on the planet.  I saw him dazzle crowds everywhere from Seattle to Romania.

But I knew him as the guy who always wanted to help. Everyone, all the time.

He died today, “in the presence of his wife and four sons…a loving husband, father and grandfather peacefully passed away following a long battle with cancer,” according to a statement posted on his Facebook page.

I first met Howard Schmidt in the late 1990s when he was the big-deal keynote speaker at a computer conference I had attended as a cub reporter.  I was a nobody. But good fortune had us both stranded in an airport when our flights were canceled, both trying to get back to Seattle. I worked up the courage to talk to him in the waiting area about our options for getting home.  When we ended up on the same flight, and he discovered I wasn’t traveling in first class, he stopped me at the gate.


“No colleague of mine sits in the back while I sit up front,” he said, a kindness so genuine I never forgot the tone of voice he used.  He upgraded me to first class so we could sit together.  During the next three hours, I enjoyed a graduate-school class in cyber-security as I picked his brain about everything.

Howard was a natural giver.

The most important thing to know about Howard is that the job of White House cyberczar is awful.  All the responsibility, none of the power.  Herding cats. Pick your cliché.  Making America’s computers secure is the job of private industry. They own all the hardware; they write all the software; they hire all the best people.  All a government official can do is “coordinate.”  Cajole. Beg and plead.  It sounds like a glamorous job. In fact, the pay stinks, compared to what someone like Howard could earn in the tech world. And it’s kind of humiliating to go around begging companies to share what they know about hackers.

But it had to be done. Howard was always doing what had to be done.

Along the way, he always took my calls.  He would message me from half-way around the world, and apologize if it took him 10 hours to get back to me.  Sometimes, he even dragged me along, as in the case of a banking security conference in Bucharest where Howard and I both spoke. A few years later, I ended up getting a plum invitation to speak in Malta at a similar conference. Howard never admitted it, but I’m virtually sure he set me up for the gig because it was one of the few times he had to turn something down.

Whenever we spoke, I would get tired just hearing about Howard’s grueling travel schedule. When he finally started to slow down, he spent his last years traveling, of course…this time via motorcycle. Sometimes to see America’s beauty, but mostly to see his grandchildren.

“Ride my bikes as much as possible in Milwaukee…our second home (grandkids),’ he messaged me once.

Howard was always interested in what I was doing, and cheered me on as I had some success writing books. So it was natural that the day he retired from the White House, we chatted about doing a book together.

“I get approached all the time about doing one,” he said.

“Let’s chat some time and see if there isn’t a good fit? Before the months disappear,” I pleaded.  It was one of those conversations we never finished, one of those dream projects that you never get to.

I didn’t know Howard was sick until recently.  I reached out to him when President Donald Trump *almost* signed an executive order on cybersecurity. If anyone could make sense of it, he could.  I messaged him on Facebook.

“Hi Bob, This is Howard’s wife,” the response came. “Howard is fighting a brain tumor and apologizes for not being able to help.”

I was stunned.  But also, not stunned. I could picture Howard lying there, as ill as a human being can be, apologizing because he couldn’t help.  Perhaps the words he used suggested he meant “help you with your story.” But I know what he really meant:  he felt badly he couldn’t help the country.

I said I would pray for him and asked if there was anything I could do. Then, true to form, he tried once more.

“Howard said he will call in a little while” his wife wrote to me.

He never did call; I figured he’d had a bad day and I didn’t want to be a pest.  I’m so sad it was my last chance.  Let me tell you: I am much more sorry that Howard was unable to help us this one last time. Heaven knows we need it.

I’ll console myself with the thought that Heaven’s networks are much more secure now, and the Devil is no longer spreading viruses up there.

Like all women and men who work in the protection field — computer security people, health department inspectors, fire marshals — Howard spent a lifetime toiling tirelessly and invisibly, saving people from dangers they never knew existed.  Countless crushing hacker attacks didn’t happen because of Howard’s work.  He was America’s digital guardian angel for many decades. In fact, his work lives on, and you will continue to enjoy the protections from policies that Howard created and pushed for years, if not decades.

Now, he’s a real Guardian Angel. I suspect we’ve yet to see his best work.


Survey: Half of small firms hit by ransomware, paid an average $2,500 in 'ransom'

Larry Ponemon

We are pleased to present the findings of The Rise of Ransomware, sponsored by Carbonite, a report on how organizations are preparing for and dealing with ransomware infections. As of September 2016, the Justice Department reported more than 4,000 ransomware attacks daily since January 1, 2016. This is a 300-percent increase over the approximately 1,000 attacks per day seen in 2015.

You can read the full research at Carbonite.com.  Here is a summary:

We surveyed 618 individuals in small to medium-sized organizations who have responsibility for containing ransomware infections within their organization. These individuals, as revealed in this study, dread a ransomware infection and many of them (59 percent of respondents) would rather go without WiFi for a week than deal with a ransomware attack. Furthermore, 77 percent of respondents believe that those who unleash ransomware should pay for the crime. Specifically, 47 percent of respondents say criminals should face criminal prosecution and 27 percent of respondents say they should be subject to civil prosecution.

There is a significant gap between the perceptions of the seriousness of the threat and the ability of a company to prevent ransomware in the future. While 66 percent of respondents rate the threat of ransomware as very serious, only 13 percent of respondents rate their companies’ preparedness to prevent ransomware as high.

Fifty-one percent of companies represented in this research have experienced a ransomware attack. The following explains how these companies were affected.

  •  Companies experienced an average of 4 ransomware attacks and paid an average of $2,500 per attack.
  • If companies didn’t pay ransom, it was because they had a full and accurate backup. Respondents also believe a full and accurate backup is the best defense.
  • Companies suffered financial consequences such as the need to invest in new technologies, the loss of customers and lost money due to downtime.
  • Cyber criminals were most likely to use phishing/social engineering and insecure websites to unleash ransomware. Respondents believe the cyber criminal specifically targeted their company.
  • Compromised devices infected other devices in the network. Very often, data was exfiltrated from the device.
  • Companies were reluctant to report the incident to law enforcement because of concerns about negative publicity.

Following are the key takeaways from this research.

 Many companies think they are too small to be a target. Perceptions about the likelihood of an infection affect ransomware prevention and detection procedures. Fifty-seven percent of respondents believe their company is too small to be a target of ransomware and, as a result, only 46 percent of respondents believe prevention of ransomware attacks is a high priority for their company. Despite not being a high priority, 59 percent of respondents believe a ransomware attack would have serious financial consequences for their company and 53 percent of respondents would consider paying a ransom if their company’s data was lost (100 percent – 47 percent of respondents who would never pay a ransom).

 Current technologies are not considered sufficient to prevent ransomware infections. Only 27 percent of respondents are confident their current antivirus software will protect their company from ransomware. There is also concern about how the use of Internet of Things connected devices will increase their risk of ransomware.

 Inability to detect all ransomware infections puts companies at risk. An average of one or more ransomware infections go undetected per month and are able to bypass their organization’s IPS and/or AV systems, according to 44 percent of respondents. However, 29 percent of respondents say they cannot determine how many ransomware infections go undetected in a typical month.

 One or more ransomware attacks are believed to be possible in the next 12 months. Sixty-eight percent of respondents believe their company is very vulnerable (30 percent) or vulnerable (38 percent) to a ransomware attack. Relative to other types of cyber attacks, 67 percent of respondents say ransomware is much worse (35 percent) or worse (32 percent).

 The severity and volume of ransomware infections have increased over the past 12 months. Sixty percent of respondents say the volume or frequency of ransomware infections have significantly increased (22 percent) or increased (38 percent). Fifty-seven percent say the severity of ransomware infections have significantly increased (18 percent) or increased (39 percent) over the past 12 months. In a typical week, the companies documented in this research have experienced an average of 26 ransomware alerts per week. An average of 47 percent of these alerts are considered reliable.

 Negligent and uninformed employees put companies at risk. Fifty-eight percent of respondents say negligent employees put their company at risk for a ransomware attack. Only 29 percent of respondents are very confident (9 percent) or confident (20 percent) their employees can detect risky links or sites that could result in a ransomware attack.

 To prevent ransomware infections, employees need to become educated on the ransomware threat. Fifty-five percent of respondents say their organizations conduct training programs on what employees should be doing to protect data. However, only 33 percent of respondents say their companies address the ransomware threat.

 Most companies experience encrypting ransomware. Fifty-one percent of respondents had a ransomware incident within the past 3 months to more than one year ago. Eighty percent of respondents say they experienced encrypting ransomware and 20 percent of respondents say their company experienced locker ransomware. These companies have experienced an average of 4 ransomware incidents. Most respondents (59 percent) believe the cyber criminal specifically targeted them and their company.

 The consequences of ransomware are costly. The top consequences of a ransomware attack are financial. Attacks required companies to invest in new security technologies (33 percent of respondents), customers were lost (32 percent of respondents) and lost money due to downtime

(32 percent of respondents). Moreover, the ransomware incident is believed to make their company more vulnerable to future attacks (49 percent of respondents).

By far, most ransomware incidents are unleashed as a result of phishing and insecure websites. Forty-three percent of respondents say the ransomware was unleashed by phishing/social engineering and 30 percent of respondents say it was unleashed by insecure or spoofed websites. Desktops/laptops and servers were the devices most often compromised at 55 percent and 33 percent of respondents, respectively.

 According to 56 percent of respondents, the compromised device was used for both personal and business purposes. The compromised device infected other devices in the network (42 percent of respondents) and the cloud (21 percent of respondents).

 Many companies paid the ransom. Forty-eight percent of respondents say their company paid the ransom. The average payment was $2,500. A key element in making ransomware work for the attacker is a convenient payment system that is hard to trace. The ransom was most often paid using Bitcoin (33 percent of respondents) or cash (25 percent of respondents). Fifty-five percent of respondents say once the payment was made, the cyber criminal provided the decryption cypher or key to unlock compromised devices.

 Attackers demand speedy payment. Forty-six percent of respondents say the attacker wanted payment in less than two days. Only 16 percent did not place a time limit for payment.

 Data was exfiltrated from the compromised device. Fifty-five percent of respondents say with certainty or it was likely that the ransomware exfiltrated data from the compromised device(s). On average companies spent 42 hours dealing with and containing the ransomware incident.

 Full and accurate backup is a critical ransomware defense. Fifty-two percent of respondents did not pay the ransom because they had full backup (42 percent of respondents). Sixty-eight percent of respondents in companies that experienced a ransomware incident say it is essential (30 percent) or very important (38 percent) to have a full and accurate backup as a defense against future ransomware incidents.

 Fear of publicity stops companies from reporting the incident to law enforcement. Despite the FBI’s pleas to report the incident to law enforcement, 49 percent of respondents say their company did not report the ransomware attack. As shown in Figure 16, the primary reason was to avoid the publicity.

Read the rest of this research at Carnbonite.com.

Treason, arrests, a suspicious death, the vanishing executive order — Trump's cyber-mystery

Bob Sullivan

A suspicious death related to a British spy. Accusations of treason.  Arrests — including one, during a meeting, where the suspect was marched out with a bag over his head.  Election interference and ‘Kompromat.’

These are some of the things that, while hanging in the air, weren’t mentioned in the Trump administration’s first cautious steps into managing the cyberworld this week.

Like almost everything in the cyber-spook world, the Trump Administration’s first step into computer security is now shrouded in mystery, intrigue and speculation.

Trump’s team trotted out a series of experts and officials on Tuesday — including former New York City Mayor Rudy Giuliani — at an event marking an executive order Trump planned to sign. It was to be a sign that Trump wanted to get tough on computer security.

Then, without explanation, the order signing was canceled, leaving cyber-folks to do what they often do best: Guess at what it all means.

On the surface, Trump’s executive order and the spy-novel-like intrigue happening in Russia’s cyberworld have nothing to do with each other.  It’s hard not to connect them, however.

Here’s a quick scorecard to catch you up on what’s going on.  Three, or possibly four, Russians with ties to law enforcement have been arrested and charged with treason.  One suspect was grabbed at a meeting and had a bag thrown over his head in a clear show of force.

Another suspect, Ruslan Stoyanov, was a researcher at respected antivirus firm Kaspersky, and previously worked in Moscow’s cybercrime unit. He had stopped crime rings that were targeting Russian banks. I have been told he is accused of snooping on and sharing data with outside entities — perhaps the U.S., though that isn’t clear. My source requested anonymity, but others have confirmed that basic story.

Brian Krebs has painstaking amounts of additional detail on that here.

It’s easy to connect these arrests with the accusations of Russian meddling into U.S. elections, but there are other explanations.  For one, Russian officials are upset that secret information keeps making its way to a blog called Shaltay Boltay (Humpty Dumpty) in Russia that’s a bit like Wikileaks.

Meanwhile, a former KGB official was found dead a few weeks ago in his car under mysterious circumstances. The man, Oleg Erovinkin, was reportedly a source for Christopher Steele, the former British spy who authored the notorious dossier of allegedly embarrassing information about President Trump.

When Trump assembled the folks who will be in charge of making U.S. computer systems safer, none of this came up.

On the surface, a draft version of the order that was widely shared showed it would primarily call for a 60-day review of the most critical U.S. networks, including military command and control systems.  It also asked for a review of America’s cyber enemies; a review of computer security education; and asked for proposals to create incentives for private firms to improve their security.

It is unclear why the president didn’t sign the order as planned.

The draft order got, expectedly, mixed reviews from industry.

“What I like about it is that it creates a sense of urgency and seriousness that we really have to double down on security,” said Eric Geisa, vice president of products at Tempered Networks, discussing the draft order.

Morey Haber, vice president of technology at BeyondTrust, was far more critical.

“We already do all this (vulnerability assessment). The only difference is that it’s (to be) reported to the president,” he said.  Prior to BeyondTrust, Haber spent 10 years as a contractor providing vulnerability assessment to the Department of Defense.  “It ignores attack vectors that have actually been exploited before. It’s almost a knee-jerk reaction, similar to ban of certain countries for immigration.”

Haber pointed out that most hacks involve the human element, like an employee responding to a phishing email.

“We should be making sure the front doors are locked before we change the combination on the safe,” he said. “We are targeting the wrong things here. We do need to look at these things, but this is not typically how attacks have occurred. We should be targeting the lowest hanging fruit, like phishing emails, USB sticks left in parking lots.”

Perhaps because of this kind of feedback, the order was delayed.  Or something entirely unrelated is the cause.

Geisa said this moment in time gives the administration an opportunity to succeed where others have failed.

“This isn’t something new. After the (Office of Personnel and Management) hack Obama signed an executive order…but what I’ve seen from the government in the past is you get high-level guidelines, but there isn’t a lot of of prescriptions. They might say you need to encryption, or example. Well, no kidding,” he said. “The time is now to get very specific.”

The Internet has suffered from a “fundamental flaw” since its earliest days, he said —  the use of IP addresses to authenticate computers, which makes it easy for machines, and criminals, to lie about who they are. Changing that will require a very heavy-handed implementation of new protocols that define how computers talk to each other.  Perhaps Trump’s administration could lead that charge, Geisa said.

On the other hand, it’s important to understand how different Internet security is from other kinds of security.  The “weapons” of cyberspace are mainly controlled by civilians. Instead of bombs stored in silos that the government can secure, ‘cyber-bombs’ can be hacked servers, private computers, even webcams — as we all learned last year when an army of zombie webcams knocked a large portion of the Internet offline.  They cannot be secured without massive efforts and cooperation by private industry.

And that brings us back to the Russian hacks.  I’ve spent years attending international security conferences where the real work of rescuing the Internet happens.  Naturally, private firms are reluctant to share information with government officials and with each other — many see this very expensive and difficult research as competitive advantage.  Still, informal exchanges happen all the time. Secret cyberheros rescue us from digital doomsdays on a regular basis, in conversations we’ll never hear about or see in a press release.  Often, these involve “hackers” with a past, who have spent time in the murky world between white and black hat. That’s precisely why they know what’s going on.  But that can also make them very “shy” when speaking to law enforcement.

You can bet Russian cyber-experts are getting more shy by the minute. That hurts everyone except the criminals.

But it’s a good reminder of how hard U.S. officials must work to keep the information flowing between private industry and government workers fighting to keep our water dams and power grid safe.   That’s going to take a lot more than an executive order.

Complexity is the enemy of security

Larry Ponemon

We are pleased to present the findings of The Cost & Consequences of Security Complexity, sponsored by MobileIron. The purpose of this research is to understand the reasons behind the growing complexity of companies’ IT security architecture and how it is affecting their ability to respond to cyber threats. We surveyed 589 individuals involved in securing, overseeing and assessing the effectiveness of their organizations’ information systems or IT infrastructure.

While some complexity in an IT security architecture is expected in order to deal with the many threats facing organizations, too much complexity, as shown in this research, can impact the ability to respond to cyber threats. Participants in this research understand the negative impact IT security complexity has on their organizations’ security posture. In order to be able to protect their organizations from cyber threats, 68 percent of respondents believe it is essential (33 percent) or very important (35 percent) to reduce complexity within their IT security architecture.

According to respondents, employees’ access to cloud-based apps and data and use of mobile devices in the workplace are the biggest drivers of complexity. The growth in unstructured data is making it increasingly difficult to deal with cyber threats.

Complexity seems unstoppable. As shown in Figure 1, complexity is a growing problem. Fifty-eight percent of respondents say in the past two years the complexity of their organizations’ IT security architecture increased significantly (28 percent) or increased (30 percent) and 66 percent believe in the next two years complexity will increase.

Following are eight consequences of complexity.

  • Inability to integrate security technologies across different platforms.
  • Inability to ensure policies and governance practices are applied consistently across the enterprise.
  • Too many active endpoints.
  • Poor investments in overly complex security technologies that are difficult to operate and financial loss due to the scrapping of these complex technologies.
  • Inability to see vulnerabilities in the system.
  • Difficulty in communicating the organization’s security strategy and approach to deal with cyber threats to senior management.
  • Decline in productivity of IT security staff due to complexity.
  • Lack of accountability for IT security practices.

Part 2. Key findings

Here is a sampling of key findings: These will be explored in more detail during a webinar held on Jan. 17. Click here to register for the webinar.

Most IT security architectures are very complex. Sixty-seven percent of respondents say their organizations’ IT security architecture is very complex.

What are the consequences of complexity? Only 35 percent of respondents rate their ability to hire and retain qualified security personnel as high (7+ on a scale from 1 = no ability to 10 = strong ability). Also problematic is the ability to integrate security technologies across different platforms (only 29 percent rate their ability as high) or to ensure policies and governance practices are applied consistently across the enterprise (only 21 percent rate their ability as high).

Employees’ use of cloud-based apps and mobile devices is considered most responsible for IT security complexity.  Some 64 percent say it is access to cloud-based applications and data and 56 percent say it is the use of mobile devices (including BYOD and mobile apps) that increase the complexity of dealing with IT security risks. The rapid growth of unstructured data and constant changes to the organization as a result of mergers and acquisitions, divestitures, reorganizations and downsizing also increase complexity.

Investments in security technologies have contributed to complexity. In the survey, 61 percent of respondents say enabling security technologies have made it more complicated to deal with threats, and 72 percent say they have lost money on poor investment in enabling security technologies.

Current security architectures are overly complex. According to 71 percent of respondents, the complexity of their companies’ IT and IT security architecture makes it difficult to see vulnerabilities in the system and 51 percent of respondents say simplified policies and processes are needed to improve the ability to respond to a changing threat landscape.

Companies shelved or scrapped enabling security technologies because of complexity. Sixty-five percent of respondents say their company has had to frequently (27 percent) or sometimes (38 percent) scrap or shelve one or more enabling security technologies because they did not effectively moderate cyber threats or were too complex to operate. The primary reason for not deploying technologies purchased is that they were too complicated to operate (63 percent of respondents. Other reasons are the lack of in-house expertise to deploy and manage the technology (54 percent of respondents) and poor vendor support and service (48 percent of respondents).

Complexity makes it difficult to explain the approach taken to reduce IT security risks to senior management. Some 67 percent of respondents believe their company’s approach to dealing with cyber threats is too complex to explain to senior executives. Such difficulty in communicating IT security practices to senior management leads to difficulty in achieving goals and objectives set by senior management (49 percent of respondents). As a result, 62 percent of respondents say their company needs to simplify and streamline its security architecture.

Complexity affects the staffing of knowledgeable IT security professionals. As discussed previously, only 35 percent of respondents rate their companies’ ability to hire and retain qualified security personnel as high; 56 percent of respondents say they do not have the necessary expertise to deal with the complexity of their IT and IT security processes and 52 percent of respondents say their companies’ current IT security infrastructure is too complicated and, as a result, decreases the productivity of their IT security staff.

Ineffective IT security architectures are costly. Respondents estimate an average potential total cost exposure from IT security failures of $77 million. The most significant financial impact results from the organization’s response to information misuse or theft followed by costs associated with reputation and brand damage because of IT security failure.

To learn more about these findings, check out the webinar

Here's what millions of leaked passwords look like, and other scenes from inside The Glass Room

Bob Sullivan

It’s very hard to make privacy and security sexy. The folks at Mozilla and the Tactical Technology Collective have done just that this month with a clever art installation/ pop-up shop in lower Manhattan called “The Glass Room.”

The Glass Room aims to inform and challenge visitors by making them see and touch real-life representations of digital risks, the same way you might wander through an art gallery and ponder other life mysteries.

Visitors there are forced to look at an encyclopedia-style pile of books in which every password stolen from LinkedIn is printed. They are listed alphabetically, so every few minutes someone exclaims when they find their password printed in the volumes.


The point is really the sheer size of that hack…which was indeed quite a bit smaller than Yahoo’s hack announced this week.

Other works include a fitbit attached to a metronome, designed to fool the gadget’s supposed health predictive abilities; Where the F&^&* was I, a printed book showing all the places the artist had been during a year, according to the cloud; and a screen showing data on leaked by smartphones as people walk by outside.

Maya Indira Ganesh gave me a tour of the place

“It’s an art exhibition that’s trying to shine a light on what it means to live in the data society, ” she told me.  It’s also trying to scare folks a little bit.

Not all surveillance technology is bad, of course. The Glass House tells both sides of the story. Video monitors can help you check in on elderly family members, for example.  But you should always wonder: Who else is watching, and why?

Thankfully, The Glass Room includes a detox bar in the back, with Apple-store-like “ingeniuouses” there to help you fix the privacy settings on your gadgets.  They also offer an 8-day data detox kit, which I’ll be sharing in the future.

The Glass Room first popped-up in Germany before making its way to Manhattan this month.  The store closes this weekend, but you can browse the entire exhibit online.  And, better yet, you can watch the videos I’ve attached to this story.