Monthly Archives: October 2018

Where’s the data? Firms increasingly fret about governance; join us for a free webinar

Larry Ponemon

There will be a free live webinar discussing these results on Oct. 18 at 11 a.m. Click here to register for this webinar.

Organizations are becoming increasingly vulnerable to risks created by the lack of oversight, visibility and controls over employees and other insiders who have access to confidential and high-value information assets. The 2018 Study on the State of Data Access Governance, sponsored by STEALTHbits Technologies, reveals the importance of a Data Access Governance program that can effectively reduce the risk created by employees’ and privileged users’ accidental and conscious exposure of confidential data.

In the context of this research, Data Access Governance is about making access to data exclusive and limiting the number of people who have access to data and their permissions to that data to the lowest levels possible. Ponemon Institute surveyed 991 IT and IT security practitioners in the United States (586) and United Kingdom (405).

To ensure these respondents have an in-depth knowledge of how their organizations manage users’ access to data, we asked them to indicate their level of access to their organizations’ IT networks, enterprise systems, applications and confidential information. If they had only limited end user access rights to IT resources, they were not included in the final sample of respondents.

While the study reveals companies are taking some steps to manage the risk, the perception among these respondents who are knowledgeable about access rights in their organizations perceive that the risk will either increase (48 percent) or stay the same (41 percent) over the next year.

Key Findings

 Following is an analysis of the key findings. The complete audited findings are presented in the Appendix of this report. We have organized the findings according to the following topics:

  • The risk of end user access to unstructured data
  • Data Access Governance tools used to limit access to sensitive data
  • Current practices in assigning privileged user access
  • Effectiveness of Data Access Governance programs
  • Recommendations for improving Data Access Governance programs

The risk of end user access to unstructured data

 Organizations lose track of where employees and other insiders are storing unstructured data. In the context of this research, end users are employees, temporary employees, contractors, consultants and others who have limited or “ordinary” access rights to their organizations’ IT resources.

Unstructured data is defined as information that either does not have a pre-defined data model or is not organized in a pre-defined manner. Unstructured data tends to be user generated or manipulated data that lives in documents, such as spreadsheets or even scanned and signed contracts. Typically, this data may be in a structured format in an application and exported to a document for use by a person or team of people.

Respondents were asked to rate their confidence that their organization knows where users are storing unstructured data from 1 = no confidence to 10 = high confidence. Only 19 percent of respondents rate their confidence as high (7+ responses). This lack of confidence indicates that much of a company’s sensitive unstructured data is not secured.

Organizations lack visibility into how users are accessing unstructured data. As discussed above, respondents have little confidence they know where unstructured data resides. They also don’t know for certain the end users accessing the sensitive unstructured data.

The majority of respondents (50 percent) say their organizations rely upon platform capabilities, such as access controls built into Dropbox, to determine who has access to sensitive unstructured data. Only 37 percent of respondents say they use role-based access enforced through AD groups, even though many rate AD as very important. Only 31 percent of respondents monitor compliance with policies or information from specialized file activity monitoring (28 percent of respondents).

Documents and spreadsheets are the unstructured data most secured today. Some 71 percent of respondents say documents and spreadsheets are most often secured and 64 percent of respondents say emails and text messages are secured.

Confidence in safeguarding unstructured data is low. As a result of the volume of unstructured data that needs to be protected and the difficulty in determining who has access to sensitive unstructured data, only 25 percent of respondents rate their confidence in discovering unstructured data containing sensitive information as very high (7+ on a scale of 1 = no confidence to 10 = high confidence). Only 12 percent of respondents highly confident in their organizations’ ability to discover where unstructured data is stored in the cloud.

Inappropriate behaviors by end users put organizations at risk. Fifty-nine percent of respondents say users access sensitive or confidential data because of curiosity and 52 percent of respondents say users share their access rights with others.

False positives and too much data are the biggest challenges in determining if an event or incident is an insider threat. Organizations find it difficult to determine if inappropriate access to sensitive data was caused by a negligent or malicious insider. Security tools yield too many false positives (63 percent of respondents) and security tools yield more data than can be reviewed in a timely fashion (60 percent of respondents) are the biggest challenges in determining if an event or incident is an insider threat.

To continue reading, download the full report at Stealthbits website.

There will be a free live webinar discussing these results on Oct. 18 at 11 a.m. Click here to register for this webinar.

 

 

What should college students know about ethics and technology? Help us make a 101 course, here

Bob Sullivan

What should computer science students — all college students — learn about the intersection of ethics and technology? @ethicaltechorg, founded by two Duke University students, (I’m an adviser) is crowdsourcing the curriculum for Tech Ethics 101. Thoughts here, or at the link: https://ethical-tech.org/request-for-collaboration/

Algorithms run our lives today. They decide what homes we should buy, who we should date, what jobs we are qualified for, what updates and Tweets we see, and even welfare payments, mortgage loans, and how long convicts must remain in prison. Complex formulas make all these decisions in darkness, their calculations unknown to their subjects, often even beyond the understanding of their data scientist creators. Operating beyond reproach inside a black box, computers have become our puppet-masters, as consumers buy things, choose mates, and make political decisions based on realities calculated on their behalf.

But like all systems that operate in secret, algorithms have a dark side. They can lie. They remain vulnerable to hacking and reverse-engineering. And they reinforce some of society’s worst elements, like racial, class, and gender bias.

I’m really concerned about this; I believe everyone in the world should be. So today I’m announcing that I’ve joined a new group called Ethical Tech, which collaborates with groups like the Duke University Center on Law and Technology; I’m a member of the organization’s advisory board.  Founders Cassi Carley and Justin Sherman, both of Duke, have ambitious plans for the organization.

We join a rich set of organizations springing up lately — long overdue — to deal with runaway technology and its unintended consequences.  The Center for Humane Tech opened its doors earlier this year, born out of frustration with Facebook, promising to help programmers think more about what they are making. Just this week, my pal Julia Angwin announced a publication called The Markup, funded by Craig Newmark from Craigslist. It will seek to add journalistic accountability to the world of technology.  So, energy around this topic is brewing.

At Ethical Tech, our  first project involved bias in algorithms used by judges around the country to decide how long convicted criminals should spend in prison. Several other projects are in the works, including design of a tech-ethics class for college students.

I hope you will consider helping. What should future programmers know? What should future digital citizens know? How can we arm them for this ongoing information war; and how can we convince engineers to use their math skills for good instead of evil?

I often ask a basic question when I am in groups, like this: “The Internet — good or bad?”  Yes, yes, it’s done an amazing job spreading information around the world. But it’s done an even better job spreading BAD information around the world. Some research suggests that more people think the world is flat today than 10 years ago.  So, that’s bad.  But I doesn’t have to be that way. (And anyway, I think the Internet is good, but it’s more a 60-40 thing). We can’t afford to be passengers in this digital journey any longer, however. We have to make deliberate choices, every day, to make sure tech enhances our humanity instead of destroying it.  That will require concentrated effort across all sorts of party, racial, gender, and ideological lines.  We’re going to have to talk to each other. So, let’s get started.

What should computer science students — all college students — learn about the intersection of ethics and technology? @ethicaltechorg, based at Duke, (I’m an adviser) is crowd sourcing the curriculum for Tech Ethics 101. Thoughts here, or at the link: https://ethical-tech.org/request-for-collaboration/