Monthly Archives: July 2014

THAT Facebook study; yes, you should be concerned

BobThere was another round of confused Facebook outrage this month when a story in the Atlantic revealed the social media giant had intentionally toyed with users’ mood — allegedly for science, but in reality, for money.  FB turned up the sadness dial on some users’ news feeds, and found out the sadness was contagious (happiness was, too).

The study that’s produced the outrage did qualify as science. It was published in the prestigious journal Proceedings of the National Academy of Sciences.  It has plenty of folks scrambling.

What does the nation’s leading privacy researcher, and a frequent but even-handed Facebook critic, think of the Facebook mood manipulation study controversy?  It’s a case of “shooting the messenger,” Alessandro Acquisti of Carnegie Mellon University told me.  It’s also a rare peek “through the looking-glass” at the future of privacy.

Acquisti has been involved in designing hundreds of studies, and he has a deep research interest in privacy, so I asked him for his reaction to the Facebook research dust-up. His response might surprise you.

“The reaction to the study seems like a case of shooting the messenger,” he said. “Facebook (and probably many other online services) engages daily in user manipulation. There is no such thing as a “neutral” algorithm; Facebook decides what to show you, how, and when, in order to satisfy an externally inscrutable objective function (Optimize user interactions? Maximize the time they spend on Facebook, or the amount of information they disclose, or the number of ads they will click? Who knows?) The difference is that, with this study, the researchers actually revealed what had been done, why, and with what results. Thus, this study offers an invaluable wake-up call – a peek through the looking glass of the present and future of privacy and social media.

“Those attacking the study may want to reflect upon the fact that we have been part of the Facebook experiment since the day any of us created an account, and that privacy is much more than protection of personal information —  it is about protection against the control that others can have over us once they have enough information about us.”

Here’s my take on it.  It’s a bad idea to harm people for science.  In many cases it’s merely unethical, not illegal, but it’s a really bad idea.  When harm is unavoidable — say you are trying an experimental drug that might cure a person with cancer, or might kill them faster — scientists must obtain “informed consent.” Now, informed consent is a very slippery idea.  A patient who is desperate and ready to try anything might not really be in a position to give informed consent, for example. Doctors and researchers are supposed to go the extra mile to ensure study subjects *truly* understand what they are doing.

Meanwhile, in many social science studies, informed consent prior to the experiment would wreck the study.  Telling Facebook users, “We are going to try to manipulate your mood” wouldn’t work.  In those cases, researchers are supposed to tell subjects as soon as is feasible what they were up to. And the “harm” must be as minimal as possible.

Everyone who’s conducted research (including me!) has a grand temptation to bend these rules in the name of science — but my research has the power to change the world! — so science has a solution to that problem. Study designs must be approved by an Institutional Review Board, or IRB.  This independent body decides, for example, “No, you may not intentionally make thousands of people depressed in order to see if they will buy more stuff. At least if you do that, you can’t call it science.”

Sign up for Bob Sullivan’s free email newsletter. 

Pesky folks, those IRB folks. My science friends complain all the time that they nix perfectly good research ideas.  A friend who conducts privacy research, for example, can’t trick people into divulging delicate personal information like credit card numbers in research because that would actually cause them harm.

Facebook apparently decided it was above the pesky IRB. Well, the journal editor seemed to say the research was IRB-approved, and then later, seemed to say only part of the research was IRB approved, all of which seems to say no IRB really said, “Sure, make thousands of people depressed for a week.”

And while a billion people on the planet have clicked “Yes” to Facebook’s terms of service, which apparently includes language that gives the firm the right to conduct research, it doesn’t appear Facebook did anything to get informed consent from the subjects. (If you argue that a TOS click means informed consent, send me $1 million. You consented to that by reading this story).

Back to Facebook researchgate.  The problem isn’t some new discovery that Facebook manipulates people. Really, if you didn’t realize that was happening all the time, you are foolish.  The problem is the incredible disconnect between Facebook and its data subjects (ie, people).  Our petty concerns with the way it operates keep getting in Facebook’s way. We should all just pipe down and stop fussing.

Let’s review what’s happened here. Facebook:

1) Decided it was above the standard academic review process

2) Used a terms of service click, in some cases years old, to serve as “informed consent” to harm subjects

Think carefully about this: What wouldn’t Facebook do? What line do you trust someone inside Facebook to draw?

If you’d like to read a blow-by-blow analysis of what went on here – including an honest debate about the tech world’s “so-what” reaction – visit Sebastian Deterding’s Tumblr page.

Here’s the basic counter-argument, made with the usual I’m-more-enlightened-than-you sarcasm of Silicon Valley:

“Run a web site, measure anything, make any changes based on measurements? Congratulations, you’re running a psychology experiment!” said Marc Andreessen, Web browser creator and Internet founding father or sorts. “Helpful hint: Whenever you watch TV, read a book, open a newspaper, or talk to another person, someone’s manipulating your emotions!”

In other words, all those silly rules about treating study subjects fairly that academic institutions have spent decades writing – they must be dumb.  Why should Silicon Valley be subject to any such inconvenience?

My final point: When the best defense for doing something that many people find outrageous is you’ve been doing it for a long time, it’s time for some soul-searching.

 

Who's in charge at power plants? Many don't know

Larry Ponemon

Larry Ponemon

An unnamed natural gas company hired an IT firm to test its corporate information system. POWER Magazine reported, “The consulting organization carelessly ventured into a part of the network that was directly connected the SCADA system. The penetration test locked up the SCADA system and the utility was not able to send gas through its pipelines for four hours. The outcome was the loss of service to its customer base for those four hours.”

As stories like these become more common, we wanted to study how well utility firms are preparing for what seems like the inevitable: a major, successful attack.  The answer is a mixed bag.

This month, we release the results of Stealth Research: Critical Infrastructure, sponsored by Unisys. The purpose of this research is to learn how utility, oil and gas, alternate energy and manufacturing organizations are addressing cybersecurity threats.

Among the more alarming findings: 67 percent of those surveyed said they’d suffered at least one security compromise, but yet one quarter don’t actually know who’s in charge of security.

As the findings reveal, organizations are not as prepared as they should be to deal with the sophistication and stealth of a cyber threat or the negligence of an employee or third party. In fact, the majority of participants in this study do not believe their companies’ IT security programs are “mature.” For purposes of this research, a mature stage is defined as having most IT security program activities deployed. Most companies have defined what their security initiatives are but deployment and execution are still in the early or middle stages.

Key findings of this research

Most companies have not fully deployed their IT security programs. Only 17 percent of companies represented in this research self-report that most of their IT security program activities are deployed. Fifty percent of respondents say their IT security activities have not as yet been defined or deployed (7 percent) or they have defined activities but they are only partially deployed (43 percent). A possible reason is that only 28 percent of respondents agree that security is one of the top five strategic priorities across the enterprise.

The risk to industrial control systems and SCADA is believed to have substantially increased. Fifty-seven percent of respondents agree that cyber threats are putting industrial control systems and SCADA at greater risk. Only 11 percent say the risk has decreased due to heightened regulations and industry-based security standards.

Security compromises are occurring in most companies. It is difficult to understand why security is not a top a priority because 67 percent of respondents say their companies have had at least one security compromise that that led to the loss of confidential information or disruption to operations over the last 12 months. Twenty-four percent of respondents say these compromises were due to an insider attack or negligent privileged IT users.

Upgrading existing legacy systems may result in sacrificing mission-critical security. Fifty four percent of respondents are not confident (36 percent) or unsure (18 percent) that their organization would be able to upgrade legacy systems to the next improved security state in cost effective ways without sacrificing mission-critical security.

 Many organizations are not getting actionable real-time threat alerts about security exploits. According to 34 percent of respondents, their companies do not get real-time alerts, threat analysis and threat prioritization intelligence that can be used to stop or minimize the impact of a cyber attack. If they do receive such intelligence, 22 percent of respondents say they are not effective. Only 15 percent of respondents say threat intelligence is very effective and actionable.

More than half, hit. The majority of companies have had at least one security compromise in the past 12 months. Sixty-seven percent of companies represented in this research have had at least one incident that led to the loss of confidential information or disruption to operations. Twenty-four percent of security incidents were due to a negligent employee with privileged access. However, 21 percent of respondents say they were not able to determine the source of the incident.

Who’s in charge? When asked if their company has dedicated personnel and/or departments responsible for industrial control systems and SCADA security, 25 percent say they do not have anyone assigned,. The majority (55 percent) say they have one person responsible

Out of control. Nearly one-third of respondents say that more than a quarter of their network components are outside their control, including third party endpoints such as smartphones and home computers are outside the direct control of their organization’s security operations.

Who’s in charge at power plants? Many don’t know

Larry Ponemon

Larry Ponemon

An unnamed natural gas company hired an IT firm to test its corporate information system. POWER Magazine reported, “The consulting organization carelessly ventured into a part of the network that was directly connected the SCADA system. The penetration test locked up the SCADA system and the utility was not able to send gas through its pipelines for four hours. The outcome was the loss of service to its customer base for those four hours.”

As stories like these become more common, we wanted to study how well utility firms are preparing for what seems like the inevitable: a major, successful attack.  The answer is a mixed bag.

This month, we release the results of Stealth Research: Critical Infrastructure, sponsored by Unisys. The purpose of this research is to learn how utility, oil and gas, alternate energy and manufacturing organizations are addressing cybersecurity threats.

Among the more alarming findings: 67 percent of those surveyed said they’d suffered at least one security compromise, but yet one quarter don’t actually know who’s in charge of security.

As the findings reveal, organizations are not as prepared as they should be to deal with the sophistication and stealth of a cyber threat or the negligence of an employee or third party. In fact, the majority of participants in this study do not believe their companies’ IT security programs are “mature.” For purposes of this research, a mature stage is defined as having most IT security program activities deployed. Most companies have defined what their security initiatives are but deployment and execution are still in the early or middle stages.

Key findings of this research

Most companies have not fully deployed their IT security programs. Only 17 percent of companies represented in this research self-report that most of their IT security program activities are deployed. Fifty percent of respondents say their IT security activities have not as yet been defined or deployed (7 percent) or they have defined activities but they are only partially deployed (43 percent). A possible reason is that only 28 percent of respondents agree that security is one of the top five strategic priorities across the enterprise.

The risk to industrial control systems and SCADA is believed to have substantially increased. Fifty-seven percent of respondents agree that cyber threats are putting industrial control systems and SCADA at greater risk. Only 11 percent say the risk has decreased due to heightened regulations and industry-based security standards.

Security compromises are occurring in most companies. It is difficult to understand why security is not a top a priority because 67 percent of respondents say their companies have had at least one security compromise that that led to the loss of confidential information or disruption to operations over the last 12 months. Twenty-four percent of respondents say these compromises were due to an insider attack or negligent privileged IT users.

Upgrading existing legacy systems may result in sacrificing mission-critical security. Fifty four percent of respondents are not confident (36 percent) or unsure (18 percent) that their organization would be able to upgrade legacy systems to the next improved security state in cost effective ways without sacrificing mission-critical security.

 Many organizations are not getting actionable real-time threat alerts about security exploits. According to 34 percent of respondents, their companies do not get real-time alerts, threat analysis and threat prioritization intelligence that can be used to stop or minimize the impact of a cyber attack. If they do receive such intelligence, 22 percent of respondents say they are not effective. Only 15 percent of respondents say threat intelligence is very effective and actionable.

More than half, hit. The majority of companies have had at least one security compromise in the past 12 months. Sixty-seven percent of companies represented in this research have had at least one incident that led to the loss of confidential information or disruption to operations. Twenty-four percent of security incidents were due to a negligent employee with privileged access. However, 21 percent of respondents say they were not able to determine the source of the incident.

Who’s in charge? When asked if their company has dedicated personnel and/or departments responsible for industrial control systems and SCADA security, 25 percent say they do not have anyone assigned,. The majority (55 percent) say they have one person responsible

Out of control. Nearly one-third of respondents say that more than a quarter of their network components are outside their control, including third party endpoints such as smartphones and home computers are outside the direct control of their organization’s security operations.