Monthly Archives: August 2020

Consumers very worried about privacy, but disagree on who’s to blame

Privacy and Security in a Digital World: A Study of Consumers in the United States was conducted to understand the concerns consumers have about their privacy as more of their lives become dependent upon digital technologies. Based on the findings, the report also provides recommendations for how to protect privacy when using sites that track, share and sell personal data. Sponsored by ID Experts, we surveyed 652 consumers in the US. For the majority of these consumers, privacy of their personal information does matter.

Consumers are very concerned about their privacy when using Facebook, Google and other online tools. Consumers were asked to rate their privacy concerns on a scale of 1 = not concerned to 10 = very concerned when using online tools, devices and online services. Figure 1 presents the very concerned responses (7+ responses).

The survey found that 86 percent of respondents say they are very concerned when using Facebook and Google, 69 percent of respondents are very concerned about protecting privacy when using devices and 66 percent of respondents say they are very concerned when shopping online or using online services.

When asked if they believe that Big Tech companies like Google, Twitter and Facebook will protect their privacy rights through self-regulation, 40 percent of consumers say industry self-regulation will suffice. However, 60 percent of consumers say government oversight is required (34 percent) or a combination of government oversight and industry self-regulation (26 percent) is required.

Following are the most salient findings:

 The increased use of social media and awareness about the potential threat to their digital privacy has consumers more concerned about their privacy. In fact, social media websites are the least trusted (61 percent of consumers) followed by shopping sites (52 percent of consumers).

  • Consumers are most concerned about losing their civil liberties and having their identity stolen if personal information is lost, stolen or wrongfully acquired by outside parties (56 percent and 54 percent of respondents, respectively). Only 25 percent of consumers say they are concerned about marketing abuses if their personal information is lost or stolen.
  • Seventy-four percent of consumers say they rarely (24 percent) or never (50 percent) have control over their personal data. Despite this belief, 54 percent of consumers say they do not limit the data they provide when using online services. Virtually all consumers believe their email addresses and browser settings & histories are collected when using their devices, according to 96 percent and 90 percent of consumers, respectively.
  • Home is where the trust is. Forty-six percent of consumers, when asked the one location they trust most when shopping online, banking and other financial activities online, say it is their home. Only 10 percent of consumers say it is when using public WiFi.
  • Consumers believe search engines, social media and shopping sites are sharing and selling their personal data, according to 92 percent, 78 percent and 63 percent of consumers. To increase trust in online sites, consumers want to be explicitly required to opt-in before the site shares or sells their personal information, according to 70 percent of consumers.
  • Consumers reject advertisers’ use of their personal information to market to them. Seventy-three percent of consumers say advertisers should allow them to “opt-out” of receiving ads on any specific topic at any time, and 68 percent of consumers say they should not be able to serve ads based on their conversations and messaging. Sixty-four percent of consumers say they do not want to be profiled unless they grant permission.
  • Online ads and the “creepy” factor. Sixty-six percent of consumers say they have received online ads that are relevant but not based on their online search behavior or publicly available information frequently (41 percent of consumers) or rarely (25 percent of consumers). Sixty-four percent of consumers say they think it is “creepy” when that happens.
  • Forty-five percent of consumers are not aware that their devices have privacy controls they can use to set their level of information sharing. Of the 55 percent of consumers who are aware, 60 percent say they review and update settings on their computers and 56 percent say they review and update settings on their smartphones.
  • Fifty-four percent of consumers say online service providers should be held most accountable for protecting consumers’ privacy rights when going online. Forty-five percent of consumers say they themselves should be most accountable.

Download the full report at the ID Experts website.

Is smartphone contact tracing doomed to be a privacy killer? Or can tech really help?

An app that tells you if you were exposed to someone with Covid? Sounds great. But, as usual, tech-as-silver-bullet ideas come full of booby-traps. There’s been a lot of scattershot discussion around smartphone contact tracing during the past several months, with privacy advocates saying the harms far outweigh the benefits, but many governments and technology are plowing ahead anyway.

But if tech *could* make us safer during this crisis, shouldn’t we try? Under what conditions might it actually be feasible, and fair? Prof. Jolynn Dellinger (Duke and UNC law professor, @MindingPrivacy) has put it all together in a thoughtful analysis, creating a 5-part test that could be considered before implementing contact tracing. Will it *really* work? Will it do more harm than good? Is there enough trust in institutions to ensure it won’t be abused later? Her structure would be useful for the launch of almost any new technology, and it deserves a careful reading on its own. It also deserves more discussion, so I reached out to Prof. Dellinger and Prof. David Hoffman at Duke’s Sanford School of Public Policy and invited them to a brief email dialog with me. I hope you’ll find it illuminating.

Disclosure: I was recently a visiting scholar at Duke, invited by Prof. Hoffman.



FROM: Bob
TO: David
CC: Jolynn

David: Jolynn’s piece is such an excellent state-of-play analysis. Not to put words in her mouth, but I read it as a polite and smart “this’ll never work.” We can’t even get Covid test results in less than a week, why are we even talking about some kind of sci-fi solution like smartphones that warn each other (or, gulp, tell on each other)? Every dollar and moment of attention spent on contract tracing apps should be redirected to finding more testing reagents, if you ask me. Still, this discussion is inevitable, because the apps – working or not – are coming. So I really welcome her criteria for use.

One thing I’ve thought a lot about, which she mentions in passing: Alert fatigue. I’d *definitely* want a text message if someone I spent time with got Covid, were that possible. But if I got five of these in one day I’d turn it off, especially if they proved to be false alarms. Or if I got none in the first 10 days, I’d probably turn it off, or it would age off my smartphone. Fine-tuning the alert criteria will be a hell of a job.

Meanwhile, my confidence level that data like this would *never* be used to hunt for immigrants, or deadbeat dads, or terrorists, or journalists, is about zero. It’s hard to imagine a technology more ripe for unintended consequences than an app that makes such liberal use of location information.

That being said, I sure wish something like this *could* work. Let’s imagine an alternative universe where the trust, law, and technology were already in place when Covid hit, so tech was ready and willing to ride in and save the day. How do we create that world, if not for now, but at least in time for the next pandemic/terrorist attack/asteroid strike/etc. ? We might have to reach back to the days after 9/11, as Jolynn hints, and start a 20-year effort at lawmaking and trust building. The best way to start a journey of 1,000 miles is with a single step. How would we get started?


FROM: David
TO: Bob
CC: Jolynn

David Hoffman

Thanks Bob, with any of these uses of technology the first question that should be asked is “what problem are we trying to solve?”  Are we using the technology to trace infections? Or are we allowing people to increase their chances that they will be notified if they have had exposure to the virus? Or are we using the technology to have individuals track whether they are having symptoms? Or to enforce a quarantine? Or to have people volunteer to donate plasma? Or just to provide people with up to date information about the virus? Depending on the problem we are attempting to solve, we will want to design very different technology implementations. For many of these problems we will likely need to merge other data with whatever data is collected through the technology. Based on what we have seen done in other countries these other data feeds can include information from manual contact tracers, credit card data, CCTV camera feeds and clinical health care data. Once we define what problem we are trying to solve and what data is necessary to solve it, then we can conduct a privacy assessment to determine the level of the risks.

Many of the smartphone apps that have been created have been described as “contact tracing apps”, but it is not clear to me that they will actually help much with contact tracing. To properly do contact tracing through manual efforts, with technology, or using a combination of both we will need to have enough data about whether people have contracted COVID-19 (this presumes broad and quick testing) and a mechanism to accurately measure whether people have been in close contact with each other for long enough to warrant a recommendation that they quarantine themselves, get tested, or both. Unfortunately, solutions that rely just on Bluetooth data from smartphones is likely to result in a large number of both false negatives and false positives. However, a system that integrates Bluetooth data with information learned from manual contact tracers has a higher likelihood of success. Manual contact tracing though suffers from an issue of a lack of centralized guidance, is under resourced and in most areas has not made clear what privacy protections will be put in place for the collected data. The US urgently needs a national strategy on contact tracing, with clear recommendations on what data to collect, what technology to use, and what cybersecurity and privacy protections to put in place.


FROM: Jolynn
TO: Bob
CC: David

Jolynn Dellinger

Bob, Thank you so much for reading the post and for your thoughtful comments and questions. The covid crisis highlights the numerous ways data and emerging tech could be used to benefit society.  Benefitting society while preventing harm to individuals is not an unobtainable goal, but it will take concerted effort. We have long recognized as a society the sensitivity of health information and we are getting there (slowly but surely) on location data. Acting on what we know by taking proactive (as opposed to merely reactive) steps to protect the privacy of personal information – through design, policy and law – is the place to start. A reactive step at this moment is passing a limited law dealing with the privacy of information collected for covid-19 purposes — and this is absolutely better than nothing.  A proactive step would be passing comprehensive privacy legislation that circumscribes collection and use of data more generally and contributes to the creation of an environment in which people can trust companies and governments not to repurpose, exploit or misuse their personal data. (Arguably, because we have waited so long to take obvious necessary legislative action, even a comprehensive privacy law could be broadly characterized as “reactive” at this point, but that is a topic for another post).

Regarding the original post, my personal view is that voluntary digital contact tracing apps are not likely to be worth the existing privacy and security risks at this time given our failure to implement the other necessary elements of a comprehensive, holistic response to the health crisis and the likelihood that they will not be used by sufficient numbers of citizens to make the notifications helpful or reliable. You mentioned in your introductory comments “feasibility” and the relevance of the dollars spent on contact tracing.  I did not cover this topic adequately in my original post but certainly think it is a crucial consideration. Budgets are limited and strained, and every response we choose to invest in necessarily represents another option we do not pursue. So the question of whether to pursue digital contact tracing apps should not be considered in a vacuum but rather should be analyzed in terms of bang for the buck, so to speak. Is an investment in such apps the best, most effective use of our limited funds? And what potentially more useful responses are we foregoing? This question further highlights one of the downsides of the state by state approach the US is currently taking. How much more economically efficient might it be to have regional approaches or, sigh, leadership at the federal level? I strongly agree with David’s comment that the US needs a national strategy with clear recommendations on what data to collect, what technology to use, and what cybersecurity and privacy protections to put in place. I would add that these guidelines, like the 5 question analysis proposed in the blog post, should be applied to any and all personal data collected for the purposes of managing the covid crisis.


FROM: Bob
TO: Jolynn
CC: David
So is there one thing that readers might urge their leaders to do, or urge technology companies to do, during the next couple of months that might bring us closer to these goals?
It seems like a federal privacy law is probably off the table between now and election day, so that won’t come in time to help with Covid.
Is there something else that might? Could a state pass a law? Could a tech firm adopt a model privacy policy around contact tracing apps?  What kind of steps might any of these interested parties that would at least move us a bit in the right direction? Sadly, I’m quite sure we’ll be dealing with Covid long after November.

 

FROM: Jolynn
TO: Bob
CC: David

Jolynn Dellinger

State legislatures could pass laws or, in the alternative, Governors might issue executive orders to accomplish immediate goals. States can work to ensure that all local and state level health departments are on the same page and are employing similar privacy and security protections for data collected by manual contact tracers and any digital contact tracing apps or other technologies designed to manage Covid issues.

Tech firms and app developers should certainly have privacy policies in place, but those entities should also make explicit, affirmative guarantees that any data collected for purposes of responding to the Covid crisis (health, location or other personal data) will not be used for any other purpose or monetized, and will not be sold to or shared with any third parties, including law enforcement of any kind. Google and Apple could also bar apps from inclusion in the Google Play Store or App Store if they do not make such explicit commitments.

Want to participate in this dialog? Leave your comments below. We’ll keep the conversation going.