An app that tells you if you were exposed to someone with Covid? Sounds great. But, as usual, tech-as-silver-bullet ideas come full of booby-traps. There’s been a lot of scattershot discussion around smartphone contact tracing during the past several months, with privacy advocates saying the harms far outweigh the benefits, but many governments and technology are plowing ahead anyway.
But if tech *could* make us safer during this crisis, shouldn’t we try? Under what conditions might it actually be feasible, and fair? Prof. Jolynn Dellinger (Duke and UNC law professor, @MindingPrivacy) has put it all together in a thoughtful analysis, creating a 5-part test that could be considered before implementing contact tracing. Will it *really* work? Will it do more harm than good? Is there enough trust in institutions to ensure it won’t be abused later? Her structure would be useful for the launch of almost any new technology, and it deserves a careful reading on its own. It also deserves more discussion, so I reached out to Prof. Dellinger and Prof. David Hoffman at Duke’s Sanford School of Public Policy and invited them to a brief email dialog with me. I hope you’ll find it illuminating.
Disclosure: I was recently a visiting scholar at Duke, invited by Prof. Hoffman.
FROM: Bob
TO: David
CC: Jolynn
David: Jolynn’s piece is such an excellent state-of-play analysis. Not to put words in her mouth, but I read it as a polite and smart “this’ll never work.” We can’t even get Covid test results in less than a week, why are we even talking about some kind of sci-fi solution like smartphones that warn each other (or, gulp, tell on each other)? Every dollar and moment of attention spent on contract tracing apps should be redirected to finding more testing reagents, if you ask me. Still, this discussion is inevitable, because the apps – working or not – are coming. So I really welcome her criteria for use.
One thing I’ve thought a lot about, which she mentions in passing: Alert fatigue. I’d *definitely* want a text message if someone I spent time with got Covid, were that possible. But if I got five of these in one day I’d turn it off, especially if they proved to be false alarms. Or if I got none in the first 10 days, I’d probably turn it off, or it would age off my smartphone. Fine-tuning the alert criteria will be a hell of a job.
Meanwhile, my confidence level that data like this would *never* be used to hunt for immigrants, or deadbeat dads, or terrorists, or journalists, is about zero. It’s hard to imagine a technology more ripe for unintended consequences than an app that makes such liberal use of location information.
That being said, I sure wish something like this *could* work. Let’s imagine an alternative universe where the trust, law, and technology were already in place when Covid hit, so tech was ready and willing to ride in and save the day. How do we create that world, if not for now, but at least in time for the next pandemic/terrorist attack/asteroid strike/etc. ? We might have to reach back to the days after 9/11, as Jolynn hints, and start a 20-year effort at lawmaking and trust building. The best way to start a journey of 1,000 miles is with a single step. How would we get started?
FROM: David
TO: Bob
CC: Jolynn
David Hoffman
Thanks Bob, with any of these uses of technology the first question that should be asked is “what problem are we trying to solve?” Are we using the technology to trace infections? Or are we allowing people to increase their chances that they will be notified if they have had exposure to the virus? Or are we using the technology to have individuals track whether they are having symptoms? Or to enforce a quarantine? Or to have people volunteer to donate plasma? Or just to provide people with up to date information about the virus? Depending on the problem we are attempting to solve, we will want to design very different technology implementations. For many of these problems we will likely need to merge other data with whatever data is collected through the technology. Based on what we have seen done in other countries these other data feeds can include information from manual contact tracers, credit card data, CCTV camera feeds and clinical health care data. Once we define what problem we are trying to solve and what data is necessary to solve it, then we can conduct a privacy assessment to determine the level of the risks.
Many of the smartphone apps that have been created have been described as “contact tracing apps”, but it is not clear to me that they will actually help much with contact tracing. To properly do contact tracing through manual efforts, with technology, or using a combination of both we will need to have enough data about whether people have contracted COVID-19 (this presumes broad and quick testing) and a mechanism to accurately measure whether people have been in close contact with each other for long enough to warrant a recommendation that they quarantine themselves, get tested, or both. Unfortunately, solutions that rely just on Bluetooth data from smartphones is likely to result in a large number of both false negatives and false positives. However, a system that integrates Bluetooth data with information learned from manual contact tracers has a higher likelihood of success. Manual contact tracing though suffers from an issue of a lack of centralized guidance, is under resourced and in most areas has not made clear what privacy protections will be put in place for the collected data. The US urgently needs a national strategy on contact tracing, with clear recommendations on what data to collect, what technology to use, and what cybersecurity and privacy protections to put in place.
FROM: Jolynn
TO: Bob
CC: David
Jolynn Dellinger
Bob, Thank you so much for reading the post and for your thoughtful comments and questions. The covid crisis highlights the numerous ways data and emerging tech could be used to benefit society. Benefitting society while preventing harm to individuals is not an unobtainable goal, but it will take concerted effort. We have long recognized as a society the sensitivity of health information and we are getting there (slowly but surely) on location data. Acting on what we know by taking proactive (as opposed to merely reactive) steps to protect the privacy of personal information – through design, policy and law – is the place to start. A reactive step at this moment is passing a limited law dealing with the privacy of information collected for covid-19 purposes — and this is absolutely better than nothing. A proactive step would be passing comprehensive privacy legislation that circumscribes collection and use of data more generally and contributes to the creation of an environment in which people can trust companies and governments not to repurpose, exploit or misuse their personal data. (Arguably, because we have waited so long to take obvious necessary legislative action, even a comprehensive privacy law could be broadly characterized as “reactive” at this point, but that is a topic for another post).
Regarding the original post, my personal view is that voluntary digital contact tracing apps are not likely to be worth the existing privacy and security risks at this time given our failure to implement the other necessary elements of a comprehensive, holistic response to the health crisis and the likelihood that they will not be used by sufficient numbers of citizens to make the notifications helpful or reliable. You mentioned in your introductory comments “feasibility” and the relevance of the dollars spent on contact tracing. I did not cover this topic adequately in my original post but certainly think it is a crucial consideration. Budgets are limited and strained, and every response we choose to invest in necessarily represents another option we do not pursue. So the question of whether to pursue digital contact tracing apps should not be considered in a vacuum but rather should be analyzed in terms of bang for the buck, so to speak. Is an investment in such apps the best, most effective use of our limited funds? And what potentially more useful responses are we foregoing? This question further highlights one of the downsides of the state by state approach the US is currently taking. How much more economically efficient might it be to have regional approaches or, sigh, leadership at the federal level? I strongly agree with David’s comment that the US needs a national strategy with clear recommendations on what data to collect, what technology to use, and what cybersecurity and privacy protections to put in place. I would add that these guidelines, like the 5 question analysis proposed in the blog post, should be applied to any and all personal data collected for the purposes of managing the covid crisis.
FROM: Bob
TO: Jolynn
CC: David
So is there one thing that readers might urge their leaders to do, or urge technology companies to do, during the next couple of months that might bring us closer to these goals?
It seems like a federal privacy law is probably off the table between now and election day, so that won’t come in time to help with Covid.
Is there something else that might? Could a state pass a law? Could a tech firm adopt a model privacy policy around contact tracing apps? What kind of steps might any of these interested parties that would at least move us a bit in the right direction? Sadly, I’m quite sure we’ll be dealing with Covid long after November.
FROM: Jolynn
TO: Bob
CC: David
Jolynn Dellinger
State legislatures could pass laws or, in the alternative, Governors might issue executive orders to accomplish immediate goals. States can work to ensure that all local and state level health departments are on the same page and are employing similar privacy and security protections for data collected by manual contact tracers and any digital contact tracing apps or other technologies designed to manage Covid issues.
Tech firms and app developers should certainly have privacy policies in place, but those entities should also make explicit, affirmative guarantees that any data collected for purposes of responding to the Covid crisis (health, location or other personal data) will not be used for any other purpose or monetized, and will not be sold to or shared with any third parties, including law enforcement of any kind. Google and Apple could also bar apps from inclusion in the Google Play Store or App Store if they do not make such explicit commitments.
Want to participate in this dialog? Leave your comments below. We’ll keep the conversation going.