It's a little over a year since I last blogged anything. Partly that's because I had other things on my mind, and nothing struck me as being the kind of topic about which I had much to say. And then Covid-19 came along, and I still didn't have anything to say.
|
Gratuitous virus image! Yay! |
|
I think that a lot of people don't have much to say about it; but that doesn't seem to have stopped them saying it anyway. There's been a
lot of nonsense. And there's also been a lot of stuff that isn't nonsense, but that is nonetheless trivial, as people find an excuse to make their rather quotidian thoughts about consent or resource allocation or whatever relevant by sticking "... in the Time of Coronavirus" at the end of the title and bunging it off to a journal. Hey-ho. The REF's coming. What do you expect?
The upshot is that I've been ignoring most of the CV-19 stuff; but every now and again, something catches my eye - such as
this piece by Deena Davis on the Hastings Center blog: "Before We Turn to Digital Contact Tracing for Covid, Remember Surveillance in the Sixties", the conclusion of which is that "for me, digital contact tracing [through phone apps] is a bridge too far".
Why would this be? The concern articulated has to do with the misuse of data. Once you're being traced, who knows what'll happen to the data generated? There are precedents for things happening with it that one might not welcome:
Do you remember when we discovered that Uber’s passenger app not only traced you to your destination, but continued to trace where you went after you exited the car?
And so the worry is that by installing a tracing app, one would potentially be handing over vast amounts of information to the government, which might use it for sinister ends. For example, it might allow information to be gathered about immigration lawyers having met clients - presumably, not something that one would want to see. Correspondingly, we might see other instances of governments prying into personal lives. And this is at the root of the reason not to install the app.
In order to place such an app on my phone I would have to believe at
least the following things: that the promised anonymity would be
respected, that the government would not get hold of it, and that it
would not be used to trace contacts for other reasons, e.g., to discover
an immigration lawyer’s clients.
[W]e would need important safeguards against mission creep, whereby
the surveillance app did not de-activate just because the pandemic was
over. Perhaps the government discovers a new use for it; perhaps we kind
of get used to it, the way we are used to the idea that our E-ZPass
keeps a record of every toll booth we have gone through and our grocery
store loyalty card keeps a record of the foods we buy. [Ryan] Calo [has] noted that
“clear, explicit rules are critical,” but what point are rules if the
government clandestinely subverts them?
Well, OK. But it's one thing to raise concerns about what a government might do nefariously with data gathered through a tracing app - quite another to take those concerns as settling matters. For one thing, we have to ask ourselves whether governments actually would do that. Perhaps they would. Perhaps not. It's notable that Davis doesn't really go beyond the "But what if..." stage of argument. But that's a really cheap move. We need to know more about the likelihood of this or that outcome. ("But what if my writing this blog inspires a white supremacist murder?" Well, I suppose there is a non-zero chance that it could, somehow. But it's not likely. It's not a reason not to write it. The example is hyperbolic, but I hope it gets the point across.)
Let's stick with the immigration lawyer example. It strikes me that there's a number of problems with this. The first is that people who are concerned about deportation are likely to be among those least likely to download any kind of track-and-trace app to begin with. And so - assuming I've understood the technology correctly and it'll be required for two phones to have it installed and active for it to work - there won't be a particular concern there. Even if I've misunderstood the technology, anyone who is particularly worried would be able simply to turn their phone off for a bit, or leave it at home, or something like that. This does undermine the efficacy of the app, for sure - and I'll come back to that point in a moment. (The qualifier would have to do with instances in which apps are installed automatically,
like U2 albums. But if that's the case, there's nothing special about CV-19 apps, because presumably governments could install such apps anyway, and much more surreptitiously.)
In the meantime, it's also worth noting that there are rules about legal privilege that militate against the government making use of data gathered from such an app. And while I'm not sure how powerful this point is - it's not obvious that there'd be a way to distinguish reliably between a lawyer and a client meeting in the office (which would be privileged) and their meeting in Starbuck's half an hour later (which wouldn't), the principle applies; and, of course, it people are bumping into each other in the coffee shop, then this is a "civilian" interaction anyway, so whether people are immigrants or lawyers would be neither here nor there. They could easily be just people who happened to be close by at a given point.
"Ah, yes," the response might go, "It's true that the government would be in trouble if it subverted the rules on legal privilege; but so what? By then it's too late. And as Davis says, what point are rules if the government subverts them?" And, superficially, this has a certain attraction as an argument - except for two considerations. First, it militates against having rules on anything: if you've decided that they won't stop nefarious behaviour, you might as well not have them; and if you don't have the rules, then there's nothing about which to complain.
sequitur, second, that though the rules might not stop people determined to be bastards, their being there does give you a way to resist that bastardy. That's an important point of principle. So, for example, if you're an immigrant who's facing deportation because of the rule-breaking way an app was used, you'd have grounds for that deportation decision to be overturned or suspended, because there was (in effect) a due process violation. Now, admittedly, it might be that when push comes to shove, this makes no practical difference. But, as indicated, the point of principle would stand, and at the very least it would improve the prospects of other immigrants in a similar situation.
But a point that's more important than any of that is, I think, this: that the reason for having the app is important. Davis doesn't say much about that, concentrating instead on the reasons to be suspicious of it. But the examples she offers are examples of institutions using data for what we can take to be bad reasons. It is
prima facie undesirable for Uber to scrape data for commercial reasons. It is
prima facie undesirable for governments to try to sidestep legal norms when it comes to things like immigration (or anything else). There does not seem to be a particularly compelling, or even good - by which I mean morally defensible - reason for either of those things to happen.
An app for exposure to CV-19 is different. The underlying reason for that seems to be
prima facie decent. So even if there are reasons
con as well, matters are different from how they appear in the examples offered. We can't make sense of the desirability or otherwise of installing the app unless we give a full account of the reasons
pro and
con. It might even be that we find the reasons for a CV-19 app unconvincing all told. Nevertheless, there is a qualitative difference between it and the other examples Davis offers, precisely because there is a weighty moral reason for a CV-19 app that there isn't in other cases.
And this is where I go back to the point about lawyers switching off their phones for a while. There is a reason not to. But maybe, if the government is untrustworthy enough, there is a reason to do that.
If there are no particularly good reasons to install the app, then don't install it. If the positive reasons not to are great, then don't install it. But slightly vague appeals to the risk of state surveillance don't seem to me to carry much weight.