Questions we should ask about COVID-19 contact-tracing apps

Three ways to judge contact-tracking apps from the point of view of "openness."
69 readers like this.
Hands holding a mobile phone with open on the screen

Opensource.com

One of the cheering things about the pandemic crisis in which we find ourselves is the vast upswell of volunteering that we are seeing across the world. We are seeing this equally across the IT sector, and one of the areas where work is being done is in apps to help track COVID-19. Specifically, there is an interest in COVID-19 contact-tracing or -tracking apps for our mobile0 phones. These aren't apps that keep an eye on whether you've observed lockdown procedures; rather they attempt to work out who has been in contact with whom and, from that, once we know that one person is infected with COVID-19, what the likely spread of the virus will be.

There are lots of contact-tracing initiatives out there, from PEPP-PT from the European Union to Singapore's TraceTogether, from the University of Washington's PACT to MIT's PACT.1 Google and Apple are—unprecedentedly—working on an app together. There are lots of ways of comparing these apps and projects, but in today's article, I want to suggest three measures that can help you consider them from the point of view of "openness."

As regular readers of my blog will know, I'm a big fan of open source—not just for software, but for data, management, and the rest—and I believe that there's also a strong correlation here with civil or human rights. There are lots of ways to compare these apps, but these three measures are not too technical and can help us get a grip on the likelihood that some of the apps (and associated projects) may impinge on privacy and other issues about which we care. I don't want the data generated from apps that I download onto my phone to be used now or in the future to curtail my—or other people's—civil or human rights, for blackmail, or even for unapproved commercial gain.

1. Is it open source?

Our first question must be: "is the app open source?" If the answer is "no," then we have no way to know what is being captured and, therefore, how it is being used. If the app is closed source, it could be collecting any data from pretty much any measuring device on our phones, including photo, video, audio, Bluetooth, WiFi, temperature, GPS, or accelerometer. We can try restricting access to these measurements, but such controls have not always been effective, understanding the impact of turning them off is rarely simple, and people frankly rarely bother to check them anyway. Equally bad is the fact that with closed source, you can't have any idea of how good the security is nor any chance to criticise and improve it. This is something about which I've written many times, including in my articles Review by many eyes does not always prevent buggy code and Trust and choosing open source. Luckily, it seems that the majority of contact-tracing apps are open source, but please be careful and reject any that are not.

2. Is the data centralised or distributed?

In order to make sense of all the data that these apps collect, there needs to be a centralised2 store where it can be processed, right? It's common sense.

Actually, no. Although managing and processing data in one place can be much easier, there are ways to store data in a distributed manner and allow the sorts of processing needed for contact tracing to take place. It may be more complex, but it also makes it much, much more difficult for governments, corporations, or malicious actors to misuse this information. And we should be clear that this will be what happens if the data is made available. Maybe the best governments and the best corporations will be well-behaved by their standards, but a) those are not necessarily the standards that I or others will endorse, and b) what about malicious actors and governments and corporations that are not "the best"?

3. How is location or proximity tracking done?

This might seem like another obvious choice: if you want to find out who was in contact with whom, then the way to do it is to see who was where and when. GPS tracking—and associated technologies like WiFi access point location tracking—combined with easily available time data would give the ability to work out who was in a particular place at the same time as other people. This is true, but it also provides enormous opportunities for misuse, particularly when the data is held centrally (see above). An alternative is to use sensors like Bluetooth or NFC3 to allow phones to collect information about other phones (or devices) with which they have been in contact and when. This is more easily anonymised—or pseudonymised—allowing information to be passed to the owners of those phones, but at the same time, more difficult to misuse by governments, corporations, and malicious actors.

There are other issues to consider, one of which is that these sensors were not designed for this type of use, and we may be sacrificing accuracy if we choose this option. On the other hand, many interactions between people occur indoors, where GPS is much less effective anyway, and these types of technologies may help.

You could argue that this measurement is not about "openness" in itself, but it is a key indicator of whether the information collected can be used in ways that are far from open.

Other questions

There are many other questions we can ask about COVID-19 contact-tracing apps, some of which are related to openness and some of which are not. These include:

  • Coverage
    • Not all demographics have—or use—phones as much as the rest of the population, including the poor, the elderly, and certain religious groups. How effective will such projects be if they have reduced access to these groups?
    • Older devices may have less accurate sensors or not have some of the capabilities required by the apps. What is more, there may be a correlation between the use of these older devices with some of the demographics noted above.
    • Some people rarely update the apps on their phones, so even if they load an initial version of an app, newer versions with functionality or security improvements are likely to be unequally distributed across the set of devices.
  • Removal: How easy will it be to remove the application fully, what are the consequences of not doing so, and how likely are people to do so anyway?4
  • Will the use of these apps be mandatory or voluntary? If the former, there are serious concerns about civil or human rights, not to mention the problems noted above about coverage.

All of these questions are important, but not directly related to the question of the "openness" of the apps and projects. However, we have, right now, some great opportunities to work with and influence some really important projects for public health and well-being, and I believe that it is important that we consider the questions I've raised about openness before endorsing, installing, or using any of the apps that are being created.


0. Or "cell," if you're in North America.

1. Yes, they chose the same acronym. Yes, it is confusing.

2. Or, I suppose, "centralized," depending on your geography.

3. "Near Field Communication" – the same capability used when you do contactless payment with your phone or credit/debit card.

4. How many apps do you still have on your phone that you've not even opened for three months? Yup, me too.


This originally appeared on Alice, Eve, and Bob—A security blog and is republished with permission.

Tags
User profile image.
I've been in and around Open Source since around 1997, and have been running (GNU) Linux as my main desktop at home and work since then: not always easy...  I'm a security bod and architect, co-founder of the Enarx project, and am currently CEO of a start-up in the Confi

2 Comments

The questions are now all the more necessary because countries introducing applications for monitoring the quarantine of citizens (e.g. in Poland something like this has been implemented), may overuse them for not good purposes. It is worth following this up.

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.