COVID-19: some concerns about Contact Tracing apps

The Electronic Frontier Foundation, one of the most respected associations for the protection of privacy and digital rights, that fights since its beginnings against abuses of digital technologies, has published a large article that takes stock of anti-pandemic tracking apps, with an excellent introduction to the basic concepts of this topic.

This kind of apps, which use proximity or location to alert mobile phone users when they have come into contact with an infected person, could become essential tools for countries looking to reopen their economies after lengthy lockdowns. However, there are growing tensions over the best approach to coronavirus contact-tracing apps and whether or not the technology can live up to its promise.

Andrew Crocker, Kurt Opsahl, and Bennett Cyphers collected all doubts and tried to give some explanation [1].

Below some highlights:

How Do Proximity Apps Work?

There are many different proposals for Bluetooth-based proximity tracking apps, but at a high level, they begin with a similar approach. The app broadcasts a unique identifier over Bluetooth that other, nearby phones can detect. To protect privacy, many proposals, including the Apple and Google APIs, have each phone’s identifier rotated frequently to limit the risk of third-party tracking.

When two users of the app come near each other, both apps estimate the distance between each other using Bluetooth signal strength. If the apps estimate that they are less than approximately six feet (or two meters) apart for a sufficient period of time, the apps exchange identifiers. Each app logs an encounter with the other’s identifier. The users’ location is not necessary, as the application need only know if the users are sufficiently close together to create a risk of infection.


Would Proximity Apps Be Effective?

Traditional contact tracing [2] is fairly labor intensive, but can be quite detailed. Public health workers interview the person with the disease to learn about their movements and people with whom they have been in close contact. This may include interviews with family members and others who may know more details. The public health workers then contact these people to offer help and treatment as needed, and sometimes interview them to trace the chain of contacts further. It is difficult to do this at scale during a pandemic. In addition, human memory is fallible, so even the most detailed picture obtained through interviews may have significant gaps or mistakes.


Would Proximity Apps Do Too Much Harm to Our Freedoms?

Any proximity app creates new risks for technology users. A log of a user’s proximity to other users could be used to show who they associate with and infer what they were doing. Fear of disclosure of such proximity information might chill users from participating in expressive activity in public places. Vulnerable groups are often disparately burdened by surveillance technology, and proximity tracking may be no different. And proximity data or medical diagnoses might be stolen by adversaries like foreign governments or identity thieves.

To be sure, some commonly used technologies create similar risks. Many track and report your location, from Fitbit to Pokemon Go. Just carrying a mobile phone brings the risk of tracking through cell tower triangulation.
Stores try to mine customer foot traffic through Bluetooth [3].


Security

An application running in the background on a phone and logging a user’s proximity to other users presents considerable information security risks. As always, limiting the attack surface and the amount of information collected will lower these risks. Developers should open-source their code and subject it to third-party audits and penetration testing. They should also publish details about their security practices.


Addressing Bias

…contact tracing applications will leave out individuals without access to the latest technology. They will also favor those predisposed to count on technology companies and the government to address their needs. We must ensure that developers and the government do not directly or indirectly leave out marginalized groups by relying on these applications to the exclusion of other interventions.


Expiration

When the COVID-19 crisis ends, any application built to fight the disease should end as well. Defining the end of the crisis will be a difficult question, so developers should ensure that users can opt out at any point. They should also consider building time limits into their applications themselves, along with regular check-ins with the users as to whether they want to continue broadcasting. Furthermore, as major providers like Apple and Google throw their weight behind these applications, they should articulate the circumstances under which they will and will not build similar products in the future.

I highly recommend you take a look to the original post [1], for more detailed informations.


References

  1. The Challenge of Proximity Apps For COVID-19 Contact Tracing
  2. Contact tracing – Wikipedia
  3. In Stores, Secret Surveillance Tracks Your Every Move

Related posts

  1. How secure and privacy-oriented is iOS?
  2. Weekly Privacy Roundup #14
  3. Vulnerable webapps and VMs for penetration testing practice: my own list
  4. Weekly Cybersecurity Roundup #14
  5. Sara Morrison: how SDKs, hidden trackers in your phone, work