NHSX finally released the Data Protection Impact Assessment (DPIA) for their contact tracing app at the weekend. DPIAs are risk analyses, meant to identify and address potential privacy and security issues prior to the deployment of a high risk application like the NHSX app. Also, they play a fundamental role in fostering public trust, as they ultimately allow us to know if we can trust the software we would be installing on our phone.
Currently, the documents are unavailable, as some material was accidentally published, as highlighted by Matt Burgess at Wired.
We have already highlighted how the Government has lacked due diligence so far (link). Although the public release of this DPIA comes as a positive development, we find significant shortcomings which still persist, as outlined below.
Risk assessmentis unsatisfactory
Assessing privacy risks is the most important aspect of a DPIA, as it should allow to put adequate safeguards into place. Unfortunately, what we have read so far leaves causes for concern: some threats have been merely listed, and left without appreciable safeguards in place, such as:
re-identification of app users;
re-use of contact tracing data for other purposes;
potential impact of measures taken against individuals as a result of the data which has been collected.
Other risks have been downplayed. For instance, “malicious or hypochondriac” use of the self-diagnosis tool is admittedly at risk of generating false alerts, an occurrence which NHSX claims to prevent by forbidding it in the terms and conditions of the app.
Finally, NHSX decided not to submit this DPIA to the Information Commissioner’s Office for prior consultation, which they are obliged to if a DPIA reveals significant residual risks. It is clear to us that this is the case, and NHSX could definitely use some help from the ICO.
NHSX claims that the app does not profile you, nor interferes with your liberties
According to the DPIA, contact tracing will not result in any deprivation of your rights and freedom, nor involve profiling or evaluation of your personal aspects. However, this app is designed to:
evaluate your likelihood of being infected, by means of a risk-scoring algorithm that profiles your daily encounters;
influence your behaviour, e.g. by pushing you to self-isolate; and
inform the strategic management of Public Health responses, e.g. by allowing the Government to identify geographic areastoput into quarantine.
Thus, this is yet another example where the NHSX is improperly downplaying the areas of risk associated with the use of their app.
Your rights may be limited by design choice
The contact tracing app will transmit your personal information to a central database, managed by NHSX. The GDPR gives you the right to access, to erasure, and to objection regarding this data, as well as the obligation for the NHS to justify any limitation to your rights.
However, the app currently lacks this functionality, and the DPIA states that enabling these to be exercised will depend on technical practicality.
This claim is unfounded: the app could allow you to exercise your rights by design, the same way it allows you to link your medical condition to your data stored on the centralised server.
NHSX continues by stating that, if the decision to limit your rights were to be taken, this would be based upon an “overriding legitimate interest”, and the “fact” of your data being not identifiable. These reasons are groundless: an overriding legitimate interest must be assessed on a case-by-case basis, and cannot be relied upon unconditionally. Simultaneously, data about your encounters can be shown to you without the need to directly identify yourself, the same way these can be linked to your self-diagnoses. If data can sometimes be presented and linked to you, then it is unclear why there is a technical issue in providing access and deletion rights.
DPIA improperly describes anonymisation, and leaves space to ambiguities
Data is anonymous when it cannot be used to identify a given person. On the other hand, the NHSX app works with pseudonyms — i.e., by “including identifiers which are unique to individuals”, in order to uniquely recognise the activity logs generated by a given person.
It follows that claims such as “The App is designed to preserve the anonymity of those who use it” are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities, such as this:
“Data may be fully anonymised for public health planning, research or statistical purposes, if those purposes can be achieved without the use of personal data. Otherwise, data may be linked […] with such additional protection that is required as an output of data protection impact assessment.”
We cannot decipher the meaning of this phrase: will our data be used for research purposes only and insofar as non personal data can be used? Or will our data be used regardless, and anonymised when possible? Furthermore, the law imposes the adoption of strict safeguards as a precondition to use data for research purposes, the details or existence of which are not covered in the DPIA.
Many questions are left unanswered
There are many other areas of significant concern for people’s privacy and security which are left untouched.
NHSX hit the news for its decision to rely on a centralised approach, contrary to most of the other European countries (link), and to the same design choices of Apple and Google (link). A centralised solution is not a bad option in itself, but it is much more intrusive and must therefore be justified: the Parliament Joint Committee on Human Rights has stated that “Without clear efficacy and benefits of the app, the level of data being collected will be not be justifiable”(link). Therefore, NHSX should clearly explain the necessity upon which the decision to rely on a centralised model was taken, as well as the metrics and KPIs by which they plan to measure its added value against the decentralised model.
Additionally, the DPIA only mentions NHS England and NHS Improvement as the entities to whom your data may be shared with. Although this would be good news, we also notice that the DPIA points out that data will be shared for research and statistical purposes, but leaves us without a clue about which entities may receive the data for such purposes.
Finally, will any of the data generated and collected by the NHSX be available to law enforcement and security agencies, such as the Home Office or the GCHQ? Indeed, the UK government has recently asked to be granted more investigatory powers (link), and is relying on a surveillance company to analyse health data (link). The NHS will also have broad discretion under GDPR to release data to the police when a crime is being investigated; sometimes they may feel this is justified, for instance in the case of a missing child or terrorism. If the data might be re-identified and released in certain circumstances, this should be spelled out.
Thus, we believe this should be a matter of concern, and a clear answer concerning any possible involvement of such agencies should be put on record.