We have heard that contact tracing is key to easing the lockdown and getting Britain back on its feet. In fact, other countries like Germany are already trialling their own apps, and Apple and Google have now released a software tool to assist the release and use of coronavirus contact-tracing apps around the world.
We were also told that the NHS Covid-19 contact tracing App would be rolled out nation-wide by mid-May 2020, although the Government then extended this date to the beginning of June. However, the NHS Covid-19 contact tracing App pilot, which has been taking place on the Isle of Wight has now entered its third week of testing. Given that we have already seen security breaches relating to the outsourcing company Serco, who the UK Government have been using to assist with their test, track and trace programme and as we await the results of the pilot, we take a look at the privacy and data protection issues that surround the controversial App.
It is clear that the aims and objectives of the App are of the highest order – it could help to save thousands of lives and provide much needed insight into how we might control the spread of this deadly virus. However, there are serious concerns around privacy and data protection rights that need to be addressed before the App is rolled out across the UK. The most prominent categories of this being the information held and/or used by the App on your location, your health data and your personal identifiers such as name or date of birth.
How does the NHS Covid-19 Contact Tracing App work?
The App uses Bluetooth Low Energy, the same type of technology that devices such as fitness trackers use to communicate with smart phones. In the case of this App, once it has been downloaded onto a user’s mobile device, a random user ID will be generated. The App will broadcast its device’s user ID and will look for other mobile devices using the App within a 2-metre radius allowing the App to identify those people to whom the user has been close enough to catch the virus from. The App will log those devices as contacts by collecting their user IDs and will log the distance between the two devices and the duration of the interaction.
In order for contact tracing to work as an epidemiological tool, the NHS must be able to identify those people who may have caught the virus. Users of the App will be encouraged to log their symptoms using the self-reporting tool. If a user becomes unwell, they will also be encouraged to upload their user ID and the data about the interactions they have had with their contacts to a centralised NHS system. Using a clinical algorithm, the centralised database will assess the uploaded information. Any contact who is deemed to have had a high-risk interaction with the user will be notified that they should isolate for 7 or 14 days. The individual with symptoms must then be tested and if the test returns a negative result, a further alert is sent out to their contacts to instruct them that self-isolation is no longer necessary.
If this is able to be done, the UK may be able to contain the virus in a controlled area, whether that be on a national level or regional level, which would obviously be a great result. However, it is important, if UK citizens are being encouraged to use that they understand what that means for the information gathered about their location, health and medical data.
There are a number of core legal hurdles relating to the Human Rights Act 1998 and the Data Protection Act 2018, and the accompanying GDPR, that the NHS will need to overcome in order to use the App lawfully.
Human Rights Act
Contact tracing involves the State gathering information on its citizens. As such, Article 8 of the European Convention on Human Rights, the right to a private life, is quite clearly engaged. One of the requirements of this type of interference is that it must be necessary in the interest of, inter alia, public safety and for the protection of health. What is clear is that contact tracing cannot work unless everyone who logs Covid-19 symptoms is tested. In order to determine its necessity, it must, therefore, first be established whether the rate of testing can keep up with the number of people identified by the App as requiring testing. If the users who self-report using the App cannot subsequently be tested, this undermines the necessity argument. The App developers also need to ensure that its use doesn’t have perverse consequences, such as complacency if users receive no notifications to self-isolate, or hysteria if users are constantly being told that at least one of their contacts, whose identity will be unknown to the user, has contracted the virus.
If the App is not effective because there is not enough take-up to provide an epidemiological basis or enough take up but there are insufficient testing capabilities, the App may start its life as lawful and then quickly become unlawful because of some other aspect rendering it unnecessary or disproportionate.
The type of data collected concerns special personal data in the form of health and medical information. If the data is anonymised, which we are told it is, it would then not be classed as personal data. However, if upon looking at the data, a data subject can be singled out and treated differently from others, then potentially it may still classify as personal data.
It should also be noted that once you upload your flu-like symptoms into the App, you may be anonymised, but as soon as you get tested you are no longer anonymous and neither is any of the previous information entered into the App.
In addition, in relation to the location information, the GDPR provides strict rules regarding the processing of location data. It is entirely possible that the location data together with the medical data, albeit anonymised, might allow an individual to be identified.
The user will be being tracked everywhere they go, such as on the tube, to work and home again. When a user first signs up to the App, they will be asked to enter the first part of their postcode, we are told this is to assist the NHS with resource planning. However, this information goes beyond anonymous location data, and although the user’s postcode cannot be narrowed down as much as GPS, it could still lead to the user being identified if they were to live in a small village, for example. Once two contacts are in close proximity to one another, the App will also record for how long. It is therefore, possible that if two contacts spend significant periods of time together at specific times of the day in specific locations, and one of them self-reports and gets tested, and is therefore no longer anonymous, that user’s contact may be automatically identified as for example, their partner or colleague.
Under the GDPR, it is necessary to identify a lawful basis for this type of processing. Although users choose to download the App, and explicit consent will be required, consent should not be the basis relied upon. The App has been described as voluntary, implying that the user can opt out if they wish. Although the user can stop using and delete the App, they cannot in reality withdraw consent or exercise their right to erasure, as their data will not be deleted.
Given that Google and Apple, who have been the champions of privacy during this pandemic, have developed systems that are already in use elsewhere in the world, it is unclear why the Government has decided to create its own system. This is especially where the Google and Apple apps use systems that do not necessitate centralisation of the data, therefore removing the danger of identification of the data subject. Instead, the Government has enlisted NHSX and VMware Pivotal to develop and control the app. There are a number of data security questions surrounding both companies, and although NHSX has received advice from Government security experts at the National Cyber Security Centre (NCSC), given that it is a new app, it is still possible that there could be security flaws, some of which will not be seen until the App is fully up and running on a mass scale.
It is also understood that GCHQ, who are controlling the use of the App with the NHS, will be issuing a ‘master key’, so that once the App is on your phone, GCHQ will have a way to re-identify the data at a later date. At the moment, we assume that this would be for as long as the data exists on the servers accessible to GCHQ – not just during the pandemic. This could lead to the data being used for other purposes, such as criminal investigations. The NHS has also said that it intends to retain the data indefinitely because of its value. What we do not know at this stage is what the NHS intends to do with it in the future – will it be shared with other nations, sold or used internally for research purposes? It is not clear. The NHS may delete some data once the pandemic is over but essentially, they will be reluctant to let it go if they are not required to.
At the moment, what is intended on being put in place is a type of large scale surveillance with infrastructure that will be very difficult to remove. The App is being billed as being voluntary, what remains to be seen, however, is whether the Government will attempt to enforce its usage on the nation as a way to control how we emerge from lockdown and whether in the coming months there will be restrictions placed on those who refuse to use the App.
To determine what users’ rights are, it is critical that the nation is given clarity and transparency in terms of how user data is being used now and what is going to be done with it in the future. It is clear that contact tracing and testing is needed in order to get us out of lockdown, and that this is required sooner rather than later. However, if it is going to take time to develop an app that assuages privacy and data protection concerns, we need to ask whether we want to sacrifice our privacy rights, as citizens agreed to do in South Korea, in order to beat the virus.