The UK Government COVID App: Should We Use It?

by 3rd May 2020COVID-19, Development, Mobile Platforms, News, Security

Updated on 5th May 2020 to reflect further information on data security.

The UK government has been working on a COVID tracking app for the last month or so, designed to notify you if you have come into close proximity with someone who has has been diagnosed with the Coronavirus. A self-diagnosis would flag a yellow alert, with a subsequent positive medical test sending a red alert – which can only be enabled if you have been provided with a verification code.

This, in itself, is a useful tool, both to ensure your safety, but also of others, in the case that you or other close family members have been diagnosed. However, to be truly effective, the government has estimated that it needs at least a 60% take-up in the country.

One could argue that this is too little, too late – after all, the Prime Minister has himself declared that the worst of the pandemic is over in the UK. This, however, is not about the current pandemic, but also guarding yourself in the future, should another pandemic occur.

That Sounds Great – So What’s The Problem?

Both Google and Apple have created APIs within their smartphone operating systems, which allow apps to use Bluetooth to identify who you have been in contact with, and warn you if someone has tested positive for COVID-19, keeping all the information stored locally on your phone. However, the British version, designed by NHSX – the health service’s digital innovation team – is designed to capture this information in a centralised database, ostensibly to provide a better view of the state of the virus across the UK.

On top of that, the app will ask to upload all of your contacts so that it can identify who is most likely to be at risk and then contact them using the information you have provided.

This is where it gets murky and steps beyond the bounds of keeping you and your family safe, and into the area of government monitoring.

Certainly, it would be incredibly useful to the NHS to identify where hotspots are occurring, and potentially what kind of person is catching the virus (for example, based on age, ethnic lines, gender etc). Unfortunately, both the NHS and the UK government (and not just this current one) have a dubious history in using (and abusing) our data.

Technical Constraints

The technology used to identify people you have been in contact with, is Bluetooth Low Energy (LE), a lightweight protocol which can reach up to 100 metres outside, and certainly 10-15 metres inside, through thick brick walls.

Whilst there is also the ability of the app to implement proximity sensing (PXP) to identify how close you are to another user of the app, this isn’t overly reliable. In a heavily populated area such as London, it’s likely that you will pick up responses from neighbours in a block of flats (both adjacent, above and below), or just walking past your front door. Even in less populous suburbs this is likely to be an issue. Then consider, on going to a supermarket, you might get a response from everyone in the building.

Quite obviously, your phone is going to suffer from data overload, and the app will be come more of a burden than of any use.

Can You Trust The Government With Your Data?

Several years ago, the NHS attempted to centralise all of our health records, but were insufficiently clear as to whom would be able to access that data – which seemed to include private companies. Despite protestations of anonymity, it was shown that the data could easily be traced to specific individuals. There was also a provision that researchers could apply for the anonymity to be lifted in exceptional circumstances – for example, during a pandemic.

Meanwhile, Google purchased Deepmind, and worked with five NHS trusts to analyse patient data. Unfortunately for them, the Information Commission ruled that protection of 1.6 million patients data was lacking.

Then there are the numerous cases of data loss by public services: from unencrypted USB drives containing patient records or details of high-risk offenders; to CDS containing child benefit claimant details.

If their lack of care over the security of your personal data doesn’t worry you, then who will be able to see it (and who shouldn’t but still will) should. Over the last few years, more and more laws have been introduced, such as the “Snooper’s Charter”, giving the government the right to track your online activity should it so wish. The Regulation of Investigatory Powers Act, passed in 2000 under a Labour government, was less about regulation and more about giving virtually all public services access to data.

For example, the following can access data under RIPA if they see fit – and they haven’t been shy in doing so:

  • Any county, district or borough council
  • HM Revenue and Customs
  • Department of Trade and Industry
  • The Office of Fair Trading
  • The NHS
  • The military
  • The Gambling Commission
  • OfCom
  • And the list goes on…

The app, designed under NSHX purview, has also been a collaborative development with an AI company called Faculty, which was also involved in the Vote Leave campaign – which was notoriously lax in it’s application of data protection law – and Palantir, a data mining company founded by Peter Thiel, a strong support of Donald Trump.

Recently, it was also discovered that the Department of Education had an agreement from 2015 to share student data from schools with the Home Office in pursuit of their hostile environment for migrants. This was only stopped once it was made public. Bear in mind that, as with the NHS, none of this was communicated to the public.

What Is Being Said?

Matt Hancock, the Health Minister has said:

If you become unwell with the symptoms of coronavirus, you can securely tell this new NHS app and the app will then send an alert anonymously to other app users that you’ve been in significant contact with over the past few days, even before you had symptoms, so that they know and can act accordingly.

All data will be handled according to the highest ethical and security standards, and would only be used for NHS care and research.

And we won’t hold it any longer than is needed.

Professor Ross Anderson from the University of Cambridge has this view:

Personally I feel conflicted. I recognise the overwhelming force of the public-health arguments for a centralised system, but I also have 25 years’ experience of the NHS being incompetent at developing systems and repeatedly breaking their privacy promises when they do manage to collect some data of value to somebody else.

Meanwhile, the Chief Executive of NHSX, Matthew Gould, told MPs on 4th May, that if users give the app permission to upload data to the NHS, they could then no longer request that data to be deleted and it could be used for ongoing research. That’s all well and good if the data is anonymised, but we already know that it isn’t, and will be linked to a particular person, and general location using a unique ID.

What Do We Say?

The idea of an app to warn you of potential contact with someone who may have been diagnosed with COVID-19 is a good one – after all, it helps you and all of us to stay safe, and to help beat this pernicious virus a lot more quickly. However, the UK government has overstepped the mark, which should come as no great surprise given their previous activities.

Mr Hancock’s statement that they won’t hold the data any longer than is needed means absolutely nothing – they could easily claim that it is needed right up until your great-grandchildren are paying their last respects.

And the British government has proven time and time again that their ethical and security standards fall well short of anything we would, or should expect.

The issue of identifying your proximity to someone else with the app, together with the likelihood of data overloading in populous areas, or false positives from quarantined neighbours, also makes it less useful.

This could have been a useful tool in helping to rid the world of Coronavirus: unfortunately, given their history, I don’t trust the government one iota with the ethical use or the security of my personal data.


The app isn’t likely to be reliable; it will suffer from too many false positives and data overload; and based on past evidence, you can’t trust the government with your data, particularly in light of recent statements about what data will be uploaded, who can access it, and whether it will be updated.

Don’t download the app unless it has the centralised data capture removed: even if that is removed, the app is not going to prove very useful in the long run.


Further Reading

Matthew Cunliffe

Matthew is an IT specialist with more than 24 years experience in software development and project management. He has a wide range of interests, including international political theory; playing guitar; music; hiking, kayaking, and bouldering; and data privacy and ethics in IT.

Matthew Cunliffe

Matthew Cunliffe


Matthew is an IT specialist with more than 24 years experience in software development and project management. He has a wide range of interests, including international political theory; playing guitar; music; hiking, kayaking, and bouldering; and data privacy and ethics in IT.


Submit a Comment

Your email address will not be published. Required fields are marked *

Share this post