Google and Apple joined together for an emergency project on Friday. It’s an urgent, complex project, with huge implications for privacy and public health. Similar projects have been successful in Singapore and other countries.
We covered the basic outlines of the project here, but there is a lot more to dig into – starting with the technical documents published by the two companies. We reveal a lot about what Apple and Google are trying to do with this sensitive data, and where the project falls short.
Public health workers try to contain the spread by tracking down and quarantining everyone that infected person has been in contact with. This is called contact-tracing, and it’s a crucial tool in containing outbreaks.
Apple and Google have built an automated contact-tracing system. It can operate at a far greater scale than conventional contact tracing. Some of this functionality will also be built in to Android and iPhones.
What Apple and Google are working on together is a framework and not an app. They’re handling the plumbing and guaranteeing the privacy and security of the system, but leaving the building of the actual apps that use it to others.
The system lets your phone log other phones that have been nearby. As long as this system is running, your phone will periodically blast out a small, unique, and anonymous code. Other phones in range receive that code and remember it.
The new coronavirus is spreading through the US. Several states have made emergency declarations. The World Health Organization has declared it a pandemic.
When your phone checks back with that database, it runs a local scan to see whether any of the codes in its log match the IDs in the database. If there’s a match, you get an alert on your phone saying you’ve been exposed.
The system lets you record points of contact (the exact thing contact Tracers need) without collecting any precise location data and maintaining minimal information in the central database.
It’s assumed that only legitimate healthcare providers will be able to submit a diagnosis, to ensure only confirmed diagnoses generate alerts. It’s not entirely clear how that will happen, but it seems like a solvable problem.
The system uses a version of the BLE beacon system that’s been in use for years. It works off the same antennas as your wireless earbuds.
Engineers on the project are optimistic that they can tweak the range at the software level through’thresholding’. But since there’s no actual software yet, most of the relevant decisions have yet to be made.
Social distancing rules recommend staying six feet away from others in public. Officials will also be wary of sending out so many alerts that the app becomes useless.
The system will be built into official public health apps, which will send out the BLE signals in the background. The agencies will be in charge of a lot of important decisions about how to notify users and what to recommend if a person has been exposed.
The team hopes to build that functionality into the iOS and Android operating systems. It will still prompt users to download an official public health app if they need to submit information or receive an alert.
The system does n’t personally identify you and does n’t log your location. The health apps that use that system will eventually need to know who you are to upload your diagnosis to health officials.
The central database stores all the codes sent out by infected people while they were contagious. It’s possible to envision some scenarios in which those protections break down. The engineers have done a good job ensuring that you ca n’t work directly from those codes to a person’s identity.
Each of these steps is performed through a cryptographically robust one-way function. You can generate a proximity key from a daily key, but only if you start with the daily key in hand.
The log on your phone is a list of proximity IDs (the lowest level of key), so they are n’t much good on their own. If any of the proximity IDs in your log came from that daily key, it generates an alert.
It’s possible to imagine a malicious app that collects proximity IDs in advance. It connects them with specific identities and later correlating them to daily keys. Even then, all you would get from the server is the last 14 days worth of codes.
It’s hard to guarantee someone’s anonymity if they share that they’ve tested positive through this system. In some ways, that tradeoff is inherent to contact tracing.
It’s basically impossible to build a completely anonymous contact tracing system. People interviewing you and asking who you’ve been in contact with you. It’s essentially impossible to create an anonymous contact tracking system.
When collecting your proximity IDs and you test positive, they could link you to a specific location where your proximity identification had been spotted in the wild.
Neither Apple and Google are sharing information that could directly place you on a map. Google has a lot of that information and the company has shared it at an aggregated level, but it’s not a part of this system. An attacker might be able to work back to that information, but they would still know less than most of the apps on your phone.
As long as your specific log stays on your specific device, it’s protected by the same device encryption that protects your texts and emails.
Without a daily key to work from, they would have no clear way to correlate one proximity ID to another. The robust cryptography makes it impossible to directly derives the associated daily key or the associated personal ID number.
Apple and Google insist that participation is voluntary. If you take proactive steps to participate in contact tracing, you should be able to use your phone without getting involved at all.
Public Health work is full of medical surveillance, simply because it’s the only way to find infected people who are n’t sick enough to go to a doctor. The hope is that people will accept this level of surveillance as a temporary measure to stem further spread of the virus.
It matters a lot that the system is voluntary, and it does n’t share any more data than it needs to. It remains to be seen whether governments will try to implement this idea in a more invasive or overbearing way.