Real World Crypto 2021 - Session 8: Invited Talks
Carmela Troncoso (EPFL) / Privacy by Design – From Theory to Practice in the Context of COVID-19 Contact Tracing video slides
Discussion of security & privacy concerns of contact-tracing apps
They were built under time pressure yet had to be robust, reliable and scalable
Had to keep scope simple, avoiding too much new tech, and using existing infra where possible
She worked with group to specify Decentralized Privacy-Preserving Proximity Tracing
Key ideas were:
- BLE (bluetooth low-energy) beacons
- crypto for unlinkability
- decentralisation of the matching operations to preserve privacy and limit purpose
They had plans to design fancy crypto (DP3T paper) for all this, but then reality bit:
OS Vendors have to be involved because of contraints (must run in b/g - Apple have to be involved; preserve CPU and battery usage - Google & Apple are involved; compatibility with old OS versions - G & A must be involved). Cannot consume much bandwidth on the users' devices as they may have limited b/w quota and will uninstall if they think the tracing app is hogging it
G & A both implemented the protocol and the API that tracing apps must use. This had implications for privacy. The protocol is public. Privacy concerns dictated that RPI (random identifiers) change daily.
Apps collect RPIs they see and record bluetooth signal power as a proxy for distance.
If user tests positive they are given access key and upload their device’s recent contacts. Other users get daily download of all positive users' keys, so can alert if one matches an identity the user was in contact with.
No-one ever has identity, location or other info about other users. There’s no info available for abuse. The system automatically sunsets itself by design as users stop participating, because the data servers retain is useless for any other purpose.
Crypto design has some crucial properties:
- only true positive test subjects are allowed to upload, to prevent abuse or false alarms. This is achieved using commitments.
The crypto seems good but realizing it in different national health regimes had mixed success due to different levels of digitization available in their processes
Privacy sometimes requires transmissions to be delayed to avoid identifying a user, but in this app they had to upload as soon as a diagnosis was available for epidemiological reasons
Dummy strategy - they wanted to have dummy traffic to servers intermingled with user’s legit upload traffic to make it hard to identify users bywho have positive diagnosis. That was made hard for them by the choices made by A & G who didn’t have privacy in mind in their API design.
Study in Switzerland shows that users of the app were notified of exposure 1 day quicker on average than those reached by manual contact tracers:
- They had to consider carefully what was logged at load balancers in the Swiss cloud realization of the server infra: coarse logging only, counts only for stats, etc. to avoid identifying users in server or LB logs
- Privacy eng goes well beyond crypto; doing good privacy eng in an agile & service world is “exhausting”
- Effective integration of the tech with the people who must use it is essential for success (i.e. not just users, but doctors must be able to use it properly)
Vanessa Teague (Thinking Cyber Security / Australian National Uni) / Not as Private as We Had Hoped – Unintended Privacy Problems in Some Centralized and Decentralized COVID-19 Exposure Notification Systems blog video slides
Social graph inference - centralized apps, without enough care, could allow govts to reconstruct social graphs. This was the big concern being voiced early in 2020 about contact tracing. But it turned out that user privacy was damaged much more by buggy implementations:
- AU’s COVIDsafe app was centralized, using BLE beacons.
- initially it killed battery and caused uninstalls (on Google Play)
- not running the app is great for privacy!
- iPhone app) buggy use of encryption caused message truncation, msgs were dropped before hitting the DB. Also great for privacy!
The iPhone app, once it had maxed the num concurrent BT connections, would drop all future ones. This also killed other apps wanting to use BT. Great for privacy but bad for users with BT devices like continuous glucose monitoring. Two iPhone devices in screenlock will not record proximity.
Good observation: in most apps, security or privacy problems become well known. But in the contact tracing apps, failures are generally undetectable by users: indistinguishable from never having contacted COVID+ people.
These failures mean that Aus has little idea how many ppl still use the app and what fraction of proxmity events it detects.
For Android CVE-2020-12856 allowed an attacker to track your phone and obtain it’s long-term identity key (IRK) which:
- permits phone to be tracked,
- even after the user had uninstalled the app!
Massive privacy failure.
A different app leaked social graph edges if users opted-in to uploading their additional detailed exposure info
Description of bug in early version of A&G exposure notif API where if malicious server has access to contacts disclosed to human tracers they can identify whether the user failed to disclose an additional contact by ordering the list of known contacts uploaded by the discloser in a certain way. This was fixed by ensuring the lists are shuffled on receive and again on send.