Skip to main content

Go read this investigation into a flawed contact-tracing app used by one US college

Go read this investigation into a flawed contact-tracing app used by one US college


Security problems were discovered with the mandatory app

Share this story

Illustration by Alex Castro / The Verge

When Albion College announced it would be re-opening back in June it said it would put a number of health measures in place to help reduce the spread of COVID-19, including reduced lecture sizes, and virus tests for staff and students. But as TechCrunch reports in a new investigation, it also introduced a mandatory contact-tracing app with a number of privacy issues. The report highlights the problems facing these apps and the institutions that are introducing them, and it’s well worth a read as a case study.

The app, called Aura, is designed to alert the school when a student tests positive for the virus and to let students know when they may have come into contact with someone else who has it. But rather than relying on local Bluetooth proximity signals to tell when contact has occurred (as Apple and Google’s system does), Aura instead uses location data, a practice that’s been criticized for creating privacy problems. As well as being bad for privacy generally, the approach also means the college can keep tabs on where students are going, and place restrictions on their movements:

In addition to having to install the app, students were told they are not allowed to leave campus for the duration of the semester without permission over fears that contact with the wider community might bring the virus back to campus.

If a student leaves campus without permission, the app will alert the school, and the student’s ID card will be locked and access to campus buildings will be revoked, according to an email to students, seen by TechCrunch.

Investigations have also revealed other unintentional privacy oversights. Secret keys for the app’s backend servers were found in the app’s code, allowing one researcher to access patient data stored in the app’s databases and cloud storage. TechCrunch also discovered an issue with the QR codes the app generates which are designed to confirm whether or not someone has tested negative for the virus. 

Our network analysis tool showed that the QR code was not generated on the device but on a hidden part of Aura’s website. The web address that generated the QR code included the Aura user’s account number, which isn’t visible from the app. If we increased or decreased the account number in the web address by a single digit, it generated a QR code for that user’s Aura account.

In other words, because we could see another user’s QR code, we could also see the student’s full name, their COVID-19 test result status and what date the student was certified or denied.

Although these most egregious issues have since been fixed by the app’s developers, one security researcher quoted by TechCrunch said that they pointed towards the app being a “rush job.” The incident raises serious questions about the contact-tracing software being rolled out in other institutions around the world, and TechCrunch’s investigation sheds an important light on the problems it can cause.