One of the more frustrating aspects of the ongoing COVID-19 pandemic has been the frankly haphazard manner in which too many folks are tossing around ideas for bringing it all under control without fully thinking things through. I’m as guilty of this as anyone, desperate as I am for life to return to normal. “Give me the option to get a vaccine candidate even though it’s in phase 3 trials,” I have found myself saying more than once, each time immediately realizing how stupid and selfish it would be to not let the scientific community do its work and do it right. Challenge trials, some people say, should be considered. There’s a reason we don’t do that, actually.
And contact tracing. While contact tracing can be a key part of siloing the spread of a virus as infectious as COVID-19, how we contact trace is immensely important. Like many problems we encounter these days, there is this sense that we should just throw technology at the problem. We can contract trace through our connected phones, after all. Except there are privacy concerns. We can use dedicated apps on our phones for this as well, except this is all happening so fast that it’s a damn-near certainty that there are going to be mistakes made in those apps.
This is what Albion College in Michigan found out recently. Albion told students two weeks prior to on-campus classes resuming that they would be required to use Aura, a contact tracing app. The app collects a ton of real-time and personal data on students in order to pull off the tracing.
Aura, however, goes all in on real-time location-tracking instead, as TechCrunch reports. The app collects students’ names, location, and COVID-19 status, then generates a QR code containing that information. The code either comes up “certified” if the data indicates a student has tested negative, or “denied” if the student has a positive test or no test data. In addition to tracking students’ COVID-19 status, the app will also lock a student’s ID card and revoke access to campus buildings if it detects that a student has left campus “without permission.”
TechCrunch used a network analysis tool to discover that the code was not generated on a device but rather on a hidden Aura website—and that TechCrunch could then easily change the account number in the URL to generate new QR codes for other accounts and receive access to other individuals’ personal data.
It gets worse. One Albion student was able to discover that the app’s source code also included security keys for Albion’s servers. Using those, other researchers into the app found that they could gain access to all kinds of data from the app’s users, including test results and personal identifying information.
Now, Aura’s developers fixed these security flaws…after the researchers brought them to light and after the school had made the use of the app mandatory. If anyone would like to place a bet that these are the only two privacy and security flaws in this app, then they must certainly not like having money very much.
To be clear, plenty of other schools are trying to figure out how to use technology to contact trace as well. And there’s probably a use for technology in all of this, with an acceptable level of risk versus the benefit of bringing this awful pandemic under control.
But going off half-cocked isn’t going to help. In fact, it’s only going to make the public less trustful of contact tracing attempts in the future, which is the last thing we need.