As the world tries to fight the coronavirus pandemic, contact tracing is a key practice for preventing outbreaks. Contact tracing is the method of contacting every person an infected person might have had close contact with and getting them tested so they don’t go out and unknowingly spread the virus. While much of this tracing is done manually by calling those potentially infected people, several countries have turned to mobile contact tracing apps to do the work.

Because your phone likely goes with you wherever you go, on paper this seems like a good method for stopping the spread of the virus. Some countries, like South Korea, have actually gotten the virus under control while using contact tracing apps, while others have apps in use or are in the process of creating them, such as Italy, France, and Australia.

However, despite the potential benefits of this technology in helping us combat coronavirus, many people are concerned about the privacy of such apps. Even if the creators of the apps aren’t abusing them, malicious actors can — and already have in some countries. For example, there was already a vulnerability (CVE-2020-12856) detected in Australia’s app COVIDSafe that allows an attacker to silently bond with a phone running the app. This vulnerability has been fixed, but like with any software, it’s possible that there are people who are still running the vulnerable version. 

RELATED CONTENT:
Google and Apple release Exposure Notification API
Keeping up with the evolving security threats during COVID-19 

There are a number of privacy and security concerns surrounding these apps. First is the question of what data the app is actually collecting. And as a follow-up to that, how can users ensure that the app isn’t collecting more than what it says it’s collecting? 

Anne Hardy, CISO at Talend, believes this is an impossible thing to know. As a user, she explained, you can’t really track where your data is going. “You just trust what companies tell you they do and then you hope that they do what they say,” Hardy said. “So I don’t think there is much that the consumer can do just to make sure that this is not happening, except counting on people to monitor that the app is not doing things right. Me or any user cannot really look at the traffic that the app is sending and check that no personal information is actually sent without you knowing.”

Something else to keep in mind is how privacy and security are incorporated into the development process under normal circumstances. Even during normal times these things tend to not be top of mind, Kelvin Coleman, executive director of the National Cyber Security Alliance, explained. “So you can imagine as you rush to create some of these things, security, privacy protocol, probably are still not at the very top of the list,” said Coleman.

He explained that it’s a bit of a double-edged sword. There is a case for getting these apps out into the public as soon as possible to help deal with the current situation and prevent as many deaths as possible. “You want to contain it and make sure that people have an opportunity to know that they’re in the vicinity or have been in the vicinity of someone who’s had the virus, and so there is a rush to get it out there,” said Coleman. “But we have to make sure that security and privacy protocols are thought of not second, third on the priority list. It needs to be at the top.”

Hardy believes that given the accelerated timeline, it’s likely these apps will have privacy and security flaws. She said that this will lead to a lot of suspicion around these apps, which might lead to many people refusing to download one. 

Tom Pendergast, chief learning officer at MediaPro, added that in order for people to be willing to download contact tracing apps, there is a need to build trust between people whose data is being collected and those who will later gain access to that information. “We’ve seen that people will do rational things—like isolate themselves from loved ones—when they believe it will lead to good outcomes. We’ll need people with high levels of trust and authority—we’ll need more Anthony Fauci’s—to support this contact tracing system, and then it stands a good chance of working,” said Pendergast.

There are a number of parties who will need to establish that trust, Pendergast explained. These include API makers, like Apple and Google; public health authorities; application developers, pen testers, and application analysis companies; and more. “This list could and should go on, and we should expect transparency and visibility from all parties,” said Pendergast.

Another concern is the potential for a government to misuse this information. “We’ve known throughout history that sometimes a government can use [something that has] an official purpose to help people, and then they find this other mission-creep purpose,” said Coleman. It will be important to make sure that government agencies stay in their lane and don’t use contact tracing apps as a mechanism for surveillance, unrelated to COVID-19.

In a similar vein, it’s important to think about a kill switch for when an app would no longer be needed. “It can’t fall into the wrong hands and it can’t be used for mission creep by the government,” Coleman continued. “Who’s gonna have the kill switch to say that’s done, it’s served its purpose and now we have to go back to where we were in terms of not tracing folks.”

The technology underneath
Google and Apple have solved some of these concerns with their contact tracing exposure API. This API can be used by public health agencies to build apps for Android and iOS devices. It works by using rotating keys and identifies, and doesn’t collect location data. Only when a person confirms that they are sick is their key shared with anyone they might have been around. 

“Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps,” Google and Apple wrote in a joint statement

Hardy believes that from a privacy perspective, Google and Apple’s approach is better than some others because of the fact that privacy information stays on the device until a person reports that they are sick. However, at least in the United States, it doesn’t seem many states are leveraging this API, or are planning to. An analysis done by 9to5mac.com found that only four states in the U.S. currently had plans to leverage the API. These states include Alabama, North Dakota, South Carolina, and Virginia. 

As far as database models underlying these apps, distributed databases will likely be the model used for a majority of these apps. According to Asya Kamsky, principal engineer at MongoDB, a distributed database is a database that is “designed to store massive amounts of data across thousands of servers.”

This model is beneficial for contact tracing because it provides low latency and high availability of data. “Your reads and writes will be faster if data is closer to where the queries are taking place,” said Kamsky. 

Distributed databases also allow for apps to comply with different countries’ data regulations because where the data is stored can be specified based on each individual requirement, Kamsky added. 

The fact that data is being stored in multiple different locations using this model shouldn’t have an impact on privacy. “From a privacy standpoint, it’s important the person or team writing the app follow best practices. You can utilize end-to-end encryption to make sure your data is secure before it hits the cloud,” said Kamsky. 

Are apps like these effective?
Given the privacy concerns surrounding a data collection app like this, there is likely to be a lot of risk-benefit analysis. For example, a person might be willing to give up some of their privacy if there is the potential to save lives. But there are a lot of issues with these apps that could reduce the effectiveness of them.

Just like vaccines only protect a population from a particular disease if enough people get that vaccine, thus creating herd immunity, a contract tracing app will only work if a majority of the population are properly using it. If not a lot of people are downloading or using these apps effectively, then this might not be the   best chance of stopping the spread of coronavirus, compared to current tactics like social distancing, wearing masks, and testing.

Apart from people just not wanting to download such an app, Hardy brings up a number of technical reasons why these apps might not work 100% of the time: unreliable Bluetooth, the fact that not everybody has a phone that could handle the app, or the potential that your phone battery could be dead while you’re out and about. 

If the app can’t properly track a person’s whereabouts and know who they’ve encountered, then it won’t be able to tell them if they need to get tested. 

“I think contact tracing apps seem to be a very reactive way to look at COVID,” said Hardy. “I think there are probably better things to do, like making sure that people are tested frequently instead of giving them the opportunity to be notified that they have been close to someone, and then what’s next? They have to be tested. I have some doubts about the usefulness of those apps … We need to continue to educate people as we discover how this virus is transmitted from one to the other. So I think communication and education is probably more important.”