Image: Getty Images/iStockphoto

Contact tracing is one tool used to track and try to stem the spread of a disease. To combat the COVID-19 outbreak, many companies and governments have deployed contact tracing apps that can alert you if someone with whom you’ve been in contact tests positive for the virus. But such apps have triggered privacy worries since they use your proximity to someone else to function effectively. A new report from mobile app security provider Guardsquare looks at several contract tracing apps to see if these worries are justified.

SEE: How tech companies are fighting COVID-19 with AI, data and ingenuity (TechRepublic)

Back in June, Guardsquare examined 17 different COVID-19 contact tracing apps distributed by governments around the world. The company found that the apps were not well protected against reverse engineering and possible exploitation, making them easy for hackers to attack and clone.

For its latest research, Guardsquare rechecked 14 of the original 17 Android apps (three are no longer around), looked at 38 new ones, and extended its scope to include iOS apps. The new analysis also incorporated six additional features to broaden the definition of security protection. The research included global contact tracing apps and apps from two US states and two US territories for a total of 52 Android apps and 43 iOS apps—95 apps in all.

Drilling down further, 60% of the apps analyzed use the API from Apple and Google, which the two companies added to their respective mobile operating systems earlier this year. The API was designed to address privacy fears, so Guardsquare focused its report on the 40% that were created without use of the official API. And among those, significant security and privacy concerns remain, according to the report.

SEE: Meet the hackers who earn millions for saving the web, one bug at a time (cover story PDF) (TechRepublic)

The company did analyze the apps that took advantage of Apple and Google’s API using the same methods it applied to the DIY apps. Guardsquare chief scientist Grant Goodes told TechRepublic that these apps aren’t so much “more secure.” Rather, they have a much lower need for additional security since they’re designed to minimize their exploitability. They neither gather nor expose privacy-sensitive data. To pull location data, they also use their own kind of Bluetooth proximity detection for tighter security.

The DIY apps, however, were a different story. Taking the apps through their paces, Guardsquare found that they either used GPS tracking or their own custom Bluetooth proximity detection (or both), methods considered less secure and less private than the Bluetooth feature in the API from Apple and Google.

Many of the apps that use GPS tracking ask people to share their phone numbers or passport information before using them. Some of these apps do encrypt location data, but some also store such data in a plain database or even leak them in an in HTTP cache, which renders the encryption ineffective.

Some of the DIY apps analyzed also capture device information such as IP address, MAC address, device name, RAM size, OS version, time spent in the app, carrier, GPS location, and timestamp. Calling this overreach, Guardsquare said that just an IP address and timestamp should be enough if a government wants to link an individual to a device.

Next, the research cited six different types of protections that contact tracing apps should have to secure private data. Three of these protections were Name Obfuscation to obscure human-readable names, Data-at-rest Encryption to encrypt data residing on the device, and App Attestation to establish the integrity of the app and ensure that requests to the server are coming from the actual app.

Only 35% of the Android apps and none of the iOS apps used Name Obfuscation. Some 20% of the Android apps and 22% of the iOS apps employed Data-at-rest Encryption. And just 5% of the Android apps and none of the iOS apps used App Attestation.

Image: Guardsquare

While the Apple and Google API is available only to public health authorities and just for the purpose of contact tracing, security and privacy should be embedded into any mobile app, especially one that impacts your personal health, according to the report.

SEE: Top 5 programming languages for mobile app developers to learn (free PDF) (TechRepublic)

“Properly securing contact tracing apps is not just a citizen privacy and security issue,” the report said. “It’s not just a government trust issue. Most importantly, it’s a public health concern. The makers of contact tracing apps owe it to their citizens to offer a secure, reliable method to trace known COVID exposures and reduce the risk of catching the virus without forcing them to sacrifice their privacy and security in the process.”

How should developers of contact tracing apps better secure and protect the privacy of their users? Guardsquare offers the following suggestions:

  • Developers should use code hardening to protect code at rest and runtime application self-protection (RASP) to protect apps in use.
  • To be truly bulletproof, apps should implement hook detection, tamper detection, and debugger detection as well.
  • They should also employ real-time mobile threat intelligence tools to understand when hackers go after apps and stop them as quickly as possible through blocking or vulnerability management strategies. Many industry standard best practices are well known and relatively easy to implement.
  • Further, these apps should not be gathering certain types of information, and certainly not be storing it for any length of time. Wherever possible, the information the apps process should be treated as highly sensitive. Every effort should be taken to ensure that the collection and potential exposure of personally identifying information is minimized.

“We recognize that many governments employ third-party contractors to develop these apps, but this does not absolve them of responsibility,” Guardsquare said. “Anyone disseminating contract tracing apps must impose minimum standards of quality and security on the third parties or internal teams who are developing them.”