How businesses could be exposed to security risks from employees using FaceApp

The seemingly harmless fun of AI-based apps such as FaceApp can actually subject individuals and businesses to security breaches.

Will mobile devices replace passwords? Mobile devices could provide a more secure, user-friendly mode of account authentication, according to an IDG and MobileIron report.

FaceApp debuted in 2017, but it's the subject of renewed popularity this week with the now-viral FaceApp old age filter. Celebrities such as Miley Cyrus, Mindy Kaling, Kim Kardashian, the Jonas Brothers and Carrie Underwood are posting their aged faces online using the #faceappchallenge tag, and regular folks are doing it, too. And therein lies the problem. They're doing it from home, they're doing it from work, and they're likely doing it from company-owned devices. 

Stan Lowe, Zscaler global CISO, said, "companies should be concerned about users downloading these types of apps, mainly because they have not been examined for undocumented features and they may occasionally be utilizing the devices for activities that are not readily apparent such as bitcoin mining. Any app that asks you to provide any data, biometric or otherwise, is going to use it for some reason. Companies and individuals should guard their privacy and data in all forms, including biometrics. We were all told not to tell strangers where we live—this holds true in an age where apps are collecting all kinds of data. Your privacy and data are valuable. As the old saying goes, beware of strangers bearing gifts."

faceapp2-copy.jpg

FaceApp is an AI-based app that can change hair and age.

Image: FaceApp

The app, which was created by the Russian-owned FaceApp, uses artificial intelligence to morph faces and make them look older, or younger, whatever the user requests. Some of the security concerns are how and where the facial images are stored.

FaceApp CEO Yaroslav Goncharov talked to TechRepublic about the safety of his app. "FaceApp performs most of the photo processing in the cloud. We only upload a photo selected by a user for editing. We never transfer any other images from the phone to the cloud. We might store an uploaded photo in the cloud. The main reason for that is performance and traffic: we want to make sure that the user doesn't upload the photo repeatedly for every edit operation. Most images are deleted from our servers within 48 hours from the upload date."

Goncharov said, "We accept requests from users for removing all their data from our servers. Our support team is currently overloaded, but these requests have our priority. For the fastest processing, we recommend sending the requests from the FaceApp mobile app using 'Settings->Support->Report a bug' with the word 'privacy' in the subject line. We are working on the better UI for that."

Goncharov added, "Even though the core R&D team is located in Russia, the user data is not transferred to Russia." He also said that the only photo that is uploaded is the one that is selected for editing, not all photos in the user's gallery.  

SEE: FaceApp says it's not uploading all your photos (CNET)

Security experts warn that companies need to be careful with FaceApp and similar apps.

"Any organization that plans to rely on biometrics for security is going to need to rethink their strategy.  Since a biometric cannot be 'reset', the best approach is to enforce strong cybersecurity policies that address potential vulnerabilities such as passwords," said Craig Lurey, CTO and Co-Founder of Keeper Security.

Sam Bakken, senior product marketing manager for OneSpan, said, "Any mobile app is a potential privacy issue regardless of its use of artificial intelligence or biometric data. Applying artificial intelligence and biometric data to authentication use cases can also make an app MORE secure and do a better job of protecting a user's privacy – in ways that are much better than a simple username and static passwords."

Pankaj Srivastava, COO of privacy-first company FigLeaf said it's definitely a risk to use apps such as FaceApp. "Clever companies are finding new and different ways to couch data collection in 'fun' or viral-sharing' experiences. It may seem fun to manipulate your photos but what you're really doing is giving way your entire photo album to a company with no traceable address, location or history. Oddly, users don't have to opt-in to use the app which seems a violation of the European Union's GDPR laws."

To better protect your privacy, Bakken said, "individuals need to decide whether the functionality provided by the app improves their quality of life to a degree that it's worth sharing information with the app developer."

Srivastava said, "If consumers are going to own privacy or take back control of privacy they are going to have to start reading the fine print, like FaceApp's terms and conditions. That is something individuals have to own and this is a part of online privacy that they should be responsible for."

He added, "our company recently surveyed more than 4,000 online users across the US and UK and over 75% believe they have to change their behavior online in the wake of recent privacy scandals. Nearly half agreed that privacy should be a shared responsibility and over 75%  believe they have to change their behavior online. All of this points to a growing recognition that consumers are waking up to the idea that they need to take more ownership of their own privacy, and apps that use data in ambiguous ways, over time, will be less and less tolerated by the general consumer."

Also see 

faceapp-wider.jpg

FaceApp is a popular AI-based app.

Image: FaceApp

By Teena Maddox

Teena Maddox is Associate Managing Editor at TechRepublic. She oversees production of TechRepublic's downloads and TechRepublic Premium. Teena also covers hardware devices, IoT, smart cities and wearables. She ties together the style and substance of...