Deploying a mobile app securely involves third parties that make security more complicated.
Enterprises build and deploy mobile apps for lots of reasons. Some apps are the product the enterprise sells. Some apps are helping staff perform internal processes. Other apps are slightly customised versions of commercial products, tailored for the enterprise's needs.
In any of these situations, mobile apps have security concerns that other forms of software lack. You need to be aware of these three risks.
1: App store delays affect your security response times
Mobile apps are generally distributed through app stores. Some enterprises distribute apps — even internal ones — through the official public app stores (e.g., Apple or Google Play). Some enterprises have their own internal app stores where they load the apps. If you sell or distribute through the public app stores, it is important to remember that approval times are fairly long and not guaranteed. Why does this matter for security?
If a mobile app has a security vulnerability in its code (i.e., not in the server-side code that supports the app), how quickly can a fixed version be deployed to end users? If the public app store is involved, it could take weeks before the fixed version is available. Even then, how will end users be encouraged (or forced) to upgrade? Will the application quit working because it has been blacklisted at the server? Will that have a negative impact on the user, or will that impair important business processes? If there's no force or push, end users may take weeks or months to upgrade, and some won't upgrade at all. Depending on the vulnerability, this could be a big problem.
Apps that have a server-side component should include a version check and a graceful way of disabling the app if the user must upgrade. Any plans to respond to vulnerabilities in the app must account for the lag time between finding the problem, fixing the problem, and getting the fixed app into users' devices. If deploying the fix to 100% of users is a requirement, that will be very difficult to accomplish.
2: Third-party libraries can create a privacy liability
Mobile apps are frequently built with lots of third-party libraries. Analytics, crash reporting, authentication, advertisements, cryptography, and push messaging are just some of the features that apps implement using someone else's code. That code gets incorporated into your app and executes with your app's permissions.
The actions of third-party code may expose the enterprise to unknown or unexpected legal risks. In the UK, for example, the Information Commissioner's Office (ICO) has released guidance for mobile app developers that makes it clear that IMEIs, MAC addresses, and other technical identifiers are covered by the Data Protection Act and must be treated as private data by firms.
Some third-party libraries check what permissions the app has, and send private information to their online services if it's available. An advertising library, for example, might check to see if it has access to the IMEI or the Bluetooth MAC address, and if it does, it might include that in its ads requests. The app is sharing private data on individuals with third parties. This activity may or may not be covered adequately in the terms and conditions that the user agrees to.
Of course, a vulnerability in this third-party code may be the reason that a firm needs to update its app in the app store.
3: Shifting sands of third-party code
Third-party apps can change their behaviour without any action by the enterprise. Imagine that when an online shopping app runs, the crash reporting software checks for permission to get the device's location, finds that it cannot, and sends a crash report that does not include the device's location. Now the shopping app adds a feature to find the nearest store. The app requires the user's location and receives permission from the user. This is fine because the firm knows the location is being sent to its store-searching service online, and it handles the location data safely there. But now when the app crashes, the crash report is also able to retrieve the user's location because the app now has that permission.
An enterprise might be receiving the user's location in a crash report, which is stored unprotected in a log file alongside technical details that normally aren't subject to privacy regulations. The developers don't realise that their change created this issue, because the third-party library is opaque to them. They can't tell that changed behaviour in response to increased permissions.
Building and deploying a mobile app involves a lot more parties than simply the firm that writes the app; the app stores and all third parties that contribute code can create risk. Keeping track of which third-party code is in the app is an important first step. Knowing which libraries are involved allows a firm to monitor for vulnerability announcements in those libraries, incorporate non-vulnerable versions, and push updated versions to the app stores.
- Majority of mobile apps will fail basic security tests in the future: Gartner (ZDNet)
- The threat of risky mobile apps to BYOD
- Enterprise apps pay the bills for mobile developers
- Defending the last missing pixels: Phil Zimmermann speaks out on encryption, privacy, and avoiding a surveillance state
- Security and Privacy: New Challenges (ZDNet/TechRepublic special feature)
Note: TechRepublic and ZDNet are CBS Interactive properties.