Smartphones

Convenience or security: Who decides which is more important?

A fascinating discussion in the IT Security Blog raises an equally fascinating question: Who decides when convenience is more important than security?

On June 15, 2011, in the TechRepublic IT Security Blog, Donovan Colbert reported a somewhat disturbing convenience feature in some Android devices that creates, at the very least, a potential security problem. The subsequent discussion thread is very enlightening.

There is a setting in certain Android devices that will back up some of your personal data to Google servers, unless you decide to opt-out. The official wording:

"Check to back up some of your personal data to Google servers, with your Google Account. If you replace your phone, you can restore the data you've backed up, the first time you sign in with your Google Account. If you check this option, a wide variety of your personal data is backed up, including your Wi-Fi passwords, Browser bookmarks, a list of the applications you've installed, the words you've added to the dictionary used by the onscreen keyboard, and most of the settings that you configure with the Settings application. Some third-party applications may also take advantage of this feature, so you can restore your data if you reinstall an application. If you uncheck this option, you stop backing up your data to your account, and any existing backups are deleted from Google servers."

(Thank you Michael Kassner)

At first glance, all of that sounds very convenient, but as Donovan points out, when you start to consider what information is actually being saved on Google's servers, you may wonder if your convenience is coming at too high a price with regard to security.

For example, consider the information Google is saving if your Android device connects to your corporate WiFi network - after a successful connection, the access codes to your enterprise network are now stored on a Google server. Now, if you access that Google server with another device on another WiFi network using only the credentials required to access your Google account, those corporate access codes will be "restored" to your new device. A malicious user could potentially acquire access codes to a corporate network by merely knowing a Google username and password. On public networks, that information is not difficult to come by.

As you can imagine, this does not sit well with IT professionals who are responsible for maintaining secure networks. Such potential security vulnerabilities will likely cause many network administrators to make policy changes with regard to the use of Android-based devices on their corporate systems.

Potential versus actual threat

Now, the scenario described above and the other issues raised by Donovan in his article are serious, but they are mostly in the potential threat category at the moment. We know of no actual abuses of this Google-provided feature for Android devices. But that is not the real issue is it?

The problem with the feature is that Google has chosen to make it an opt-out transaction. The default on many devices is to check the box in the settings control that allows Google to store this sensitive data. However, that is not the secure and responsible approach. Users should have to actively opt-in to the feature. Google should be erring on the side of security over convenience. The only person who has the right to make the decision to override security in favor of a convenience feature is the user - not the vendor, service provider, or software developer.

Do you agree or disagree? Do you think the potential security risks are overstated or understated? Do older IT professionals and computer nerds like me need to chill or do we provide sage wisdom?

About

Mark Kaelin is a CBS Interactive Senior Editor for TechRepublic. He is the host for the Microsoft Windows and Office blog, the Google in the Enterprise blog, the Five Apps blog and the Big Data Analytics blog.

20 comments
NHS Tony
NHS Tony

surely this SHOULD be a question asked by the device when initially setting it up... ...setting it up, the clue is in there somewhere i think...

Mark W. Kaelin
Mark W. Kaelin

I agree wholeheartedly that we should all take responsibility for securing every piece of technology we use, but we are well-versed in the technology itself. We spend lots of time thinking about technology - in our blood so to speak. But does that level of sophistication extend to all users in an enterprise environment? Consider Donovan's story - he noticed something was amiss because he expected a request for a password to access a WiFi network, but didn't get that request. How many of your users would have noticed that?

dcolbert
dcolbert

I work at Company X. I have a Google device. I have permission from IT to connect my Android phone over the company wireless access points, which I do. But the Google account associated with my Android device is my *own* account. I think this is a fairly common scenario in places that have allowed Android devices onto their networks. The problem comes when Company X lets me go. They take back their company phone and reset it - but like most of us, they're not aware that Google has opted-in backing up the device to the cloud. The terminated employee goes home upset, pulls out his Android tablet and wonders how he is going to pay for it now that he has no income, and then notices that even though he has never taken this device into the office, it has all the hotspots at work listed. So he goes back, parks in the parking lot, and finds out that he has complete wireless access to the corporate network of his former employer. This is *all* too real of a possible situation. Some people have asked where this would really pose a threat.

Ricky Tandiono
Ricky Tandiono

I think it's all back to the security policy. Google might choose convenience to grasp the market. If company going to allow the devices to be used in the company's network, then it should be part of the company security policy to have the IT to brief the staff or do some check before allowing it. True, it will be good if google can do something about it, but instead of shifting the responsibility, each of us should also do our homework.

mail2ri
mail2ri

This may be a tricky issue at the enterprise level, but at a more personal level, I feel comfortable if the service provider / vendor decides to keep me secure even if it compromises some amount of convenience for me. For instance, one of the reasons I use Chrome browser over IE or FF is because Chrome gets updated Automatically whenever a new version is made available by Google, assuring me of a secure browsing experience. At an enterprise level, left to themselves, users would rarely worry about security and upgrades, leaving gaping holes in enterprise IT infrastructure which a company desktop / laptop is very much a part of. Should staff be permitted to use their personal devices even if for official purposes ? My take is NO, by default - unless there are clear policies on ownership and support for such devices. E.g. a personal laptop with no anti-virus (or an expired one) and an unlicensed OS, besides other pirated apps installed, poses a huge security risk to the enterprise security setup, and can be nightmare for Tech Support staff, which is completely avoidable. In this scenario, since the company IT staff don't "own" the personal devices of staff, though they are expected to support them technically, they are constrained to provide permanent solutions to patch-up issues on the device as they wouldn't be authorized to tinker with the device (here I mention device as a generic equipment, as many firms allow their staff to even use their own smartphones and ipads for company work), nor can they install official licensed s/w on the staff's personal device, which would be a breach of license terms. Hence, unless personal devices meet certain minimum criteria (licensed OS and security s/w, for starters), I would say it's a no-no to personal devices at workplaces, in the best interests of a larger group. The CIOs are often caught between allowing staff use personal devices at workplaces, and ensuring enterprise IT security despite greater risks posed by such devices. Often it is influenced by the organization's culture, and who calls the shots. This is more so in firms where IT is perceived as a mere service function. So, convenience over security always comes at a price - which many are not willing to pay.

dogknees
dogknees

When it comes to my own security, I make the decisions. If I make a decision that adversely affects others, I accept that I have some ethical and legal responsibility for it and I'll take whatever punishment or approbation appropriate. In a work environment, I have no argument with the Systems team making these decisions. Where I often disagree is when they (Systems) take the sledge-hammer approach. You mention that some might bar Android devices from their networks. This is a sledge-hammer approach. Why not do the research and block only those actions that cause problems? A classic example of this approach is managing user access to their C: drives to stop installation of software and changes to settings. Only problem is that access to the Temp folder can be blocked, breaking legitimate procedures. The "block everything" approach is a problem, not a solution!

wizard57m-cnet
wizard57m-cnet

standard operating procedure...just in the past year or so, Google has unleashed Buzz, which harvested all a google members contacts and broadcast activity to everyone in the list, then there was Street View in Europe "war driving" and gathering all kinds of info regarding wireless APs, now this. To me it seems that Google is in such a rush to grab user eyes that they push questionable software and policies onto an unsuspecting, and many times unknowlegdable, public without any regard to potential security violations.

seanferd
seanferd

You should have to turn them on yourself, and understand what they are. In general, who decides? The vendor, as always. So, you should understand your device or service before you start using it. Opt in, or opt out, you are responsible for your use of devices or services.

Alpha_Dog
Alpha_Dog

...yet I tread a fine line in securing systems. I ensure the customer is appraised of threats (often with the use of allegory or pictures) as well as the possible actions. The customer must make the decision. It is our job to train, explain, protect, and fix, not arbitrarily make decisions for our clients unless they specifically ask. Google needs to keep it's mitts to itself. It is paid to search and do other things that we entrust them to do to the best of their ability using funds which are derived from selling advertising to pay for it all. To do anything else is questionable. To do something questionable in secret is evil.

SKDTech
SKDTech

Don't Be Evil. Is that not the core philosophy that Google and its defenders trot out whenever someone questions them? This is not evil, but it is definitely treading the line. Don't get me wrong, I like Google and alot of the things they have done. But every so often they do something so naively idiotic that it makes you wonder if they live in the same world as the rest of us.

Spitfire_Sysop
Spitfire_Sysop

By default, google collects the GPS co-ordinates and passkeys for every WiFi AP touched by an Android. This creates a google map that could give you directions to every AP in world and then give you access without the owners permission. My WiFi skeleton key scenario could be created by any unscrupulous Google employee or potentially one of the Chinese hackers that likes to snoop around the Google servers. This is a danger that could pose a threat to national security. I think we should move straight from justified concern to legal action.

Mark W. Kaelin
Mark W. Kaelin

Do you agree or disagree? Do you think the potential security risks are overstated or understated? Do older IT professionals and computer nerds like me need to chill or do we provide sage wisdom?

dcolbert
dcolbert

How many users would notice it, and *question* it. Most would go, "Awesome! I.T. must have put more magic Unicorn dust in the WiFi boxes so that it logs on automagically now!"

DigiTechDude
DigiTechDude

If they are foolish enough to not eliminate access to that user after firing him, they get what they deserve. If they are foolish enough to authenticate wireless access to the internal network with a shared key, they get what they deserve. In even a marginally secure network this scenario could never happen.

Spitfire_Sysop
Spitfire_Sysop

It's not only a solution, it's the best solution. It's called whitelisting. You first make an implicit deny or "block everything". Then you make your "white list" which is the list of things you want to work. This is full control. You cannot have control when something can be added without approval. This is called "change management". No changes are allowed without your approval. This is enterprise security. So it doesn't matter if you disagree with the Systems administrators because they are following a corporate policy based on enterprise level security. Your "block only the problems" does not account for the new problems. This is a basic security concept. The solution to your classic example is to add the temp folder to the white list. No problem.

dcolbert
dcolbert

Doing something questionable, openly, but only if the person having it done to them goes out of their way to *search* for that disclosure? That boils down to the "defense" of Google I've heard in the forum on my original story - that it is the user's responsibility and that the information is OUT there - even if it isn't obvious.

dcolbert
dcolbert

in the forum on my original post. I'm not sure if it is intentional and done in pride and arrogance, or just Google's naivity and idealism crashing on the rocks of reality - but either way, it is dangerous. In that sense, where I've been accused of promoting Anti-Google FUD, I don't think that is the case at all. I think this issue needed discussion in the light of day.

dogknees
dogknees

... is that the network guys don't think it through and add Temp to the whitelist.

Alpha_Dog
Alpha_Dog

It's not the disclosure or discovery that determines the action, but rather the effect and intent of it.