Despite the fact that I really like the Google Chrome Web browser, I gave up using it last month around the same I gave up Coke Zero. I gave up Coke Zero simply because of the caffeine. My reason for giving up Chrome was a little more complicated.
Chrome is a terrific Web browser, maybe even the best browser available. It's secure. It's fast. It's simple to configure and use. And, it makes very efficient use of screen real estate. In February, Chrome was the only one of the top five Web browsers to gain market share, and it did it at the expense of Internet Explorer, Firefox, Safari, and Opera. While it's still only about 5% of the market and is a distant third behind IE and Firefox, Chrome has lots of momentum.
I've never had a problem with Chrome itself, but I've become more and more wary about trusting Google with my data. Those feelings of mistrust intensified recently when the company released Google Buzz, as I explained in my article Why Google Buzz confirmed our two worst fears about Google.
Google's haphazard attitude toward data privacy was my biggest concern, and it made me want to limit to amount of data I gave to Google. Since I was already doing most of my Web searches through Google.com and handling most of my personal email through Gmail, I decided to avoid the Chrome Web browser and the Android smartphone platform.
If Google had all four of those platforms then it would have had a near-complete digital footprint of my online activities. Limiting it to two seemed wise. I was especially concerned with Chrome saving and storing my entire Web browsing history and sending it back to Google.
However, for people like me -- along with concerned IT leaders and technology professionals -- Google recently published a YouTube video (see below) explaining Google's privacy approach in Chrome. In this video Google says, "Using Chrome doesn't mean sharing any more information with Google than using any other browser."
As simple as this statement may sound, it has important implications for the future of both Chrome and of Google itself. Chrome needed the statement because power users needed the reassurance that Google wasn't going to use Chrome to further mine our data and manipulate it in ways that we never intended or consented to.
The statement is also important for Google in general because the company wants to make inroads with businesses (see the recent announcement of Google Apps Marketplace) and businesses and IT departments need to have a much stronger level of trust and confidence in Google's privacy and security policies.
I applaud Google for making this statement, and I also intend to hold Google to its promise by occasionally running scans to make sure Chrome really isn't "phoning home" to the mothership with any of my data.
Because Google is sitting on one of the largest collections of human information on the planet, the company still needs to become even more transparent and make similar statements about its policies for:
- Ensuring its data centers are secure from attackers, both internal and external
- Locking down user data so that Google employees cannot violate user privacy
- Anonymizing data so that usability and UX information is not tied to specific users
That's the kind of stuff Google needs to share if it wants to win over the enterprise.
Jason Hiner is Editor in Chief of TechRepublic and Long Form Editor of ZDNet. He writes about the people, products, and ideas changing how we live and work in the 21st century. He's co-author of the upcoming book, Follow the Geeks (bit.ly/ftgeeks).