Imagine standing on a city street corner in the not-too-distant future, and an app on your phone can determine that you’re angry, based on your facial expression. On your wrist is a wearable device that shows your heart rate is elevated, and your arm is raised. The smart glasses you’re wearing have facial recognition, and identify the person you’re with. The app on your companion’s smartphone, and their own wearable device, show a similar angry expression and an elevated heart rate. An inference can be made that you’re fighting with this person and that data is stored away for future reference, since nearby building cameras could be accessed to show the actual scene, if needed in court for a potential criminal case.
If this sounds farfetched, it’s not. The amount of data being collected by just the wearable device on your wrist is simply astounding. Damien Mehers, a wearables developer who built the Evernote app for Pebble and the Samsung Galaxy Gear, said, “Especially with the fitness [devices], if you read the license agreements, if people really realized what they are signing up for, they might be horrified at what they’re allowing the companies to do with the data. I think there needs to be more clarity and perspective from the user.”
Types of data being collected
Because of this mass collection of data from all sources, which includes wearables, the Federal Trade Commission released a report to Congress in May 2014 after an in-depth study of nine data brokers. The FTC explained in the report, “Data Brokers, A Call for Transparency and Accountability,” that data brokers collect personal information about consumers from a wide range of sources and provide it for a variety of purposes to companies who pay for the information. “Because these companies generally never interact with consumers, consumers are often unaware of their existence, much less the variety of practices in which they engage,” the report said.
And someone isn’t immune just because they avoid social media and wearables. The FTC said data brokers collect an average of 3,000 data segments on nearly every US consumer.
Here is the type of data that brokers collect, according to the FTC report:
- Identifying data: name, address, etc.
- Sensitive identifying data: Social Security number, driver’s license number, etc.
- Demographic data: age, gender, race, languages spoken, employment, religion, etc.
- Court and public record data: bankruptcies, criminal convictions, marriage licenses, voting registration, etc.
- Social media and technology data: purchases, level of usage, Facebook and Twitter usage, number of friends in social networks, online influence, etc.
- Home and neighborhood data: dwelling type, home loan, interest rate, etc.
- General interest data: apparel preferences, attendance at sporting events, gambling, magazine subscriptions, media channels used, pets, preferred movie and music genres, etc.
- Financial data: ability to afford products, credit card user, credit worthiness, financially challenged, discretionary income level, net worth, tax return transcripts, etc.
- Vehicle data: brand preferences, propensity to purchase new or used vehicle, motorcycle owner, intent to purchase vehicle, etc.
- Travel data: highest price paid for travel purchase, cruises booked, preferred vacation destination, date of last travel purchase, etc.
- Purchase behavior data: amount spent on goods, buying activity, method of payment, buying channel preference (internet, mail, phone), shooting game purchases, guns and ammunition purchases, purchase of plus-sized clothing, average days between orders, novelty Elvis purchases, etc.
- Health data: tobacco usage, allergies, prescription purchases, brand name medicine preference, contact lenses user, weight loss supplements, reported interest in various health topics, etc.
The data brokers combine online and offline data to market to consumers online. It’s a big business – in 2012, the nine data brokers analyzed generated $426 million in annual revenue for these products. Consumers have very little ability to know, let alone control, what data is being collected, so in the report, the FTC asked Congress to enact legislation to allow people to know what data is being collected about them and who is collecting it. Each data broker source provides just a few data elements about each person, but data brokers put all of these items together to form a detailed composite about a specific consumer, the report explained.
The data is used to make inferences about consumers, and they’re placed into categories based on their interests and habits. Some categories target race and income, with “Urban Scramble” and “Mobile Mixers” including a high concentration of Latinos and African Americans with low incomes. Other categories highlight age, with “Rural Everlasting” a group of single men and women over the age of 66 with low educational attainment and low net worths. “Married Sophisticates” are upper middle-class couples in their 30’s with no children.
Some of the data can be used for purposes that consumers wouldn’t necessarily agree to. The report said, “Moreover, marketers could even use the seemingly innocuous inferences about consumers in ways that raise concerns. For example, while a data broker could infer that a consumer belongs in a data segment for ‘Biker Enthusiasts,’ which would allow a motorcycle dealership to offer the consumer coupons, an insurance company using that same segment might infer that the consumer engages in risky behavior. Similarly, while data brokers have a data category for ‘Diabetes Interest’ that a manufacturer of sugar-free products could use to offer product discounts, an insurance company could use that same category to classify a consumer as higher risk.”
Terms of service in wearables
Wearables open an entirely new avenue for data collection, with the user’s heart rate, activity level and sleeping habits among the data points being stored, depending on the device used. As Evernote’s Mehers said, the consumer signs away a lot of rights when they accept the terms of service.
Tatiana Melnik, an attorney who works in the healthcare IT, data security and security, said there are a number privacy and security concerns in wearables.
“You have to read the terms of service for a wearable device. But on the other hand, you want to use the device. You’re not really in a position where you can’t accept the terms. You don’t exactly have a choice,” she said. “It’s a fundamental notion of fairness and what’s fair for you might not be fair for me. At what point do I make a decision that you can’t punish me because I don’t want to play by your rules.”
The language is vague in most terms of service agreements. Even the phrase “third party” is up for analysis, since it could mean practically anyone who is in contact with the company. “There are a lot of people who are maybe okay with sharing the information of sharing their heart rate with their doctor, but not with an advertiser or a pharmacy,” she said.
“But it’s true that sometimes consumers don’t realize what they’re giving up by way of using those devices. For example, consider GPS. A lot of the devices now are GPS enabled. That means you’re giving a third-party company the ability to track your every move. Is that something you’re comfortable doing?
“One of the other issues is that in many of these agreements you’ll find something that say something like, ‘In the event of the sale of our company, or if we’re in bankruptcy, we can sell your data.’ They reserve the right to sell the data. People don’t realize that their data is an asset. There is value to knowing all this data about a person. You know their buying habits, their likes and dislikes. Certainly I’m okay with Company X having my data, but not Company Y. But you can’t undo that. Once you agree, there’s no way to undo it.”
And many companies, such as Facebook, are frequently updating their terms of service, so it’s hard to keep up with the changes, even if they were originally read in their entirety. Facebook recently made the news when it was discovered that the company had “emotionally experimented” on its users, as reported on ZDNet.
Security of wearables
And then there’s the overall security of the data being collected. Melnik said, “When you’re looking at wearables you have other security issues. The software is only as good as the developer. If someone makes a mistake and there’s a huge loophole in the software, who patches that? Who has the responsibility?”
Paco Hope, principal consultant at Cigital, said that it’s very possible to hack into a wearable device at its weak point – when it sends data to the cloud. “Every device like that tends to work in a client server environment where it’s not very clever. It’s uploaded to the cloud or somewhere else, and that’s where you spoof these things. You pretend to be the server, the mothership. These devices are so small that they can’t do terribly sophisticated embedding to see who that is on the other end.
“There’s perfectly reasonable ways to secure that communication, but that’s when it’s running on a pretty full featured device like a phone or a PC or a tablet. A tiny computer meant to fit into a watch or sunglasses or a shoe. The power you have there is a lot more limited,” Hope said.
Brian Brown, vice president of technology and security for RenewData, said the use of wearables in the enterprise makes it tougher to restrict data flow in and out of an organization when employees wear them to work. It increases the attack surface by allowing more points of entry for hackers, so he has to weigh the risk versus benefit.
And then the development process itself allows for security breaches, Brown said.
“It’s a typical technology development scenario. You have speed to market with the technology, then you have to balance your risk and cost of developing that technology and all those factors combined into a security profile of a device. Most of the wearables are not standalone devices. A lot of them back into a mobile phone of some sort, so you have other ways to compromise that data,” Brown said.
Steven LeBoeuf, president of Valencell, agreed that the cloud is the weak point. “When the data is being sent to the cloud, that’s when the consumer loses control because the data goes to a company that has a service and you’re relying on that company and that service to keep your data. The more data you share with the companies you trust the better shape you’re in. But the more data, the more wary you must be.”
However, many people don’t worry about privacy and security of their personal information, either because they aren’t aware of the risks, or they don’t know how much data is truly being collected.
Conan Dooley, security analyst with Bishop Fox, said, “People are so willing to give up privacy for functionality because functionality is a tangible benefit and privacy is an immaterial one.”