CXO

Privacy concerns about data collection may lead to dumbing down smart devices

A new wave of smart devices sensors and Internet of Things collecting data will make it hard to remain anonymous offline. Will the public wake up to the risks all of that data poses to their privacy?
 
privacy_istock_022414.jpg
 Image: iStock/maxkabakov

Should we do something just because we can? That simple question has bedeviled many leaders over the centuries, and has naturally arisen more often as the rate of technological change (e.g., chemical weapons, genetic engineering, drones, online viruses) has increased. In many cases, scientists and engineers have been drawn, as if by siren song, to create something that never existed because they had the power to do so.

Many great minds in the 20th century grappled with the consequences of these decisions. One example is theoretical physicist J. Robert Oppenheimer:

"When you see something that is technically sweet, you go ahead and do it and argue about what to do about it only after you've had your technical success," he said, in a Congressional hearing in 1954. "That is the way it was with the atomic bomb."

In the decades since, with the subsequent development of thermonuclear warheads and intercontinental ballistic missiles and arms buildup during the Cold War, all of mankind has had to live with the reality that we now possessed the means to end life on Earth as we know it, a prospect that has spawned post-apocalyptic fiction and paranoia.

In 2014, the geostrategic imperative to develop the bomb ahead of the Nazis is no longer driving development. Instead, there are a host of decisions that may not hold existential meaning for life on Earth but instead how it is lived by the billions of humans on it.

This year, monkeys in China became the first primates to be born with genome editing. The technique used, CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats), has immense potential for use in genome surgery, leaping from lab to industry quickly. CRISPR could enable doctors to heal genetic disorders like sickle-cell anemia or more complex diseases in the future. Genome surgery is, unequivocally, an extraordinary advance in medicine. There will be great temptations in the future, however, for its application outside of disease.

Or take a technology that has become a lightning rod: Google Glass. Google banned facial recognition on Google Glass in the name of privacy, but included the feature in Google+ years before.

While Google turns facial recognition off by default, Facebook has it on and suggests people to tag when users upload photos, thereby increasing the likelihood that people will be identified. As always, the defaults matter: such tagging adds more data to Facebook's servers, including "shadow profiles" of people who may not have created accounts on the service but Facebook knows exists.

Over time, the increasing reach of both technology companies will make it harder than ever to be anonymous in public or formerly private spaces. Even if these two tech companies agreed not to integrate facial recognition by default into their platforms or tethered devices, what will the makers of future wearable computing devices or services choose? Government agencies face similar choices; in fact, the U.S. Customs and Border Patrol is considering scaling facial recognitions systems at the U.S. border.

Several news stories from the past week offer more examples of significant choices before society and their long-term impact, along with a lack of public engagement before their installation.

The New York Times reported that a new system of "smart lights" installed in Newark's Liberty International Airport are energy efficient and are also gathering data about the movements of the people the lights "observe." The lights are part of a wireless system that sends the data to software that can detect long lines or recognize licenses plates.

The story is an instructive data point. The costs of gathering, storing, and analyzing data through sensors and software are plunging, coupled with strong economic incentives to save energy costs and time. As The New York Times reported, such sensors are being integrated into infrastructure all around the world, under the rubric of "smart cities."

There are huge corporations (including Cisco, IBM, Siemens, and Philips) that stand to make billions installing and maintaining the hardware and software behind such systems, many of which I saw on display in Barcelona at the Smart Cities Expo years ago. A number of the wares' potential benefits are tangible, from lower air pollution through reduced traffic congestion to early detection of issues with water or sewage supplies or lower energy costs in buildings or streetlights.

Those economic imperatives will likely mean the questions that legislators, regulators, and citizens will increasingly grapple with will focus upon how such data is used and by whom, not whether it is collected in the first place, although parliaments and officials may decide to go further. "Dumbing down" systems once installed or removing them entirely will take significant legal and political action.

The simple existence of a system like that in the airport in Newark should be a clarion call to people around the country to think about what collecting that data means, and whether it's necessary. How should we weigh the societal costs of such collection against the benefits of efficiency?  

In an ideal world, communities will be given the opportunity to discuss whether installing "smart" streets, stoplights, parking meters, electric meters or garages--or other devices from the much larger Internet of Things--are in the public interest. It's unclear whether local or state governments in the United States or other countries will provide sufficient notice of their proposed installation to support such debate.

Unfortunately, that may leave residents to hope that watchdogs and the media will monitor and report upon such proposals. At the federal government level, there are sufficient resources to do so, as happened last week when The Washington Post reported that the Department of Homeland Security (DHS) was seeking a national license plate tracking system. After the subsequent furor, the DHS canceled the national license plate tracking plan, citing privacy concerns. Data collection that would support such a system may occur, with private firms arguing a First Amendment right to collect license plate data.

What will happen next on this count is unclear, at least to me. While the increasing use of license plate scanners has attracted the attention of the American Civil Liberties Union, Congress and the Supreme Court will have to ultimately guide their future use and application.

They'll also be faced with questions about the growing use of sensors and data analysis in the workplace, according to a well-reported article in the Financial Times. The article's author Hannah Kuchler wrote, "More than half of human resources departments around the world report an increase in the use of data analytics compared with three years ago, according to a recent survey by the Economist Intelligence Unit."

Such systems can monitor behavior, social dynamics, or movement around workspaces, like the Newark airport. All of that data will be discoverable; if email, web browsing history, and texts on a workplace mobile device can be logged and used in e-discovery, data gathered from sensors around the workplace may well be too.

There's reason to think that workplace data collection, at least, will gain some boundaries in the near future. A 2010 Supreme Court decision on sexting that upheld a 1987 decision that recognized the workplace privacy rights of government employees offers some insight.

"The message to government employers is that the courts will continue to scrutinize employers' actions for reasonableness, so supervisors have to be careful," said Jim Dempsey, the Center for Democracy and Technology's vice president for public policy, in an interview. "Unless a 'no privacy' policy is clear and consistently applied, an employer should assume that employees have a reasonable expectation of privacy and should proceed carefully, with a good reason and a narrow search, before examining employee emails, texts, or Internet usage."

Just as a consumer would do well to read the Terms and Conditions (ToC) for a given product or service, so too would a prospective employee be well-advised to read his or her employment agreement. The difference, unfortunately, is that in today's job market, a minority of people have the economic freedom to choose not to work at an organization that applies such monitoring.

If the read-rate for workplace contracts that includes data collection is anything like that for End User License Agreements (EULAs) or ToC, solely re-applying last century's "notice and consent" model won't be sufficient. Expecting consumers to read documents that are dozens of pages long on small mobile device screens may be overly optimistic. (The way people read online suggests that many visitors to this article never made it this far. Dear reader, I am glad that you are still with me!)

All too often, people treat any of the long EULAs, ToC, or privacy policies they encounter online as "TL;DR"--something to be instantly scrolled through and clicked, not carefully consumed. A 2012 study found that a consumer would need 250 hours (a month of 10-hour days) to read all of the privacy policies she encountered in a year. The answer to the question about whether most consumers read the EULA, much less understand it, seems to be a pretty resounding "no." That means it will continue to fall to regulators and Congress to define the boundaries for data collection and usage in this rapidly expanding arena, as in other public spaces, and to suggest to the makers of apps and other digital services that pursuing broad principles of transparency, disclosure, usability, and "privacy by design" is the best route for consumers and businesses.

While some officials like FTC commissioner Julie Brill are grappling with big data and consumer privacy (PDF), the rapid changes in what's possible have once again outpaced the law. Until legislatures and regulators catch up, the public has little choice but to look to Google and Mark Zuckerberg's stance on data and privacy, the regulation of data brokers and telecommunications companies, and the willingness of industry and government entities to submit to some measure of algorithmic transparency and audits of data use.

There's hope in the near future that the public will be more actively engaged in discussing what data collection and analysis mean to society, either through upcoming public workshops on privacy and big data convened by the White House at MIT, NYU, and the University of California at Berkeley, but public officials at every level will need to do much better at engaging the consent of the governed. The signs from Newark and Chicago are not promising.

 

About

Alex Howard writes about how shifts in technology are changing government and society. A former fellow at Harvard and Columbia, he is the founder of "E Pluribus Unum," a blog focused on open government and technology.

10 comments
adornoe
adornoe

Here's a great idea for a great application.


There are far too many people using the internet and other technology, such as banking systems, that don't really care about the footprint that they've left behind.  There are also a lot of people not aware or ignorant about how their data and browsing habits, are being used to form a profile of them.  

We know about services that tell us about our broadband speed, if we suspect that we're being robbed.  Somebody needs to come up with a similar service, that tells people how big a footprint they've left behind.  A meter-like service that initially informs people about  how much of their privacy is being invaded, without their knowledge, or even with their knowledge.  A gauge that reaches to the red level, would indicate to a person that, they are in high danger from data collection and spying. 

Once a person sees that meter reach the red, or even approaches the red, it would be a reminder or warning to the user that,  they had better start taking control of his/her life and pull back and try to take back control of what he/she does.  Clicking for further details that the service "knows" would be a wake-up call, if those details reveal a lot more than the user even suspected.  If a service were to reveal to me that, "it knows" who my friends are, and who my family members are, and the different places I live at, and where I went to school, and where I bank, and how many accounts I have, and where I've been on the internet (including all social media sites I regularly visit), and which people I video-chatted with in the last year, and what kind of searches I conducted, etc., I would be scared as hell, and I'd probably get the heck off the internet, and go become a hermit in a cave or mountain somewhere.  


I know that there are services that can do some research for people that perform similar work, but, creating a detailed analysis of one's trails on the internet and elsewhere, would be a much bigger deal that people would be tempted to use. 

Adam Blackie
Adam Blackie

Taking a slightly wider view.... I have also noticed that the definition of privacy and the need for it has changed over time.

As a baby boomer I was bought up to expect it. As a parent of Generation Y offspring I notice that they don't understand the concept in the same way as me.

The concept and need for privacy seems to be changing over time in a similar way, for example as language use, social manners and gender roles. All of which are very different compared to 40 years ago but have changed subtly and incrementally over time.

Whether this is better or worse for society is a value judgement based on what use is made of the data. Only time will tell.

Michael Berg
Michael Berg

Who will read / analyse the data.....simply too much. Humans like making complex things thinking it is progress.....it is not, for progress would see disappearance of thought concerning any idea of spying on people. Simply switch the device off.

cartmit
cartmit

Here is an idea that is "out of the box": instead of debating whether private firms have a First Amendment "right" to collect license plate data, ask the question "Should the state have the power to require vehicle registration and display of those license plates?"

I would argue that if the founders of this nation were included in the debate, their answer would be a resounding NO! That such power was specifically NOT included in the very limited set of powers granted to government. And for those who point out that such powers would then be reserved to the "states or the people", those who value liberty above security would prohibit such power to the individual states as well.

Jimmy Young
Jimmy Young

Seems to me we should revamp the way ToCs and EULAs are transmitted to the general public, and that two versions of documents should be created one base foundation for legal purposes and another translated to what matters to the "everyday user" not aware of the impact their data has on the online environment around them.

C-3PO
C-3PO

It would seem to me that in most cases is not that the data is used, but that it is collected. For instance, facial recognition would be fantastic for my ailing memory - to be reminded who a person is. That could be pulled from data already known - ie. the persons facial metrics. The problem comes in that the location the person was seen is recorded upon the recognition of the face. Which is the private information? Why do they need to know that this person was spotted in such and such a place? Perhaps airport security needs to keep a record of who passed through a space, but then that information needs to be kept under strict control.

Does the company that allows me to remotely change my thermostat need to keep records on what it was set to? Probably not. If they need to do some "research" in this regard, there can be a subset of people who are asked to contribute that data.

License plate recognition? Something similar to the facial recognition - hopefully that one does not become a "pay a premium to get through the lights more quickly" scheme that would favour the rich, though allowing emergency vehicles quick passing through lights would be a real boon to health care.

Yes it's complicated. Again, the problem is not that the data is collected, but how it is used and by whom. Benevolent use can benefit us all, but nefarious characters will always try to profit by it, either legitimately or illegitimately.

whitewolf60
whitewolf60

@cartmit Long story short...my father has for years tooled around with "I.D. plates" on his personal cars, trucks, and motorcycles; plates which he had made at a local graphics shop. Note that he also has no "driver's license".

Any law enforcement officer ignorant enough to attempt to "cite" him for various imagined "infractions" found that he had bitten off more than he could chew. After all, he was not a "driver" (a person employed to conduct a motor vehicle) and he was not in a "motor vehicle" (commercial transportation device). He is just a private individual, traveling for personal reasons...and you don't need a license to travel.

I can't avail myself of this benefit as I currently hold a "Commercial Drivers License" and expect to drive in commerce in the future.

The key: acquiring a "drivers license" makes you subject to the traffic (commerce) laws. Registering your personal conveyance as a "motor vehicle" declares to the state your intent to utilize your transportation device in commerce. You can be compelled to do neither.

Both ignorant and free cannot be!

Editor's Picks