Adobe Digital Editions has raised privacy concerns among users. Find out why encryption is not the answer.
Privacy concerns have surfaced around Adobe Digital Editions , an eBook reader popular with libraries, universities, and individuals. An Ars Technica story on the issue describes how the software appears to send details of digital books that the user opens and some activity about how the user navigates them. The article emphasises the fact that this data is sent to Adobe in the clear, not protected by encryption. That is not the problem, nor is encryption the answer.
Encryption in transit doesn't protect stored data from the law
Although it is true that the data can be read by "anyone who can monitor network traffic (such as the National Security Agency, Internet service providers and cable companies, or others sharing a public Wi-Fi network)", the NSA is not the concern here. Adobe is the concern. US law enforcement have lawful mechanisms to get Adobe to turn over any data that Adobe possesses. So, while it is true that encryption would mitigate exposure to wholesale snooping on public WiFi, it would not protect the data from law enforcement. If Adobe collects the data, then the data can be subject to lawful disclosure. These are not the droids you are looking for.
We're not in Kansas any more
The data around eBooks use is part of an overall digital rights management (DRM) programme. Although it appears Adobe is communicating with concerned parties like the American Library Association (ALA), it has not published anything publically laying out its side of the story. One of the important concerns, not just in the US, is that the company giving you an eBook reader is recording every book that you read, when, for how long, and how far you've read it.
There exist eBooks, possibly even some protected with Adobe LiveCycle DRM, that are politically or legally sensitive. We have to consider the laws and governments beyond the US. In some countries, the Internet is policed by the government and the data that the government learns about reading habits can be used to police the citizenry. It is easy to imagine a regime sufficiently technological and sufficiently repressive that they can and will use such data against their citizens. In America, consider a terrorist suspect accused of planning to build a bomb or do some other terrible thing. Given legal authorisation (e.g., a search warrant or similar approval) and the IP addresses used by an accused person, the FBI can compel Adobe to turn over reading records related to those IP addresses. In some states, there are laws protecting the privacy of a citizen's reading choices at a library. With Adobe collecting this data, there may be a legal loophole that the government can step through in the case of eBooks.
Sloppy design for privacy
Any software with a "phone home" mechanism jeopardises user privacy. Even something as simple as automatically checking for updates will reveal the fact that a particular piece of software is installed on a specific IP address. Moreover, it reveals the version number of that software, which might be out of date and contain significant vulnerabilities.
Note that encrypting the data between your computer and Adobe, while a nice idea, does not address the problem that Adobe knows what everyone is reading. Furthermore, consider the various data breaches that have happened lately where whole databases full of passwords have been disclosed. Do we imagine the list of books being read will be protected more or less strongly than the passwords for a web site?
Three principles for privacy in software
In an information gathering system like Adobe's, assuming this is a necessary and important part of it, there are some principles they could apply.
- Don't send data you don't need. Data that exists can be the subject of law enforcement. As a software firm, the less data you have, the fewer headaches this will cause you.
- Aggressively destroy data when it has been used. As soon as a decision has been made using the data (e.g., "is this most recent usage allowable?"), discard it. Collect fresh data to make new decisions. If repeated collections cause problems, then cache and optimise.
- It is really hard to anonymise data. As we have seen with NYC taxis , some things that look anonymous to a lay person are trivial for a knowledgeable person to deanonymise. The best practice is not to collect the data at all, rather than try to anonymise it.