Chad Perrin analyzes the inherent problems of DRM, its security, and its negative affect on business.
Back in November 2007, I hinted at the inherent problems of DRM software in the article Radiohead knows more than Microsoft about security. I didn't really address DRM itself in any detail, however. I'll address it now.
As you might have gathered from the Radiohead article, DRM is essentially ineffective. Its only successes are in treating legitimate customers like criminals. A determined (and competent) security cracker can always find a way to circumvent DRM.In April 2007, Ars Technica reported on the statements by one of the people involved in the Xbox-based AACS key crack that allowed them to circumvent the HD-DVD format's DRM. Before that, any AACS cracks have been "fixed" by "expiring" compromised content decryption keys and issuing new keys with new content. This meant that already cracked and released keys could be used to circumvent DRM on older content, but not on anything released after the new keys were issued. Such solutions to the problem don't address the real problem, though — that the new keys can be extracted as well, as demonstrated by the Xbox-based crack.
Ultimately, there's no way to really protect content from its users without simply preventing the users from accessing it at all. DRM "protects" content by encrypting it and preventing the user from accessing it in unauthorized ways — including copying it. To be worth selling, though, it has to be accessible in authorized ways — including actually playing the content on an authorized player. This means there must be a way for the player to decrypt the content.
There are at least two different ways to leverage the necessity of decryption to circumvent DRM:
- You can access the content after it is decrypted. AACS standards require specific characteristics for authorized players to make it difficult to capture content after decryption to make unauthorized copies, but ultimately the only reason this DRM circumvention technique is not used more often is because it is usually easier to get the decryption key than to capture the content between decryption and display.
- You can access the decryption key, then use it to decrypt the content and capture it at your leisure. For the decryption key to be used in an authorized manner, it has to actually be used — which means it has to be accessible to the decryption software. If it's available to the decryption software, it's available for a security cracker to discover.
The entire AACS saga highlights the core weakness of DRM. The point of DRM is to simultaneously prevent the user from accessing content and allow the user to access the content. The way DRM like AACS does this is by encrypting content, then providing the decryption keys needed to access the content and trust that users are too stupid to access the keys in an unauthorized manner.
As long as you want your customers to access the content at all, you have to resign yourself to the harsh reality — that once you give the customers access, you can't take it back. Any other approach to it just means you're lying to yourself.
Since DRM is ultimately ineffective at stopping the people the purveyors want to stop, it's not really protection against copyright infringement. That leaves two things that DRM could be:
- If you subscribe to the notion that information is and should be the property of the person first disseminating it, DRM is just an insult to your customers. It restricts the ability of end users to access the content in legitimate ways by treating them like criminals, interfering with fair use, and even preventing customers from doing something as simple as watching a movie without getting a new DVD player just to satisfy your paranoia. This, of course, assumes that your customers won't just circumvent DRM.
- If you subscribe to the notion that "information wants to be free", or that the possessor of information should be able to do whatever he or she wants to do with it, or that copyright law is simply wrong, DRM is worse than an insult — it's a violation of the rights of every single customer.
Either way, it's just a bad way to do business.
People react negatively to the way content providers are treating their customers. People who would otherwise just buy content and use it the way the content providers would like them to are becoming irate, boycotting the worst offenders among content providers and even infringing copyright themselves in some cases. I'm sure Sony/BMG isn't even aware of how much damage it has ultimately done to its own business by mistreating its customers.
Some groups are even making concerted efforts to make life difficult for DRM users by preventing them from effectively using resources that are generally accessible to everyone else. The third version of the GPL, for instance, requires DRM software licensed under its terms to make any "authorization keys" available with the source code:
"Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source.
In the GNU Quick Guide to GPLv3, it says:
It's always possible to use GPLed code to write software that implements DRM. However, if someone does that with code protected by GPLv3, section 3 says that the system will not count as an effective technological "protection" measure. This means that if you break the DRM, you'll be free to distribute your own software that does that, and you won't be threatened by the DMCA or similar laws.
As I've already pointed out, there are some negative consequences for saddling your customers with DRM. DRM isn't the only problem here, though. Anything that attempts to restrict what people can do with what you've given them can have similar negative consequences. I need to be perfectly clear here: I'm talking about what you can do — not what you should do. When you try to restrict what people can do by applying rules across the board to anyone and everyone, you sometimes end up stopping people from doing what they should do.
The problem is unintended consequences — in trying to restrict what people can do, you may end up having the opposite of the intended effect. This is the problem behind Germany and England, and even parts of the United States, trying to outlaw network administration tools that could be used to crack security on others' networks, because those tools are also used by people who secure our networks against malicious security crackers. By the same token, it is also the problem behind trying to prevent people from writing DRM software without destroying the effectiveness of the software at the same time.
By mandating that any "authorization keys" must be provided with the source code in the GPL, version 3, the GNU Project is effectively saying that certain types of software development using code distributed under terms of the GPL cannot be allowed to be as effective as its developers could otherwise make it. This discourages certain types of security software research with GPLed code, discourages greater adoption of open source software by commercial entities, and could easily have further unintended negative consequences that have not become as obvious as these.
This is probably where some of my readers expect me to say "April Fools!" It's the first of April, after all. I just got done saying, first, that DRM used to "protect" content is bad, and second that we shouldn't try to prevent anyone from creating DRM systems — which probably seems contradictory.
None of this is a joke, though. I guess I just can't come up with a good one for April Fools' Day this year. There's no contradiction in what I said.
It's all based on ideas like Kerckhoffs' Principle and Shannon's Maxim — a rephrasing of the basic concept in Kerckhoffs' Principle that says "The enemy knows the system." Ultimately, it all just means that trying to interfere with the way people use what they have by keeping its internal workings secret is doomed to failure. The common thread is that security cannot be bought with attempts to restrict how people might use what you've given them.
Someone who intends to circumvent your security measures will not be stopped by the attempt to convince them to ignore what's already in plain view.