Security

Microsoft to block keys less than 1024 bits in August software update

This is your last chance to prepare for an August update from Microsoft that will reject cryptographic keys less than 1024 bits.

An important update from Microsoft is coming in August and for those who missed the announcement earlier this year, it could actually break a number of things for some businesses who haven't already prepared for the change. The update will block anything using a cryptographic key that is less than 1024 bits. Here are some of the repercussions of the update for those who haven't prepared in advance:

  • Error messages when browsing to web sites that have SSL certificates with keys that are less than 1024 bits
  • Problems enrolling for certificates when a certificate request attempts to utilize a key that is less than 1024 bits
  • Creating or consuming email (S/MIME) messages that utilize less than 1024 bit keys for signatures or encryption
  • Installing Active X controls that were signed with less than 1024 bit signatures
  • Installing applications that were signed with less than 1024 bit signatures (unless they were signed prior to January 1, 2010, which will not be blocked by default).

For a little background, with encryption, security is often measured in how long it would take to break that encryption, or more practically, in key lengths. The genius behind modern encryption is in the fact that some math functions are very fast in one direction, but extremely slow in the reverse. So by using a key to produce a cipher block from the data you wish to encrypt, you can get the result in a very fast time frame. But in order to figure out the key, or to break that encryption, it would take you a long time. Note that it's still possible, -- all encryption can be broken -- but we still say it's secure because of how long it would take to break it. However, this depends on two factors: how much processing power you have, and how long the key is. For example, if you were to implement an encryption system that used a 20-bit key, it would be horribly insecure, because breaking that encryption would take seconds for a modern computer. However, if the key length is 1024 bits, then those seconds would instead become several billion years.

Back a few years ago, most certificates were issued using 512-bit key lengths. With the computers we had then, brute forcing a private key was considered to be unfeasible, because it would take a ridiculously long amount of time. But now, most security experts consider that length to be too short, because of how fast processor power evolved, and with things like GPU arrays being used to crack passwords and so on. As the attack vectors evolve, so must security, and as such any modern certificate is now issued with a minimum of 1024 bits. But many businesses and corporations make their own certificates for a variety of purposes, from signing emails, to encrypting corporate websites, or even for their own internal login systems. Up until now, Microsoft products, such as Windows Server 2003 or 2008, allowed you to create certificates with a short key length, but after this update, this will no longer be possible.

Discovering usage of keys less than 1024 bits

So who will be affected? If your systems are kept up to date, and you've been following safe security practices, chances are this won't change a thing for you. But many businesses have a large array of systems and server software, some dating from several years ago, and you may well have older certificates that no longer meet the new requirements. After the update, you may start seeing errors such as not being able to browse to a site that uses a certificate with a key of less than 1024 bits, problems enrolling older certificates to new client machines, failing to sign email, and so on. Again, most businesses won't have any of these issues, but some networks that rely on older systems will be impacted. To find out if this is your case, the Microsoft TechNet post linked above has a series of commands you can issue using the Certutil tool which you already use to manage certificates.

To monitor an external website, a useful tool is the Qualys SSL Labs site, which not only tells you the key length used by the site, but other information on the certificate used to encrypt your connection. A bad case that could happen here is that your corporate website will start showing error messages to users, and that's something you obviously want to avoid. A worse case however is if you rely on old hardware tokens. For example, some older DKIM and KVM switches use 512-bit keys, and the same is true for some cheap routers. While these things may or may not produce errors after this Microsoft update, it's still worth looking into and replacing them with more secure alternatives.

So how much of a threat is there? In most cases, it's not a huge deal. 512-bit keys have been broken, and while it's not a trivial thing to do, as processing power keeps increasing, it will become more trivial. So you may ask, how much more secure is a 1024-bit key? Remember that it's an exponential function, so doubling the key length much more than doubles the security. Of course, the holy grail in hacking would be to find a function that could factorize these keys as quickly as it takes to make them, and such a function may exist, which would render all modern encryption completely useless, but most security experts agree that this is an unlikely event. So for now, as long as all your corporate systems are already onboard with 1024-bit keys, you can sleep knowing that not only will this August update not break anything, but your data is truly secure.

About

Patrick Lambert has been working in the tech industry for over 15 years, both as an online freelancer and in companies around Montreal, Canada. A fan of Star Wars, gaming, technology, and art, he writes for several sites including the art news commun...

22 comments
r.j.thomas
r.j.thomas

Okay, I think I see where you're coming from- that MS are acting as producer, seller and lawmaker which is a fair point. No company should be taking matters into their own hands (especially "law making"), but the flip side to that is that law making takes a loooong time. A really long time. We could wait twenty years for a government to ban weak certificates during which time thousands of people would have been defrauded by dodgy sites or we could act now. And don't forget, fraud (the prevention of which I'm assuming is the main drive behind this) is also illegal so you could argue MS are just complying with the law... possibly :). Although not generally with security issues, I have the same thing in my job. We pretty much go straight to the latest version of Windows Server when it's available as we have SA on our license. So most of our kit is now on 2008 R2. This has caused us significant problems as most of our estate is still on XP. Group Policy Preferences? Not without an update. So; we can either keep using old technology (batch files) while we wait for the desktops to be running something newer (which still hasn't happened) or update them with the GPP client and start using the newer technology across the board. Which makes life a lot easier, and means that any Vista and 7 clients can use their native GPP client for all kinds of things. I totally understand your Post Office analogy but I think this just shifts the unchosen responsibility back a level. Even if the Post Office is merely implementing law, I still didn't choose that law but have to abide by it. Also, the Post Office will have some responsibility for implementing new technology without waiting for law; the law will be vaguely worded ("explosives" or something). If someone develops a new type of explosive (e.g. a hypothetical Blu tack bomb), the post office would be able to implement blu-tack finding technology without hanging about, waiting for the government to specifically outlaw blu-tack bombs because this would already be covered under exisiting law as an explosives device.

redave
redave

While I realize there is a difference in time and cost of installation. The cost of installing a SSL should be the same no matter what the key length. Once you have SSL installed, how hard is it to upgrade. Is it time and effort, shoddy prior installation, lack of tools or something else. From the users viewpoint, they now going to see 1028 or more bits as secure and everything less as not secure. This fits in with reality, with today's computers that have a GPU on them, yesterday's SECURE is GONE!!!

r.j.thomas
r.j.thomas

Nice to see people getting confused about the great Microsoft Censorship monster. I'm with the pro-security people here. Microsoft aren't imposing anything other than common sense security procedures here, but to protect people, not their assets. Microsoft aren't asking us to have multi-point locks on our sheds, they're trying to protect us so that when we buy off Amazon, we know Amazon have multi-point locks on their website and not a christmas-cracker padlock. It makes perfect sense- so Deadly Earnest can carry on having no SSL (and nothing that needs it) on their website, but if you deem your website needs SSL then MS are just making sure it's strong enough. Example: I saw a review on Amazon UK the other week written by someone complaining that they had a nightmare time getting their self-signed OWA cert to work on a Lumia 900. Of course. Self-signing goes against all the principles of a secure web because there's no chain of trust. It wasn't the Lumia at fault but the certificate. People accept security impositions in every sphere of life, but as soon as evil MS try to do anything they're...well, evil-er. Think about it: what's your car's M.O.T. test for? Security (in a safety context). To protect you and everyone else on the road by making sure your car meets minimum safety standards. Why do you have a username and password to log onto the work's network? To protect you from other people masquerading as you and getting you in to trouble. Why do UK banknotes have such complex security features? To protect you (and the UK economy) from using fraudulent money. I don't see this as any different- MS aren't forcing me to do anything, just protecting me from sites that might not be trustworthy (and I don't think they'll ever require SSL full stop because most web content doesn't need it).

mark1408
mark1408

I read the TechNet blog referred to in the post (and the 2nd part at http://blogs.technet.com/b/pki/archive/2012/07/13/blocking-rsa-keys-less-than-1024-bits-part-2.aspx) and if I understand correctly it will be possible to override the effect of the update (due for release on 14 August) by modifying the registry on each affected machine. I think I'm going to take the approach of wait and see what happens and troubleshoot afterwards if necessary. I'm not aware of any 512-bit keys on our system, and the proactive logging / analysis described in the TechNet blog looks pretty complex compared to the workaround.

Deadly Ernest
Deadly Ernest

Yes, it's better security, but it should be up to the user to decide what level of security they want, not Microsoft. If people want lower security, they should be allowed to have it. The next step we can see Microsoft doing along this line is to have Internet Explorer NOT permit you to see a web site that doesn't have a valid SSL certificate, and thus force everyone to get a SSL certificate - even when they don't need one. My web site has no data that needs protection, so I have no security on it, and that's my choice and should always remain my choice.

JCitizen
JCitizen

that the SSL doesn't work on. I used to do business with them, but won't miss them for their lack of security! I'm tired of vendors getting cracked and compromising my secure credit card numbers. It doesn't cost me anything, but who wants to do business with a laggard like that?!

zloeber
zloeber

Too bad they don't disable this ancient protocols while they are at it (each have known vulnerabilities/issues).

Kieron Seymour-Howell
Kieron Seymour-Howell

Most users are clueless about security. Unsecured systems cause billions of dollars of lost time and problems each year. Allowing ignorant people to manage their own levels of security is like allowing everyone to design their own braking systems on their cars. As you know, most people cannot even change the oil. The same level of incompetence exists for people who have a computing device that is networked, or on the Internet. It became blatantly obviously to me years ago, that even most IT managers and business managers have no idea how to implement or use effective security. To assume that the average home user has this knowledge is hilarious indeed. Thanks for the laugh. :P

m9jlogistics
m9jlogistics

Seriously? First, you're arguing a make-believe point. This has nothing to do with web censorship and that's certainly not the "next step". It's been shown time and again how small keys and poorly implemented crypto offer no barrier to penetration so why defend a useless security measure? If you're going to have an SSL certificate then it should meet some minimum standard, no? We also don't live in a bubble. Compromised systems have real life impacts for user data and other networks.

JCitizen
JCitizen

by the cert companies to get site owners to pay more. ?:|

Deadly Ernest
Deadly Ernest

every lock you have is a high security like a bi-lok set, even for your push-bike cable to stop it being stolen. The people who decide on the security for their website are those who design it. If they want to have different levels of security they should be able to do so. We've had the 1024 bit keys available for some time and many are using them, and I say good luck to them. If that's what they want, well let them. I also agree that the security experts should be saying you need 1024 bit keys for high level data. What I object to is a software company, and I don't care if it's Microsoft, Apple, or Anyone else forcing it down people's throats. I will admit I'm very wary of anything Microsoft has to say is a good or better security measure as their past performance in this field leave a hell of a lot to be desired. New releases of Windows continue to be patched for security related code problems in earlier versions, some of them going back to Win 95. Yet they do nothing about fixing the underlying code that causes them. Then when you add in what they said and promoted as Palladium back in the late 1990s, plus what they've been doing since towards sneaking in the Palladium set up so they can arrange total vendor lock-in. Well, you just have to have a huge block of salt on hand when MS speak of security, especially forced security like what they're doing with Win RT.

Deadly Ernest
Deadly Ernest

level of security they want to use is up to the user, not Microsoft. This decision is akin to the Yale lock company saying you MUST have a steel core door with a deadlock or you can't have a lock on your front door.

Deadly Ernest
Deadly Ernest

a new versions of software to having the same hole being patched time and time again when the company releases a new version. That's the biggest difference between Apple and MS, MS patch over the hole while Apple ix the hole itself. However, that's a separate issue. Re the post office, it's obeying a law and they're the one carrying the mail, not the company selling you the box to put it in, which in that analogy is what Microsoft are doing. As to standards, I'll pay more attention when Microsoft starts to pay attention to the International and Industry standards that have existed for years that they ignore, and some of them are what causes some of the security issues. Security standards for banks should be coming form either the government or the banking industry itself, not from Microsoft saying their software won't work with anything below a certain level any more. Until Microsoft fix up their own long standing security issues they have no right to try and force any other security aspects onto people via changes in their software.

r.j.thomas
r.j.thomas

To quote a certain Mr Ian Thorpe, look: most of these changes aren't forcing anything down anyone's throat (with the exception of possible the mail certificate issue). MS are simply trying to protect people from dodgy ssl-encrypted sites that should know better (our site has a 2048bit Cert that's been in place for years). And as for MS security bashing- every OS is written by humans and is therefore flawed. MS should actually be praised, as they were the first company to make it really easy to patch their flawed OS. For example, if you want to know just how buggy Mac OS X is, subscribe to this list: https://lists.apple.com/mailman/listinfo/security-announce and you'll soon realise that every OS is full of holes. "The Internet is just like a telephone or the mail system, it's a way to communicate". Yup, totally agree. But both these systems have security systems in place that we didn't choose; that's why you can't send a bomb through the post. I didn't ask the Post Office to impose that particular rule, but I still have to live with it and it makes a lot of sense. I just can't see where the problem lies; the house/ shed falls apart because these are single person/ family issues. If you leave all your doors open that's no skin off my nose. But once a resource becomes shared, different security comes into play. Why else are banks hard to get into? You might be a millionaire who could afford to lose half your cash but I'm not and therefore can't- so the bank has to protected well enough to serve everyone (quite besides the economic chaos that would ensue if banks started leaking money that visibly). I really do think this is just another extension of the "resonable security" argument; no-one is being force to run their websites over SSL, but those who do need to comply with certains standards. This "imposition" of security standards happens over and over again in our daily lives except most people don't see or think about it long enough.

Deadly Ernest
Deadly Ernest

administer servers included training on assessing security. I also agree that a lot of corporate data needs high security, but that should still be the decision of the people in charge of the data, not the sales staff and techs over at Microsoft. For the vast majority of users high security is not needed, and the corporate ones who need it do use it, but now we have Microsoft saying everyone gets it whether they want it or not. Well, what we'll see is more wide open systems because people who are currently using low end security will not have the skill or interest in going with the high end stuff. Whether people have the skills to accurately assess security issues or not, it should always remain their choice as to what level they want, except where decreed by law, it should NOT be a software company forcing a high level of security on them. Kieron, I'm not saying there isn't a place for high end security, what I am saying is it should be my choice, not someone else's if there isn't a law about it.

Kieron Seymour-Howell
Kieron Seymour-Howell

Security, should be transparent; either on, or off. I agree that not everything requires security, but the problem remains that most people do not even understand the concept of how to make an informed decision. Case in point: Say you have a server, and people can access it, it needs to be secure at the maximum available to stop people from picking away, and possibly gaining a semi-secured higher level. Perhaps where poorly chosen data was stored, perhaps the keys for the next higher level. I am sure you see where I am going. Recently there was a news article about a person who thought that compressing files was the same as encrypting them, and of course the flash key with those files on it was lost, and compromised. When you are dealing with unknowns, like a person's level of competence, always assume the worst. Unless, you like being blamed for a fool and pay a heavy cost later on. I believe that all security should be whatever is the maximum available when dealing with the complexity and interconnectedness possible in any modern digital device. Sure, if the computer world WAS like sheds and houses that would be fine, but online the back of someone's shed may be the side of someone else's vault. Because of a coding errors or premeditated schemes everything should be treated as the weakest link. If the systems were not connected, like a shed in a field with nothing else in sight, then varying security levels would work fine. But, such is not the case with interconnected digital systems. There is always going to be a fool out there who will store munitions in his "shed" and cause a problem.

Deadly Ernest
Deadly Ernest

on guard. Sure, if I have extremely important data I should protect it in an appropriate way. But that doesn't mean I need to have the same level for stuff I just don't want left lying around but isn't important. The level should be my choice. As to the average user, make it too hard and there will be NO security at all, as they won't go through the rigmarole to maintain it - that's been proven. Also, the great majority of people don't need it. The Internet is just like a telephone or the mail system, it's a way to communicate. The fact that companies are trying to save money by connecting their businesses and business databases to the Internet is irrelevant to the general usage of the Internet to allow people to find publicly available information and communicate. One place I worked was very concerned about security, so concerned that the corporate internal network was NOT physically connected to the Internet at all. Most communications were internal and the external stuff was handled by sending the emails via a special mail server that was disconnected from the Internal system and then connected to the Internet once every hours to handle external e-mail processing. If anything was more urgent than that the people just had to call them. The fact people are using the Internet in ways it was NOT intended or designed for does NOT give any software company the right to force their ideas down people's throats. As I've said elsewhere, I've not problems with using appropriate security for the situation, what I do have problems with is someone forcing it down my throat without the authority to do so. Some organisation have high security because there are government laws on handling that data, but this is nothing to with that. When I set up security for my system or my website, I decide what level of security I want, it should not be taken out of my hands by a software company.

Kieron Seymour-Howell
Kieron Seymour-Howell

The problem here is that most people have no idea or inclination of how to differentiate between the shed and the house. Also the systems that we all use have shared doors and gates into those places. People toss everything into a huge jumble and do not even know how to create and organize basic folders. Most people allow thousands of files, multiple copies, and outdated files to accumulate everywhere. Unfortunately, in the computer world and to continue with your analogy, having bad security on one house will create a threat on a neighbours house, shed, or yard. In the computer world, a terrorist or hacker will enter your shed, climb up on the roof and start firing grappling hooks over the fence into the neighbours yard, or worse. I learned a long time ago that if you assume that people know what they are doing, it creates a lot of problems. I STILL maintain that only people who have a valid and verified level of competence should be allowed to connect a device such as a computer to the Internet at all. For good reason, we do not allow preschoolers to drive front-end loaders and cars along public roads, but that is precisely what IS allowed in the virtual world.

Deadly Ernest
Deadly Ernest

force a higher level of security down people's throats even when they don't want or need them. there are some cases where high security is needed, and there are some cases where high security is a total waste of time and resources. Maybe this example will help you understand the point. I have a house and a garden shed. The exterior doors on the house are very strong and have very solid deadlocks on them because I've got valuable gear inside, both in monetary value (tv, fridge, etc) and sentimental value (photos, etc), and IP value (stories in the middle of being written). The shed has a few old tools and other things in it so it has a basic door with a cheap latch lock. Even if everything is stolen the total value is below the insurance company's minimum claim threshold. I, as the user, decide what level of security I want on the which. What MS is doing is equivalent to forcing me to have to put a solid core door and deadlock on the garden shed because they now say it's either top end security of no security, no middle ground. I believe the level of security to apply to my system and data is my choice, and not for a software company to decide and force me to use. It's that simple - user choice or corporate dictatorship.

m9jlogistics
m9jlogistics

No, i don't think that analogy is accurate. There are standards everywhere and best practices to follow - why is this different to you? Why is it wrong to require keys meet a minimum security level? There is absolutely no reason why anyone should be using a key less than 1024 these days anyway, so really MS is closing a loophole that is otherwise a security concern. Sorry if it interferes with your right to have a crappy implementation of security.

Editor's Picks