Linux optimize

Linux hit with Phalanx 2: Is there a Linux double-standard when it comes to security?

Is there a double-standard applied to Linux security vs. Windows security? The Phalanx 2 exploit, which takes advantage of compromised cryptographic keys in Debian-based systems, is on the loose and has unleashed some interesting responses from Linux advocates.

Back in May, TechRepublic bloggers Chad Perrin (Security) and Vincent Danen (Linux) pretty thoroughly covered the Debian-based flaw concerning the OpenSSL vulnerability. Vincent showed you how to find and fix your weak keys and Chad provided some additional methods to patch up any problems with cryptographic keys. Well apparently, not everyone heeded the security warning and applied the patches, because Phalanx 2 is on the loose.

According to the US Computer Emergency Readiness Team (US-CERT), attackers are using compromised SSH keys in a local kernel exploit to get into the root system.

...once attackers have control of the system, they install a Linux kernel rootkit called 'phalanx2'. This steals more SSH keys, which are then sent to the intruders. (ZDNet UK)

ComputerWorld's Steven J. Vaughan-Nichols is hopping mad in his post, "Linux security idiots," in which he rails against Linux system administrators for being slackers and suggests a career in the fast-food industry.

...for attacks like phalanx2, where simply being aware of recent major security problems and updating systems would have stopped the assault in its tracks, there is no excuse.

Over at CNET's Open Road blog, Matt Asay says it all in his headline, "Linux servers under the Phalanx gun: A problem with people, not code." Now what is interesting here in both Vaughan-Nichols' and Asay's posts, is that the blame is placed squarely on those dumb old admins who didn't patch correctly (not the flawed code), whereas -- as at least one commenter pointed out in Asay's blog -- when a security flaw wreaks havoc on Windows-based systems, it's all Microsoft's fault and due to its inherent weakness, not just careless administrators. Is there a double-standard at work here?

About

Selena has been at TechRepublic since 2002. She is currently a Senior Editor with a background in technical writing, editing, and research. She edits Data Center, Linux and Open Source, Apple in the Enterprise, The Enterprise Cloud, Web Designer, and...

37 comments
pgit
pgit

Chad Perrin pointed out in several comments (I can't locate atm) that windows has vulnerabilities that are well known and Microsoft refuses to fix them. BIG difference. Yeah, windows deserves the rap it gets. Lazy admins and coders who don't think (or ask) deserve the rap they get. And whenever there's a flaw in code, Linux deserves the rap. But being open source, it's very unusual for a Linux exploit to be discovered by a true black hat. In contrast windows users are at the corporate whim. You cannot know, and cannot assume either way, whether there are exploits lurking in your system this moment, that MS has known about for a year but elected to do nothing about it. "we'll leave that to the antivirus people..."

eliwap
eliwap

The difference between Linux and Microsoft is that bugs get patched quickly in Linux, sent out the door and packaged by the relevant distributions. Its then up to corporate policy and administrators to implement what becomes available. So with Linux it usually is the fault of a lazy Administrator not the code. Where as with Microsoft it frequently (not always) takes months if not a year or more before a fix gets out. By that time a minor bug turns into a major nightmare and everyone has forgotten about it. So... with Microsoft it is the Code and the Administrator.

TripleII-21189418044173169409978279405827
TripleII-21189418044173169409978279405827

You can read many exploits being used on security exploits in other OS software that have been fulled patched, and bulletins submitted, etc, and there is always a small group of IT folks who just don't deploy the fix. The Debian SSH keys problem was severe, but anyone in IT who hasn't heard of it or taken the corrective actions should probably be fired. Literally, it is like the admin just emailing their root passwords around the web (regardless of OS) and then being surprised when someone accesses their system. TripleII

Selena Frye
Selena Frye

Do you think Microsoft gets blamed differently when viruses and malware attack, but Linux gets a pass? Is bad security more the fault of careless admins or flawed systems? What do you think of the perspectives of the bloggers cited in the post? I'm just full of questions!

jwise
jwise

Since my OS of choice is z/OS (IBM mainframe), then perhaps my view on Linux versus Windows is more objective. I use Windows because my job requires it. I experiment with Linux, and would like to do more. My impression is that Windows definitely has more security problems, but I perceive that as the result of its market share. My theory is that Linux exploits (and Mac too) will increase assuming their market share increases. I also believe that open source systems ultimately are less secure than closed source. The bad guys can read the code and develop their malware. Obviously, there are a lot of talented malware developers.

Andy J. Moon
Andy J. Moon

When the Blaster worm spread like wildfire, I was in Las Vegas. I saw the story on the cover of USA Today when I was walking through the Paris hotel and, being the system administrator for a nearly pure Microsoft environment, was worried. I called a coworker immediately and asked if any of my servers had been affected, but they were running fine. Blaster exploited a vulnerability that had been patched months earlier by Microsoft and, since I assiduously applied patches to my systems, my machines were protected. There are comparatively few "zero-day" exploits that hit the wild before the vulnerabilities are patched, so no matter the OS you are dealing with, a big part of a system administrator's job is keeping up with vulnerability reports and patching those vulnerabilities. Operating systems are going to have holes and it doesn't matter who wrote the OS. The the blame for systems that are compromised by this bug belongs squarely on the admins in charge of them.

Tearat
Tearat

Yes if you can fix the problem yourself without creating additional problems Yes if MS who own the rights to the software will let you Yes if the type and number of problems are close to being the same as any other OS No if you cannot fix the problem No if you can fix the problem but are not allowed too because of legal mumbo jumbo No if the type of problems are worse and are more frequent on MS OS Yes yes and yes if the try to cover it up or ignore it

mar_hunzar
mar_hunzar

Microsoft or Linux or OSX, it matters not what operating system when there is a flaw in the code. However it does seem that people like to blame Microsoft when a virus / trojan / flaw is discovered but when its OSX or Linux then it must be somthing else. Come on? really? What some people fail to realise is that Microsoft currently dominates the desktop market, Linux and OSX do not. Linux is gaining ground in the server market, but not yet as bigger population of servers as windows. Now taking this into account, the surface of operating systems servers or indeed desktops is running is heaviliy biased towards Windows, therfore there will be more of a tendancy to try and attack windows because: a) Windows is always seen as insecure 2) Windows is the dominant platform. Now, if you were to develop code to gain financial information in an age where AV patches are released daily, in order to gain maximum return in minimum time because you know it will get patched what would YOUR choice be.... Windows (lots of targets) or Linux (Substancial but not overwhelming) or OSX. Hmmm, I think that would be windows then. A bad admin is a bad admin, bad practice and procedure is bad practice and procedure. Bad code is bad code. Come on guys just because its Linux does not make it all an admin fault..... does it? And when the day comes when Linux rules the desktop market and there is a pc in every home and every business that runs it, who will you blame then?

Merlin the Wiz
Merlin the Wiz

There is no double standard. There is and always has been a tendency to lump all fo the many different versions of the 'nix operating systems under the broad banner of "Linux". This is mostly due to the perception of the "Windows" community and partially a fault of the 'nix community. The "Windows" community has never considered the different versions of 'nix to actually be different operating systems. This IMHO is the same as the 'nix community calling all of the different versions of "Windows" "Windows". When there are many differences between "versions" or "releases", oops ... I meant "kernels". If there is a double standard IMHO this is where it exists. All 'nix' is "LINUX" and all "Windows" is "Windows". I cannot wait until IP6 gets fully implemented and nearly every device that has a microprocessor is attached to the internet. Then everything in our daily lives (all of our appliances and vehicles etc.) will be subjected to daily onslaughts of this or that piece of malware. Then MAYBE, we won't worry about which operating system is best, we will just worry about keeping what we have operational 24/7/366.

Neon Samurai
Neon Samurai

It's not an issue in the kernel. It's not an issue in Red Hat, Suse, Mandriva, PCLinuxOS.. It?s a Debian issue. It's a flaw that effects that distribution and the forks that draw there base package source for OpenSSL from Debian (Ubuntu and such). A Debian developer did not follow the Debian policy and tried to fix what he thought was a crypography flaw. If he'd actually talked to the OpenSSL developers (ie. cryptography experts), they would have told him that what appeared to be a flaw was intentional and provided a specific increase in randomness. Debian is at fault and that's fine, let's hold them at fault just like we hold Microsoft at fault when its distribution (OS) presents flaws from it's own poor coding. The other contrast was that the flaw was fixed when it was realized. It's probably one of the longest outstanding flaws but it was fixed quickly when discovered. Microsoft still denies flaws in there software design that continue to allow third party software to be exploited. I don't see that as a double standard; let fault fall where it is deserved. The same goes for Red Hat. They screwed up with server administration. That?s not a flaw in the kernel ?Linux? or any other distribution that happens to use that kernel and the GNU stack on top of it. That?s human error, a screw up. Don?t dismiss Red Hat of there responsibility in leaving there servers open to penetration. They are a huge company for whom third party pentesting is a minimal expense; if you?re a SLA subscriber, hold Red Hat?s feet to the fire. If your not Red Hat but manage repository servers; pay attention, take note and check your servers. ?Linux is broken but gets a double standard.? That?s a political party line only of use to the detractors threatened by anything that is not there own pet OS. These are the same people that think any distribution that happens to use Linux and other GNU commodity lego pieces should be referred to as one big ?Linux? so they can cause confusion and attribute a flaw with one separate OS/userspace to every similar OS/userspace combination. The same goes for Windows and any other platform. If a flaw in Windows is caused by a third party driver developer; hang that developer out to dry until they fix there code. If the flaw?s root cause is the Windows code then hold Microsoft accountable. No difference; blame where it belongs. I think Apple should have every flaw publicized in equal proportion to there ?we?re bulletproof? marketing party line for the very same reason; blame where it is do. Apple says that osX is impenetrable, denies any problems in the network stack then quietly releases patches to the network stack over six months later hoping no one will notice. Not acceptable. Apple?s bulletproof security that does not allow specific programs to run does so by looking at the name the program tells it and can be easily defeated by editing the identifier text file within the application directory tree; that should be publicized to the point where Apple fixes there ?security theater? and replaces it with a real security mechanism (if that is the intent anyhow). Hold the responsible party accountable.

ChewyBass
ChewyBass

Some of the comments are on par with what I would expect from the Linux community. I've heard for so many years about how secure Linux is and that you don't even need a firewall or antivirus when browsing the internet. Yet there is no problem with the OS, it is just the stupid/lazy admins.

CharlieSpencer
CharlieSpencer

Having said this, the blame lies with developers of malware.

TripleII-21189418044173169409978279405827
TripleII-21189418044173169409978279405827

[B]Do you think Microsoft gets blamed differently when viruses and malware attack, but Linux gets a pass? Is bad security more the fault of careless admins or flawed systems?[/B] The problem is 100% bad admins. With corrected keys, SSH is secure, it doesn't have flaws. I am reminded of a Seinfeld episode. He goes on to describe how he has the best lock he could buy, but it does have one fatal flaw "the door must be closed for it to work". Many hacked Windows sites, bad admins not deploying fixes and patches. Same goes for Linux, bad admins, not deploying fixes and patches in this case. What is really telling though, are the number of hacked systems where they are up to date and fully patched. That is the best metric for judging relative security of systems. TripleII

Neon Samurai
Neon Samurai

The obscurity of the source code for Windows does not make it any more secure. Research into Windows platform security for good or evil ends doesn't rely on source code at all yet flaws are constantly found and rediscovered. We're talking about people that work with binary auditing software, dump files and raw assembly code; source dosen't matter. If you look at cryptography, that area of research remains open specifically because the crypo can be reviewed by peers and improved upon. If I make a crypto program and release only the binary code saying "it's great, it's unbreakable", nobody cares and nobody should. It's my word only and maybe I missed something. If I release the source, I suddenly have crypto experts saying "there's a flaw here and you can fix it like this" or confirming that it is infact a strong encryption. Likewise, in the FOSS world there is peer review by anyone who cares to get involved. I'm not a developer so I'm limited to bug reporting mostly. Other's who know code can take part in patching and improving the code. Peer review works in an actively developed chunk of source. If it's not actively developed; darwinism works, there is probably something better suited to the need. Back to the security bit. The security of a thing is not in it's obscurity. Any percieved increase in security by obscuring the mechanism is "security theater" no better than the TSA camped out in every airport. It's security to feel good not security to be safe. For real security, you should be able to leave the mechanism wide open to inspection. Here's how it works, here's the source.. off you go. That mechanism should stand up to rigerous scrutiny while never allowing a user past without valid authentication. This means publish the source code for OpenSSH not publish your personal certificates used to authenticate you against openssh. I suspect your mainframe is using Mandatory Access Controls. The mechanism is well known and wide open to scrutiny but unless the user account has the aproapriate access level, they're not getting into it. If they can get around the MAC mechanism, that's a serious flaw in the mechanism and it needs to be addressed promptly; that's where availability to source expidites the process rather than relying on a budget restricted single developer to eventually get too it and only if it is going to effect the profit margin less than leaving it wide open. In terms of the recent Debian OpenSSL screwup, a developer with no cryptography expertise thought they found a bug and uploaded there "fixed" version to the repositories. That's a failure in process as they ignored the normal process for vetting Debian patches and uploads along with ignoring the good sense to contact the OpenSSL developers directly with the percieved flaw and patch. The crypto experts would have told him it wasn't a flaw and was supposed to be there increasing randomization. Humans will always be the weak link though.

vmaatta
vmaatta

I do agree that market share certainly affects the amount of viruses etc. being made for an OS. But software being open source certainly doesn't make it less secure. Quite the opposite. Most (if not all) vulnerabilities are found through reverse engineering and finding them has nothing to do with actually seeing the source code. And closed source project code can only be fixed by the limited number of developers who actually have access to the source. Add to that some political idiocy at project management level in that code is not patched even though it's known to be buggy. How about open source ? As I said.. vulnerabilities are found whether the source is open or not. But in open source there is the possibility that code is viewed and tested by much more eyeballs than the closed source. Also anyone can take an active part in the project, point out vulnerabilities and actually supply a patch. And before anyone says this.. not any and all code is accepted to an OSS project. There's peer review and the usual "management" processes in place in OSS projects.

cearrach
cearrach

Most open source developers are palpably aware that the source code is available, and so take extra care to make it secure. Closed source developers can and do sometimes assume that since their code is hidden they can be more relaxed with respect to security. I'll take open source any day.

Tearat
Tearat

?Now, if you were to develop code to gain financial information in an age where AV patches are released daily, in order to gain maximum return in minimum time because you know it will get patched what would YOUR choice be.... Windows (lots of targets) or Linux (Substancial but not overwhelming) or OSX. Hmmm, I think that would be windows then.? You forget there is more than one criminal Most criminals will take what ever they can get Windows may be the biggest target. But it is not the only target The fact there are attacks on OSX and Linux based systems shows this One percent worldwide is still a lot This is a mistake people make when looking at security You are not defending from one person or one type of attack You are defending from all persons, who will attack with everything they can use They will attack every target they can find

Neon Samurai
Neon Samurai

Your comment seems to break down to "Windows is a target because it is popular". If we look at success rates, why is it that Microsoft has the higher rate of successful exploits? Also, how is it that the popularity of Unix like platforms in the server market does not equate to more successful attempts? How does Microsofts Monopoly share of the desktop market equate to the other platforms being of lesser quality? osX is actually an embedded OS sold on Apple hardware so it's really outside the scope of comparison. b) windows is the dominant platform - not relevant, it measures marketing quality and domination of supply chains not product quality. a) windows is always seen as insecure - only because MS continues to leave poor design decisions in place blaming repeatedly exploited flaws on third party software. So, are criminals only trying to exploit desktops then? We're not seeing this overwhelming flood of success against servers. If the flaw that is exploited against a BSD, Linux or any other Unix like platform is in the software then that distribution (if paid) or project maintainer (if community) should be made aware so it can be fixed. The flaw in Debian's openSSL was all on Debian's shoulders. A developer chose to fix what looked like a flaw rather than consult the upstream crypto specialists that knew it was supposed to be there. That's a software issue that effects Debian and the forks below it. The flaw in Red Hat's servers was due to configuration; human error. That one is all on Red Hat and the server administrators. Unless you have figures that say otherwise, the usual flaw that Unix like platforms are exploited through are configuration errors. I'll take weakness to configuration errors over weakness to bad code and long waits for patches any day; I can fix the first issue myself rather than wait for the developer to fix the second. "Come on guys just because its " the popular pick for selling to consumer desktops doesn't make it a well developed product; it only makes it well marketed.

jim.wyse
jim.wyse

For heavens' sake, learn the difference between "there" and "their"! It would make your text somewhat more credible!

jlwallen
jlwallen

windows is hit with this kind of thing all the time. so windows admins are USED to patching and repairing. Linux admins aren't so much used to this so they can get lazy. it's about reliability. and when something is too reliable, you take it for granted.

Neon Samurai
Neon Samurai

Viruses are made for other platforms also but I've found that success rate makes a big differences. When exploit code works only for a short amount of time before it's obsolete, you don't get the same virus over and over. It's like keeping your vulcanic vent bacteria in a cold oxygen deprived fridge; they can exist for a short while but it's a pretty hostile environment for them with little chance to survive let alone procreate. By contrast, when exploit code works well on a given platform and can be easily modified to get around AV signatures and third party software fixes re-exploiting the same underlying flaw, they tend to get more variations and spread much further. Again, we take those little vulcanic vent bacteria and store them in a nice warm oxygen rich oven and they prosper, procreate and overrun the place. Mutations easily apear between generations as the weaker stranes die off and beter adapted stranes flurish. I sincerily wish it was just about market share measured through retail channels; it would make so many things easier. Heck, I'd simply love to see accurate figures from MS private data even if it mean promissing never to tell anyone. I'd also love to see a true measurment of the other platforms in use but that's equally mythical. What-a-you-gonna-do. As long as the platforms I have to care for continue to be actively developed, I'm happy. (that includes the steady flow of updates from Microsoft too)

vmaatta
vmaatta

I agree. Especially the last paragraph points out an important thing for everyone to think about. Way too often the talk about market share concentrates on the desktop. Outside that area Windows is not that big of a player. And it's true that there's a lot more gain in cracking a linux server than some home computer running Windows. And still for some reason most viruses and malware are made for Windows.

Neon Samurai
Neon Samurai

I put more importance in success rates rather than attempts if I'm measuring the potential level of security in a platform. In the desktop market the majority use Windows for various reasons. There is also a lower level of knowledge among administrators since everyone with a Windows box is an admin weather they care to understand how the toaster works or not. In the server market, the majority market share is not Windows. Every home user does not run a server either so on average, the knowledge of the admins is higher. If it is market share then where are the flood of successful exploits against the majority share? Sure there is some money to be made in breaking into home user's machines but this is chump change. The real money is on the servers (much larger creditcard lists for example). Market share also does not give any indication of how a platform responds to vulnerabilities and exploits. You have to look at times between discovery and correction along with being aware if the correction is a harmless flaw or exploitable vulnerability. You also have to look at success rates. Every machine attached to the internet is taking hits constantly; home, server, busines workstation.. it doesn't matter. Looking at where those hits are successful give a far more accurate picture of platform security. I also have little doubt that if other platforms become less obscure in the market share graphs and start being percieved as taking more hits because of it; the developers of that software will continue to respond by seeing exploits as "proof of concept" for a flaw that needs fixing or process that needs to be reworked. Simply put, the motivations are different. For profit closed source software has to be good enough to make a sale without pushing development costs beyond desired profit margins. FOSS software has to be good enough to perform it's function and stand up to peer review scruteny without making the developer look like a doofis. If one writes ugly and insecure open source code they can expect there reputation and respect to go down. If one writes insecure and ugly close source code, it's ok provided the green pieces of paper flow in. Which is really best for the end user?

vmaatta
vmaatta

"Closed source developers can and do sometimes assume that since their code is hidden they can be more relaxed with respect to security." I've heard someone actually wonder how they're code could be unsecure since it's closed source.. go figure.

kingttx
kingttx

I hit submit before I finished picking nits. s/heavens'/heaven's - Apostrophe placement or plural/singular agreement. If you meant to say "heavens'", then you should also use "sakes". Enjoy the abuse! :)

kingttx
kingttx

Since we are picking nits: s/over defensive/overly defensive or s/over defensive/over-defensive - Adverb describes verbs, adjectives, or other adverbs

Neon Samurai
Neon Samurai

though you completely missed the point of the post being "place blame where it's due". In other words, understand the issue and hold the applicable party responsible rather than simply slamming any platform blindly. I'm done hearing from you about my spelling. If you have issues with specific points in my original post however, I'd be very interested to understand which points you feel differently about and why.

jim.wyse
jim.wyse

Quote "a grammar snob who chose to focus my spelling rather than the information content." Unquote. Hardly, it was your bad spelling that diverted the focus away from your over defensive rant about one OS being better than another! Get over it yourself!

Neon Samurai
Neon Samurai

This is my recreational posting on an informal forum focusing on technology. I post from various keyboards/button pads and various degrees of support for grammar and spell checking. If it was a business document or grammar focused discussion forum then I'd have more respect for a grammar snob who chose to focus my spelling rather than the information content. Casual writing on a technology focused forum under limited time constraints versus formal writing in a business setting or a linguistics focused discussion forum; "learn the difference." You obviously understood the content of the post even with the egregious and invalidating spelling error. Did you have any particular response, addition or issue with the technologically based points I presented?

Tearat
Tearat

The number of staff employed by the companies They tend to lower the number of people they employ when it looks like they have spare time So the ones who have the most problems usually have the most staff You need to think of that before commenting about who is busy or not busy You are not the only one to do it in this discussion

seanferd
seanferd

Slightly modified for flow: "Troll is the analysis of you." I removed the word "correct". Thinking about it now, I believe it struck a chord with a song title "Theater is the Life of You" (Artist: Minutemen, Album: Double Nickles on the Dime). Anyway, I feel it will be useful in the future. :D

Neon Samurai
Neon Samurai

His other comments didn't demonstrate any knowledge worth responding too but I figured I'd give him the benefit of the doubt as other's have come in like trolls but been very intelligent when engaged. Glad you found something of interest in there also.

seanferd
seanferd

Thought to reply the other day, figured I'd wait instead. Got an excellent quote out of it as well. :D

Neon Samurai
Neon Samurai

I'm not sure why you seem to feel so threatened by my comment. I simply pointed out that a lazy admin on any platform is an issue. To go over your accusations though: A. please provide more details as to why "Only Linux admins are busy" and how you aer able to sit back all day not doing your job monitoring the rest of the platforms. I'd like to hear more about these busy admins, what you think they are doing and how that provides evidence that the platform is not well suited to servers. If this point is left as is; it can be only assumed a pathetic attempt at trolling. I see the rest of your comments are equally as hostile, unsupported and flat out wrong so I guess Troll is the correct analysis of you. B. please elaborate on this point also. What "suppositions" do you find fault with? Where do you feel I need more technical support for my points? What specific points do you disagree with and based on what technological experiences? C. if a server stores it's password file where it is easily accessed and unencrypted but the admin uses a strong password; that's a problem. if a server stores it's password file where it is hard to access and decrypt but the admin chooses week passwords; that's a lesser problem. In the first case, the issue is software design and no amount of administrator good habits are going to compensate for that poor design since the password file can simply be grabbed and weak admin password cracked. In the second case, the issue is administrator habits. The security of the system can be improved instantly by having that administrator start using strong passwords. The flaw is not in the software design, it's in the way the human chooses to interact with the system (weak passwords). Now, if your not able to change the software design then a flaw in that design is a very big problem and no amount of good habit is going to fully compensate for that. Why is this such a threat to you though? Why is saying that poor admin habits on an platform can cause the platform to be at risk?

johnmckay
johnmckay

Sit back and think about that response... a) Only Linux admins are busy! The rest of us are sitting back doing nought all day. If it's so great I wonder what they're busy doing??? b) Lot of supposition on hit rates and success rates but zilch to back any of it up. But maybe your too busy to get hard facts just now. c) A lazy (oops controllable) risk is less of a risk than another??? How d'you work that out when this thread clearly shows the opposite is occuring in this example? There is no excuse for sloppy procedures or lack of housekeeping, anywhere. We expect flaws to be found and we expect remedial measures to be invoked asap, regardless of the platform. If admins can't do their job then give the task to someone who can/will. It's an easy fix (and motivates me).

Neon Samurai
Neon Samurai

Heck, admins overrun with tasks that just can't dedicate the time to one thing are not rare. I actually thought you where going to mean frequency in terms of the hits. Windows is hit all the time and the hits are successfull most of the time. Linux/BSD are hit all the time (server market) and the hits do not have nearly as high a success rate. A Windows patched system can still have exploits beyond the control of the admin. A *nix system's successful exploits tend to be configuration errors so that is a controllab le risk. In the case of Red Hat, well.. they have to do some house cleaning and get there stuff in order. It should be a reminder to the rest of us admins that it's about time reviewed our own configurations.