Linux

Choosing the right Linux distribution is key to success

Jack Wallen was reminded of a lesson he learned near the beginning of his Linux career...choosing the right distribution for the job is the key to success. Read on to see what brought about this reminder of what should be fundamental to every Linux administrator.

This past week, I had a frustrating reminder of a lesson I learned long ago...but let slip away out of either arrogance or stubbornness. That lesson is that choosing the right distribution is the first (and most important) key to success when deploying the Linux operating system.

Yes, we all have our favorite distribution, but sometimes that favorite distribution simply will not do the trick and a different flavor must be used.

Let me explain.

On my editorial calendar (for a different publication) I had scheduled an in-depth article on installing the Packetfence Network Access Control system. I had written about it long ago (for yet a different publication) and remembered how powerful the software was and how perfect it was for business and enterprise use. The thing I remember most about Packetfence was how incredibly challenging the installation was. I assumed it to have matured in that area -- it hadn't -- at least not any distribution other than RHEL and/or CentOS.

So I plowed on with the intent of getting Packetfence installed on Ubuntu Linux, ignoring all of the warning signs. After a week of working on and off, I managed to get the system up and running...minus the web interface. This was the biggest issue with Packetfence and non-standard web servers (apache2 vs. httpd). It seemed impossible to get Packetfence to start the apache2 daemon. No matter what I did I was returned with that ever-mocking, "The webpage is not available" error. I could work with Packetfence from the command line, which I didn't mind doing...but I knew the vast majority of readers would settle for nothing less than a GUI.

And so I switched gears and focused on installing Packetfence on CentOS. It took roughly an hour (not including the installation of the OS), but it was up and running perfectly - even the web interface.

Lesson learned. Fundamental rule back in place. But what exactly does this mean? It's quite simple (and obvious): The right tool for the right job will save you a lot of work and headache. This applies to the software you install as well as the distribution of Linux you install it on.

Let's face it, some distributions are better suited for certain jobs. As much as I like Ubuntu on the desktop, it's just not the best server OS. Why? Well, for one it doesn't follow standards like a server OS should.  The illustration above is a great example. Ubuntu opts for using apache2 instead of httpd. Since httpd is the standard web server daemon, there are certain tools that might have trouble working with a non-standard server. If you want a server OS, stick with something like CentOS. Not only does CentOS stick with standards better than Ubuntu, it's more in line with the higher security needs a server OS calls for.

Sometimes this is a tough choice. Many administrators know one OS or one distribution of Linux. Switching from Fedora or CentOS to Ubuntu is a challenge (likewise Ubuntu to CentOS or Fedora). This highlights another important point - understanding the fundamentals of Linux will help you make that jump from one ship to another (and back again). When I started with Linux, the distributions were much more similar than they are today. I got to know Red Hat Linux quite well. Having that skill allowed me to jump to (then) Mandrake, SUSE, eventually Debian,  and then Ubuntu. So to any new Linux administrator I would suggest (in the least) getting to know disparate distributions like Ubuntu and CentOS. Having a solid understanding of both will give you a head start on both the desktop and the server arena.

Choosing the right distribution can seriously make or break a Linux project. This is especially true if you are working on selling Linux to the powers that be, or just trying to get your friends/family/co-workers to give it a try. Choose the perfect Linux distribution that meets the needs of either the user or the service and you are already halfway to success. Choose the wrong distribution and you have failed from the beginning.

About

Jack Wallen is an award-winning writer for TechRepublic and Linux.com. He’s an avid promoter of open source and the voice of The Android Expert. For more news about Jack Wallen, visit his website getjackd.net.

18 comments
eriksank
eriksank

All distribution more or less distribute the same software packages. So, it's not that packages that differentiate the distributions. The differentiators are: (1) the package manager (e.g. apt-get, yum, emerge, pacman, ...) (2) the choice of what version of each software package to distribute (conservative versus bleeding edge or even beta) (3) Which alternative of equivalent packages they install by default (e.g. gnome versus kde) (4) The amount of integration testing they do, for their particular combination of package versions In my impression, the package managers across the different distributions are more or less equivalent. For example, fundamentally, apt-get (debian, ubuntu) is not better than yum (redhat, centos); and if the one package manager were better than the other, they would quickly implement the features that differentiate them, and we would be back to square one. Therefore, the true differentiator is the choice of what version of each package, the distribution releases: conservative or even bleeding edge. But even there, most distributions offer a choice of different releases. For example, Debian has "stable", "testing", "unstable", and so on. Another differentiator is which alternative of equivalent packages the distribution installs by default. On the desktop, it is primarily the choice between gnome, kde, or maybe one of the flux -or other boxes. On the server, however, there is less balkanization. Therefore, all server distributions look pretty much the same across all linux distributions. I think it is the last differentiator that really matters, that is: How much testing went into ironing out all issues when integrating their particular combination of package versions? And that is an issue of man power. The distributions with more package maintainers and beta testers will have had more opportunity to test their releases. The distributions that create fewer releases and upgrade their packages less often, will also tend to be more stable (but with fewer new features).

rjkirk
rjkirk

Ubuntu inherits apache2 from Debian. Nobody would argue Debian is a "non-standard" server. Debian was around long before CentOS and RHEL. In fact, Debian was founded the same year as Red Hat. This is simply a case of an app that is not cross-distro compatible.

Justin James
Justin James

If you want to ever install applications and get support from the vendor, you really only have a few choices: RHEL, SLES, and *sometimes* a third choice, depending on vendor. Over the years, the *only* Linux that you will always be able to get a vendor to support is RHEL. Who knows what the future of SLES is right now? And Ubuntu hasn't had the same impact in the server room as it has on the desktop. So really, if you are talking service-side distros, and you plan to use (or already have) any commercially supported apps, RHEL is your best, and really only, option. Even with CentOS, many vendors will tell you "not supported" when you try to open a ticket. J.Ja

mwclarke1
mwclarke1

If enterprise and critical app and need support I still use RedHat, if dev, or other non critical or support in-house, personal server use, I use CentOS. For some desktops doing specific tasks I may use CentOS there also. dedicated appliances where need a minimal but based on full distro I use CentOS as well. Other specifc machines may run aother specialised ditro as needed. However, I find that many of the desktop distros, Ubuntu and Fedora to name two, try and incorporate too many new features and the release schedules are way too aggressive and not tested as thoroughly to be utilized in any serious server implementations unless used just for testing/personal use. Even as a desktop can have issue with things not working or stop working due to a new feature, driver, or lack of for older hardware and requiring more newer hardware to run as well.

sysop-dr
sysop-dr

I use CentOS for development and testing, RedHat Enterprise for production servers. Ubuntu for new Linux desktops and Puppy 5 for installing on desktops over 3 years old.

pgit
pgit

I tried ubuntu server.. no way. I would never expose that to the internet. With that exception (and excepting ubuntu in general, just don't like it) any Linux distribution can be set up as a secure, reliable server. They all allow for installing the needed daemons and services. They all have the same basic firewall in iptables. They all allow for start/stop on boot so unneeded and insecure processes can be avoided. Some have tools for getting various servers basically installed, up and running, after which minimal manual tweaking can lock it down tight. Mandriva has some of the best, quickest tools for this, and to me the way it's laid out it's the easiest to get configured to the point it can be safely deployed on the internet. Fedora is closer to a desirable server out of the box, but specific config tools are somewhat lacking. Debian is even better, it seems to respond to setup via command line in a more solid fashion than others. Point being with varying degrees of effort just about any distribution can be made to behave like any other. True that some are better suited for a task out of the box, which is a matter of choices made by the packagers of the distro. For that, I come back to Mandriva. It seems to me their default 'flavor' is "chameleon." It seems ready for hard core alterations in any direction the user wants to go. It treats the end user like a trusted power user, exactly the opposite of the ubuntu/Mac "you don't need to know" attitude. It always takes me close to 1/2 the time to set up a given server on Mandriva than most of the other major distributions. YMMV of course.

tbmay
tbmay

...but the CentOS punt is just ducking work. It IS RHEL.

mitch
mitch

Once the BAD kinks are worked out of a new Fedora release - which usually takes just a short while - I find Fedora to be quite stable. I use it in production without fear. Well, maybe not the MOST current version....

jlwallen
jlwallen

i wrote, at least, one of them. but things have changed since then (for both Packetfence and Ubuntu). Back then it was a bit challenging, but do-able to get Packetfence running on Ubuntu...even the web interface. Now...not so much.

VizCreations
VizCreations

I am a web developer. I know Ubuntu is not very famous in server side installations. openSUSE has been pretty popular, but Fedora is definitely the most desirable. I don't know why, you have to ask the Gurus.

CharlieSpencer
CharlieSpencer

"Fedora is closer to a desirable server out of the box, but specific config tools are somewhat lacking." Excuse my ignorance, but I thought Fedora was more of a community test bed and not intended for actual production use. Wouldn't RHE be a better choice? I understood it to be the stable released equivalent.

Justin James
Justin James

"...but the CentOS punt is just ducking work. It IS RHEL." Oh, I know... but you know how vendors are too... "we didn't certify our application on that OS"... with the unsaid, "that's our story and we're sticking to it!" J.Ja

pgit
pgit

I just got done echoing the same sentiment in a post above. I assert most distributions are really 'testing' versions, if you want to be honest about it. Anything "stable" in the Linux world is going to be at least a year old, and probably won't have the most recent packages of any component.

pgit
pgit

A lot of distributions are somewhat 'community test bed' releases, it's just that fedora is honest about it. Debian stable and slackware are solid exceptions to this, the goal is stable and not cutting edge. A lot of distros are fooling themselves in releasing a system and calling it "stable." I always find slews of people in various forums up against problems that turn out to be bugs, bad packaging, unintended conflicts, lack of hardware support and such. In a way even windows has been a 'test bed,' win95 was the most obvious, and they have gotten better at releasing a stable OS, but there's still some 'testing' left to the end user. Like VisCreations says, ask the gurus why they like fedora. All I know is it works, doesn't give me much if any trouble and so far has proved to be well hardened. Ironically I have less trouble with fedora than just about any other distribution, excepting again debian stable and slack. That said, I don't recall anyone jumping on the latest fedora to press the latest/greatest into service. I personally have used a version one or two releases prior to the latest in production. Bugs and hacks have been well hammered out by the community.

tbmay
tbmay

Fedora is a developers distro. RHEL is for sure a better choice in production. Debian Stable is good too. I've run slackware boxes for years without rebooting too. I don't really use it any more though; however, I hope Pat keeps working on it. The first two are my choices, hands down, because of the ease of updates.

eCubeH
eCubeH

We use Fedora as the server. Works quite well and stable with what we have set up (fairly simple with httpd, sftp / sshd, mysqld, named, dhcpd). Of course we have to be prepared to research and experiment a decent amount in a sandboxed environment. We use Fedora on all our desktops / netbooks, and use many services locally too. We investigated CentOS but then prefer the flexibility and currentness of Fedora, and we thought it would be an easier connect and maintain. We use a version that is 1 version old (between FC12 and FC13 presently).

Editor's Picks