I love a wildly overstated article title. Don’t you?

The technical genius of Linux

The Linux kernel was essentially created on a mailing list. In the early 1990s, Finnish programmer Linus Torvalds announced his toy OS kernel to the world. People thought it was great fun, and started hacking on it. By the mid-1990s, it was something of an Internet phenomenon, with complete operating systems built around it. The three oldest-living Linux distributions are Slackware, Debian, and SuSE, in that order. Others have come and gone, and the number of available distributions has increased at a surprising rate — an alarming rate, to some. See the GNU/Linux distro timeline for details.

This proliferation is part of the genius of the Linux development model. The Linux community, on the whole, does not fear the fork (even if specific individuals often do) and it has reaped great rewards as a result. Being the second major operating system family (after the BSD Unix line) to use open source licenses for the whole system also factors into the somewhat accidental brilliance of Linux development.

Its design closely mirrors the Unix philosophy. There are areas where it diverges, of course, such as anything the GNU project touches, but at its core the Linux way of putting together an operating system still substantially serves to bring that philosophy into the lives of many people who might never have encountered it otherwise.

Studies have shown that, relative to major commercial software systems (let’s not name any names), the quality of code in the Linux project is remarkably high as well. It tends toward a bug rate about an order of magnitude lower than in those comparable closed source systems.

The Worrisome Trend

I do not believe Ubuntu is at fault for this trend. I saw it picking up speed for a while before most of us had even heard the name Ubuntu uttered, and I have no doubt that its roots go back to the late ’90s at least. Some might argue it goes back to the development of the X Window System, years before Linus started work on his kernel, but that seems a bit extreme.

Ubuntu really looks like the turning point, though. It went from a relative nobody, also-ran Debian derivative to the single most popular Linux distribution (in terms of public awareness, at least) in the world apparently overnight. A recent announcement by Lady Gaga to the effect that she loves Ubuntu has only served to accelerate a runaway train to popularity that left the station several years before that.

The trend is simply this: Abandon the Unix philosophy. Tie everything together in large blobs of interconnected guts and exploding kidneys (to misquote Nat Torkington’s commentary on Perl internals — a subject for another day). Make it as much like MS Windows as possible, in the hopes it will lure a few MS Windows users to the OS, even if it means throwing out some of the characteristics of Linux-based systems that make them worth using instead of MS Windows.

The Four Horsemen

Watching the progression of the Ubuntu project as an idle armchair spectator while using FreeBSD for pretty much everything over the last few years, I have seen plenty of signs of the odious direction Canonical leads the Linux community. Things like universal sudo access as an su replacement, intertangled and thoroughly unnecessary dependencies in the software management system, and such monomaniacal focus on a single desktop environment (GNOME, now ironically replaced by Unity on Ubuntu’s “friends list”) that it actively hinders attempts to customize a lean-running system all bother me.

Unfortunately for me, recent events have pushed me back into the Linux world more fully than I have been since I made the switch to FreeBSD as my OS of choice about half a dozen years ago. I chose Debian as my distribution, which had been my favorite distribution before 2005, and installed it on a new ThinkPad. This is where the troubles began.

The design of Linux-based distributions has become so dependent on the assumption that if you want a GUI you will have GNOME — and maybe KDE as well — installed that all the new tools designed to make the system easy to set up are now tied to those desktop environments. The tools we used to use to configure things from the command line on the way to having a GUI system running, when we did not want to use GNOME and KDE tools, have been neglected, deprecated, or just destroyed along the way. After a while of fighting with the software to try to get things to work simply, cleanly, and with minimal software installed, I finally gave up and started installing legions of garbage (for my purposes) software from the GNOME desktop environment.

It turns out that dependencies in Debian have started following the Ubuntu model; it is now impossible for me to uninstall a lot of software I will never use without breaking the whole system. If I use APT to remove Evolution, it uninstalls the core of GNOME. If the core of GNOME goes away, the reason I installed it goes away too — namely, the bloated GUI replacements for the simple, easy software I used to be able to use from the command line or via scripts to get things done quickly.

Let’s revisit this:

I have to keep Evolution installed, despite the fact I do not use it, in order to keep GNOME installed, despite the fact I loathe it and run a tiling window manager called i3 on this laptop instead.

It seems to me that the Four Horsemen of the Apocalinux are:

  1. The GNU Project: the poster boy for “embrace, extend, extinguish” in the open source world.
  2. Sudo Everywhere: the poster boy for security configuration that imitates Microsoft and Apple security philosophies.
  3. Ubuntu Package Management: the poster boy for tightly interconnected dependencies for software across the system.
  4. GNOME For Everything: the poster boy for the assumption that everybody wants the most bloated, featureful GUI all the time.

The Fifth Horseman

A Fifth Horseman has reared its equine head, and its name is NetworkManager. This piece of software is The Standard Way To Configure Networks these days. Don’t worry; it will probably be replaced in a couple years, by something even worse.

NetworkManager is the poster boy for a broader and deeper problem, as are the rest of those Horsemen. The real problem is that overly complex, do-it-for-me tools are created, integrated with the system, and required by lots of software designed for Linux-based systems (and their cousins). These things make earlier, cleaner, more stable tools “obsolete”, so those tools get abandoned in favor of the new, which then proceeds to prove its design flaws in practical use, including examples like Linux HAL and ALSA. Eventually, the fundamental brokenness of these systems — only once everybody has bought into the hype surrounding them — becomes obvious, and someone decides to replace it, creating a churn of incompatible changes in the assumption of how the underlying system works every few years.

There are alternatives to NetworkManager that you can use, of course. The most common, and a both simpler and cleaner choice, is called Wicd.

On this Debian system, I tried using none of the above, and just using the ifconfig and iwconfig tools, the /etc/init.d/networking service, and the /etc/network/interfaces configuration file like I did in the good ol’ days. I could write a simple script that automated network management, with these tools doing the heavy lifting behind the scenes, so that executing a single command could tell my laptop how to connect to what. Simple and easy.

Unfortunately, there are things that this Debian install simply would not allow me to do this way. Some of the tools I used to use to augment those already mentioned, such as wpa_supplicant and dhclient, were mangled or missing.

After fighting with that for a while, I tried Wicd, just to use something simpler than NetworkManager, and to avoid having the NetworkManager service with its beastly control interfaces in my way when trying to change network connections. Unfortunately, Wicd failed to work with any available DHCP clients (see: mangled dhclient above).

Ultimately, I gave in and tried using NetworkManager. Well, it worked — sorta. I could connect to my WPA encrypted network at home, no problem. Getting it to connect to open wireless networks with no encryption anywhere else, however, was another story. I fought with it, and fought with it, and finally came to the conclusion that the only way to get it to do what I needed was to manually kill NetworkManager and everything associated with it, then start it all up in “the right order”, manually specifying all the configuration options at that time to ensure that it did not try to do something for me in its endless quest to second guess what I’m trying to accomplish. I had a real bear of a time trying to get everything specified properly until I discovered a command line interface for NetworkManager in the APT archives called cnetworkmanager.

At that point, to ease network configuration management, I wrote a script to automate everything I needed to do when switching network configurations so I would not have to spend five minutes screwing around entering commands in “the right order”, since there was no reasonable way to just write up a configuration and have NetworkManager actually use it properly rather than decide I must want something other than what I told it to do in this case — following the MS Windows philosophy of nondeterministically applying heuristic[1] decision making processes to solve the problem of how to almost do what the user wants, but not quite.

In summary, the Linux community appears to have decided to fix the problem of having to write a simple script to manage my network configuration by reorganizing the OS around the assumption that if I install the X Window System, I must want to have a gigantic nanny application making decisions for me, even if they’re wrong, which only becomes usable by way of installing yet more software to give me an interface that can be scripted, because in the end I have to write a script to manage my network configuration.

The Good Ol’ Days

Jack Wallen offered his somewhat whimsical take on what Linux-based systems have lost over the years, in 10 things I miss about old school Linux. His list of things to miss is significantly different from mine.

I suppose I can sum up what I miss in one sentence: I miss when my Linux-based systems worked without blood sacrifices. Sure, it’s still better than MS Windows — even marginally better than MacOS X — but I’m getting pretty antsy to get back to FreeBSD. Maybe I should send some money to the guy working on the graphics driver I need.

  1. heuristic (adjective): based on wild guesswork, and prone to error