Linux

10 common mistakes to avoid when you're installing Linux software

Installing software in Linux is nothing like it used to be, but there are still some pitfalls to watch out for. Jack Wallen explains how to sidestep some of the typical installation problems he's encountered.

Installing software in Linux is nothing like it used to be, but there are still some pitfalls to watch out for. If you follow this little guide, your Linux life will be made simpler and safer.

Note: This information is also available as a PDF download.

#1: Installing from source when your system is primarily an .rpm or .deb system

Many new Linux users don't understand that both rpm and apt (or dkpg) keep track of everything installed on the system. However, those systems (rpm, apt, and dkpg) can keep track only of packages they install. So when you find that obscure package that comes only in source and you compile it yourself, your package management system will not know what to do with it. Instead, create either an .rpm or .deb file from the source and install the package with the package management system so that system will be aware of everything you have installed.

#2: Neglecting the many graphical front-end package management applications

Most people don't even realize that there are graphical front ends that take a lot of the guesswork out of installing packages in Linux. For yum (the command-line package management system for rpm), you can use Yumex for yum (installed with yum install yumex); you can use Synaptic or Adept for apt (installed with apt-get install synaptic or apt-get install adept).

#3: Forgetting to update the list of available packages

When using apt-get or yum, make sure you're updating the list of available packages. Otherwise, your system will not remain updated with the latest releases of installed packages. To update with apt-get, you issue the command apt-get update. To update with yum, issue yum check-update.

#4: Not adding repositories for yum or apt-get

Both yum and apt-get use a listing of repositories that tell them where to locate available packages. But the default repositories (often called "repos") do not include every Linux package known to Linuxkind. So if you run the command to install an application, and yum (or apt-get) can't find the package, most likely you'll have to add a repo to your sources listing. For yum, the sources are in /etc/yum.conf. For apt-get, they are placed in /etc/apt/sources.list. Once you have added a new repo, make sure you run the update so either apt or rpm is made aware of the new source.

5#: Not taking advantage of installing from a browser

Just as with Windows, when your system sees you are attempting to download an installable application, you'll be asked whether you would like the package management system to attempt to install the file or just save it to disk. In both instances, you will be asked for the root password (so you must have access to said password for this to even work). One thing I've always like about this method (be it in a yum-based or dpkg-based system) is that it has almost always been good about locating and adding dependencies.

Naturally, this method works only when you are downloading a file that's applicable to your system. If you attempt to download an rpm file on a Debian-based system, you won't have the option of installing the file.

You can take this one step further and select the Always Do This... check box in the Firefox popup so that every time you download a file associated with your package management system, it will automatically prompt you for your root password and continue to install the package. This streamlines the process quite a bit.

#6: Forgetting the command line

Let's say you've installed a headless server using Ubuntu or Debian (a common setup for Linux servers) and haven't installed any of the graphical interfaces or desktops. To do any maintenance, you have to log in via ssh (because no admin would log in via telnet) and are limited to the command line only. Even so, your ability to keep your system updated or install new applications is not limited. You can still use yum or apt-get to manage your packages.

With a Debian-based system, you have another option: Aptitude. From the command line, issue the command aptitude and you will be greeted with a nice curses-based interface for apt. This system is easy to use and gives you an outstanding option for maintaining a gui-less server without losing functionality. Aptitude lists Security Updates, Upgradeable Packages, New Packages, Not Installed Packages, Obsolete Packages, Virtual Packages, and Tasks. As you scroll through the list, you will not only get the installed vs. the new package release numbers but also a description of the package. After using Aptitude, you will quickly see how simple updating Linux packages can be, even from the command line.

#7: Blindly unpacking tar files

I can't tell you how many times I have downloaded a source package and without thinking, untarred the package not knowing its contents. Most times this works out fine. But there are a few times when the package creator/maintainer has failed to mention that the entire contents of the package are not housed in a parent directory. So instead of having a newly created directory housing the contents of the tar file (which can contain hundreds of files/directories), those files are blown up into the directory you unpacked them into.

To avoid this, I always create a temporary directory and move the tar file into it. Then, when I unpack the tar file, it doesn't matter if the contents are contained within their own directory or not. Using this method will save you a LOT of cleanup in those cases where the creator didn't pack everything in its own neat directory.

#8: Deleting those make files

When you're installing from source, you'll probably run make clean to get rid of all of those unneeded source files. But if you get rid of the Makefile, uninstalling will be a hassle. If you keep it, you can usually uninstall the program simply by issuing make uninstall from the directory housing the Makefile. A word of warning: Don't dump all your Makefiles into one directory. First rename them so you know which application they belong to. When you want to uninstall the application, move the Makefile to another directory, rename it to its original name, and then run the uninstall command. Once you've uninstalled the application, you can delete the Makefile.

#9: Installing for the wrong architecture

You might notice that many rpm files will have an i386, i586, i686, PPC, 64, etc. There is a reason for this. Unless the rpm file has noarch included in the filename, that rpm file was created for a specific architecture. And when those files were created for that architecture, they were optimized for it, so they'll will run better. Does that mean you can't install an i586 on a standard 386 machine? Of course not. But it will not run as efficiently as it will on the indicated architecture. Now, you can't install a PPC rpm on an x86 architecture. The PPC architecture is for the Motorola chipset. Nor can you install the 64 bit on a 32 bit. You can, however, install the 32 bit on a 64 bit (as in the case when you want to get Firefox running with Flash on a 64-bit machine).

#10: Failing to address problems with kernel updates

It used to be that updating kernels was a task left to the silverback geeks. No more. With the new package management systems, anyone can update a kernel. But there are some gotchas you should know about. One issue is that of space. With every update of a kernel, your old kernel is retained. If you continually update kernels, your system storage can quickly fill up. It's always a good idea to check to see what older kernels you can get rid of. If you're using rpm, issue the command rpm -qa | grep kernel to see what you have installed. You can remove all but the last two installed. It's always best to keep two in case the one you are running gets fubar'd.

Another gotcha involves NVIDIA drivers. If you use the livna repositories, you will find yourself locked into the livna kernel releases as well. This isn't always a good idea. Instead, I would do this in two parts: Update your kernel and then download and install the NVIDIA drive associated with your kernel. This will require you to search for the proper rpm file for the NVIDIA driver, but it will keep you from having to use the livna kernel. I was once locked into this system and found myself suffering from interesting kernel/video issues isolated to the livna files. Avoid this. Of course if you are using a Ubuntu system you can avoid the NVIDIA trap altogether by using Envy. This handy tool will allow you to install the best NVIDIA driver without having to mess up your favorite kernel.

And although this is a no brainer, make sure you reboot after a kernel upgrade. It's the one time you will HAVE to reboot your Linux machine. Although your machine will continue to work just fine, it will be working with the older kernel and not taking advantage of the new feature or security enhancement (or whatever the newer kernel has to offer).

About

Jack Wallen is an award-winning writer for TechRepublic and Linux.com. He’s an avid promoter of open source and the voice of The Android Expert. For more news about Jack Wallen, visit his website getjackd.net.

67 comments
Allen Halsey
Allen Halsey

With latest versions of Fedora, I've found that receiving the nVidia kernel module from the Livna repository to be a smooth process. Previously, it was advised to wait a day or two before installing new kernels to give the Livna guys a chance to build a new nVidia kernel module. But this wait is rarely necessary nowadays. A nice guide to setting this all up is here: http://www.mjmwired.net/resources/mjm-fedora-f9.html#nvidia

Allen Halsey
Allen Halsey

Should you create a package instead of installing from source? Yes. But, afterwards, please contact your distro to see if you can maintain the package in your distro's package repository. That way the package will go through QA review and be available to everyone and receive updates from upstream. Packaging is a great way to contribute to your distro.

Allen Halsey
Allen Halsey

Should you create a package instead of installing from source? Yes. But, afterwards, please contact your distro to see if you can maintain the package in your distro's package repository. That way the package will go through QA review and be available to everyone and receive updates from upstream. Packaging is a great way to contribute to your distro.

art
art

On 2: If you ever used the command line apt-get or aptitude, you will never go back to GUI again. Completing an 'aptitude update' takes less time than the synaptic GUI takes to load, and don't get me started on adept. On a really fast machine, start it and go to lunch. On a slow machine, go home and deal with it in the morning. On 5: If you let the install take place automatically from a browser, you won't have the .deb in case you need to reinstall. Better to download the .deb to a tmp folder and either issue 'dpkg -i package-name.deb', or use the GUI deb installer. Tip 1: Aptitude stores more info on libraries and other dependencies than apt-get. That way, if you decide to remove a package you installed with aptitude, you have greater assurance that you are removing the dependencies as well. In addition to the ncurses interface, aptitude support almost all of the same command line arguments that apt-get supports. It's never too late to switch to aptitude. Tip 2: And I'm surprised you missed this one. Install the package checkinstall for installing from source. Checkinstall replaces 'make install' and creates a package ??? 'checkinstall -r' for RPM and 'checkinstall -d' for debian packages ??? and then uses the package manager to install the created package. That way your source compilations can be removed or reinstalled with the package manager. If you install from source, you need checkinstall. Period.

ikaufman
ikaufman

Be very careful when adding other repos. You could wind up installing incompatible software, or add things that you really do not want to satisfy the dependencies the builders used when creating those packages.

Tolga BALCI
Tolga BALCI

Unpacking the tar files without checking the content can really mess up your whole directory. Instead of creating a temporary location and checking out the contents, it will be practical to use tar with -t parameter: tar -tvf file.tar will give you the listing of the contents without unpacking it, thus enabling you to check out the contents [the parent directory issue].

yurachi
yurachi

It is actually, a security breach. Almost the same as windows, demanding Administrator rights regardless whether is needed, or not. Saving rpm to your own home directory should not ask you for any password. Running "sudo yum ..." or "sudo rpm ..." will ask you only for your own password, so root account is not used at all. And sudo logs everything. The only drawback is that you need to have /etc/sudoers set properly.

TJ111
TJ111

It's not hard to install software in Linux, if that's what you got out of this article. The point is that it's just as easy, if not easier, than Windows. Its just that linux gives you multiple ways to install software, and different distributions use different package managers. . For example, if you want to install Amarok under Ubuntu. You can go to Add/Remove Programs, search for Amarok, click install, and enter your password. You can go to Synaptic, search for Amarok, check the box next to it, hit apply, enter your password. You can open a terminal and type "sudo apt-get install amarok". Or, you can do it the windows way, go to a site like getdeb.net, download amarok.deb, double-click the file, and install it with an installer similar to installing programs in windows. For standard users, one method is sufficient. However, for IT people on an IT website, several times they are managing different computers with different architecture, different distributions, etc, so its good to know all the methods you have available. . PS, this article isn't about whether or not X distribution works well with Y wireless card. It's not Linux's fault the hardware manufacturers won't open up their drivers, thus leaving volunteer developers having to reverse engineer them.

FXEF
FXEF

Nice article for Linux users. Too bad we are burdened with Windows trolls and their stupid comments.

tnkback
tnkback

When you are completely ignorant about Linux, there is a search engine at your disposal. When you search for anything pertaining to Linux, WOW, you have an unlimited source of information. A new Linux user cannot sit in front of their monitor and expect Linux to install itself. Wait.. it does install with very little user input. The word of the day is READ!

quark
quark

I do NOT make mistakes!

chaz15
chaz15

Yep, this confirms Linux is still very much for VERY TECH SAVVY PEOPLE. Yum, Yumex, Kernel,Livna repositories, .rpm,.deb, tar Lost 99/ 100 people by the first line !!!!!!!!

art
art

Install the package checkinstall. grab the source, and unpack it in a ~/tmp directory cd to that directory ./configure make now, as root or using sudo, instead of 'make install', run checkinstall. checkinstall creates a package and installs it for you.

n.smutz
n.smutz

I assume this is meant for Linux neophytes. Is package creation (RPM, etc.) a straightforward and obvious process?

ToadWiz
ToadWiz

Thanks for bringing up a point I've wondered about. I've only been using Ubuntu for two weeks and I don't know how to tell which repos are useful and will play nice together. Is there a list? Does anyone have a recommendation? Know of a website that can help me choose the repos I need? (And PLEASE, Windoze fanboys lay off. I know you think repos make Linux harder to use - and they do. But 24,600 plus pieces of software FREE for the asking came up on my initial install. When you have 24,600 pieces of free software available on your initial OS load, you can tell me why Linux is so terrible.)

wolftalamasca
wolftalamasca

I have been in the same boat with devices not working.. with windows, mac and linux.. both old distros and new. One thing I found was that sometimes people do not look at the simpler fixes to some problems. One major glitch seems to be network cards, just as an example. One would think Windows could deal with the common or generic drivers for the most common network cards. no. It cannot. Leaving many freshly installed machines needing them to simply do updates, or, in some cases, unable to get the drivers at all without another computer, floppies, usb sticks, cd-burns, etc. Linux, on the other hand, supports so much hardware on install its mind boggling... and, it can support many using default drivers to limp along until better ones are found. Not windows, though. It installs an amazing amount of data, yet still hasn't a clue about your hardware and drivers most of the time. And, most average users fumbling through a windows install usually do not have a clue about digging into their control panels and finding missing or unsupported devices... those are the same people that the fud-generators claim will be lost if they moved to linux... ones who need IT help to setup or upgrade their own computers running windows anyways. So, back to Simple: what do you do when the OS is giving you grief about your drivers? Perhaps bite the bullet, if your time is -so- expensive, and buy a proper card that is supported. Most are. Somehow, you managed to get one of the very few that, regardless of price, are a pain. Granted, this Simple solution is not 100% effective.. laptops and onboard stuff can rule out buying replacements. But claiming it's easier on windows is a crock. Many windows driver installers will overwrite each other (even by the same mfg!) or do not supply updates (think HP denying updates to printers for newer OSX releases and Vista to try to bolster new printer sales). Some drivers (again HP is an example) load up your system with so much junk it bogs out and becomes unstable. Yet, out of the box, I can print to my HP AIO printer from the last several different distros of linux... While my XP computer I have to print to it via a shared printer to avoid installing the drivers (300mb of install, for a simple AIO printer!, which has turned my XP install upside down everytime). This is IT folks. We are the magicians. What we do inside the 'mysterious black boxes' is why we get paid... regardless of the reasons. Much like other professions, when the average joe decides to pull out the plumbing tools to DIY, we get called in to stop the flood. All in the name of saving a buck.

trent1
trent1

IMO... NOW, it's easier to install Linux than Windows (quite a bit easier than Vista, not necessarily easier, but definitely quicker than XP). That wasn't the case 18 months ago.

RknRlKid
RknRlKid

It isn't Linux's problem that drivers don't exist for certain pieces of hardware. (A strange phrase, as if "Linux" was a person who could manufacture something!) Hardware makers have denied drivers for both Vista and Linux. That is a problem with the hardware makers, not the OS's! This problem pops up with every new Windows release, as well as with Linux.

trent1
trent1

For someone like me who is checking out linux again (didn't much like the last distros I tried). This is the kind of basic, but useful info that comes in handy. It's just too bad we are burdened with Linux snobs and their "you are obviously too stupid to use this gift from God" mentality... AND The Windows lemmings, refusing to see anything else as credible. Please people, give it a rest. The article was very helpful. I'm sure there are others who read it and feel the same.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

I have spent hours and hours of time reading and fiddling to get a network card to work on Linux. I don't want to spend hours solving a "simple" problem. It only takes an hour or two to format my drive and install windows. If I wanted to spend hours reading I would buy a book and enjoy myself instead of hitting my head against a wall trying to get a critical piece of equipment to work. I really thought this article was well written and gave me answers to questions that I have never had answered. Bill

burntfinger1
burntfinger1

When I was retired I didn't make any mistakes either :) since I have reentered the workforce however.....

hkommedal
hkommedal

does in fact do nothing at all. That is how they manage to avoid all mistakes.

brian
brian

It's deeply ingrained in the Linux development culture. New software should be named such that: * the name length is inversely proportional to how often it's used, * The name, in a literal sense, should have as little as possible to do with the function of the software. * But, there should always be an obscure connection (e.g. somewhere in Greek mythology or the developer's favorite TV show), anagram or acronym that the developers thought was clever. I mean come on, "Ubuntu Update" would have been really boring. And everybody can see the "APT" in "Synaptic".

ToadWiz
ToadWiz

Maybe so, but I think Linux' biggest problem is the gap between starting using Linux and being a pro. Ubuntu has made Linux about as easy to use as Windows, and if simple is what you need, then you probably didn't need this article. But for people wanting to step beyond the basic operator level, it's a chore to dig out what information you need. This list of things to avoid is JUST the kind of thing that I need. No, I had no idea what Livna was, and I'm not sure I'll ever need to know, but at least if I do, I'll remember there was such a thing and have an idea what to look for. My thanks to the author.

jdclyde
jdclyde

couldn't load a windows system either, or install software, or add a printer, or keep their antivirus up to date. What, if anything, is your point? That ANYONE using ANY Operating system might actually have to learn something?

CharlieSpencer
CharlieSpencer

No matter what OS you prefer, learning to use it involves learning some new terminology. Other than "Livria", the words you quote are very basic terms a beginner will have to know before being productive. On the other hand, no one was born knowing the terms "Control Panel", "Add / Remove Programs", "Network Neighborhood", or other Windows-specific terms. There's a learning curve either way. A similar article for Windows would lose 99% of those not already familiar with it's rerminology.

Jaqui
Jaqui

an article meant to help techs makes linux non usable by people. nice bit of fud there. if it was meant for end users, it wouldn't be on a site for IT professionals.

Jaqui
Jaqui

it's a straightforward process, if you know how your distro altered default configurations. if you are new to linux / packaging, then it isn't as simple as it sounds. that is the "fault" of the distros that change the software default locations / configuration. If they left the GUI config alone, the app config location alone, and used the FSH as defined in the standard then it would be simple to make packages.

CharlieSpencer
CharlieSpencer

This is complicated by some Linux users insisting that all drivers must be open source, a requirement that doesn't exist among Windows users. I confess to not understanding this demand; isn't a closed source, vendor-supplied, proprietary driver better than none at all?

mudpuppy1
mudpuppy1

I agree. I don't have a dog in this fight. This whole debate reminds me of the Ford vs. Chevy thing. Neither camp is paying me to shill for them, so I get tired of the snobs on both sides. Yes, I mostly use Windows, but that is changing (thanks Vista). I have Ubuntu on a test box at home and I'm thinking of moving it to my main box (just haven't got there yet). I found this article to be very helpful. It filled in some gaps. Moronic comments like those by the poster above you don't help.

ProperName
ProperName

In response to: " That's the problem I have spent hours and hours of time reading and fiddling to get a network card to work on Linux. I don't want to spend hours solving a "simple" problem. It only takes an hour or two to format my drive and install windows. If I wanted to spend hours reading I would buy a book and enjoy myself instead of hitting my head against a wall trying to get a critical piece of equipment to work" Have you used windows? When it breaks do you fix it yourself? or call someone for support? I too have stared at the BSOD in windows for hour upon hour trying to solve a "simple" problem, wanting to bang my head against the wall for countless hours of wasted time. I have sat in frustration for trying for hours and hours to get an onboard network solution to work within a linux box. Guess what? Both times the solution was equally easy and only after countless hours of "reading" did I find my solution. Could have been easier to re-format and re-install on both occasions, but would not have solved my problem. Issues are OS independent. I have had troubles in Linux and troubles in Windows. Taking time to learn the solutions is how I learn the distinct differences between the two. And become a better user. Windows installation on a good day (even dual core) will run you three (to 6) hours by the time you have re-installed the OS (with Service Packs and updates), apps, chipset, video, sound and network drivers, setup the internet and home network connections, peripherals (hp printers can take an hour or better alone) Seems like the quick and easy is not really so. Thankfully the longest Linux install I have encountered was an hour and a half, with only two prompts from me. What a pleasant change when it reboots and 99% of what I need is ready to use already.

RknRlKid
RknRlKid

The whole "wireless card doesn't work" thing is not just a Linux problem. I had the exact same problem with Vista and wireless. The real issue to me is this: USE THE HARDWARE COMPATIBILITY LIST! Its no different than Windows in this respect. Where people get into trouble is that they buy an inexpensive card, then expect it to do miracles. If the card is not on the HCL, then it doesn't matter if its Linux, Mac or Windows, it just won't work. Just because Linux is free, doesn't mean you can use cheap/inexpensive hardware! One of the problems with Linux is that its heralded as an "inexpensive" alternative. Software wise it is less expensive than Windows, yes. Hardware wise though, there may not be much difference. Failing to make that distinction will result in overall failure.

aa8vs
aa8vs

Linux apparently has some good usage, but it is not meant for the faint of heart or typical PC user and the geek's know this and enjoy showing superiority 8^P -- "Linux is free, if your time is worthless"

art
art

brian says: "the name length is inversely proportional to how often it's used," That seems to be to be a good thing. The more often you use it, the less you have to type! Obscure, rarely used stuff should be more descriptive. My favorite editor is called "ne". I want to edit a file, I just type "ne /path/to/file" What could be easier than that?

ToadWiz
ToadWiz

Ok, tell me then why "Microsoft Genuine Advantage" is not called, "Microsoft's Method of Examining your computer with your permission to determine how much software you have pirated"? Inquiring minds want to know! Unix has a reputation for short, non-descriptive names, like "vi" for the editor. But Ubuntu Update is alliterative, which counts as marketing. Are you suggesting that only a for-profit concern is allowed to do marketing or to choose a name that might be remembered?

boxfiddler
boxfiddler

but it hasn't anything at all to do with Linux. Or computers. Or maps...

jmgarvin
jmgarvin

Personally, I don't care. I do understand wanting FLOSS drivers, but at the same time I live in reality.

RknRlKid
RknRlKid

I don't see what the "evil" is about accepting vendor drivers or software. SOMEBODY has to develop this stuff, and who better than the vendor of the hardware? People just want it to WORK. To paraphrase the movie "Tron," I'd love to have religious discussions with you, but we have work to do! The obsession sometimes with open source does border on a religious argument.

mamies
mamies

I had the same problem but with an Acer laptop. Unlike you i didnt do the research so then realised that there were NO XP drivers released according to acer. I installed Ubuntu thinking that I would use it till i found XP drivers but ever since then i havent even bother to look. I found Ubuntu, easy to use, i have "pretty features", I can install windows applications and it is very stable. I am not looking back

ToadWiz
ToadWiz

I bought an HP DV9627CL laptop. It came with the 1st drive partitioned into 3 pieces, one of which was hidden. I tried for days to reformat the drive, without success. I'm no novice at this. I've been a PC user since 1990 and make my living in IT. Loaded on the primary partition was Vista and a bunch of malware. (No virus, just crappy software that constantly bugs you to pay for it.) I wanted to pull Vista and the malware off, and put Win2000 on in place of it. I did check with HP's website before purchasing to insure Win2000 drivers existed. The second partition contained a Vista/malware reload and I believe the third partition was part of what was making my reformat and repartition not work. After the purchase, I found that key drivers were not available and HP refused to give me access to them. I extracted the primary hard drive, put it in an external case, and reformatted it on another machine. But what I could find did not give me a well-running machine. (I got the video to work, but couldn't get the network operating, and only once did I manage to get the audio to work.) After I got tired of messing around, I got Ubuntu. I had it fully operational in less than 1.5 hours, and that included the time to write a manual of my efforts. I will probably maintain a Win2000 machine, at least for a while. But I'm sold on Ubuntu. HP and M$ drove me off, and I think they will continue to try to force people to use Vista in a similar manner. If you aren't willing to have your arm twisted by them, Linux is available.

Dumphrey
Dumphrey

what do you mean by "until I bought a computer loaded with Vista and malware, and DELIBERATELY hosed so that I couldn't install the OS of my choice" Not trying to start an argument, I am actually curious. Feel free to PM me.

Dumphrey
Dumphrey

my 86 year old aunt browses the web and checks her email on Ubuntu, on a Dell computer she bought with XP. All the Dell pop-ups and reminders irritated her so much she was about to give it away. My father installed Ubuntu on it at that point, she has been happy since. My father use to call me daily with questions. Now he calls weekly, and usually I have to google the answer anymore. Its a learning curve, and not really that steep. To "use" Windows as an appliance is easy once you master a few basic tasks, clicking, dragging, icons, and where files and programs reside. The same is true for Linux. Learning to "use Linux" as an appliance is easy, with the same basic tasks as Windows, but the file structure is different. Learning to administer each system takes a bit more learning, but once again, not hard to learn to add/remove programs from a package manager or double click a .exe/setup file. Installing Linux is pretty dern easy these days. no more manual Xorg configs, no more manual partitioning (unless you want to), no more manual choice of hardware. Anymore, just follow the defaults, and boom, Linux. Windows, follow the defaults and boom, Windows. Now you have to install motherboard drivers, network drivers, video drivers, sound card drivers... reboot, reboot, reboot... My XP machine requires 12 reboots during the OS/Driver install process. takes 1 for Linux on the same hardware (and yes everything works, including compiz(which I disable since its just candy(Use classic desktop on XP (another layer for fun)))). I was able to walk my father through his first Ubuntu install, over the phone, in 20 minutes, from empty HD to surfing the web.. 20 minutes. So, before spreading FUD, make sure you will not be seen as ignorant by looking at a Linux distro from the past 2 years. MAYBE even attempt to install a modern distro. The difference between a computer user, and a good computer user is the ability to find information, be it google, a book, a magazine, or a coworker, it doesn't matter what OS is involved, RTFM/L2Goo applies.

ToadWiz
ToadWiz

Linux geeks can be a bit superior/snooty to the rest of us, but your response shows your bias. Making your living off people not interested in being geeks is fine, but your attitude is no better than the Linux geek. There is more to the world than lemmings and Linux geeks. I was satisfied to use Windows 2000, until I bought a computer loaded with Vista and malware, and DELIBERATELY hosed so that I couldn't install the OS of my choice. Microshaft lost me as a customer at that point - NO ONE forces me to accept their malware and tells me I can't do anything about it. If I have to become a Linux geek, so be it. I am not a lemming. Between Microsuck and the Linux geeks, I can tell you which I will choose. Micro$oft will want to keep you as a lemming. The Linux geek's superiority may be irritating, but he'll tend to help you as long as you are willing to put forth a minimal amount of effort. So, if you aren't bothered by being treated as a profit opportunity by Micro$oft and system resellers, be happy with your choice. And if you are bothered, try Linux, swallow some pride, and ask for help. It's there.

historyb4
historyb4

I know many people who use Linux who are not geeks, most people are to scared of something new or like this poster above to lazy. Ok maybe stupid too.

Dumphrey
Dumphrey

Leave Foreigner out of this, please, Im begging you! No! No more! I promise to be good mommy!

ToadWiz
ToadWiz

I know a wink when I see one, but just for fun: Revealing of Microsoft?s tactics is the leakage of an internal Microsoft memorandum in October of 1998. Referred to as the Halloween Document, its veracity has been confirmed by Microsoft staffers. It contains some interesting statements: ? Open Source Software poses a direct, short-term revenue and platform threat to Microsoft, particularly in server space. Additionally, the intrinsic parallelism and free idea exchange in OSS has benefits that are not replicable with our current licensing model and therefore present a long term developer mindshare threat. ? OSS is long-term credible ... Fear, Uncertainty, and Doubt (FUD) tactics cannot be used to combat it. ? Linux can win as long as services / protocols are commodities. ? De-commoditize protocols & applications: OSS projects have been able to gain a foothold in many server applications because of the wide utility of highly commoditized, simple protocols. By extending these protocols and developing new protocols, we can deny OSS projects entry into the market. ? The ability of the OSS process to collect and harness the collective IQ of thousands of individuals across the Internet is simply amazing. More importantly, OSS evangelization scales with the size of the Internet much faster than our own evangelization efforts appear to scale. As far as I am concerned, the second bullet point confirms that Micro$oft has used FUD in the past, or they wouldn't be denying it as a means of combatting Linux.

jmgarvin
jmgarvin

Ride into the Danger Zone

seanferd
seanferd

I want you to show me... ;)

dawgit
dawgit

You ain't fool'n nobody. The government knows right where you are. :0 :p So, you're not lost. :^0 -d

boxfiddler
boxfiddler

is a carbon copy of my earlier post re: getting lost at yum. Apparently the lack of emoticon is a problem... ;)

Jaqui
Jaqui

fear uncertainty doubt spewing fud is feeding fear, uncertainty, doubt. edit to add: and is usually done by lying. or, as with Gartner, using antiquated versions of Linux instead of current versions.