Open Source

The state of Linux according to Linus Torvalds

Linux Magazine has published a wide-ranging, two-part interview with Linus Torvalds, asking him questions about everything from his take on continuing development of the kernel to the quality of Linux security to the impact of new hardware, and lots of things in between.

Linux Magazine has published a wide-ranging, two-part interview with Linus Torvalds, asking him questions about everything from his take on continuing development of the kernel to the quality of Linux security to the impact of new hardware, and lots of things in between.

In part one, I found his comments on the desktop to be most interesting:

To me, Linux on the desktop has always been the most interesting goal. The primary reason for that is simply that it's always been what I want (I've never wanted a server OS-I started out writing Linux for my own PC, not to be some file server), but also because all the interesting problems always end up being about desktop uses....

The desktop, in contrast, is all about a wide variety of uses. Huge variety in hardware, huge variety in software, and tons of crazy users doing things that no sane person would ever even think of doing. Except, it turns out, those crazy users may be doing odd things, but they do them for (sometimes) good reasons. So aiming for the desktop always forces you to solve a much more generic problem than any other target would have forced us to look at.

In part two, Torvalds is asked questions about Git commands ("I think we ended up with something like fourteen commands being used. And even that’s more than most end developers even will need."),  revision control systems, and kernel problem-reporting.

Considering Linux's success on the server side and its everlasting struggle to make gains on the desktop, what do you think of Torvalds' comments that he envisioned Linux first and foremost as a desktop solution?

About

Selena has been at TechRepublic since 2002. She is currently a Senior Editor with a background in technical writing, editing, and research. She edits Data Center, Linux and Open Source, Apple in the Enterprise, The Enterprise Cloud, Web Designer, and...

50 comments
Altotus
Altotus

And were not looking back. Tops as far as were concerned.

ericswain
ericswain

For the Recreational Computing Sector,(ie. Internet, Email, Light Document work) I think Linux has a better chance at impacting the market share from both Microsoft and Apple as both of these companies haven?t addressed this new sector head on. With the explosion of Net Books (I too fall in this category as I own 3) see Linux and even HP's Unix gaining traction. This goes to show that people are willing to try something different and at the right price point the OS is almost irrelevant. Give me access to the internet, give me my emails and let me draft up a quick letter in anything resembling office and we're golden. When it comes to businesses that run demanding applications, exchange servers, and business standard office programs, the adoption from Enterprise or even SMB's fades as the majority of the business sector in almost every way designed to work with Microsoft's Operating System. The standardization that has occurred has left little wiggle room for anything not the above mentioned. If Linux wants to play in the sandlot then it?s going to have to adapt to the other players, otherwise it?s going to get knocked off the second it comes on board. I know, I know you?re going to say ?well you can install Wine to run your apps?, but an app to install apps that may or may not work is something that needs to be addressed, in the business world MS is the standard. GUI environments that change the markup of what the end user is able to do only adds to the confusion of the typical end user in Linux.. GNOME or KDE? Running Linux on a device like a Net Book seems like a good fit for now as most Net Book OS's are designed specifically around the hardware that the OS is installed on. This sort of marriage works for end users who don?t want to fuss with an Operating System and just want to use the computer. Apple has used that marketing strategy to boost sales in their ?Virus Free Operating System? (not mentioning any names but for some reason they sure love cats). The true reason they have fewer viruses is simple, no one wants to waste time writing a virus that no one is really going to get. Virus, Worm, Trojans are written for two reasons, media and exposure and financial gains. Until a company like HP, Novell, or even Oracle (with the new acquisition of Sun Microsystems) for that matter now gets into the Operating System Standardization with Linux, this Operating Systems isn?t going to have a platform to compete against Microsoft?s hold on business computing.

jlwallen
jlwallen

i realize that i have a different opinion than most but i have always thought Linux on the desktop made just as much sense as it did on the server. i think ultimately it boils down to what you are familiar with. i always liken the Linux/Windows issue to religion. if you grow up catholic you will know Catholicism and much less about the Methodist way of things. sure you might know the basics, but that's about it. same holds true with operating systems. the first operating system i REALLY worked with was on an Apple II (OS 6 maybe???). then i worked with DOS for a while. after that i tried windows 3.1 and didn't like it so i went back to Apple. it was a switch back and forth between Apple and Windows but Windows never really felt comfortable to me. finally Linux came along and, for some reason, it felt second nature to me. it made perfect sense on the desktop. i still feel that way. i get a lot of people scoffing when i firmly say Linux makes more sense on the desktop than windows. i can totally see where Linus is coming from on this. i wish more people saw it this way. ;-)

william.bondy
william.bondy

How many computers users do you have, how many Servers. I really don't mind MA and PA shops using Linux, but on that second year, and the First year you make money start budgeting for MS. You don't want your I.T guy to run your business. If your Linux guy leaves what kind of shape will you be in? Make sure they have lots of DOCS on everything they did. When I have to send in a Linux guy to clean up another Linux guy mess, it is very expensive and nerve racking.

tbmay
tbmay

An even bigger question I'd have for people is why it's so important for them to see a widescale adoption of Linux on desktops. If I can get the job done and save the client money by deploying open source software, I do so. This is usually in the network device/server roles that come up though. The end users don't know what's running these platforms and don't care. Desktop adoption is a completely different matter. These folks have businesses to run and don't have time to learn a new way of computing. This doesn't even touch on the fact that most of them have special apps they depend on for their businesses that are obviously written for Windows. Wine? Sure I use it on my own workstations but anyone who uses it knows it's not even close to 100% and the ONLY way I'd think about suggesting it for clients is after substantial testing of each and every app they would use. Of course, I don't work for free and this would not be cost effective for the clients. Long story short, I don't see the Windows desktop disappearing in businesses any time soon and I don't have any agenda to change that. If a change starts naturally happening, it would be very good for my business but I'm not holding my breath.

brian
brian

Getting a Linux box to play nice with an Active Directory network is possible, but hairy and unpleasant, and takes a lot of RTFM. And then once connected, you have a hard time getting any benefit. I had a Linux box and a WinXP box both connected to a small Active Directory network (around 300 machines). After the initial connection, I touched the Linux box only to run some bash script that couldn't be written in Batch language on XP. The command-line tools for Active Directory administration and Samba share access on Linux are hairy and unpleasant, while the same tasks in a Windows command prompt are simple and easy, allowing for almost anything an admin would want to do, remotely, with scripted automation, without touching VB. I dearly missed the ability to nest commands with backticks, but that was it, and I could usually use a second batch call to replace that functionality.

pgit
pgit

I've used Linux on many machines going on ten years now. I've never been locked out of any task, in fact I found I can do a lot more with Linux than with windows. (primarily because to do 'the same' in windows you have to pay for software) BTW check out Mandriva 2009.1, they have done it again. This is the greatest Linux desktop ever, seriously. I have it running hardware that never worked in Linux before. Wireless is a billion times better than in windows. 2009.1 picks up a couple old pcmcia wifi adapters I have right out of the box, just click to connect. I can't get either of these to work at all in XP. (one of my tasks today for a client) I have the alleged drivers, but they just won't recognize. I thank Linus, really can't thank him and like minded developers enough. Hey Steve; "DEVELOPERS!! DEVELOPERS..." ey?

fatedtodie
fatedtodie

I recently decided to look into learning more about linux just because I got bored, and after installing about 7 different flavors, and sub flavors of linux I can tell you why I stick with windows. Rather than looking at it religiously it is due to what it would take for me to do everything I want to do on my computer. My Zune, is not compatible with Linux, so... that is a negative. My games are not compatible with Linux, so... that is a negative. Internet works, so that is a push. Startup time is about the same, another push. Music plays, another push. So far (albiet from limited use) I don't see what I GAIN from going linux. Also being that I have several windows licenses from hardware purchases and computer builds, I don't see the "free" aspect as a gain. I do feel some Linux users treat the OS like a religion, and view Jobs and Gates as their version of the antichrist. I never felt that way.

lastchip
lastchip

If MA and PA shops are already using Linux successfully, I can't imagine why they would want to make a retrograde move back to the treadmill that is Windows. In terms of stability, security and now usability, there are clear advantages to using a Linux platform. Not to mention cost savings that are so important in the present economic turmoil. I accept for medium and large company's, making the change is a major undertaking, but for small businesses, it makes perfect sense. The problem is, take away a GUI from 90% of Windows admins and they are lost. You can't blame that on Linux, it is just a symptom of a dumbed down system. Having the choice between a command line in Linux and endless drop-downs in Group Policy, I know which I'd choose. As far as I'm concerned, Windows is history! And not a bad history either; after all, it gave computing to the masses, but it's had it's day and there are better options now available. Producing an ever more bloated system, requiring ever more powerful hardware is not a sensible or sustainable route for the future. It's a pointless exercise, when for most small businesses, they are only using a fraction of the computing power available. Microsoft attempts to keep reinventing the wheel, so in a sense, is the victim of it's own success. You only have to look at the resistance to Vista, to see that in action. Vista gave no clear or convincing reason to upgrade from XP and frankly, nor will Windows 7 or whatever it turns out to be called. A pretty desktop is not a good reason to spend corporate money! I think IT pro's sometimes forget an operating system is just an interface between the user and the hardware. It makes little difference, what that interface is, providing it is easy to use. Computers in business are simply another tool to produce a result. Why use an earthmover, when a shovel will do the same job and in many cases, quicker with less disruption!

RipVan
RipVan

...IT guys already DO run the business. And far too many are "test passers" who bring a nice certificate that lines the pocket of Bill Gates, but the person is basically clueless. Another person has to come along and clean the guy's mess. Yes, his "MS mess." And don't use the term "budgeting for MS," but the more appropriate "budget for the MS treadmill." Most of the people in computers I know are very nice people who try hard, but couldn't work in Linux if they tried. They fake their way through Windows as it is. They don't know any better, their customers don't know any better, and they don't necessarily think Windows is the "best" way, they just can't do anything any other way. The true computer junkies I know do both, and while they may be better at one than the other, they recognize that the unwashed masses are hooked on inferior computing, and it pays the bills, so that's just the way it is.

Neon Samurai
Neon Samurai

Enough market share to convince software and hardware developers to provide the minimal support needed would be enough for me. The market is there but it's not "retail" measurable so you can't convince the specialty hardware and game publishers to support it.

pgit
pgit

Yes, of all the interoperability issues, Active Directory is the bugaboo. A major PITA.

Neon Samurai
Neon Samurai

.. ah.. another exuse to cut a VM and have a look. maybe it's ready to move me off 2008.1 but I'm still tentative about KDE4. Also, for old hardware, keep your 2008.1 liveCD and installs around. I had a notebook that needed a temporary *nix OS but 2009.1's KDE4 ate it alive; unusable until I grabbed my 2008.1 disk. But, that was also 2009.0 with a less mature KDE so maybe it's improved. Bwahahahahaha.. it's ISO download night for me

brian
brian

Like Linus said, the variety ends up being a problem that Linux hasn't solved. My experience with Linux has been very positive when it only has to do very basic things or a very limited number of things. Whenever I've tried to use it as a regular use OS it's ended up blowing itself up and rendering my machine useless. It could of course be brought back, but only after a multi-week research and troubleshooting expedition. Whenever I try to install it on a laptop, it again takes weeks of troubleshooting and research, modifying text config files and the like. Usually to get the screen properly recognized, because apparently X is more sensitive to EDIDs than any other OS. Longest I ever kept it stable as a working OS was seven months or so, and then the video drivers suddenly stopped working on a security update. Had spent two weeks trying to install them, so I pretty much gave up. Makes me laugh a bit when people insist it's more stable than windows. On the other hand, I maintain a Linux box sitting at home as an MP3 jukebox and it's only blown up and needed reinstall once in the last 2 years. (Again on an automatic security update...) Other than that, it handles everything an MP3 kiosk should do really well, hosts a couple servers, and can burn disks for the car or dump a special selection to a thumbdrive really easily. I think they could probably make great strides if someone would take the time to clean up the software repositories (I'm Debian) so that new users have an easier time figuring out which of the 50 apps written for a task is the right one to install. ("Whoops, no, that one doesn't run under KDE. Oh that one's a console app. Oh that one hasn't been worked on for 12 years. Oh there we go, K3B. Why didn't I immediately associate that name with CD burning? How silly of me.") Yeah I know Ubuntu installs K3B by default now. IMO that's the source of its success; it filters out the crap and spoonfeeds some basic functionality. Extrapolate the above to other things you would do with a computer, and then picture going through the trial-and-error for everything you do that isn't default. Makes a mess of your system, and gets old quick!

pgit
pgit

...on what you use a computer for. I manage systems with mine, do a lot of support work like troubleshooting network issues, do security auditing etc. Never played games, don't have a 'pod of any kind (though a lot do work with Linux) It really depends on what you want/need to do with a computer. Linux does it all for me, MS doesn't, without shelling out $$$ (or even $$$$) In fact of dozens of folks I've migrated to Linux the only reason any of them keep a windows instance around is games. (though in a couple of cases it's proprietary hardware support) I agree the fanboi 'which is better' argument is a waste of my time. Each system is what it is, does what it does, and that's that. Though I do find Steve Ballmer rather entertaining..

jlwallen
jlwallen

i have to say that i am not one of those users who looks at bill gates in a negative way. how can you when the man is probably the biggest philanthropist on the planet. he really is a good man. as far as your zune is concerned - i can't really comment on that hardware because i've never tried one. i hope they are better than that Apple version of the same device. but at least the apple version (minus the iPhone and iTouch) can be used with Linux. i can't say i can relate with the games issue. when i play games i am either at at table top or a console. i spend so much time at my PC i don't want to add to that by playing games. my PC is for work. my play station is for play. ;-) of course when diablo III comes out - i might have to break down and actually boot my laptop into vista to play it. ;-)

robsku
robsku

as it only takes: /etc/init.d/httpd restart That will take a fraction of your 10 seconds ;) Anyway, it's not that kind of ultra simple tasks where shell commands really rock - there is more to system administration than just restarting servers. I've read a book about command line windows administration and even with Windows and it's dreaded cmd.exe it's way faster to use cmd line than mouse - your just being ignorant, the more complex the job, the more it takes unnecessary time via mouse. Even M$ knows this, that is why WinXP and Win Server editions, even with just cmd.exe, are fully administrable via cmd line if you know how. Another proof is M$ making the PowerShell - they did not make it just for fun, that is a tool for serious effective system administration that had to be made because cmd.exe, even if way faster than mouse, still had a lot to gain to reach same level of efficiency provided to shells like bash on *nix systems. You said you prefer the GUI and that is great... for you.

Neon Samurai
Neon Samurai

In terms of "install only what is needed" the insecurity in GNOME would be having it installed when a GUI desktop is not required. It adds more libraries and layers increasing the risk that there is a bug on the machine and that the bug is exploitable in some way. With a desktop, the cases where a GUI is not relevant are far fewer so it comes down to choosing one that fits your needs without opening the system up to exploitation. My personal solution is to boot to text login and start a GUI manually when required; typing "startx" is preferable to always being dumped into a GUI by default. It's the same with me and Backtrack; why load a GUI if all I'm doing is popping up a kismet or airodump scan? That's generic for any GUI layer though including the webmin browser interface. Linux/A,M,P/X/Ghome or Linux/A,M,P/Webmin are heavier than Linux/A,M,P. You also have the potential for open ports (X if allowed by the admin and https 10000 for webmin dfault). With windows, the exploitable ports are not so much related to the GUI directly but you get other things more nefarious than resource usage. Things like getting a browser open on otherwise locked down non-browsing public terminals. The original poster may have Gnome specific bugs in mind though so hopefully they respond.

Slayer_
Slayer_

You said GUI, not windows. You said GUI's are insecure. Also I'm not a fanboy. I pretty much hate everything microsoft EXCEPT the original MS photo editor which is actually pretty handy and I miss it when it was removed in office 2003 for some new BS version that makes even resizing images a chore.

Neon Samurai
Neon Samurai

It's not as nice as a true link (ln) but I shortcut the heck out of the desktop. There is still clicking afterward but a desktop shortcut can at least replace the first few clicks with a keyboard command. Still not as nice as a "nano /etc/network/interfaces && /etc/init.d/network restart" or "sed 's/something/somethingelse/g' vhostA.conf > vhostB.conf && /etc/init.d/apache reload". Also not as nice as having human readable configs editable by any text editor or specialized utilities. But, there are ways to make Windows more efficient without reducing security.

Neon Samurai
Neon Samurai

If both hands are on the keyboard, I can crank out commands in detail very quickly. One command line entry to go to a config file, change a setting and restart the service. Consider a word processor with a mouse. Both hands on the keyboard cranking out a letter to Aunt Emma. Now stop and reach for the mouse to highlight and bold some text. Stop and reach for the mouse to add italics. Drag the scroll bar up to review something at the top. Select the print button; crap, defaults where not correct. Select File -> Print, click on settings and correct printer. A wordprocessor by keyboard only; type, type, type again cranking out Aunt Emma's letter. keyboard command to select last word; crtl+b to bold. No, or noticably smaller, break in typing speed. Hit end, hold shift, hit home; crtl+i for italics and we're still moving along. crtl+home to review from top. Alt+f, p, and the printer prompt is up as fast as typing "The". alt+N, arrow down to select desired printer and off it goes. There are times when the mouse is the more efficient way to interact with the machine but those times are greatly overstated by most I've met. For me personally, I click through Windows where it makes more sense to do so, I enjoy Mandriva's GUI draktools and webmin but Debian's well layed out config files not locked into some GUI utility accessible only format blows the others away. It's great that Windows has the admin console that can connect to other networked machines as easily as the localhost (what encryption does that use over the network?). I'd just love to see that same admin flexability from the command line without reason to touch a GUI+mouse (I believe there new admin shell addon is a big step in that direction also).

lastchip
lastchip

You've got to get to that GPO first through a number of drop-downs, and that takes longer than 10 seconds. I'm talking about starting from a logged-in status. I'm at the command line and you on your desktop. You've got to go through a series of click, click, click to even get to the GPO. At least, that's my experience with the Windows server set-ups I've seen. Or perhaps you have something different.

lastchip
lastchip

You read more of Chad Perrin's and Michael Kassner's excellent articles if you want to learn about security. The shear numbers of malware and viruses written for Windows says a lot and the fact that many of those are now initiated via a browser doesn't need much further explanation. It is clear you are a Windows fan boy and so be it. It makes no difference to me. I have my views and you have yours, nothing wrong with that. Just don't expect me to agree.

william.bondy
william.bondy

Well, your very mistake my friend, 10 seconds I can stop and restart the WWW service, with one right click, and that is 10 seconds faster than your apache hehehe. GPO are very eazy to use and MS even put a sentance to explain what the GPO means, even a Linux guy could figure that out. How you can say cmd line is faster than the mouse??? it would be like bringing a knife to a gun fight. You said it you perfer the CMD line and that is great,,, for you

Slayer_
Slayer_

I am guessing you don't know of the server manager apache comes with, that litiratly has a "Restart" button that with a single click restarts it. The entire process takes 3 clicks. A double click on the task tray icon, and a single click on the button. And since when does having a GUI increase security risks?

lastchip
lastchip

So you'd rather use a tool that requires **** loads of manual reading, Do you not have to learn (one way or another) to use Group Policy? Group Policy is not always intuitive, in spite of what Microsoft would have you believe. and lots of guess work and trial and error, over a tool that clearly explains what everything does, allows you to turn it on and off with a click of the button, I'm not quite sure where that's coming from and rather proves the point I made in my post, that Microsoft admins are lost when you take away a GUI. Many administrative functions in Linux can be implemented from a single command on one line, further, it is often possible to "string" commands together, to create a series of required changes. Just for example, I can stop and restart my Apache server to implement changes in around 20 seconds. (limited by my typing speed) Try and do that with Windows server, you've got no chance. and allows you to quickly load a web browser to figure out options you don't understand rather than having to first start a GUI session and dropping what your doing... There is a difference in philosophy between us here. In my view, a GUI has no place on a server. It simply offers more potential security issues,than if it doesn't exist. The old adage of get rid of everything you don't need on a server has never been more pertinent. If you're referring to a desktop, then there are many graphical tools available now, to make Linux administration, very similar to Windows. So in conclusion and in the spirit of clarification, yes, I do prefer the command line every time. It's quicker and more efficient, but like everything else in life, you have to make time to learn something new.

Slayer_
Slayer_

So you'd rather use a tool that requires shit loads of manual reading, and lots of guess work and trial and error, over a tool that clearly explains what everything does, allows you to turn it on and off with a click of the button, and allows you to quickly load a web browser to figure out options you don't understand rather than having to first start a GUI session and dropping what your doing... Just checking if this is really what you meant...

Neon Samurai
Neon Samurai

For me with Mandriva, #.0 is more of an RC1 kind of thing. Since 2007, I've held out for the #.1 versions letting the distro build mature a little. Free and Live torrents just finished. Time to cut a disk and reboot the notebook. Live won't replace my Backtrack as my go-to run from CD system but I always keep 8.1 and now 9.1 liveCDs handy.

pgit
pgit

I had three older laptops yesterday that 2009.1 (or suse 11 for that matter) just wouldn't boot. They got a 2008.1 install that went flawlessly. But on newer hardware 2009.1 is utterly incredible. I wouldn't compare it to 2009, rather 2008.1. 2009.0 was garbage and I was contemplating moving to another distro, after being with Mandriva since Mandrale 7.0. But 2009.1 is 'night and day,' and KDE 4 is stable and useful. Not everything I used in KDE 3 has been ported yet, but enough of it has been, and there are alternatives to the rest. You are going to be shocked at how smooth 2009.1 is, it really is the best Linux system I've ever seen. BTW I happen to be running 2008.1 on my main machine, My primary desktop doesn't like 2009, the video is unsupported. (even on 2008.1 I have to tinker with xorg.conf to get the resolution to stick, and old intel chip set) But this dell vostro laptop (currently 2008.1) runs 2009.1 One live CD perfectly, stuff that doesn't work in 2008 is working on an unpatched live CD! The doinky broadcom wireless works out of the box. Took me forever to get it working in 2008.1.

Neon Samurai
Neon Samurai

With a tv now in the same room, I've not been motivated to muck with the tuner card. I think it's time to have another go though. Thanks for the link, I've added it to my notes.

robsku
robsku

...I got this dvb-t tuner for free from a friend who didn't need it anymore and I never checked even what brand it is - just plugged it in PCI slot, booted up my debian, scanned the channels and it "just worked". Now that you asked I ran "lspci" to get the brand and it listed following three controllers for that card: 00:0a.0 Multimedia video controller: Conexant CX23880/1/2/3 PCI Video and Audio Decoder (rev 05) 00:0a.2 Multimedia controller: Conexant CX23880/1/2/3 PCI Video and Audio Decoder [MPEG Port] (rev 05) 00:0a.4 Multimedia controller: Conexant CX23880/1/2/3 PCI Video and Audio Decoder [IR Port] (rev 05) ...apparently these also use a chip from Hauppauge (or something like that) as with lspci -v I got more detailed information and each of the three entries had also following output: Subsystem: Hauppauge computer works Inc. Device 9002 (the last number was 9002 for first entry and 909 for the two others). Supposedly this is not exactly a new product but it works fine :) And yes, it's a coax cable feed in. Here's some technical information I just googled: http://www.conexant.com/products/entry.jsp?id=107

Neon Samurai
Neon Samurai

Offhand, what's your tv tuner if it's a coax cable feed in? I'm mucking with a hauppauge board under another distro.

robsku
robsku

...I too use Debian, but my system is totally for all purpose use... Not only it does games, work, etc. it also works as my multimedia station, including replacing the television. The longest time I've had it run without a reboot is propably 3 months and the last time I had to reinstall a Linux system because I trashed it (*I*, totally my own fault) was with Red Hat 7.1 in 2003. I've never had my debian, nor my fedora trash itself after security (or other) updates.

Neon Samurai
Neon Samurai

I'd honestly like to know the technical limitations of providing a firmware flash and generic driver interface. Especially with flash memory showing higher read speeds than platter drives. Maybe there is a reason it can't be done but if it's a business decision only, they should really re-evaluate it.

Slayer_
Slayer_

Why don't we just ask nVidia why they do this? I think you should ask though as you can explain it better than I can.

Neon Samurai
Neon Samurai

Update Drivers; download, run setup.exe, answer prompted questions as applicable, reboot. Update Firmware; download, run setup.exe, answer prompted questions as applicable, reboot. So, what is the problem with providing a generic driver interface and managing the secret sauce in firmware? I?m not seeing how a firmware update would be any more complicated than a driver update. Actually, it should be a little smoother since you don?t have as many variables as a driver wrapped in the OS would have. What makes this better is that a generic driver interface can be supported across multiple platforms outside the manufacturer?s budgets and limited development team. They can provide interface specs freely without fearing some second place copycat. Actually, you could return to true plug and play since a generic interface could be supported much more easily across hardware components in addition to the software side. Firmware is then separate from the OS platform since managing a flasher utility across platforms is much more simple then hardware drivers. I download the newest firmware, I flash my router and it?s good after a reboot; simple. I download the newest firmware, I flash my N810 from Windows, Linux, BSD or osX; not limited by platform. I download the newest firmware, I flash my GPU board and reboot (maybe the board just restarts itself); it does not add administration complexity while adding full support across platforms. Your example is based on the current model; firmware binary included into the driver for limited platforms. I?m still not understanding why it would be so hard to remove the driver layer and stuff the firmware onto a flash chip on the board. They can still add into the firmware. At worst, they add another hook to the generic interface to support the software PhysX; existing support remains in place and the platforms add support for the PhysX extension at no cost to Nvidia. Let?s look at a different example. DD-WRT is fully in firmware memory. If your suggestion that the only way improve or add too the drivers is by keeping them in the OS outside of firmware flash memory; how on earth does DD-WRT manage to add functionality? How is it possible that I can flash a base firmware image then add functions like VPN, VOIP or gaming support by, say, flashing the applicable larger firmware over top? Why would it be so hard for component vendors to drop a 128 or 256 meg flash chip on the board and have at it? Are router and other appliance manufacturers doing some kind of bizzar magic. Ok, without analogies; why would it be so difficult to make your improvements and additions in firmware leaving the OS/driver side to add it?s own support as the generic interface expands? Is it limited flash memory to expand within? Is there some way that firmware would function against the onboard GPU that reduces processing efficiency? What technical reason is there that this could not be so?

mechanicalmen
mechanicalmen

Next some clown will come on and tell you that all you have to do is bash/dunbass/moron and there you are. But they don't get that productivity is murdered in the process of having to stop and fix the OS. now send in the clowns

Slayer_
Slayer_

Ooops, made an off by one error in the drivers and it costs 10 frames per second. Do we provide users ability to flash their video card bios, or just give them new drivers... As flashing can destroy your video card, drivers seem like a better solution. And don't forget it's not just drivers, nVidia's control panel kicks the crap out of ATI's control panel as far as flexibility goes. As long as ATI cannot copy this, nVidia has a comptitive advantage. This is just an example. I have been updating my drivers as often as I can trying to get them to fix bugs, if it was just a generic interface, then there would be no hope unless they kept flashing the bios. And then how is that any better? Here is a sample from nVidias website for the latest driver version. ~~~~~~~~~ New in Release 182 Drivers: * Boosts performance in several 3D applications. The following are examples of improvements measured with Release 182 WHQL drivers vs. Release 181.22 WHQL drivers (results will vary depending on your GPU, system configuration, and game settings): o Up to 8% performance increase in Fallout 3 at high resolution and AA. o Up to 10% performance increase in F.E.A.R. 2: Project Origin o Up to 9% performance increase in Half-Life 2 at high resolution with AA. o Up to 11% performance increase in Left 4 Dead at high resolution with AA. o Up to 10% performance increase in Race Driver: GRID at high resolution and AA. * Includes full support for OpenGL 3.0 on GeForce 8-series, 9-series, and 200-series GPUs. * Automatically installs the new PhysX System Software version 9.09.0203. ~~~~~~~~~~~~~~~~~~~ So new PhysX, can't do that via generic interface. new OpenGL support, again, need more than generic driver. Performance increases? Did they improve code efficiency? That would be a bitch if it was part of the firmware.

Neon Samurai
Neon Samurai

A range of prices and qualities is not the problem. we're also not talking broad product categories here. In terms of GPU, your down to ATI and Nvidia and the changes form chip generation to generation is exponential. As it is now, they are already hopscotching each other per generation. Nvidia had the high testing scores, then ATI.. Can you clarify why it would be so hard for either hardware manufacturer to provide the closed bits on a firmware image and a generic driver interface that all platforms could support? For drivers in general, this is the bit of code that enables OS to speak with hardware. There is no reason for that translation codec to be closed. The company decides that a piece of hardware sales are too low though it's still widely in use; production and driver development ends. Everybody who has the hardware is out in the cold. There is no techincal reason why a bit of hardware can't work across multiple platforms but instead the vendor limits sales by developing for only one.. maybe two if your lucky. The area of expertise is the hardware design, they would only gain more budget towards further design work by allowing others the information to write the platform support. If the software market was more healthy like other markets then I may feel differently. As it is, I'm a consumer that has to survive buying from a broken lopsided market that limits my decisions for reasons that are not technological at all.

Slayer_
Slayer_

I don't know if you have any sales background but I don't believe ~~~~~~~~~~~~ "Now, the usual consern is; but what if our competition gets the information and uses it for there own products. And response; if your competition is waiting for your products to hit shelves so they can copy it then they are not real competition. They could take that extra budget used for OS support software (drivers) and focus it on developing more innovative hardware; the company and the end users would win." ~~~~~~~~~~~~~~~ To be true. If this were the case, there would be no cheap store brand groceries, we would not have cars in china that only cost 4k, we would not have a million different tractors, saws hammers, etc. that all do the same thing but all cost different prices. One made the original, the rest copied. It's not about who is first anymore, but who is cheapest. And often being the second person to invent something, allows you to produce it cheaper.

Neon Samurai
Neon Samurai

Mandriva, formerly Mandrake, was one of the first distributions to focus on the desktop and ease of use. It may not work for you but it's worth a look before deciding. I'd suggest looking at the GNOME and KDE versions of the Mandriva One liveCD. This will give you an idea of the default software selection and how it supports your hardware. If you like it enough, you can easily install it. If you want a more customized install, use the liveCD for testing and Mandriva Free install DVD for the actual build. If you use Mandriva Free you may want to add the PLF repositories along side the standard repositories but that is your choice as some programs may be patent questionable. If you really like it, you could always buy Mandriva Powerpack which gives you all the media codecs and other bits where a license fee is applicable. My feelings on Debian are that it'd make a fantastic primary distribution if the wifi and gpu where covered in the main repositories. Now, I understand the reasoning for not including them also. They want stability and source availability which makes sense. If they can't fix the code, they are limited by waiting for a budget restricted proprietary dev team to do it. As a server, the only thing that could replace Debian for me is one of the BSDs. If your not going BSD for your server then it's Debian unless your paying for Red Hat or Suse. Here's some reasons though. It's not the arogance of demanding a closed source developer release there source. It's that closed source is not aproporiate in some cases; Drivers. The bit of programming between the OS and the hardware; there is no reason for that to be closed source. Nvidia has patents they don't own; not a problem, put the patented crap in firmware and provide a generic driver interface. A driver enables hardware support. If a driver is broken and the interface specs are not available, your screwed. Why must hardware be bound to a single branding of software; that's madness. (In nvidia's defense, I hear that a number of the developers also work on the open source drivers after hours) In the case of Adobe, they are victims of there own sucess. It's not rare to find a website that doesn't overuse Flash media; with power comes responsibility. That media format limits 90% of the internet to there file format; provide the reader. Now, they do provide the reader in 32bit format but I can't understand why it's so hard to recompile that same source against the 64bit libraries. When the rest of the linux base platforms went to 32bit that's what it took; recompile the software against 64bit.lib instead of 32bit.lib (liberties taken with file name obviously). Why is there a 64bit Adobe Flash player for Windows and osX yet no recompile for Linux and BSDs? In short, if a company can go it alone and provide a better bit of software then fantastic; I'm the end user so I benefit from that. The issue is where the company does not provide a better solution than can be provided by volunteer developers. Open your driver interface specs and sudenly your product is supported across multiple platforms at no cost to your expense centers. Your market of potential buyers expands without any marketing cost too you. Now, the usual consern is; but what if our competition gets the information and uses it for there own products. And response; if your competition is waiting for your products to hit shelves so they can copy it then they are not real competition. They could take that extra budget used for OS support software (drivers) and focus it on developing more innovative hardware; the company and the end users would win. But as an end user, your conserns are getting things to work for you. Stick with the 32bit Mandriva install and your youtube will work fine. If you install the 64bit Mandriva then use the 32bit browser and I hear Flash works fine. GPU and wifi drivers shouldn't be an issue. In the case of ATI, the community drivers worked better at the time. In the case of Nvidia, the closed binary drivers cover my needs. Don't be afraid to experiment a bit but stick within the distro and PLF repositories until your comfortable with the platform. Last, distros are not all equal. Like anything, it takes some shopping around to find what covers your needs. If Debian hasn't covered what you want then look at other distributions. If your open to different platforms then keep exploring, the worse that can happen is an understanding of wider options. If Windows or osX ends up covering your needs, then at least you explored options and made an informed decision. I know one platform doesn't cover all of my own needs. I was back and forth between Mandriva and Windows for years until I was equally comfortable in both platforms.

brian
brian

I'll give Mandriva a try. My Debian was the Testing one, because I needed some of the new things it had and didn't want to try and add them myself (start with a "stable" and do too much to it and, well, you might as well be testing...) To some extent I've been hurting myself by not wanting to relearn a new distro after I got used to Debian... I do agree their policies make things harder though I didn't know other distros handled it better. I remember when they messed a bunch of people up by yanking a bunch of network card support, because they were worried that the binary code that needed to be sent to the network cards on boot didn't count as open source. And to this day they continue to turn up their noses at professionally-written free utilities direct from their source (nvidia drivers, adobe reader I think?) because they don't like that the owning corporation doesn't want to give them the source code. (Seems a bit high of an expectation to me and they should be happy the company is writing a Linux variant and handing it to them for free...)

Neon Samurai
Neon Samurai

Debian is a distribution I?ve not tried to run my ATI or nvidia through. The focus on free code keeps the GPU support out of the main repositories and Mandriva continues to cover my needs against bare metal. Debian?s reasons are political not technical but they benefit the distributions overall goals. If it was Debian Stable (was v4, now v5) then your not getting the latest software and drivers but you are getting the most stable and security updates for those programs. If it was Debian testing (now v6?) then this is normally recommended for desktops due to newer drivers and software but less stability as a result. If it was Debian Unstable (sid, now v7?) then there should be no surprises that it had issues as that is the development version trying new programs and processes. If Debian included the closed network card and GPU drivers directly then it would be my main distribution in a second. Ubuntu, I can?t talk about in too much detail as I haven?t look at it recently. There is nothing compelling enough to replace my existing preferred desktop distro. Ubuntu actually has poorer hardware support and admin tools than my preferred though it?s popularity and polish may be a little higher. Both are major distributions but I wouldn?t consider them universally indicative of the other major distributions available. I?m still on Mandriva 2008.1 since it covers my needs, continues to get updates and uses KDE3 by default. Mandriva 2009.1 just finished downloading so it?s time to play with the liveCD now after having time to mature further beyond 2009.0. With Mandriva, I found the community ATI driver much better than the ATI proprietary one at that time. During install it asks which you?d like to use (again any time you go into the X server settings utility) and when choosing the community, frame rates and stability where much better. I?m not on an nVidia GPU and the closed binary does all I need of it. I can see how a kernel update can hose the system though. In your case because the kernel drivers are not properly allowed to be part of the kernel (where hardware support belongs). Again, my favorite branding manages kernel updates smoothly so I?ve not hosed X because of it. The one time I broke my boot loader with a kernel update, it was an easy matter of grabbing the Supergrub liveCD, booting the system and going into Mandriva?s boot loader tool; it fix the issue automatically and life went on happily. If your not soured by the experience, I?d suggest the usual suspects; Mandriva, PCLinuxOS, openSuse, maybe even ELive. Debian and Ubuntu are closer to macintosh apples vs macintosh apples dipped in chocolate. You may want to go right outside the Debian family and forks. I can?t answer specific details for Ubuntu but if Mandriva gives you grief, I?m happy to try and answer any questions.

brian
brian

Ubuntu has hosed itself twice, Debian once. Both times it was video driver related I think. I've been sticking to repositories. The only really funky thing I installed was X11VNC on the MP3 kiosk, which worked for a long time before the machine had issues. The MP3 kiosk was Debian, can't remember version, whichever was recent in July 2007. It became unstable at boot after a security update. About seven out of eight boot attempts, randomly, would hard-lock the machine, I believe on load of the ATI drivers. (older card, All-In-Wonder 8500.) After boot it would run stable until the next power outage. Hosed that when I got time and went to Kubuntu 8.10 I believe (when KDE 4 came out.) Running OK now but no longer can put the LCD monitor to sleep. It now blacks the screen but leaves the backlight on at all times. So I'll have to reinstall eventually again. That's been the only stable system. My laptop I gave up on. It took forever to get graphics working because of funky EDID responses from the laptop screen. Once I got it working, it ran great for six or seven months. I was using it for OpenOffice, Inkscape, and some basic screenwriting software. No CD or DVD stuff even. Died on a security update, I think because the kernel got updated and the nvidia drivers had to be reinstalled. I found a workaround where after Xorg had failed to load, I could switch to a console, create a soft link to the driver's actual location in the location Xorg had decided to look for it, and restart gnome. Ran like that for a while but after I was done with the screenwriting software I didn't have much reason to keep loading Linux in that state, and I wasn't up for a reinstall. I forget the circumstances of the other Ubuntu crash. I tried loading it onto a newer laptop but the install CD (which was verified) hard-locked the system on GUI load. Had to send that laptop back anyway, tried it on my current one, and the install CD can only figure out how to do 640x480, so I didn't even try. I recognize the distance Linux has come, but my experience has been bad. Probably due to trying to use it on laptops, and most definitely due to my need to do visual type work. If I were using it to, say, use a text editor to write code, or console around through network devices, it would be fabulously suited for that.

Neon Samurai
Neon Samurai

That kind of instability is odd. What distros have eaten your machine and what have you been doing too them? Have you been staying in the repositories or adding any tar.gz or unsupported package you can find? I'm going on a few years now with a heavily used machine running rock solid. VMs cover my needs for testing and tinkering before my live system gets anything; and that includes my Win systems. I've seen installs eat themselves before too though so I'm just curious as to why so unstable in your case.

john3347
john3347

I can't believe I have read this statement.

RipVan
RipVan

I stayed at a Holiday Inn Express last week. I did hear something there that made sense. Back when Bill's antitrust case was being prepared, everyone laughed and said that he was stupid to have made all that money and to have not bought any politicians with it. That, they said, guaranteed him a courtroom beatdown. His wife told him not to hoard all the money. I think that is when he started to give some to charity. I'm just guessing, we didn't discuss that at the HIE.

Neon Samurai
Neon Samurai

I don't think he's simply walking about africa throwing money at problems but with his resources, that alone would make a noticeable humanitarian difference. I'll save comparing his work to those who started philanthropy much earlier on in there wealth accumulation; better late than never if it truly helps.

georgebdaggett
georgebdaggett

I cannot see into his heart, but I don't think Bill Gates is the greatest philanthropist. Your average Joe spends the first 50 years of his life pursuing success. Once he obtains it, he spends the remainder of his life pursuing purpose (or significance). It happens to almost everyone, including Bill Gates. He just happens to have the resources to do it in a bigger way than most. I am not against him in any way. I am going thru the same thing in my life right now. i find myself buying lunch for co-workes that I wouldn't have ten years ago!

pgit
pgit

I spit my coffee all over the keyboard when I read this: "...the man is probably the biggest philanthropist on the planet. he really is a good man." If you knew the nature of the federal reserve/IMF bankster scam, and the "tax exempt foundations" that are an integral part, you'd change that tune in a bazillionth of a second. Vaccines are 95% scam, too. The pharmaceutical/chemical industry that grew out of the global eugenics crowd, don't ya know...

Editor's Picks