I appreciate hearing your opinion of what happened in early PC days, but I would also like to have some type of reference for the command set. I like to form my own opinion of these things from source material.
Many of the devices you referenced (hard drives) just work with any OS. I have not ever seen a hard drive, mouse or keyboard that said "requires Windows XX". Where I have seen this is with Modem's and some video capture cards. These devices rely on functionality that is provided by Windows and would have been more expensive for manufacturers to build into the device. Just to make it more obvious, 3rd party manufacturers chose a cheap route to make money which caused incompatibilities with other OS's.
You said "automatically recognize the basic level actions of standard components and peripherals". This is basic level functionality and, again, most people won't settle for basic level functionality. The ITU can not keep up with technology advancement so it is not possible to have a standard way of providing functionality of new capabilities (Draft N WAP's, etc.). These new technologies and functionality is that differentiates products so companies innovate, die or live on a very small margin copying other companies.
"it is this introduced changes from the standards that means software designed to work on Win XP does not just load and work on Win 7 or Win 8" I can give you lots of examples where this is not true. There are several listed in this comment area. There are also a number of examples where this is true, but the same can be said for Mac's and Linux distributions. Developers make improvements to the kernel or support libraries and that can cause problems for applications that rely on that functionality. Again, this is true for all OS's not just Windows.
"old software designed for use on Unix and Linux in the late 1990s runs perfectly well on the very latest versions of Unix and Linux because the command sets are to the industry standards" So you are telling me that every version of every application will run on the most current version of Linux? Many linux commands have not changed in the RESULT you get from them in 20 years, but have been continuously developed to provide increased functionality (VI, VIM - I loose track of all the different names and versions of this application because they usually put a VI alias that points to the current version). All you have to do is look at a change log to disprove this theory and this issue is the very bases of package managers such as aptitude. Aptitude helps you RESOLVE dependency issues, ie incompatibilities. If every Linux and Unix application just worked then you wouldn't need this. So, now I see you falling back to "core" functionality. You could argue the same for Windows. Starting in DOS 3 or 4 you could do a "dir /p" and it would show you a single page at a time of the file structure. You can issue this exact same command on windows 8 and get the exact same RESULT.
Every OS changes and evolves and this causes applications issues that companies have to modify their applications to fix. This is not unique to MS and stating they do it just to make money is just narrow sighted. To be honest MS has really gone too far at times to help make applications run properly. There are some examples where they have customized how the OS handles a particular application in a non-standard way just so that there won't be compatibility issues such as (if I remember correctly) Sim City. I also know that the SAMBA team complained about how Excel made some funny calls to network shares that were non-standard but Windows would handle them properly. MS isn't perfect and I complain regularly about .Net versions, but saying every incompatibility is just a money making scheme is narrow sighted and wrong. .Net explanation: When .Net came out I remember it being touted as a cross windows platform compatibility library, but soon became tied to OS versions which threw out the compatibility capability.
There are a lot of Linux applications that have been ported to Windows, but dont retain all the functionality of the Linux version. Nmap is one example. Windows limits the number of threads that an application can spawn so Nmap runs slower on Windows than on Linux. This example is just to illustrate that there are fundamental differences between OSs that are often times hard (ie expensive) to overcome. It does not always make business sense to port or build an application for another platform. I would like to see MS office on Linux, but I cant see MS making money off of porting it. It would lose functionality (vb scripting which is an integral part of its capability) and many Linux users would be unwilling to spend money on it when (insert favorite office suite here) works just fine.
Keep Up with TechRepublic