Developers can write an application in today's environments without a care about the hardware used and with virtually no regard to the underlying operating system. Time will tell whether this will render the platform wars irrelevant.
At this point, it seems like developers are using native code for low-level items (such as OS code and device drivers), certain high-performance tasks, and applications so large that they are bloated even in native code (office suites come to mind). There is an odd little side effect to this trend: languages and the code written for them are divorced from the hardware.
I think this is a very good thing. With the pace of change within the industry at this point, it is extremely difficult for anyone to know enough about their project and the systems that they are targeting -- let alone how to leverage those systems well. I believe that the demand for programmers outstripped the supply of "chicken bone waving wizards" sometime in the late 1980s. Most of us have heard the old legend of the "real programmer" that optimized the code based on the drum memory's position. Those "real programmers" still exist; they are getting paid big bucks to work on "black art" projects such as device drivers, OS-level thread schedulers, and so on. These programmers are not writing typical business applications.
For those of us working on a typical business application, I think you will find that the code never does anything where the CPU, storage system, RAM layout, etc. is actually important. We are working on things like, "get the data from the database and put it on the screen." If concern ourselves with hardware-specific items, we would never finish projects. In fact, it is getting to the point where even the OS-specific items are being lost in the shuffle. For example, I think most application developers would rather put configuration items in a local XML file, an INI file, or a database and not the Windows registry.
Developers in RIA situations don't have access to the OS-specific items and, from the applications I've seen, their applications are not suffering much from it. Luckily, OSs reached a convergence spot a while ago in which they support the same UI principles more or less -- windows, buttons, and other widgets are all standardized. And, of course, the RIA framework has the option of translating your code to the OS' native widgets or to implement its own (like the JVM does). Regardless, it is entirely possible to write an application in today's environments without a care to the hardware used and with virtually no regard to the underlying operating system. It will be interesting to see if this renders the "Platform Wars" irrelevant, or if the few but important native code applications (office suites as a major example) will be able to dictate the platform choice even if it does not matter to a growing number of applications. Only time will tell.
J.JaDisclosure of Justin's industry affiliations: Justin James has a working arrangement with Microsoft to write an article for MSDN Magazine. He also has a contract with Spiceworks to write product buying guides.
---------------------------------------------------------------------------------------Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!