What it will take to get developers to stop trying to replicate desktop functionality in Web apps and start using presentation layer abstraction systems?
Persistent "Holy Grails" in the world of computing are the various methods of abstracting the presentation layer of applications. The current incarnation of this tendency has the HTTP network protocol as the common theme. Let's take a look at this history and see what the chances are that the current generation will have to replace current Web application and native application development.
When HTTP and HTML were developed, it was never envisioned that either would be used as a mechanism for providing application functionality. At best, it was acknowledged that someone might download a compiled application or some source code using the HTTP protocol that was linked to in an HTML document. A few years later, the CGI system showed that some read/write functionality was possible with HTML and HTTP, and the first generations of Web apps (rudimentary Web mail, forums, shopping carts, search engines, etc.) came about. For the most part, these Web apps were not created to replace desktop apps or even act as presentation abstraction -- they were about leveraging the connectivity that the Web had.
But in the mid 1990s, someone had the bright idea of writing small applications in the Java language and embedding these applets into Web pages. This gave Web developers the ability to put desktop-style functionality into Web pages. Microsoft quickly followed suit with its "me-too technology" of ActiveX. ActiveX was stillborn due to security issues (the only people who wrote ActiveX components and put them in Web pages were hackers looking to take over systems). Java applets gained a little bit of traction but not much. Applets primarily seem to crop up in intranets (so do ActiveX objects for that matter). Around the same time, Macromedia introduced Shockwave and Flash, but back then, people only used them to make goofy animations and cute little games.
Microsoft and Macromedia (since bought by Adobe) are back in the game with Silverlight and AIR, respectively. What is interesting about both of these applications is that, while they can be embedded into Web pages, they look to break free of the applet paradigm. Instead, they are trying to be runtime environments of their own accord, replacing both desktop applications and Web applications. Both applications work to communicate directly with the application server and the local system at the same time and bypass the Web browser. "Applications" are delivered as documents to be loaded into the runtime, which is effectively the application at the OS level. Because the "application" is targeting this runtime, not an operating system, cross-platform issues such as widgets are not supposed to be a problem.
I actually believe these claims at a technical level (more or less), and yet they will most likely fail to gain significant market share despite the wide publicity. Why? Let's take the technical issues out of the way for a moment and pretend that Silverlight will be perfectly cross-platform capable (it's not, but it's a lot better than you would expect from Microsoft). Let's further extend this fantasy by imagining that it won't cost developers a cent to get the tools to develop for these platforms, and that they can be deployed with no software to be bought for the server (again, not happening; Adobe and Microsoft like to make money selling pickaxes and not by finding gold nuggets). Take this "just-so-story" back in time about 10 years, and you get... Java applications.
That's right, Java applications. Not applets to be run in the browser, but the plain old, desktop application running within the JVM and using Swing/AWT. When was the last time you saw one of these outside of the Oracle installer? It's been a while, hasn't it? It is not like they are hard to miss either, since they use non-standard GUI widgets; you would know it if you used one. Overall, except for a few select purposes (such as internally written and deployed applications and small utility applications), they don't really exist in the wild. Java applications have the technical ability to do what Silverlight and AIR claim to do and have mature, free development tools available (Eclipse, for example). And most of the cross platform, performance, and stability kinks have been worked out.
Nevertheless, Java applications are not a "has been" -- they are staunchly a "never was." If Java couldn't make it and still can't make it, I see little reason to think that either Silverlight or AIR will make it either. Unfortunately, the goal is a good one. Developers need to be able to deploy their applications to a variety of targets without a recompile. And Web applications will always be way too much effort to get to reliably handle common business problems like data concurrency unless HTTP suddenly becomes a stateful protocol that maintains a persistent connection when idle, and HTML suddenly morphs into a great UI specification language.
Since neither of those pre-conditions will occur in my lifetime, the only legitimate route is something like Java + Swing/AWT, Silverlight, or AIR. None of them seem to be big-time successes, so I am quite curious about what it will take to get developers to stop trying to replicate desktop functionality in Web apps and start using presentation layer abstraction systems instead.
J.JaDisclosure of Justin's industry affiliations: Justin James has a working arrangement with Microsoft to write an article for MSDN Magazine.