Justin James, who is now very bullish on Web applications, explains what you need to consider when deciding whether to write a desktop application or a Web application.
One of the biggest shifts in development in the last few years has been the move to Web applications. For a long time, developers resisted this move, and some of the reasons why were good. For example, I said that for a long time the Web model wasn't so great -- the UI capabilities weren't there without a ton of work, and the ability to do "real work" was lacking. Some of the reasons were not so good, and mostly boiled down to a refusal to learn something new.
I have recently become very bullish on Web applications, and I now highly recommend that you consider them over desktop applications in all but a very few sets of circumstances. When you are deciding between writing a desktop application or a Web application, these are things that you should take into consideration.
When do you "have to" write a Web application?
Under certain conditions, you must write a Web application, regardless of what you would prefer to do, either for business or technical reasons. Here are some of those situations:
- Zero footprint installation (Flash and Silverlight work well here too)
- Software-as-a-Service (SaaS) billing model
- Cross-platform compatible (Yes, you can use one of the cross-platform widget libraries too with native code, but Web is much easier.)
- Server-centralized control of logic, resources (data, CPU, et al), etc.
- Public facing application with highly sensitive data that needs to stay behind a firewall
- Needs to communicate between the client and a centralized server outside of the client's network
Note that some of these situations have exceptions. That's fine, but keep in mind that the alternatives to Web applications in these scenarios are often no better and often worse in many ways than Web applications.
In large part, the growth in mobile devices was the beginning of the end for desktop applications. You could count on being able to target Windows and hit 95% of the potential users. Now, so many people (especially consumers) use mobile significantly enough that it is hard to get complete coverage of the market with a native-only approach. To make matters worse, not only is the mobile market highly fragmented between the different mobile operating systems, but the #1 player in mobile right now (Android) is highly fragmented amongst itself. If you really need full coverage of the market without having to learn enough native development to cover the mobile market well, Web is the way to go.
Windows 8 is the straw that broke the camel's back for me. Pre-Windows 8, I would say that you shouldn't write a Web application unless you had to write a Web application. Despite the boom in mobile, it just is not enough to tip the scales for many apps, but Windows 8 is in my opinion.
With Windows 8 looming, the time is right to reconsider native applications entirely. Instead of asking, "do I need to write a Web application" you should be asking, "do I need to write a native application?" Why? Simply put, with the changes coming to Windows 8, if you want to write applications that need a lot of input and similar hallmarks of "real work," the Metro UI is not great for it, and if you don't want to use Metro, you are locked out of the ARM devices. Or to put it another way, unless you are writing an application that can work well as a touch application, a Windows 8 native application is a bad idea, and a "legacy" Windows application locks you in to the existing Windows market, which will only get smaller over time.
Do I need to write a native application?
That's a good question! Let's look at the common wisdom answer to this question. Typically, I would say "yes" if the following are true:
- Needs to be able to work offline as well as online
- Requires substantial access to the local system's resources, particularly CPU/RAM
- Performance considerations
- Heavy graphics work
- Integration with other systems
- Desktop-quality UI widgets
Up until HTML5, I would call that a really good list. HTML5, though, answers each of these in a way that makes it even harder to justify native applications. Let's look at that list again:
- Needs to be able to work offline as well as online: There are provisions for enough local storage capabilities to handle syncing.
- Requires substantial access to the local system's resources, particularly CPU/RAM: Web Workers gives you a development model to not hog the system and block the browser.
- Heavy graphics work: The <canvas> tag allows bitmap graphics manipulation.
- Integration with other systems: WebSocket allows full-duplex, persistent connections.
- Desktop-quality UI widgets -- Improvements to the <input> element and the addition of the <menu> element allow for a much richer UI experience; jQuery has continued to fill in the gaps.
As you can see, HTML5 offers compelling reasons to re-evaluate the decision to go Web or stay native. Indeed, other than certain specialized or niche scenarios (like integrating directly with a "legacy" Windows application), it is very difficult for me to recommend building native applications for typical consumer or line-of-business applications going forward.