UPDATE (2/28/2006): I’ve posted a follow up to this article that presents positive ideas on how to change this situation.
Last night I read a great article (http://www.veen.com/jeff/archives/000622.html) from about 16 months ago about how lousy most open source CMS (Content Management Systems) packages were. While focused upon open source, the author made mention numerous times in the article and follow up comments that his compalints also apply to commercial CMS’s.
Sadly, all of his complaints are still true, and apply not just to open course (or closed source) CMS’s, but to about 90% of the web applications out there.
The simple truth of the matter is, web developers generally stink, not just as programmers, but as user interface engineers.
Over the past year-and-a-half or so, I have spent countless hours installing, trying, and uninstalling literally dozens of various open source CMS systems, without once finding something that works right, if at all. The best one out there, for my needs, was WebGUI. Too bad it broke the moment I tried to upgrade it, Apache, mod_perl, perl, or just about any other dependency it had!
Ten years after the Web revolution began in earnest, I still find myself using systems that are not much better than the systems I was using ten years ago. Part of the problem is the continual state of change within the Web development world. Every time a new language or framework or web server or technique (like AJAX) or whatever starts to gain momentum, all development on existing systems seems to halt, and everyone decides to do everything in the new system. By the time the new systems are about as good as the old ones, another technique, language, or whatever seems to come out. By the time the server-side Java and ASP web apps got to be as good as the CGI/Perl they were replacing, .Net and PHP came out. Now that .Net and PHP apps are getting as good as the Java and ASP pages they replaced, .Net 2.0 and AJAX are suddenly the rage.
The fact of the matter is, if all of that time had spent spent making things work in CGI/Perl (or whatever system had come first), I might have a chance of finding a quality web application.
AJAX is the current fad. It seems to be predicated on the fact that since Google Maps are so good, and they use AJAX, that AJAX should be used everywhere. Here’s the truth:
- Any thing you do with AJAX also has to be written server side, because otherwise your application will not gracefully degrade on a browser without JavaScript or JavaScript turned off.
- The tools to write and debug JavaScript, ten years or so after it came out, are atrocious.
- Different Web browsers still do not agree 100% how to render HTML and CSS, nor do they implement JavaScript identically, forcing you to stick with either HTML/CSS/JavaScript that renders and executes identically or “good enough”, or cut yourself off from a significant portion of visitors.
- People are using XML for something is simply was not designed to do. XML was designed to be used in such a way (in conjunction with XLST, XSD, and UDDI) so that systems could automatically discover and use each other. Thus, XML is written for the “lowest common denominator”, which makes it extremely wasteful, in terms of resources needed to create it, transmit it, and consume it. People are using XML to pass data back and forth between different parts of their code (server-side to client-side, and vice versa) because it is quick and easy for them to code that way. The fact is, their applications are significatly slower, both server-side and client-side than they need to be because of this. It is much quicker, if you know the data format, to pass fixed length or delimited data back and forth, and nearly as easy to write the code. But because programmers are lazy, they would rather save 30 – 45 minutes of code writing, at the expense of creating scalability problems (just compare the file size and parsing time of XML vs. CSV to get an idea of what I mean).
- JavaScript interpreters are incredibly slow. I recently worked on a Web application where the customer wanted some customer validation done client-side on a 3,000 record, 3 field data set. The web browser would lock up for a minute on this. Actually sending the data to the server and handling it there was faster by a factor of about 10. I don’t consider that “progress”.
- HTTP is a connectionless, stateless protocol. In other words, it is pretty much so useless without a zillion hacks laid on top of it to accomplish what a desktop application can do with minimal, if any, coding. Look at the design of application servers. They need hundreds of thousands of lines of code, and basically all they do is receive GET/POST/etc. data, pass the appropriate information to an interpreter or compiled software, then return the results. In the process, they perform validation, maintain connection states (either via cookies or session ID’s in the URL, both of which are hacks, when you think about it), and so forth. This is utterly rediculous. Desktop and server software is using Kerberos and other harded authentication management systems, while Web applications are sending plain text, occassionally protected with SSL. Is this really the best we can do?
- Web developers are still clueless about interface design. Half of the problem is that a Web developer is frequently forced to work with some sort of graphics designer who was brought up in the print world. Sure, bandwidth is cheap now. But with all of the hoops that an application server jumps through to process each result, a significant portion of the response time of an application is dependent upon how fast the Web server can build up and tear down each connection. An application that increases the number of HTTP requests, regardless of how small they are, is an application that won’t scale well. AJAX goes from “make an HTTP transaction with each form submission” to “make an HTTP request with nearly every mouse click”. I don’t call this a “Good Thing”. I call this stupidity. AJAX multiplies, quite significantly, the amount of data going to/from the servers, switches, routers, load balancers, the whole architechture. All in the name of “improving the user experience.” At the end of the day, “user experience” is determined less by what “gadgets” are in the software, and more by “how well can the user accomplish their goals?” A slow application doesn’t “work”, as far as the user is concerned, regardless of the features it has.
- AJAX “breaks” the user’s expected browsing experience. All of those cute XmlHttpRequest() statements don’t load up in the user’s web history. That means that the “Back” and “Forward” buttons don’t work. If you’re providing the user with an interface that looks like a normal Web page, with “Submit” buttons and so forth, breaking the browser’s interface paradigm is a decidedly bad idea.
- And last but not least, JavaScript (as well as Java applets and Flash, for that matter) do not get local disk access. Their only recourse, if they want to save the progress of your work, is to periodically submit the work-in-progress to the web server.
Has anyone actually tried to make something basic even work right? It sure doesn’t seem like it. TechRepublic’s blog system is a great example of how even basic JavaScript can create a lousy user experience. With every keystroke (and mouse click within the editor), it re-parses the entire blog article. It also refreshes the simple buttons at the top. In other words, with each key I press, it is making 14 (yes, FOURTEEN) [correction: 28!] connections to a web server. That is patently rediculous. This should be re-written in Flash, or dumbed-down so it doesn’t need to do this.
In fact, just about the only AJAX applications [addendum: I’m talking about just the AJAX portion of the functionality, I’m not particularly impressed by Google Maps’ results] I have seen worth using is Google Maps and Outlook Web Access. The rest seem to make my life more frustrating than whatever problem they thought to solve.
Google’s success is a great example of just how lousy web applications are. Google Maps took off like a rocket, because their competitors, despite having five, six (or more) years lead on them, had wretched interfaces. Mapquest hadn’t seemed to become any more usable since Day 1. Yahoo was making changes, but came out with them after Google Maps did. Same thing for search. Sure, Google’s results were (and still are, but less and less so) better than their competitors. But their interface is a joy to use [addendum: this is rapidly changing as Google becomes more of a portal]. GMail’s biggest “feature” isn’t even its interface, it is the amount of storage space. All of a sudden, you can use Web-based email with much of the benefits of a traditional POP3 client. Outside of the storage capacity, GMail wasn’t much different from Hotmail or Yahoo Mail.
Google cleans up because they find a market where the current market leaders have a great idea, maybe even great technology, but provide a lousy user experience anyways. The fact that Google can break into an extremely mature market and blow it wide open is proof that Web applications, by and large, stink. Because even with five, ten years of market domination, the original players still provide a lousy customer experience.
And at the end of the day, even the most basic network aware dekstop application is easier, faster, more secure, blah blah blah, better in every measurable way than the best Web application.
J.Ja
Subscribe Automatically