General discussion

  • Creator
    Topic
  • #2180167

    Critical Thinking

    Locked

    by justin james ·

    blog root

All Comments

  • Author
    Replies
    • #3117655

      Intel Macs, what’s the big deal?

      by justin james ·

      In reply to Critical Thinking

      Maybe I’m just being cynical, but I fail to see why Apple’s decision to move to the Intel architecture is such a big deal. Let’s get real here folks, it’s simply a change in CPUs.

      OK, so maybe there’s more to it. So the Macs may drop a touch in price, and/or get a bit faster. Personally, I’ve been wanting to get a Mac for a while, ever since I put together a server on BSD. I loved BSD’s reliability and speed, especially compared to the Windows 2003 Enterprise server that it replaced. Put simply, BSD rocks, and the idea of using a computer built on BSD, plus a usable GUI (call me crazy, but I think X Windows, at least in every X environment I’ve worked in, is total garbage to deal with, just a fancy way of handling multiple shell sessions) would be really nice. When Apple announced the Mac mini, I was extremely happy, and started saving my pennies. Sure, I could get a pretty decent PC for the same money, but I’d still have the same PC problems. And frankly, the perceived lack of software for Macs doesn’t bother me too much, because my home PC doesn’t do anything that I can’t do on a Mac or a BSD system. So I was already prepared to take the Mac plunge. I’ve even downloaded installers for all of my common software, just wanting to FTP it over the moment I get it plugged in.

      Am I holding off on getting that Mac mini because I want Intel. Heck no. I’m holding off for the same reason my home PC is an Athlon 1900+ with 256 MB RAM: I have higher financial priorities at the moment. But I’m looking forwards to it.

      When Apple announced that they would be switching to Intel chips, everyone made such a big deal about it. I didn’t see the cause for the hoopla then, no do I see it now. OK, maybe IBM didn’t have the roadmap for the G5 that Apple wanted, and maybe they were having some bad karma with Motorola. But at the end of the day, none of my Mac using friends ever complained about having a slow Mac, they are all delighted with it. The current G5 chips are good enough for now, and moving to Intel is a simple business decision to keep the future as bright as the present.

      So I started asking around, trying to find out what the big deal is. My Mac friends could not care less. As long as they’re using Mac OSX, they don’t care if there are ferrets running inside the box handing delivering piece of paper with ones and zero written on them to each other. They just love the OS. My PC friends are all delighted because they have these grandiose dreams of dual booting.

      Sorry folks, I’ve been down the dual boot route. NT4 & Windows 95. OS/2 Warp and Windows 3.1. Windows 98 & BeOS (yes, I tried BeOS, loved it to death, no applications or drivers, sad to say). Indeed, BeOS was originally designed for the PowerPC, then ported over to x86 when Be couldn’t sell any of their boxes. I did Windows 98 and Windows 2000 for a while too. But at the end of the day, I always despised dual booting. Life is always more miserable when you dual boot. Many advantages of each OS are tied to the file system. NTFS is the cornerstone of NT/XP’s security system. HPFS was integral to OS/2 Warp. And I’m sure that HFS+ plays a large role in OSX’s capabilities. Yes, I’m aware that many of the OS’s I’ve listed can read NTFS. But they can’t write to NTFS. No Microsoft OS reads or writes to anything other than FAT16/32 and NTFS (actually, NT 4 may have been able to handle HPFS if memory serves). The point is, you’re going to have a situation where you’re going to end up sticking a giant FAT32 partition somewhere in your system for your common data files, plus have two more system partitions, one for NTFS and one for HFS+. And to be honest, I 100% hate that idea. I jump through hoops to only have one volume mounted in my system, I don’t like dealing with drive letters (I know that MacOSX doesn’t use drive letters). It’s a pain in the rear to have to figure out which directory a file goes into, not just based upon its contents, but also upon which directory or volume or whatever has space remaining.

      Plus, dual booting is a huge waste of time and interupts my workflow. People buy faster computer parts because they don’t want to wait thirty seconds to two minutes starting an application. But with dual booting, this is exactly what you’ll be doing. Need an application that runs on the OS you’re not currently working with? Well, you get to stop EVERYTHING you are doing and reboot. Heaven help you if you miss the boot loader and end up in the wrong OS. Furthermore, isn’t one of the reasons why we like our newer operating systems is because we have to boot less often? Every new version of Windows certainly advertises this as a selling point. No one enjoys having to drop everything because something (or a crash) requires a reboot.

      OK, now there’s the hypersior option (Microsoft’s Virtual PC, Xen, VMWare, etc.). Call me silly, call me crazy, but a modern OS sucks up a good amount of RAM and CPU time, even the more efficient ones like the *Nix’s. Unless all of the OS’s you’re running are using microkernels, or aren’t doing much of anything, you can count on having to double your RAM and increase your CPU requirement by at least 25% in order to have two OS’s running on the same hardware simultaneously. And if you have one OS running CPU intensive activities on a regular basis while you work on the other OS, you should go ahead and increase that CPU power by 50% to 100%. Well, what if you are already running a top-of-the-line CPU, or close to it? I guess you’ll just have to suffer performance way below what a single OS would deliver on the same hardware. Not to mention hard drives. Unless you want to have two hard drives (in a hypervisor situation, where each OS has its own volume, and a shared volume for common data, I’d reccomend three drives), each OS is going to be trying to read and write from opposite ends of the disk simultaneously. Hooray. It’ll be like deliberately making myself use tons of swap file space in terms of speed. And we still haven’t overcome the file system issue. I hope you enjoy having all of your data readable by anyone who gets access to your system, because all of that shared data will be on a FAT32 partition. Not to mention that FAT32 is just about the least efficient file system out there, unless you count FAT16. Not to mention the lack of nifty features like NTFS’s built in compression and encryption, or HFS+’s metadata support (a big selling support for MacOSX), journaling, hard and symbolic links, etc. Or alternatively, you could just have everything in NTFS, and the MacOSX system wouldn’t be able to write to it. Just what I’ve always wanted, a 250 GB CD-ROM disk. My other option (probably the best one, sad to say) is to have the common file system be in a native format for one of the two OSs, and then share it via SMB. Yucky, but at least both OS’s will be able to read and write to the data, and maintain some of its native file system benefits. So let’s add up this hypervisor nonsense. You are almost doubling your hardware power to acheive the same results, the only thing that you’re sharing are the peripherals and optical drives. At that point, doesn’t it almost make sense (except for power consumption) to get a Mac and a PC separate and  a KVM switch from a cost standpoint? Going to the high end of the CPU chain (and the motherboard to support all of this RAM and the fancy CPU) is about as expensive as having the two machines sitting next to each other.

      Even with OSX running natively on Intel hardware, the applications are not running natively on Intel hardware. They are being emulated, via “Rosetta”. I’m not a big fan of hardware emulation, even when it isn’t buggy, it is still slow. Quite frankly, I’d prefer a Mac that is 10% slower on the hardware running native apps, than a Mac running 10% faster emulating another chip for 80% of my apps. That’s just common sense.

      Oh yeah, and there’s one more catch: MacOSX x86 will only run on Apple hardware. I’m sure that there will be XP drivers for this hardware soon enough, that’s not a concern. But do you honestly think that Apple will stop charging the “Apple Tax” just because they’ve switch to Intel? Sure, G5s are more expensive that x86 chips on a pound-for-pound basis, but not nearly by the same ratio that a Mac is more expensive than an equivalent machine. Compare the Mac mini to some of the low priced options from Dell and eMachines/Gateway. The Mac mini costs about 20% more, and comes with less goodies typically. So yes, the price of a Mac will come down, but by what? $50? Maybe $100? It still puts a PowerMac or even an iBook out of the price range of mortal men. It makes the Mac mini, and the eMac slightly more affordable, that’s it.

      But all of my eagerly-waiting pals say, “but I won’t use Apple’s hardware, I’m sure someone will release a ‘patch’ to let me run MacOSX x86 on my existing hardware, and someone will have drivers.” Good luck my friend. First of all, I’m not a big fan of ripping a company off. The profits that Apple makes from their overpriced hardware directly support their continued development of OSX. Deprive Apple of their R&D budget by not buying their hardware, and either the price of OSX goes up (the operating system that charges you for minor version upgrades as it is), or they put less money into developing it. Furthermore, if there is one thing I’m very particular about, it’s stuff like hacking the internals of my operating system and messing with my device drivers. This is the kind of thing that leads to OS instability. I’m not a big fan of OS instability, otherwise I might still be using Windows 98, which would run a heck of a lot faster than XP does for me. This is why I avoid third-party “system tweak tools” like the plague. This is why I don’t let spyware or rootkits on my system. This is why I don’t upgrade my drivers unless I’m actually having a problem, or unless the drivers supports something I desparately need to do. This is why I avoid real-time virus scanning. I don’t do these things because an operating system is under enough stress as it is, without some bonehead messing with its internals. Furthermore, is someone who hacks an operating system up to make it run in violation of its license agreement someone I trust to give me an otherwise clean and unmodified OS? I think not. People downloading warez and MP3s through P2P services like GNUtella and BitTorrent are getting killed by virues, spyware, etc. like the plague. Someone who goes through the effort of cracking an installer could just as easily through something nasty in there for you as well. I would not trust my OS to come from such a source, and neither should you.

      So at the end of the day, where are we? To effectively use Mac OSX on Intel architechture, it won’t be any different than it is today. It won’t be too much faster, if you want to use PC apps you should still have a PC sitting right next to it, and you’ll still be paying through the nose for Apple’s hardware. As excited as I am to get onto OSX as soon as my wallet allows, I don’t see how this gets me there any different or faster.

      As a parting shot, to all of those who were actually surprised that Apple had an x86 iversion in the works, I simply point you to the “Ask the developers” page on Apple’s site (note the date of when this page was put together, 2001). Also take a look at the source code tree. Darwin (the underlying OS) has been available on x86 architecture since Day 1. Sure, the GUI isn’t on here, but the OS itself is half the battle. Microsoft had a PPC version of NT when it was NT 3.1, 3.5, and 3.51 (they also had a version running on SPARC!), and maybe even in NT 4 (memory is weak on that one too). An XBox today runs on a PPC chip; it also runs on what is admitedly a modified version of Windows 2000. Every OS manufacturer out there keeps a seprate port tree for CPUs they don’t support, it’s common practice, and a smart one too. It leaves their options open in a way that’s a lot cheaper than if they suddenly find themselves without a chipset to support anymore. Plus, it gives them leverage with the hardware folks.

      At the end of the day, no matter how I slice and dice it, I simply fail to see why OSX x86 is such a big deal. Yes, Intel chips are better on power usage, a win for the laptop users out there (BTW, has anyone noticed how battery technology lags so far behind power usage?). Yes, Intel has a better roadmap than IBM has for the G5 line. My heart rate might have gone up for a split second if Apple announced that they were witching to AMD64 technology, but they aren’t. There is nothing to be worked up over on this, and this is certainly not a world changing event. If anyone can explain in a reasonable way why this is actually worth getting excited about, please let me know and  I’d be grateful to concede defeat.

      • #3121676

        Intel Macs, what’s the big deal?

        by sbd ·

        In reply to Intel Macs, what’s the big deal?

        WOW!!!

      • #3127551

        Intel Macs, what’s the big deal?

        by salmonslayer ·

        In reply to Intel Macs, what’s the big deal?

        One word — KVM (okay, so actually it isn’t a word but an acronym)

        I have also gone down the dual-boot road, and ran into the same
        roadblocks.  For a while I had three OSs on my primary workstation
        (Linux, OS/2 and Windows 98).  I rarely ran OS/2 — I had some
        great graphics programs (and still think DeScribe is one of the best
        word processors out there) but it took too much time to shut down,
        restart, boot in another OS and then do the work.  OS/2 was the
        first to go.  I managed to get a second computer relatively cheap
        so now have two systems connected via a KVM switch.  One has
        Windows XP and one has Linux, and both are networked.  The best
        thing is that I can bounce back and forth with only a simple keystroke,
        can access files from either system, and generally use the best of both
        worlds.  This makes life considerably easier than the old days,
        and sometimes I really wonder why I bothered with dual-booting at all.

    • #3119931

      Thin Computing Is Rarely “Thin”

      by justin james ·

      In reply to Critical Thinking

      For some reason, a huge number of people on ZDNet, both in TalkBacks and in articles (Mr. Berlind, are you reading?) confuse a central file server with “thin computing”.

      Mr. Berlind’s example situation (a really bad one, at that) is simply using a web browser as a thick client application to access a central file server. Especially in the AJAX world, clients get thicker not thinner! For example, I have been doing some writing on this website right here. Their blogging software puts such a heavy demand on my system at home, that it was taking up to thirty seconds for text to appear on the screen. Each keystroke caused my cursor to be the “arrow + hourglass” cursor. Granted, my computer at home is no screamer (Athlon 1900+, 256 MB RAM) but that is what AJAX does. It requires an extraordinaily thick client to run. If I compared that AJAXed system to, say, using MS Word (a “thick client” piece of software) or SSHing to a BSD box and running vi ( a true “thin client” situation), that AJAX system comes out dead last in terms or CPU usage. And after my system does all of the processing for formatting and whatnot, what happens? It stores the data (via HTTP POST) to a central server, which stores that information somewhere, performs a bit of error checking and then makes a few entries into a database table somewhere. If I compare the CPU usage on my system by the AJAX interface, to the CPU usage of the server to file my entry, it sure doesn’t look like a “thin client/thick server” situation to me! It looks a heck of a lot closer to “thick client/dumb file server” story.

      Some of the comments to this story are already making this mistake. “Oh, I like the idea of having all of my data on a central server.” So do I. This is how everyone except for small businesses and home users have been doing it for decades. The fact that most business people stick all of their documents onto their local hard drives is due to a failure of the IT departments to do their jobs properly, for which they should be fired. Why are people in marketing sticking their data on the local drive instead of on the server? Anyone who saves data locally and loses it should be fired too, because if their IT department did the job properly, this would not be possible without some hacking going on. The correct setup (at least in a Windows network, which is what most people are using, even if it’s *Nix with Samba on the backend) should that there is a policy in place which sets “My Documents” to a network drive. This network drives gets backed up every night, blah blah blah. The only things that go onto the local drive should be the operating system, software that needs to be stored locally (or is too large to carry over the network in a timely fashion), and local settings. That’s it. And if someone’s PC blows up, you can just flash a new image onto a drive and they’re up and running again in a few minutes.

      Now, once we get away from standard IT best practices, what is “thin client computing”? We’re already storing our data on a dumb central server, but doing all of the processing locally. Is AJAX “thin computing”? Not really, since the client needs to be a lot thicker than the server. Is installing Office to a central computer “thin computing” but having it run locally? Not at all. Yet people (Mr. Berlind, for starters) seem to think that storing a Java JAR file or two on a central server, downloading it “on demand” and running it locally is thin computing.

      Thin computing does not occur until the vast majority of the processing occurs on the central server. That is it. Even browsing the web is not “thin computing”. Note that a dinky little Linux or BSD server can dole out hundreds or thousands of requests per minutes. I challenge you to have your PC render hundreds or thousands of web pages per minute. Indeed, even a Windows 2003 server can process a hundred complex ASP.Net requests in the amount of time it takes one of its clients to render on the screen one of those requests. I don’t call that “thin computing”.

      Citrix is “thin computing”. Windows Terminal Services/Remote Desktop is “thin computing”. WYSE green screens are “thin computing”. X Terminals (if the “client” {remember, X Windows has “client”  and “server” backwards} is not local) are “thin computing. Note what each one of these systems have in common. They are focused on having the display rendered remotely, then transferred bit-by-bit to the client, which merely replicates what is rendered by the server. All the client does is transfer the user’s input directly to the server, which then sends the results of that input to the client, which renders the feedback as a bitmap or text. None of these system require any branching logic or computing or number crunching or whatever by the client. That is thin computing.

      Stop confusing a client/server network with “thin computing”. Just stop. Too many articles and comments that I have seen do this. They talk about the wonders of thin computing, like being able to have all of the data in a central repository to be backed up, or have all of my user settings in a central repository to follow me wherever I go, or whatever. I really don’t see anyone saying anything that is “thin computing” that does already exist in the current client/server model. The only thing that seems to be changing is that people are leaving protocols like NFS and SMB for protocols like HTTP. It’s a network protocol folks. Wake up. It’s pretty irrelevant how the data gets tranferred over the wire, or what metadata is attached or whatever. It does not matter. All that matters is what portion of the processing occurs on the client versus the server, and in all of these siutations people are listing, the client is still doing the heavy lifting.

      • #3119877

        Thin Computing Is Rarely

        by Jay Garmon ·

        In reply to Thin Computing Is Rarely “Thin”

        Preach on, brother!

      • #3131441

        Thin Computing Is Rarely

        by jdgeek ·

        In reply to Thin Computing Is Rarely “Thin”

        OK, can we use the term psuedo-thin without raising your ire?  There is certainly a noteworthy difference between a computing environment that relies on standards compliant web clients as the one managed application versus having a seperate client for each task. Maybe thin versus thick is not the best description of that difference.  Psuedo-thin may not be truly thin, but it is at least thinner on the administrative side, and arguably thinner on the client side.  It seems the real difference is in the network intelligence.  In psuedo-thin, not only the data, but also the logic comes from the server.  In this way psuedo-thin is like truly thin, even if the interpretation and rendering take more processing horsepower on the client.

        Instead of belittling others, why don’t you suggest a new terminology?  You might go down in history as the guy who first identified the dumb thick client.  Dumb….thick… wait, I’ve got it!  It’s the Anna Nicole client.

      • #3131325

        Thin Computing Is Rarely

        by justin james ·

        In reply to Thin Computing Is Rarely “Thin”

        Sure, we can say “pseudo-thin”. 🙂 You make a very good point, that with a web-application, all of the logic is stored and maintained on the central server, even if the execution of the logic occurs on the clients’ side. That is a large benefit of a web application (and that the application can be accessed from a wide variety of clients, although cross-platform compatability is still a huge issue). You can also do the same thing, however, through a centrally managed “push” system like Microsoft Systems Manager. To me, there really isn’t too much difference on a logical basis between an application where the installation gets pushed by a central server to a client, and run from the client’s local system when needed, and an application that gets pulled by the client when wanted. They both have their advantages and disadvantages. An application push can send an incredibly rich, native application, and only needs to do it once. Once the initial push is over, the client doesn’t need to interact with that server anymore. A pull, on the other hand, forces the application to be relatively lightweight. Imagine trying to pull MS Word or OpenOffice down the pipe everytime you want to use it… that would be pretty miserable. Sure, you could cache it, but at that point, the hardware resources needed for a push and a pull are now the same, and the two become nearly indistinguishable in terms of functionality, except that with a cached pull model, the break in workflow occurs the first time you try to use the application, whereas a push model interrupts you (or hopefully, works in the background) at random times like bootup.

        Sure, I focused a bit on terminology in the blog, but the underlying assumptions are what I’m attacking. People are tossing around phrases like “thin computing” which have a meaning of their own, when what they really mean is “client/server network with advanced functionality” or (to use your term), “pseudo-thin client”. Improper usage of terminology leads to miscommunication and misunderstanding. If I used the word “red” where most people use the word “pink”, I’m not going to do a very good job working as a sales person for clothing, particularly over the phone. If people misuse the term “thin computing”, they aren’t doing a very good job at communicating, especially if they are paid journalists.

        J.Ja

      • #3130551

        Thin Computing Is Rarely

        by jdgeek ·

        In reply to Thin Computing Is Rarely “Thin”

        I agree with you about terminology, sometimes I just can’t help playing devil’s advocate.

        Another significant advantage to the web services approach is a kind of sandboxing.  While you are correct that push vs. pull is probably not a major difference, there is a major difference in having only one application (i.e. the browser) run code natively.  I believe this significanlty decreases your security exposure.

        Also, I assume it is easier to deploy non-standard, custom, or lightly used apps using a web service.  Although I don’t have any experience with SMS, by deploying apps through a web server, you move to a two phase develop then deploy model as apposed to the SMS develop, package, deploy model.  I’m not sure how difficult it is to package an application in SMS, but I’m sure that creating a cross platform package that does not break existing applications is not necessarily trivial.

        Anyhow, good work on the thin client blog and thanks for an interesting discussion.

      • #3131680

        Thin Computing Is Rarely

        by stress junkie ·

        In reply to Thin Computing Is Rarely “Thin”

        I agree with J.Ja. There have been many instances when a given term has
        enjoyed a clear definition for many years, then one day people start to
        misuse the term. The next thing you know the original definition is
        lost. This is the inspiration for the expression “Newbies ruin
        everything.” which I have been saying for a long time.

      • #3044038

        Thin Computing Is Rarely

        by jcagle ·

        In reply to Thin Computing Is Rarely “Thin”

        I have to disagree with one point made in this article.

        In this article, it said no one should be saving their files to the local drives, but to the server. With a thin client, that may be what you have to do.

        However, I’ve found that working from the network is generally a bad idea. Networks go down sometimes, and they can be slowed down by too many people working from the network.

        At my school, they recommend we work from the hard drive. I’m going to school for graphic design and web development. When you start working with Photoshop and Illustrator, you do not want to work over the network. We work from the hard drive and back up to the network server, USB Flash drive, CD-R, etc.

        Maybe it won’t seem like much of a problem on a smaller network if you’re just handling Word documents or something. But I think in most cases, working from the hard drive and backing up to server, flash drive, CD-R, etc is the best idea.

        Again, this isn’t a thin computing situation, but the standard IT practices. Trust me, I don’t want to be working in Photoshop with a project due, working on the network, and then the network goes down or is slowed down due to the fact that huges files are being worked on over the network.

    • #3122828

      Technology I’m Thankful For

      by justin james ·

      In reply to Critical Thinking

      Well, here it is, Thanksgiving! And I’m trying to find some technologies that I am thankful to have in my life, since lately technology has been doing its best to make my life unhappy. So here’s what I’ve come up with:

      CD Players: Storage capacity and size aside, there is nothing that an MP3 player can do that a CD player can’t. And CDs are easy. Most importantly, the players are cheap now. CDs have been part of my life for 15 years now, and I’m always happy to have them in my life. And unlike just about everything else tech now, I’ve never had one crash on me.

      IBM ThinkPad 390E: Never heard of this model? Not surprising. It’s a PII 300 mHz system with 160 MB RAM (I’m sure it’s more, but I don’t know how kmuch was allocated to video). Sure, it’s slow. The CD drive doesn’t recognize that a disc is in there. I got a floppy jammed into the drive a few nights ago (trying to find a good floppy to start a BSD install with). With XP on it, it is so slow that it can’t play the wave file Windows sounds properly. It’s fairly heavy and has few features. It is so outdated that it does not have a built-in Ethernet port. But it has one thing that no other piece of equipment in my life has: durability. The thing is a tank. If I got into a fight, I would rather be armed with the 390E than a knife. And I’m sure it would still work afterwards. It doesn’t crash, either. For the two or three times I go on the road a year, I know that I can count on it to give me just enough connectivity to survive. Most importantly, when I have a major hardware problem, it tides me over until I can resolve the problems with my other machines.

      Microsoft’s .Net Framework: I don’t care that is is nearly as sloppy and inefficient as Java, or that it is not cross-platform. .Net has saved me hundreds of hours of coding time, and Microsoft’s fantastic documentation has saved me at least a few dozen hours in the last year. Compared to working in Java, .Net is a dream. Visual Studio is a great IDE, and its tight coupling with IIS gives me debugging powers on web dev projects that I never found with Java, Perl, or PHP. This saved me even more time and leads to better code.

      Cell Phones: I live and die by the cell phone. I haven’t had a landline in nearly four years, and don’t miss it at all. My cell phone is cheaper than a landline, too. I like not having to take personal calls on a work phone, being able to leave the house and not interupt a call, and I appreciate the ability to send text messages and email on the road, miserable as the interface may be, in those clutch situations. There are a lot of great things (and important things) that I would have missed without the ability to be reached wherever I may be, since I am so infrequently at home.

      USB: The number of peripherals has skyrocketed since the invention of USB. High speed, easy device connectivity, easy to add multiple devices and not be limited by thenumber of ports on the machine… USB has introduced us to a whole new world of computing options. Digital cameras, scanners, web cams, all of these cool things would not be nearly as widespread as they are today without USB.

      Digital Cameras: OK, so my ex is holding on to mind lately because she’s been wanting to take a lot of pictures. But when she isn’t, I carry mine around everywhere. I used to do the same with a film camera, but then it would sit until I got around to getting new film, and I hated the development costs and wait and everything else. I love digital cameras, and have since I got my first one.

      Inexpensive Broadband: Ever since 2000, I’ve been a cable modem user. It is as cheap as a second phone line + and ISP account, a billion times faster, and super-reliable. Broadband has made my life infinitely less frustrating, and for that alone, it gets my thanks.

      So that’s my list for now. What’s on your list?

      J.Ja

    • #3127264

      Intel drives Apple sales up in 2006?

      by justin james ·

      In reply to Critical Thinking

      “Apple Sales Mushroom, Thanks To Intel CPUs”

      That is one headline that we will definitely not be seeing in 2006. Standard business practice to making a product sell, is to accomplish at least one of the following: better product for the same price, equal product at a better price, or superior customer perception of product, regardless of price. In other words, it either needs to be better, cheaper, or marketed as such.

      I am not going to dispute Apple’s back office reasons for switching to the Intel CPUs. They had a rocky relationship with IBM and Motorola, and the PPC platform was not going where Apple needed it to. Intel offered them a way out, and Apple had conveniently been maintaining an x86 version of OSX the whole time. If Apple had been maintaining a SPARC version of OSX instead of x86, we would be hearing about a Sun/Apple partnership right now. The decision has been made, the code has been written and is being tested. It is a done deal.

      But those who think that this deal will significantly boost sales of Macintosh computers are dead wrong. The Intel architechture simply does not add value, reduce prices, or make the product more marketable. Here is why.

      Increased Value

      This is a simple question to ask. “Does the Intel architechture make a Macintosh any better?” Currently, no, it does not. Yes, the Intel chips are running at a higher clock speed than many of the G4 and G5 CPUs that Apple will be replacing, particularly on the low-end. The mini and iBook are running fairly old chip designs. But remember our business rules here: Apple needs to offer a better product at the same price.

      Better Product?

      The switch to Intel CPUs will not make the Macs better at first. Yes, the Intel architechture offers a better roadmap for the future, which means that the Macs a year, two years from now will be better than they would be had Apple stayed with the current PPC chips. But if IBM had devoted as much time to Apple as they have to Sony and Microsoft (for the PS3 and XBox 360, respectively), then the PPC Mac roadmap would be a five lane highway compared to Intel’s dirt roads. The x86 architechture also is not offering any new features above and beyond what PPC should have been offering. It isn’t like they are building ATI Radeon 800’s or integrating 200 GB hard drives onto the CPU or something.

      Yes, the Intel chips of today may have a higher clock speed and even perform more FLOPS than the PPCs of today. But without x86 native binaries, and nearly every piece of software running through the Rosetta translation layer, I am positive that for the first six months minimum, if not as long as two years, software running on the Intel Macs will not be noticeably faster than software on equivalent PPC Macs. I am pretty sure that it will actually run slower for many, if not most applications, for some time.

      I have also written extensively about how the Intel switch will not bring any other tangible “value adds” to Mac users.

      On the “better product” angle, the switch to Intel CPUs simply allows Apple to continue at the rate of improvement that they should have had with PPC, which simply means that (at best) they will be producing the same numbers as Dell, HP, Gateway, etc. That is hardly an “imporvement”. Any speed gains they get will have to come through superior OS design, and relies upon the availability of x86 native binaries. Considering my experiences with finding x64 native binaries and device drivers for my new Windows x64 XP Professional PC, I think that the Mac users are going to be quite shocked at how little will be available for them, at the time of launch for sure, and even much further down the road.

      Same Price?

      Paul Murphy, over at ZDNet, recently wrote a very interesting article regarding the Intel CPU switch. To summarize: a low-end Intel CPU costs $240, the G4 that is used in the iBook costs $72. That is a price increase of about $170. Apple’s number one barrier to entry is the cost of their products. Mr. Murphy’s numbers seem sound to me. To be honest, I stopped watching CPU prices around the time I bought a 486 DX4 120 mHz system, so I cannot verify them. But they seem about right.

      No Value Added

      If Apple had gone with AMD (particularly the AMD x64 CPUs) or even SPARC, I would not need to write this article. They would be delivering superior performance at the same, if not less price, then they are now. But they went with Intel. They are paying more money for less performance. Which means that they must either be offering the same value at a better price, or find a way to increase its markeability.

      Same Product, But Cheaper

      Same Product?

      As discussed earlier, the Intel Macs will be, at best, as good as the current Macs, and the Intel roadmap is no better than what the PPC roadmap should have been. I am going to give both Intel, and the Rosetta translation layer the benefit of the doubt, and allow that the Intel Macs will be as good as the current offerings. It isn’t inconceivable, that is for sure. I have no way of knowing without seeing some cold, hard numbers when the final version is released. None of the reviews I have read of the developer kits mentioned performance in any memorable way, so I am guessing that performance was not on either extreme of the scale. So we can assume that (once again) at best, the Intel Macs will be the same product as the currents Macs.

      Cheaper Price?

      This section definitely feels like deja vu. The Intel Macs will not (and cannot!) be any cheaper, at least on the low end, than their PPC counterparts. I have not seen prices for the high end G5 chips, but it is possible that the big PowerMacs will be $100 – $200 cheaper than they are now. Considering their current prices, that is a welcome price drop, but still hardly enough to suddenly make them a bargain.

      No Pricing Advantage

      Once again, I do not see the Intel CPU architecture giving Apple an advantage that they do not currently have. Since Apple isn’t going to be making a better product, or a cheaper one, that leaves them with only one way to significantly grow Mac sales…

      Marketing

      Without a better product or a better price, Apple must rely upon the arcane art of product marketing, based around the switch to the Intel CPUs to make sales jump at all, and possibly even to reassure the faitful that this is not a Bad ThingTM.

      “Intel Inside” is one of the most pervasive ad campaigns out there. The vast majority of the computing users worldwide use a computer with an “Intel Inside” sticker on it. But “Intel Inside” will not bring a single advantage to the Apple marketing strategy. There are a few reasons for this, based in the “Intel Inside” campaign, as well as Apple’s current marketing efforts.

      Intel Inside

      What does “Intel Inside” promise to the end user, to make it something worth paying extra money for? Let’s look at the History of “Intel Inside” website to find out. Apparently, “Intel Inside” was designed to present the following ideas to customers:

      • It matters who makes the CPU within your system. Intel wanted the consumer to equte “Intel” with “safety”, “leading technology”, and “reliability”.
      • There is an easy way (the stickers, the little noise in ads, the logos in ads, etc.) to easily identify what computers use Intel CPUs and which ones don’t.
      • Early “Intel Inside” campaigns stressed “speed, power, and affordability”.
      • Current “Intel Inside” efforts focus upon “technology leadership, quality, and reliability”.

      When one looks at Intel’s overall branding efforts, this is definitely what we see. They stress that Intel CPUs are the gold standard for x86 compatability, that PCs work best with Intel CPUs (the Centrino campaign takes this message to nearly scandalous lengths, presenting users with the idea that their computer will not work with non-Centrino equipment through dodgy wording), and that Intel is constantly innovating.

      Apple Switch

      To grow market share, Apple needs to cut into the PC market. Yes, having 3% – 5% of a market the size of the PC market is still a very nice slice of revenue, but they want to, and need to grow it. They have known this for a long time now. Apple counts on existing Mac owners to keep buying new Macs. From what I can tell, with the exception of those forced to go to PCs for financial or business reasons, that holds true. I have never met a Windows user who actually seems delighted to be using Windows. I have never met a Mac user (one who chooses to use a Mac, that is) who fails to provide free advertising for Macs. They definitely have something good going on there. The fact that Apple’s market share hasn’t been shrinking while the computer pie gets bigger says that Apple is signing up new users as fast as the market expands. That is a good thing for them. Due to the nature of the situation, Apple has been throwing their energies (rightfully, in my opinion) at existing PC users. Let’s take a look at the “Apple Switch” marketing campaign.

      • “It just works”
      • “As easy as iPod”
      • “Picture-perfect photos”
      • “It’s a musical instrument”
      • “Home movies in HD”
      • “Online streamlined”
      • “Join the party” (about Apple’s support and community)
      • “It loves road trips” (portability, easy connectivity)
      • “It does Windows” (ease of opening/editing files produced on Windows PCs)
      • “It’s beautiful”

      I do not see any convergence, synergy, or shared edges (how’s that for some buzzwords!) with the “Intel Inside” campaign. In other words, “Intel Inside” does not offer any additional zing to the Apple Switch campaign. Indeed, most of the “Intel Inside” campaign actually works in reverse for Apple. OSX on x86 is untested, there are (and will be for some time) few native binaries, it will not be faster (and quite possibly be slower), and so forth. Chances are, OSX on x86 will be less reliable than OSX on PPC. OSX suceeded laregly because the “Classic Mode” worked so well. If people running Mac OS 9 programs had huge problems, no one would have upgraded, or have been willing to buy a new Mac with OSX installed. But Classic Mode worked, and worked well, and worked fast, and OSX was adopted.

      “Intel Inside” will certainly be a tough sell to the current Mac market, and it will not help Apple at all in competing against Windows. It reduces their hardware, in the minds of the consumer, to a commodity device, equal to a PC. Will a potential buyer be happy about paying significantly more money for nearly identical hardware, simply because the OS is different? Probably not. In terms of marketing, I think “Intel Inside” is dismal. Sony is not a super-huge PC seller for this reason. Sure, their computers have nice design and are packed with features. But if you are willing to settle for an uglier case, keyboard, mouse, etc. then you can save hundreds of dollars by purchasing a similarly equipped HP, Dell, eMachines, etc. PC. With Intel hardware, Apple positions itself as another Sony, and still has to overcome the problem where consumers perceive a Mac as being incompatable with what they want to do.

      Conclusion

      “Intel Inside” will not drive Mac sales at all. If Apple sales rise significantly in the near future, it will be because they are either able to substantially improve the value of a Mac, either by being able to offer faster/more hardware for the same price, or by dropping the price quite a bit. The mini shows that people are willing to buy a Mac, if the price is right. Apple needs to get that price even better. That is how they will grow market share. If the Intel CPUs could do that, I would beleive that the switch to Intel will drive sales. Unfortunately, Apple chose Intel, instead of Sun or AMD to work with, and we are going to be stuck with overpriced, underpowered Macs for the foreseable future.

      • #3121415

        Intel drives Apple sales up in 2006?

        by guy_sewell ·

        In reply to Intel drives Apple sales up in 2006?

        I believe
        your reasoning is sound but you are misinterpreting some of the conditions.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”khtml-block-placeholder”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>For the
        PC crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>1. To
        grow market share Apple needs to attract switchers.? The Mac faithful/fanatics are sold.? But to a PC person a Macintel has
        obvious increased value.? I can run
        windows and my favorite PC-software, but also choice Mac stuff. (more value,
        equivalent price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>2. The
        increased value is the operating system.?
        I use PCs and Macs. For a non-IT professional there is NO question
        productivity increases and headaches decrease on a Mac. (more value, same
        price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”khtml-block-placeholder”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>For the
        Mac crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>3. A dual
        core laptop will show significant increases in processing power as compared to
        current G4 models. (more value, same price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>4. Laptop
        features have been slow (slowed?) to evolve lately on the Mac.? We should see a significant performance
        increase in weight reduction and battery life, as well as in processing power
        and AV performance due to the incorporation of Intels new
        platforms/technologies.? This is
        not a unique Apple only advantage from the PC world side, but it will be a big
        boost to the Apple faithful. (more value, same price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”khtml-block-placeholder”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>Every
        consumer (Mac or PC) will see increased value, and as a premium brand this is
        how Apple increases sales, not by being cheaper, but better.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”khtml-block-placeholder”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>The real
        challenge is will developer still make Mac specific software if you can run
        windows stuff on a Mac (the OS2 effect).?
        But 2006 is not 1990, much of the consumer and some (more everyday) of
        the pro-stuff is from Apple itself.?
        You can bet, it either will be native, or will be soon when Macintels arrive.? And if big developers are slow to
        support, Apple has shown they will fill the gap and make money doing it (watch
        out MS and Adobe).function (match)
        {
        return match.toLowerCase();
        }>

      • #3121402

        Intel drives Apple sales up in 2006?

        by wizkidsah ·

        In reply to Intel drives Apple sales up in 2006?

        I think many people shy away from buying a mac due to concerns about incompatibilities with Window/their company etc.  For $85 or less, you will be able to install Windows XP on the MacTel systems and run your XP applications.  I do think many people will try a Mac if they are comfortable they can still do “Windows” things if they need to.  I build my XP boxes, but I am looking forward to a high end MacTel box I can just install XP on.  Why bother with building them anymore?  As a PC gamer, I’m excited that it appears the new MacTel systems won’t need slightly modified cards from ATI and nvidia anymore.  Forget about the price differentials of the CPUs, if I can buy some of those (relatively) cheap cards built for PCs and stick them in my Mactel box… or not wait 4 months for the Mac version to come out, that’s huge.  Its really speaks to being able to use chipsets and other “off-the-shelf” components.  Its a bear for Apple to develop proprietary chipsets for PowerPC processors, and that component is not cheap.  Now they can use Intel’s chipsets.  Some of the advantages to this switch go beyond the processor. 

        Away from that, your arguments are sound – I don’t see having another processor exciting consumers, although I haven’t seen many people who think that it would beyond what I just described.  People haven’t been running up to me going “Yay apple is switching to Intel!”  Most people that buy Macs, or PCs for that matter, could care less.  Gamers and tech geeks are another story, but those are niche markets.  But all of the fence sitters I know looking at macs from afar, when I tell them about XP running on Macs if you install it yourself, truly think their last reservation has been removed.  If rational, many of them should buy macs next year.  We’ll see.

      • #3121401

        Intel drives Apple sales up in 2006?

        by guy_sewell ·

        In reply to Intel drives Apple sales up in 2006?

        function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>I believe
        your reasoning is sound but you are misinterpreting some of the conditions.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>For the
        PC crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>1. To
        grow market share Apple needs to attract switchers.? The Mac faithful/fanatics are sold.? But to a PC person a Macintel has
        obvious increased value.? I can run
        windows and my favorite PC-software, but also choice Mac stuff. (more value,
        equivalent price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>2. The
        increased value is the operating system.?
        I use PCs and Macs. For a non-IT professional there is NO question
        productivity increases and headaches decrease on a Mac. (more value, same
        price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>For the
        Mac crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>3. A dual
        core laptop will show significant increases in processing power as compared to
        current G4 models. (more value, same price).function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>4. Laptop
        features have been slow (slowed?) to evolve lately on the Mac.? We should see a significant performance
        increase in weight reduction and battery life, as well as in processing power
        and AV performance due to the incorporation of Intels new platforms/technologies.? This is not a unique Apple only
        advantage from the PC world side, but it will be a big boost to the Apple
        faithful. (more value, same price).function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>?function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>Every
        consumer (Mac or PC) will see increased value, and as a premium brand this is
        how Apple increases sales, not by being cheaper, but better.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>?function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>The real
        challenge is will developer still make Mac specific software if you can run
        windows stuff on a Mac (the OS2 effect).?
        But 2006 is not 1990, much of the consumer and some (more everyday) of
        the pro-stuff is from Apple itself.?
        You can bet, it either will be native, or will be soon when Macintels
        arrive.? And if big developers are
        slow to support, Apple has shown they will fill the gap and make money doing it
        (watch out MS and Adobe).function (match)
        {
        return match.toLowerCase();
        }>

      • #3121397

        Intel drives Apple sales up in 2006?

        by vuong.pham ·

        In reply to Intel drives Apple sales up in 2006?

        Total garbage.
        sorry but the flaws in your points contained in this piece are horrible.

        I disagree with our points about “Classic Mode” OSX wasn’t adpoted
        because of the mere exsistance of a OS virtualization contained in its
        own memory space. For me personally the real improvements came and thus
        the “reasons” for adopting OSX was the modern features the OS 9 could
        not provide. Applications making the transition into native osx app
        world that made the difference. Case in point the transition from 040
        to PowerPC. Where were you? Fat binaries did work very well and the
        compensation of the performance from the PowerPC CPU made  a large
        difference.  Comparing Classic Mode with OSX adoption is
        incorrect. You should be examining the Rosetta  software.

        “Chances are OSX on x86 will be less reliable than OSX on PPC” Where
        did you drag up this? Where are your quanitative analyses? NUMBERS ?
        STATs? or just pure SWAGing.

        Driving the sales of any computer is a ratio of price / performance and ROI. OSX is a huge factor in that equation.
        Comparing wintel (windows + intel) and Mac-intel.. the initial
        difference is how well does the OS take command of the CPU? How well
        designed are the subsystems I/O video etc. Does the OS provide enough
        control of the hardware to squeeze all the performance out of the
        system?  Case in point:

        Intial the release of BeOS, RedHat etc showed that the OS was
        fantastic. But the end user doesn’t sit around all day drawing windows
        or running benchmarks. The ROI is how much productivity can be
        accomplished.

        The intel transition will be that a transition, and if history serves
        as lesson to be learned. Apple will be paying close attention.

        As for me .. am I biased. Not really, I am solutions driven.
        Application developers will determine if a viable solution will exist
        with the new marriage of hardware and Operating system.  Price
        point is only one factor.

        Overpriced is a myth, when you compare the real compoent level purchase
        of computer systems. Sure most intel boxes sold are run of the mill and
        not so special, hence “commodity” but for example ibook vs other 999.00
        systems value is there with the performance to match. iMac G5 systems.
        As for dual and quad core sytemss have you recently compared the wintel
        versions of the dual core systems. WITH the associated subsystems…
        compare straight across the board and cost analysis will show not much
        price differential.

        A 3000.00 system is a 3000.00 system.
        -Vuong Pham

      • #3121395

        Intel drives Apple sales up in 2006?

        by guy_sewell ·

        In reply to Intel drives Apple sales up in 2006?

        function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>I believe
        your reasoning is sound but you are misinterpreting some of the conditions.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>For the
        PC crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>1. To
        grow market share Apple needs to attract switchers.? The Mac faithful/fanatics are sold.? But to a PC person a Macintel has
        obvious increased value.? I can run
        windows and my favorite PC-software, but also choice Mac stuff. (more value,
        equivalent price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>?function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>2. The
        increased value is the operating system.?
        I use PCs and Macs. For a non-IT professional there is NO question
        productivity increases and headaches decrease on a Mac. (more value, same
        price)function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>For the
        Mac crowd:function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>3. A dual
        core laptop will show significant increases in processing power as compared to
        current G4 models. (more value, same price).function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>?function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>4. Laptop
        features have been slow (slowed?) to evolve lately on the Mac.? We should see a significant performance
        increase in weight reduction and battery life, as well as in processing power
        and AV performance due to the incorporation of Intels new platforms/technologies.? This is not a unique Apple only
        advantage from the PC world side, but it will be a big boost to the Apple
        faithful. (more value, same price).function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>Every
        consumer (Mac or PC) will see increased value, and as a premium brand this is
        how Apple increases sales, not by being cheaper, but better.function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>function (match)
        {
        return match.toLowerCase();
        }>function (match)
        {
        return match.toLowerCase();
        } class=”MsoNormal”>The real
        challenge is will developer still make Mac specific software if you can run
        windows stuff on a Mac (the OS2 effect).?
        But 2006 is not 1990, much of the consumer and some (more everyday) of
        the pro-stuff is from Apple itself.?
        You can bet, it either will be native, or will be soon when Macintels
        arrive.? And if big developers are
        slow to support, Apple has shown they will fill the gap and make money doing it
        (watch out MS and Adobe).function (match)
        {
        return match.toLowerCase();
        }>

      • #3121386

        Intel drives Apple sales up in 2006?

        by oharag1 ·

        In reply to Intel drives Apple sales up in 2006?

        I think you are dead wrong.

        A recent benchmark over at anandtech shows the upcoming Yonah chip actually beats AMDs 64 X2 3800+ chip, and even the 64 X2 4200+ chip in some instances. Understand the X2 are desktop chips, and Yonah is a laptop chip. Intel is driving forward with 65nm chips quicker than AMD. AMD has to rely on contract mfg to make advances in mfg processes (i.e. IBM). Also, the chipset coming up for the Yonah will offer higher bus speeds and video speeds than what is currently availble on the PowerBooks with PowerPC. This is just the start. I believe the chips coming out of the Intel camp in the next two to three years will be amazing.

        Add the fact that new Macintoshs will actually boot Windows either by itself or dual boot is just amazing. I can still run (as can all the millions of PC users) my PC apps, but then use MacOS X as well. I also believe with a common chip design (Intel) more people may start to port their PC only apps to the Mac. I think things look great for Mac users in the future!

        http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2627&p=1

      • #3121292

        Intel drives Apple sales up in 2006?

        by yespapers ·

        In reply to Intel drives Apple sales up in 2006?

        I think one point that’s missing here is perception.? As with so much in the tech business, perception counts as much or more than reality.? The perception that Apple products now run Intel chips (and, as anyone will tell you, only “real” computers use them, right?) means that Macs are now officially real computers and can be considered as a viable alternative to a Windows machine.

        Another point is Apple HAD to get off of IBM chips.? IBM was just not going to put the money into desktops when they could make a bundle in gaming consoles. This meant as computing power demands and lower power requirements grew, Apple would be more and more at a disadvantage.

        And using the Sun SPARC, are you kidding?? Talk about a costly chip with pontential supply problems in the quantities Apple would need.? I do agree that it is one stellar processor but hardly right for Apple’s markets not to mention the rewrite of OS X to take advantage of it.

        The real downside in all this is loosing the current and future G5 chip.? In the high performance computing market the G5, along with the AMD Opteron, was a real winner over anything Intel currently has. Intel is definitely catching up and will most likely surpass AMD but who knows?? And also who’s to say that OS X won’t ever run on an AMD machine?? So far the Intel version of OS X has been hacked to run on all kinds of platforms apart from the Apple Mactel box.

        I think the next year or two will be really interesting for Apple. They deserve a lot of credit for what will be their third major product transition; 680XX to PPC, Mac OS Classic to OS X and now PPC to Intel.

        You go Apple!

      • #3121213

        Intel drives Apple sales up in 2006?

        by davemori ·

        In reply to Intel drives Apple sales up in 2006?

        Disagree.

        I guess that you have never had to produce and ship an actual product in the Silicon Valley.

        A lot of the production constraints as to the number of Macs that Apple could make (as well as Mac Clones, back in 1997), could be directly pinned back to the ultimate limitation in the number of PowerPC CPUs that IBM and Motorola could produce in a given time period.

        Market share, in turn, has been limited by the number of Macs that can be produced — which is a direct function of the number of CPUs produced.  In 1998, Intel-AMD-Cyrix were producing about 50 million CPUs.  Motorola and IBM were producing about 5 million PowerPCs (excluding embedded PowerPC processors).  The number has remained at about the same level, while Intel-AMD have boosted production capabilities to exceed 180 million CPUs.

        An Intel solution breaks the production limitations, and gives Apple an open door to using AMD processors if it sees a value in doing so.  It also can continue to use PowerPC.  Nothing says that is absolutely has to abandon the PowerPC processor.  As for SPARC, all that would do is put Apple at the mercy of Sun Microsystems’ SPARC production run limitations.  Sun’s SPARC annual CPU production runs are puny in numbers compared to Intel and AMD or PowerPC, and even AMD cannot produce as many CPUs as Intel. 

        There is very little reason to believe that any application or OS, even the Mac OS, leveraging OpenGL, open standards, Open Source and LinUX and a Universal Binary will not work as well on AMD as it would on Intel.

        Even back in 1996, something like 95% of the connectors, chips, surface devices materials on a PowerMac logic board or in the chassis (power supplies, etc) were industry standard with what was contemporary with Intel PCs at the time.  The other 5% consisted of ROMs and a limited number of Apple designed ASICs.  Apple clearly has a long time understanding of what it can get from economies of scale and production boosts from using industry standard components.

        There is no payoff in making lots of components on your Bill of Materials yourself, when someone else can do them more cheaply and deliver them in quantities of tens of millions per month.  Supply of CPUs and components has everything to do with how many units you can produce.

        Apple’s purpose is not industry market dominance.  Its purpose is to be profitable to its own shareholders by providing products to its customers that are perceived to create customer value.

        If the CPU switch does not create massive tangible value adds, it still succeeds if the Intel based Mac works equally as well as the PowerPC based Mac and if customer demand is met faster because the product availability is faster and in greater quantities than before.

        Apple continues to defy industry pundits each year for decades by selling off all of its production run of Macs.  If it is suddenly able to increase its supply by even 25% — much less doubling or tripling it — Apple wins.

        There is no reason to believe that the OS and apps won’t be stable under Intel.  LinUX works exceptionally well under x86.  If there is anything unstable, it is Windows, and its instabilities are not the fault of the hardware.

        There were similar arguments out of the industry in the 1990s when PowerPC was announced.  Performance issues in the initial PowerPCs were more due to the lack of a fully native OS and file system than compiled apps, and performance was still better than an 030 or 040 could deliver in most instances.  The fat binary approach used by Apple was brilliant.  Universal binaries used now are better.  Apple has clearly shown that it still remembers those arguments, and has found some new ways to make porting easier and even more compelling.  It has also shown that it clearly remembers the nightmare lessons of MacOS licensing, when licensees were deliberately deviating form the PowerPC Mac Common Hardware Reference Platform and generating instabilities in the OS that got dumped off on Apple’s tech support to the tune of millions per day in support calls for machines not made by Apple.

        While Apple can probably always improve on a price, that is not the only way to grow market share.  While increased market share is of no doubt important to Apple, it is not the overriding quest of Apple.

        As for the marketing programs at Apple – remember that marketing campaigns are designed by Madison Avenue companies.  If they miss the mark for some of us, shame as much on the adverising company who came up with the campaign as Apple should be shamed for buying into it.  The blame does not reside on Apple alone.

        The industry as a whole is at a point for several years where CPU speed increases now have to be kludged with multiple core processors, etc. due to limitations on the speed of memory, etc.  Moveover, additional hardware speed is becoming irrelevant when it does not appreciably increase the speed of your apps, MS Office, your browser, etc.

        Speed improvements over last year’s models is already less of an issue on Intel.  No reason to think that it is that much of an issue on a Mac.

        Apple has done a decent job of demonstrating the value of dual core on processor bound apps like Final Cut Pro with symmetrical capture-compression, etc.  I have seen similar laudable efforts on AMD for VMWare and other products.

         

         

         

         

         

         

      • #3196962

        Intel drives Apple sales up in 2006?

        by pheck ·

        In reply to Intel drives Apple sales up in 2006?

        Perceptions, perceptions, perceptions.

        That’s what will drive Apple sales. By aligning their CPU technology with the recognised mainstream provider (Intel) they are just following through on the long term marketing strategy that started with the Switch campaign (probably even before). The avergae Joe in the street is going to go on image. If the box has an Intel Inside sticker and has been advertised that way, it’s another known quantity. The styling will appeal, the price point wno’t be that different and it will run all the regular apps – It Just Works. The GUI won’t be too much of  a challenge to after that, if at all. Even Joe average knowns that there are several flavours of GUI out there now.

        I believe that Apple sales will go up, maybe not in leaps and bounds, but in a steady ramping up. We’ve seen the iPod halo and I suspect that there will be an Intel halo.

        The tech transition will be just scenery on the side of the road.

        Paul H

    • #3124862

      Security Minded

      by justin james ·

      In reply to Critical Thinking

      As I was setting up FTP access for our clients, I got upset, once again, at how we handle our policy regarding customer access. One of our customers does this right: each person in our company who needs to access their network has an individual username and password. My boss, and most of our customers, just want a single username/password for anyone in their organization to use. Already, this gives me a stomach ache. The idea that someone who does, or used to, work for our customer would have access to our network, with me having no way to block them by IP address, or be able to tie a login attempt to a particular individual is a frightening thought to me.

      To make it worse, a number of the higher-ups got upset at the passwords that were being assigned. “They won’t be able to remember these passwords!” was the main complaint. Well, Windows 2003 won’t let me assign easier passwords, unless I change the default policy. I’m not going to do that. Our customers have been accessing our systems with username/password combinations like “companyname/companyname” for far too long, as far as I am concerned. We process sensitive data, such as sales figures. As it is, I am considered an “inside trader” as far as blackout periods and whatnot are concerned, because I have access to all sorts of raw data, and that data in turn becomes finished reports. It is extremely important that we safegaurd this data to the best of our abilities.

      My stance is, if our customers cannot be bothered to learn a complex password to protect this data, they need to get out of the industry. NOW.

      This entire situaiton got me thinking back to some of the other security faux pas that I have witness during my time in IT. Here are a few of my personal favorites:

      • A major US bank outsourced its network management. Us peons at the Third Party Vendor had a habit of writing down router usernames/passwords. Even the HQ routers used the same usernames/passwords. Sure, it was all VPNed with IP address filtering, so no one outside of the “green zone” could access their routers. At least not over VPN. Sadly, all of their routers had either ISDN or dialup failover interfaces. The ones with dialup interfaces could also be accessed by dialing in, for troubleshooting purposes. There was no differentiating what you could do based upon which interface you came in on, so in theory, anyone with one of those phone numbers can dial in to one router, telnet to a HQ router, then access then entire network from there. It would be trivial to shut down the entire bank’s network of ATM machines and branches in a matter for an hour or two with a well written script or program that can dial a modem. With no accountability, other than the phone number from where the initial call was placed. Oh yeah, did I mention that they have never onced changed their passwords, and that the passwords, as well as all troubleshooting information are freely available within the TPV’s intranet to all who want to see it?
      • At one company I worked for, we had a massive Solaris server running HP OpenView. Sadly, HP OpenView is probably the worst engineered piece of software ever to be sold outside of the $10 “Instant Website Maker!” section at Best Buy. It is a testament to Solaris’ abilities that the server kept running, because HPOV was leaking memory like a torpedoed ship leaks water. Here’s where we had a nice little hole in our security: because HPOV was such a steaming pile of garbage, every user had the root password, so they could kill and restart HPOV. You might as well just make everyone a root user, at that point.
      • Growing up and learning COBOL on an ancient system, what a fun time. too bad we were all umasked so all of our work was chmod’ed to 777. It’s not hard to cheat or destroy someone else’s project (as happened to me, someone changed one of my PIC statements, and it took me two months to find out why my final project never quite seemed to work right) when hacking is a matter of “cd ~username”.
      • File permissions are a favorite security faux pas of mine. A company I used to work for (let’s just call them “a Fortune 500 company who’s former CEO now runs one of the top three computer makers” and leave it at that) used network storage space for all sorts of important documents. A lot of these documents, frequently pertaining to things such as the status of sales to customers, layoffs, offshoring, outsourcing, contractor conversions, employee pay rates, and so forth were typically created with read permissions to the “Everyone” group, because some nitwit sys admin had turned on inheritable permissions (fair enough) but set the top level permissions too loosely.
      • Microsoft Indexing Service = “Intranet That Can Find The Documents I Am Not Supposed To Find”. Just do a search within many corporate intranets for phrases and words such as “layoff”, “consolidation”, “India”, “sexual harrassment”, and so forth, and you find all sorts of embarassing things that were never explicitly linked to. What a great thing it is when “out of the box” defaults, mixed with ignorance on the part of a sys admin or a user, can result in anyone within the organization finding what should be restricted to a few top officials. It’s even funnier when the mistake occurs not on the corporate intranet, but on the company’s public website.
      • Security through (barely) obscurity. This one always gets a chuckle out of me, when I see someone attempt to hide “top secret” information through such crafty ruses as HTML comments to hide the text, turning off the right mouse button via JavaScript, “burying” important information with a piece of Flash, and other easily found out methods. It’s especially funny when you find it unintentionally, like if you view the source code to a document to see how they got a nice piece of design to work, and find a database password in there. The folks who write their code like this are typically pretty shoddy on their backends as well, you can usually hit them with a SQL injection attack because their super nifty search system is simply doing something like “sSQLStatement = “SELECT * FROM MY_DATABASE WHERE ID_CODE LIKE ‘%” + request.item(“search_query”) + “%”” or something along those lines. These same people often have really shoddy exception handling too, and let their database errors get sent to the client.

      This is just a very short list of some of the most common/pathetic security flaws I have personally witnessed. All it takes is for someone to be as curious as me, with less boredom-related motivations and more malicious motivations to exploit most of them. About half of these I found while simply looking for the information I needed to do my job, and finding myself in a “forbidden” directory, or seeing something else on the screen that caught my eye.

      Many of these are common mistakes on the part of end users that were enabled by poor systems administration. The vast majority of your end users have no idea that creating a directory for their supervisor in the common network area is going to expose that directory by default to everyone who wants to see it. Sys admins need to make clearly labelled “management only” directories, with the appropriate subdirectories for individual teams. Processes need to be clearly defined for systems operators at the low level for things such as department moves, new hires, employee termination, and so forth that ensure that users have only just as much access as they need. Group policies need to be put in place to disable USB ports from being used to hook up keychain drives, disable file transfers outside the corporate network over instant messaging, and so forth.

      Some of these are problems with programmers who simply don’t know better, or are too lazy to do better. These problems are trickier to find. There needs to be a rigorous code-review process in place, checking code for things like SQL statements without parameterization and whatnot.

      For both groups of people, sys admins and programmers, there needs to be a combination of education and discipline in place for slip ups. All it takes is for one “wrong person” to get a hold of a document to bring your company’s stock price tumbling, or have the SEC investigating, or any number of other problems. Why risk your company’s well-being?

      Unfortunately, systems administrators are often self-taught or trained in a haphazard manner. Too many people in IT have a certification with no experience to back it up. Programmers get cranked out of CS programs now with a lot of “this is how we do it today” knowledge, but little understanding of “why we do it like this”. All it takes is for one of these people with paper qualifications but no true understanding to have to work with a new technology or a different language to be in the dangerous land of ignorance. And that will sink your business. Beware, and conduct regular inspections to ensure that standards are met.

      J.Ja

    • #3078142

      Making peace with SaaS

      by justin james ·

      In reply to Critical Thinking

      I have been a long-time opponent of SaaS (Software as a Service) in general. Indeed, I think most of the “new ideas” in the IT industry are half-baked at best. But SaaS and thin clients (they go hand-in-hand) are nearly total bunk. Phil Wainewright (ZD Net blogger) and myself have been trading ideas back-and-forth lately regarding this topic. We both come at it from different perspectives, and there is a lot more agreement than one would think just by reading what we have written.

      Mr. Wainewright beleives that SaaS is quite possibly the way we will do most of our IT in the future. I beleive that SaaS is doomed to fail, except in certain small niche markets. He and I both agree on two things. The first, is that SaaS vendors need to do business in a better and more ethical manner than traditional IT vendors. The second, is that SaaS vendors, so far, are not doing so.

      But I also beleive that SaaS penetration will be limited for other reasons, not the least of which are technical.

      SaaS will not make large gains where the data goes both ways. It will be primarily a read-only operation. The reason for this is that smart companies with IT budgets of their own will not trust a third party vendor (TPV) with their data. It is not just a matter of whether or not that data becomes unavailable momentarily. What if there is a slip up on the TPV’s end that exposes one comapny’s data to another company? That is potentially a major catastrophe. In addition, there is the issue of importing/exporting the data should you choose to leave that vendor. Even if they allow you to export your data in some common and open format that the new vendor is able to import, how many SaaS vendors out there are going to provide you with a bulk exporting system? How will your system interface with that? A SaaS vendor is set up to allow small, discrete data transactions, not massive rivers of data. For these reasons, companies will tend to primarily use SaaS services where the data transfer is read-only.

      The Web is a miserable way of working on anything other than data which is easily put into a Web form. In other words, if the application requires anything other than textual input from the user, and input that is best displayed with standard Web widgets, then it probably won’t fly very far. In addition, binary data streams traveling back and forth over a network are really not a fun way of getting your job done. Imagine trying to use a photo-editing (or worse, video-editing) application over a network. In a typical corporate environment, bandwidth is at a premium. Any cost-savings generated by a move to a SaaS vendor will quickly be chewed up by bandwidth costs. Is a 3D Studio Max liscense really cheaper than a dedicated leased line per employee?

      SaaS will do well for applications that only a few users access, or that most employees use rarely but regularly. If more than a small number of users within an organization use a particular application, the economics of scale come into play for the customer. If an application is so important that a large portion of their employees are using it every day, the application becomes part of that company’s daily bread and butter, and as such it makes sense to bring it in house.

      SaaS will appeal mostly to small and medium sized businesses without a dedicated IT staff. For those companies where the “IT department” means someone in the office who knows to reset the cable modem when no one can get to the Internet, SaaS makes a lot of sense. For example, a small business that needs to have ODBC connectivity for small databases would be much better served by a company running Oracle on their end who gives them an EZ Installer CD that makes the right ODBC connections, and a web-based interface for creating new usernames and tables, than to have an Oracle server sitting in their backroom. It also makes sense from an financial standpoint. For a business big enough that they would need someone on hand who is either a DBA, or knows enough to fake it, SaaS’ing their database work simply does not make sense.

      Companies with locations in technological hinterlands will be well served by SaaS. Imagine a call center or a factory or whatever in the middle of nowhere. They do not need a highly educated population, and they save a lot of money by putting their facility in a backwater (lower cost of living, lower prevailing wages, employees have nowhere else to work, less unionisation, etc.). The company can choose to either become their own SaaS provider, with the applications hosted at a home office where they will be able to hire qualified personell, or they may pay a TPV to provide SaaS, particularly if it is an application which only the remote location uses. In all honesty, it is downright difficult to find competant, qualified, and experienced IT workers in the boondocks.

      SaaS will not be used for mission critical applications. The name of the game for mission critical applications is to reduce potential points of failure while providing redundancy wherever feasible. Email is considered mission critical, and therefore very few companies bigger than about twenty or thirty employees outsource their email servers. Once your website becomes a major part of your business, you either bring it in house, or at the least put a server in a co-location facility. It is one thing to have the nifty 3D map on your “how to get to our office” webpage be offline. It is another thing for the entire website to be offline. The data link between the SaaS vendor and yourself is a giant point of potentially dangerous failure. The last situation you ever want to be in is for a down telephone five miles away to put your entire company out of business for a day. It does not matter what promises the vendor or your carrier or whoever makes to you: stuff happens that you cannot prevent. You can only work to minimize that possibility. An SaaS situation multiplies your possibilities of disaster by a fairly large amount. Would you fly into an airport if you knew that the air traffic controllers were sitting five hundred miles away and communicating with the airport with VoIP? Neither would I.

      I think that SaaS will be best delivered in the form of appliances. When someone signs up for SaaS services, they already accept that it will be a black boxed operation. The customer has no idea what happens on that server; there could be a few trillion lightbulbs switching on and off in their data center instead of hard drives for all you know. Since the customer is already accepting a black box service, why not sell them a purpose built appliance? A totally sealed, rack mount box (or blade system, for bigger enterprises) that just plugs into the network, picks up a DHCP address, registers itself in DNS, and is ready to go? Even better, it does not need to be a web-based application. Because it is residing within the network, large amounts of data transfer, such as an application installation are not a problem. It could very easily have an installer run on the clients to install software from itself. It could run out to the vendor’s system periodically to fetch updates (and push them out to clients) and report usage information in the case of a per-usage billing situation. Alternatively, it could act as a web server or some other thin client server. It could even be running Citrix or Terminal Services. It doesn’t matter. The point is, if SaaS vendors sold an appliance that delivered the service to the customer, instead of having them need to interact with a vendor’s network in real time or nearly real time, the vast majority of the technical and business problems with SaaS will be overcome.

      Tell me what you think.

    • #3077367

      How search engines are hurting quality content sites

      by justin james ·

      In reply to Critical Thinking

      Jakob Neilsen has posted what I beleive to be an extremely import article, which discusses the effect that search engines have on revenue for commercial websites. Mitch Ratcliffe at ZD Net also has a good blog up about the need to pay people for their contributions to websites.

      How are these two ideas connected?

      People do very little without motivation. Money is a great motivator. What Mr. Neilsen’s article is pointing out is that it is getting increasingly difficult to make a profit on the Internet. Quite some time ago, search engines replaced DNS as the way people find sites. Now, to get found in search engines, it is requiring an increasingly larger amount of money to be fed into the gaping maw of per-click search engine advertising. When Yahoo! first started their paid review system, it was acceptable. You paid once, and that was that. Users could find your site. Now, you need to pony up cash each time someone comes to your site, unless you are lucky enough to be in the first page of the organic search results.

      With more and more websites seeing visitors come directly into one page, and not leaving that page, they need to monetize their website on every single page, and make enough money for every single page to pay for that expensive per-click advertising. If a user does not find what they were looking for on the page that the search engine sent them to, they go right back to the search engine. Your site’s “stickiness” is no longer important.

      Unless you are selling a product with a great profit margin, you are in big trouble. Content websites do not have a great profit margin on a per-visitor basis. Take a news-related website. Let’s say they make five cents per page view, on average, from advertising. They can’t very well be buying their hits for a dollar each, can they? In other words, producing content online is increasingly less profitable, thanks to search engines.

      The end result, I beleive, is that many if not most professional websites will go under. It is already happening to many newspaper websites. Their print sales are being ruined by online news, and their online sites are being forced to have paid subscriptions or to lose money. Users are increasingly going to blogs, wikis, and other amatuer websites for their news. I will admit, I have always been predjudiced against blogs, wikis, and other “community created” websites when it comes to objective (or “as objective as possible”) information. Why? Because most of them are not making money and have no editorial control. Blogs are primarily done as vanity projects, for someone to put their thinly disguised opinions up as “news” with links to like-minded blogs as “proof” and solicit comments which stroke their egos. Wikis are great examples of groupthink, which a bunch of like-minded people democratizing Truth. When this replaces professional, reletively objective websites, we are in trouble.

      There is a solution out, and that is for the content websites to find ways to generate traffic in a way that works around the search engines. RSS and traditional email newsletters are one way; you need only get the user to your site once through a search engine to get them to subscribe to your RSS feed or email newsletter. Search engine optimization is another path, as it allows you to get traffic through organic search engine results rather than paying for the placement. As Mr. Neilsen points out, increasing website usability is another solution: multiply the amount of money you make per visitor enough to offset or exceed the increased costs of getting the visitor, and you’re in good shape. There are lots of ways. Confederations of content sites are another idea; it is probably much easier to get someone to pay a subscription fee (even a higher one) to have access to a group of websites (or better yet, incofrmation from a number of websites aggregated into one site) than it is to convince them to pay a fee to a number of different sites. For example, I would be very willing to pay, say, $100 a year for access to The New Yorker, The Economist, The New York Times, The Washington Post, and say, Encyclopedia Brittanica than I would be to give each one of these sites $10 per year. Indeed, I would love to see content websites bundled up like cable TV packages.

      In any event, this is not some Chicken Little, “sky is falling” scenario. This is actually happening right now. Try buying space on Google; prices are going up, much faster than the profit margins of any product that I am aware of. The entrepenuers who try breaking into business online are going to find that their marketing costs are a lot higher today than it was five years ago. Doing business online is not as cheap as it used to be, and the inexpensive nature of online business was a major driver behind the Internet’s explosive growth to begin with. The Internet reminds me more and more of the California Gold Rush, where the people who made the real money were the people selling picks, shovels, provisions, etc. to the prospectors. Search engine advertisement is now more critical than ever, and sadly, it is now a recurring, variable cost directly tied to the number of customers you have. Imagine if a store in the mall had to pay a fee to the mall for each person who walked into their doors, instead of a flat fee for rent each month. That is where we are headed, and it totally changes the game. Or worse, if everytime someone resolved your IP via DNS you had to pay a fee. Because more and more, that is what search engine advertising is looking like.

      Tell me what you think.

      • #3258134

        How search engines are hurting quality content sites

        by librarygeek ·

        In reply to How search engines are hurting quality content sites

        Hi, My work is focused upon the organization of information, search & retrieval — so this topic is right up my alley! To be clear — search & retrieval looks at how people look for (search) information and then read, view, and/or use it. Search does *not* just focus upon search engines.

        You said:

        >People do very little without motivation. Money is a great motivator.

        There are, however other means of motivation. Observe the open source software community for one of the best examples of reputation and recognition as motivators. The thing that the big media doesn’t seem to “get” is that they need to motivate readers to read them. Why should a read bother to read that particular site? Print is having a very difficult time grasping and adjusting to the new paradigm. They are not used to engaging in conversation. However, the fabulous aspect of the web is the conversations via links, text and reuse. However, a business *is* motivated by money.

        >  Now, you need to pony up cash each time someone comes to your site, unless you are lucky enough to be in the first page of the organic search results.

        Here is the key! You need to provide *quality* content. You need to have content that is engaging, current, and provides added value that I cannot get elsewhere. Have you noticed how often you can visit newspaper sites and see articles that are virtually identical. Too often, they are pulling them from newswires with little or no additional investigation, follow up or new angles of their own. Businesses that provide content as their business should not need to hire a search engine optimizer for help. Nor should they need to purchase search results. In addition to usability, they need to open their content to search engines. Many already follow your reccomendation of locking users in via subscription. The problem is that they often lock out search engines — thus knocking them off of the search pages. When you lock down content, trying to get people to pay for it — you also lock out potential readers. I don’t pass a link along to someone where they have to subscribe.  Providing a link to an article is much like passing along a clipped article. But — it’s even better since the new viewer might start reading other areas of your site!  Locking down content reflects a business caught in a print paradigm — those businesses are dying. It is a fact of life in a capitalist society that one must adapt to survive. Every media innovation has eliminated those who could not adapt and enriched those who learned to use them (think of  radio, tv, vcrs).

        ~Library Geek

    • #3133083

      The Lone Wolf IT Guy

      by justin james ·

      In reply to Critical Thinking

      I just read a pretty good article on CodeProject (http://www.codeproject.com/gen/work/standaloneprogrammer.asp) about how to be a successful programmer when you’re the only programmer at a company. The suggestions in the article are all good. I am in that situation as well. Not only am I the only experienced programmer in my company (there are other people there who write code, but on a very limited basis, and nothing very in-depth), but I am also the systems administrator.

      All in all, it is a pretty daunting task. If the servers blow up while I am facing a deadline to write an application… well, get the coffee brewing because it’s going to be a long night. Our customers have the luxury of having dedicated IT people – here are the DBAs, over there are the programmers (sub-divided into Web dev folks, desktop application developers, specialized Excel/Access people, etc.), the sys admins are hidden in the data room, and so forth.

      In some ways, I envy these companies. What I would not give to have to keep flipping between Windows 2003 Enterprise Edition troubleshooting, FreeBSD troubleshooting, database optimization (let’s not forget, I get to run MySQL, Microsoft SQL Server, and Oracle, to add to the confusion), and programming in a hundred different languages – half of which seem to be VB variants to keep me on my toes at all times.

      The confusion can be pretty funny sometimes, especially when I am multitasking. I recently told a customer to try “telnet’ing to port 443 to check for connectivity” when I meant to tell her to “comment out the if/then block” because I was troubleshooting an SSL problem on my server while helping her troubleshoot our code over the phone. Another classic is when people ask for a piece of code advice, and I give them the right answer… in the wrong language. Too many times, I have crafted a great Perl-ish regex for someone to elegantly solve their problem in one statement, only to remember that they are using VBA (or worse, SQL).

      The situation has its rewards, however. I get to build experience along parallell lines, for instance. I can honestly say that in one year at this job, I have “1 year Oracle, MySQL, and MSSQL DBA experience, 1 year VBA with Word, Excel and Access, 1 year Windows 2003 and FreeBSD systems administration, 1 year VB.Net, 1 year ASP.Net, 1 year blah blah blah…” If I was one of those specialised IT people, I would need to work for 20 years to get one year experience in so many technologies. Of course, I came into the job with plenty of experience in a lot of different things, otherwise I would not be qualified, but still, it’s great to get a wide variety of experiences all at once.

      On that note, the work is rarely boring. I don’t get mentally stagnant, and there is always something to do. If I am not working on a project, there is always some systems administration that need to get done. If I don’t have any internal projects to get done, my help is always welcome on someone else’s project. Do I get bored? Sure I do. But I get bored a lot less often than I did when I was a pure programmer, or a pure systems administrator, or a pure whatever.

      To all of the other lone wolves out there, my hat goes off to you.

      • #3253831

        The Lone Wolf IT Guy

        by a.lesenfants ·

        In reply to The Lone Wolf IT Guy

        Hello there,

        Just to tell you i’m in the same situation as you are.I’m only 25 years old and had the chance to be proposed,though i have not that much experience, the function of IT manager here..I jumped on it and took the challenge even if i wasn’t sure i could beat it!

        So here i am troubleshooting users, maintaining the network and our ERP,seeking and buying new material as well as devellopping and deploying applications…and all by myself with no one more experienced to help me or guide me facing a problem….my only buddy is in fact, the internet and it’s various blog,forums,sites where you can hope to find the right informations or some hints that will help you through..

        I make the exact same conclusion as you after a little bit more of 1 year in the buisness.It’s sometime hard to face problems,especially when you are in my case,but with this kind of job,you get the opportunities to devellop your skills in so many domain regarding IT and so fast that it is a real chance.It’s so great to have such a job like this one!!

      • #3091188

        The Lone Wolf IT Guy

        by wilrogjr ·

        In reply to The Lone Wolf IT Guy

        Another lone wolf here – you have to wear many hats and you are always busy. The weird part is peers at larger organizations not believing all of the things you do, have access to or just plain have experience in.

      • #3133692

        The Lone Wolf IT Guy

        by apotheon ·

        In reply to The Lone Wolf IT Guy

        Been there, done that — for most of my IT career. Okay, so basically for all of it. I’m sort of an IT renaissance man by necessity.

    • #3080725

      Email servers are a commodity. Email contents are not.

      by justin james ·

      In reply to Critical Thinking

      Note: this was originally posted as a comment (http://www.zdnet.com/5208-10532-0.html?forumID=1&threadID=17796&messageID=350047&start=-1) to David Berlind’s article Yes. You should outsource your e-mail

      Mr. Berlind is absolutely correct is quite a large number of his statements, and he does indeed provide a compelling arguement for outsourcing email. But he has made some mistakes, which is where we differ on this topic.

      The first number one problem here, is that Mr. Berlind’s original blog post is titled “Google to provide email hosting?” and is 100% about outsourcing your email to Google. I put forth the question “Now, let’s look at the premise: assuming I would outsource my email, why would I outsource it to Google, of all companies?”

      Mr. Berlind has not even addressed this at all. Not in the slightest. He provides a good (but flawed) arguement in favor of outsourcing. He does not even touch the idea that Google should be the one to do it. Mr. Ou adds a quick little list of why, even if outsourcing email is the right choice for your company, Google is not the one to do it (http://www.zdnet.com/5208-10532-0.html?forumID=1&threadID=17781&messageID=349545&start=-1). Admittedly, he did not go into nearly as much length or detail as I would have, but his comments really don’t need much explaining (except for the user interface bit; GMail is pretty decent as far as web mail goes, and is garbage compared to a desktop app, is a good way of putting it).

      Now, onto the topic of the current blog post by Mr. Berlind: “Yes. You should outsource your e-mail”.

      I say, “No. You should NOT outsource your e-mail”.

      “So, one question I have for Mr. James is, of all the stuff being outsourced today, what of it isn’t mission critical?”

      That all depends on the business, but I have not encountered a business that did not consider email to be mission critical since about 1998, if not earlier than that. Furthermore, the fact that companies *are* outsourcing portions of their business process does not mean that they *should*. I can think of a number of anologies to this, but this principle is best summed up by David Hume: “One cannot derive an ‘ought’ from an ‘is'”. If everyone in New York jumped off the Empire State Building, that certainly does mean that I should as well.

      “But aside from the handful of hand-built customized competitive advantage-driving systems that integrate messaging and email into their functionality, are any of us really that deluded to believe that insourcing something as basic as email can make us more competitive than the next company (setting aside those companies with real security concerns that can prove their insourced system is more secure than the outsourced one).”

      This is a correct statement on the technological level, but an incorrect statement on the business level. On the business level, it is not the email system itself that matter; if carrier pigeons were ferrying letters printed up on Guttenberg presses at the speed of light, businesses would use it. What matter on the business level is the contents of the email, that is the mission critical part of email. Email is the lifeblood of companies, having replaced to a large extent phones, couriers, postal systems, fax machines, and so forth. What is contained in an email is often of an incredibly sensitive or important nature. Furthermore, archived emails are a frequently a knowledge repository. There is a reason why my personal email archives reach back to 2000 (and would go back to 1996, if I did not have a lapse in judgement in 2000). My major disappointment with email is that the tools are still rather primitive for mining that data.

      Any outsourcing situation has to acheive at least one of two goals: better value, or less cost. There are two ways to handle outsourced email: one is the way most small business do it, to have an external host, and they pick up the mail via POP3 and store it internally. The other way is to have an external IMAP or Exchange server offsite, and leave the data there. In the first situation, all you have outsourced are two TCP/IP transactions, one for SMTP and one for POP3. There is no added value here. And there is no reduced cost. If this is all your needs are, get the cheapest server you can find, load a Linux or BSD on it, and load qmail. For $500 in hardware, and the recurring fees for DNS and domain name registration, you are providing the same level of service to your company that the external provider is. Heck, your existing Windows server (nearly every business larger than 10 employees has one now) comes with an SMTP and POP3 server on it. So outsourcing this level of email service adds zero value and costs a lot more. Use those, if you don’t feel like getting a second server. The second option also adds no value (again, you can put Exchange onto a server yourself) and costs more. Look at the numbers you quote from Centerbeam: $45 per user per month. In a company of 10 people, that is $5,400 per year. That is more expensive than a server with Windows 2003 and Exchange, plus a lot of data AND a backup solution! Go the open source route (didn’t you guys just blog about Scalix a day or two ago?) and you have enough money left over to buy every employee an XBox 360 for a bonus. Gee, that doesn’t seem like such a value at all, does it?

      “What we offer to do is the hard work for people that they can’t afford to do themselves.”

      I know this is you quoting someone else, so I am now arguing with them, not you. As I show above, anyone who isn’t working on the US Government’s budget can see the math problems here. How much does it cost to run your own SMTP/POP3 server? It takes what, a few hours to properly setup and establish a server using either *Nix with qmail or Scalix, or Windows with the built-in servers or Exchange? The Windows route is especially useful, because all of your account management is being handled via Active Directory, so that is one less system to learn. Is an outsourced server, regardless of what it is (POP3 or IMAP/Exchange) going to integrate with your in-house identity management system? I think not. What, you’re going to set up a PPTP connection to their system, do a trust delegation to their AD system and yours, just so you don’t have to manage separate usernames and passwords? Or would you prefer some awful webadmin system to go in and change stuff?

      “That covers desktop management (anti virus, backup and restore everyday, 24/7 800# dial up helpdesk, server management, email management, VPN services, etc.).”

      The in-house solution, except for 24×7 support, is still cheaper. Sorry.

      “All a banker wants is more bankers and salespeople on staff. They don’t want a Microsoft Certified Exchange Engineer on staff who is only available for one shift a day.   Even if you do run an Exchange Server with three shifts of engineers 7 days a week, they’ll be advising you on best practices such as backup and restore. They’ll say you need a Storage Area Network (SAN) and need to send tapes to Iron Mountain everyday.”

      This guy makes me forget just about every bad example I have ever given in a ZDNet TalkBack. Bankers have all of these things anyways. Bankers run a 24×7 database of millions if not billions of database entries where even a moment’s worth of downtime can cost millions of dollars. This organization is going to be unable to support an additional few servers for email? But let’s pretend he didn’t say “banker”. Let’s pretend he said “small business owner”. If having these hordes of MSCEs on staff is a problem for him, he may want to check out *Nix+qmail. I personally cannot vouch for Scalix (never used it, relatively new), but *Nix+qmail is a time tested, battle hardened system. It requires zero maintenance. None. Heck, Exchange, when properly configured, doesn’t need any maintenance anyways. And at the end of the day, what good is his elite commando team of MSCE’s going to do for a business owner if that business does not have someone on their end who can actually understand what they are saying and how to work with them? The only time his MSCE army is a decided advatnage is when their server starts behaving erratically and the problem is definitely on their end. If their software does something like that, where you need an MSCE to troubleshoot something that was working fine, then maybe that software isn’t very good.

      “[T]he point is that a leveraged model (where an outsourcing outfit spreads the infrastucture costs across more users than you can) is not only going to save you a lot of money, but headaces (sic) too.”

      Economics of scale is an idea I can buy into. But if they are leveraging economics of scale so well, why do they need to charge $45/user/month? Earthlink charges me $6/month for a POP3 only account. Is Centerbeam’s economics of scale really so bad that they need to charge nearly 8 times as much for Exchange services? Maybe I need to reconsider those Exchange servers at my company, and put my new BSD server there to task with the email duties, the idea that Exchange is 8 times more expensive (with economics of scale, so it must be a few dozen times more expensive for our 5 person company!) than basic POP3 is total hogwash. Economics of scale distributes the cost of a Windows & Exchange license to something like 25 cents per user. So they’re just ripping you off. Sorry, I don’t like to be rpped off, and neither does my boss.

      And what headaches are they really solving for me? Managing and maintaining an email server? It seems to me like they are giving me new problems, not taking away any existing ones. Let’s make a headache list:

      In-house:
      – Hardware failure
      – Network failure (immediate Internet connection and LAN only)
      – User maintenance
      – Initial installation and configuration
      – Backup/restore
      – Patching
      – Security (95% of this is a subset of install/config)

      Outsourced:
      -/+ Hardware failure (should be be a problem if they are doing their job right)
      – Network failure (their network AND immediate local Internet connection and LAN only, we’ve doubled our headaches)
      – User maintenance (compunded by it not being part of my local authentication scheme)

      So really, all I am giving up is responsibility for the hardware, backups, and maintenance. As I have already stated, email servers require nearly zero maintenance. I am patching my internal systems anyways, adding one more item to the list. Backups, again, I should be doing this anyways, what’s one more server to have dump to tape/SAN/NAS? And for this I would be paying $45/user/month, which is much more expensive than a single Windows server in a 10 person environment? And if I have a large company, I can apply economics of scale to myself. An Exchange server can handle 1,000 users without a problem. Can my IT budget handle $45,000 per month for email alone? That’s the cost of adding 6 MSCEs to my staff on a full time basis! And then I could have two of them monitoring my servers 24×7. Hmm, maybe I had better stop discussing Centerbeam’s business model before his investors pull out now. Or better yet, maybe I should call those investors and ask them if they’d be interested in this bridge I have to sell, it connect Brooklyn to Manhattan…

      “Raise your hand if you’ve used GMail, Yahoo Mail, AOL’s mail or HotMail because you needed to send mail but couldn’t get access to your corporate email system (for whatever reasons).”

      I’ll agree with you on this one. It’s happened to all of us. On the other hand, this is an apples-to-oranges comparison. If I had the money to pay for Centerbeam’s services, I would have the money for redundant email servers, in which case the only thing that would take me down would be a disaster (natural disaster, virus/worm, fire, etc.) or a complete network outtage, in which case my users would not be able to reach GMail, HotMail, etc. or Centerbeam’s servers. So once again, this flounders on the cost issue. And also remember, I’m comparing Centerbeam’s Exchange servers to in-house Exchange servers. If you compare a plain old outsource POP3 to an in-house *Nix server, the numbers are even more in favor of the in-house solution.

      “Email systems, as it turns out, aren’t that easy to run 24/7.”

      I have been doing it for years. The only thing that ever goes wrong are things that take down an entire server, or the whole network. Again, the price of in-house vs. outsourced makes this point hard to argue.

      “Lastly, for a commodity system like email, what leverage do you have over your certified email engineer to keep the email systems up and running 24/7?  His or her job? Oh, that’s what you want.  You’d rather spend time hiring and firing email engineers than making money for your company? Service Level Agreements (SLAs) are a lot easier to negotiate and enforce with service providers than they are during an employee’s annual review.”

      Ah yes, my most favorite topic in the world! I guess I *do* have to rehash this topic. OK, time to brew a fresh pot of coffee!

      First, some links to the extensive library of my thoughts on this subject:

      * http://www.zdnet.com/5208-10532-0.html?forumID=1&threadID=16070&messageID=318442&start=-1

      ^^^^^^^^^ Number one most important post about the subject

      * http://www.zdnet.com/5208-11406-0.html?forumID=1&threadID=16244&messageID=321399&start=-1

      * http://www.zdnet.com/5208-11406-0.html?forumID=1&threadID=16278&messageID=322734

      * http://www.zdnet.com/5208-11406-0.html?forumID=1&threadID=16324&messageID=324436&start=-1

      * http://www.zdnet.com/5208-10532-0.html?forumID=1&threadID=16070&messageID=318442&start=-1

      Wow! There’s a lot of real-world, real-life, in-the-trenches experience in those links!

      Now, to be fair, I don’t always think that outsourcing is always bad, indeed, I have presented a compelling business case for it under certain circumstances: http://techrepublic.com.com/5254-6257-0.html?forumID=99&threadID=184332&messageID=1921068&id=2926438

      My direct response to Mr. Berlind’s statement. Have you ever worked someplace and had a hard time with the customer, and the boss pulled you aside and said, “look, I know you’re right and the customer is wrong, but we have to swallow our pride and give them what they want”? I have. That’s the way companies work, as long as it is profitable for them. As soon as giving the customer want they want is no longer possible, they say “no”. If a company cannot deliver on SLA (no measurability, no proof of failure, little enforceability, blah blah blah TPVs stink blah blah blah, just some self deprecation there at this late hour), you are tied to them for a contract. And what are you going to do? Spank the CEO for being naughty? It isn’t like the underpaid, underexperience, fresh-from-working-at-McDonalds-but-know-how-to-setup-a-CounterStrike-server kids who fill Third Party Vendors are going to be held responsible if a customer is lost. I am a big fan of “The Buck Stops Here”. TPVs always manage to find a reason why it isn’t their fault, SLA wasn’t truly violated, etc. For a TPV, “The Buck Stops Here” really means “Your Money Ends Up In Our Bank Account”.

      Employee annual reviews are a lot easier to manage than SLAs. I can directly measure and manage my employee’s success. SLAs are notoriously difficult to manage. I have seen cases where a customer spent nearly as much time and money simply managing SLA than they did to manage the service themselves. That is rediculous. If you think SLAs are easy to manage and enforce, try an experiment: call your cable company to make a service call. You will get a 4 hour time frame where you must be home (heaven forbid if you’re in the bathroom when they come by, “The Cable Man Knocks Once” would be a good film), and chances are they will be late anyways. If you’re lucky, they’ll give you some excuse about it. I remember working for a TPV and being instructed by managers to “invent” weather conditions that caused SLA misses, since poor weather was an SLA escape clause. How much money will you spend just hiring lawyers to 1) write the SLA 2) help you get out of the contract when SLA keeps being broken 3) sue to recoup the costs to your business when SLA is blown? If I have a bad employee who makes a serious goof, I can take them to task, or even fire them if need be and replace them with a more competant person. If SLA is blown, there is no recourse.

      Finally, there is the issue of commoditization itself. Declining levels of quality are the largest result of commoditization, outside of pricing. Look at cars. The only reason why cars improved one bit after 1972, is being foreign competition started selling better cars at a better price in the 80’s. Before that, American cars had become commoditized to the point where they were all equally junk. Now, American cars are often significantly better than their foreign counterparts, because they were forced out of commodity status. Consumer electronics is another example. Even though cell phones are cheaper now than five years ago, I spend more on them because their quality stinks. My year-old cell phone has worse battery life now than my friend’s 5 year old analog phone. The worst thing that can happen, outside of a destructive monopoly, is commoditization. It freezes the desire to improve quality and replaces it with ruthless cost cutting to match the price cutting. When you cannot compete on features or quality because everyone is the same, then no one cares about it. Again, cell phones. At this point, consumers expect poor service, because that is the price we paid to save money. There are no “premium” or “luxury” carriers out there (well, Verizon is a bit pricey, and they do seem to have slightly better coverage, from my experience), but in general, cell phones stink. Why? Because with today’s price slashing, no one can afford to innovate.

      J.Ja

      • #3101473

        Email servers are a commodity. Email contents are not.

        by sparkin ·

        In reply to Email servers are a commodity. Email contents are not.

        Nicely thought out and balanced rebuttal.

      • #3100782

        Email servers are a commodity. Email contents are not.

        by joanne lowery ·

        In reply to Email servers are a commodity. Email contents are not.

        For outsourcing there has to be a break even point at which you would service inhouse rather than outsource. At $45.00 per month outsourcing might work for up to 10-20-30 employees. At some point though the prices start to equal the costs of inhouse service. You would need to add up the cost of hardware, software, support, disaster recovery, maintenance, user management.

        You would also need to compare the quality of service from the outsource. Do you still get groupware, public folders, scheduling, contact sharing etc.

        As for tech support, outsourcing that makes a lot of sense.  My company provides outsource support for a number of SMB clients. With IPSec and PPTP remote connections we can support most sites without even needing to leave our office.

        I believe outsourcing might be a good idea, but the cost justification needs to be real and not ideological.

         

    • #3101542

      Well, forget you too

      by justin james ·

      In reply to Critical Thinking

      I just spent an hour writing a blog post. Only to have my connection drop as I hit “Submit”. When the connection came back up, “Refresh” would not resubmit my entry. “Back” restored all of the form field entries EXCEPT the article itself.

      This is why hosted solutions stink. This is why on-demand stinks. This is why thin clients stink. This is why AJAX stinks. Because a desktop application would have been performing an automatic save to disk every few minutes. And none of those other applications get write to local disk priviledges, because that would be a gaping security hole. It has been well over ten years since I used a desktop application that could lose more than a few minutes worth of work in the event of system failure, network disconnection, etc. It was five minutes ago that an online “application” did this to me. What a complete and utter bummer.

      I have to remember to write these blogs in a plain text editor then copy/paste into this form to make this thing work right. It’s bad enough that every character I type, every single keystroke causes JavaScript (save me from JavaScript, please, the interpreters are slower than my aunt’s driving) to reevaluate the entire article and try to WYSIWYG it. which means that the more I write, the longer the delay between me hitting the key and the key affecting what is on my screen. All of this for a totally of 7 dinky buttons that do bold, italic, link, unordered list, ordered list, image insert, and “switch to code mode”, and a drop down that lets me choose a block type for the current paragraph. This is stupid. How about if instead, the system have a little timer, and wait until I either key a command key for the function, click the function button, or have stopped typing for a few seconds to evaluate it. Whould that be too much? Is it necessary for it to re-download 14, that’s right, FOURTEEN images everytime I type a letter? Thankfully I’m not on dialup, and as it is I feel like I’m trying to type via telnet on a 2400 connection. Heck, old BBS’s on a 2400 modem would do a screen refresh faster than this junk.

      AJAX, thin clients, etc. is is like going back to 1989 without the cool ANSI/ASCII art by ICE, The Jargon File, 256 color GIFs of women in bikinis, the DOOM 1 shareware installer, The Bastard Operator From Hell, Legend of the Red Dragon, 2600 magazine, Phrack, music in MOD format, and all of the other fond memories of my youth.

      J.Ja

      • #3088176

        Well, forget you too

        by apotheon ·

        In reply to Well, forget you too

        That’s not really the fault of AJAX. That’s the fault of poorly conceived AJAX. Granted, I’ve only seen a grand total of about three significant implementations of AJAX that weren’t poorly conceived, and the other couple hundred or so all ranged from mediocre-bad to downright heinous, but good AJAX is possible. It’s real. I’ve seen it. I swear.

      • #3089544

        Well, forget you too

        by superdisco ·

        In reply to Well, forget you too

        Well you really answered your own comment with:

        “I have to remember to write these blogs in a plain text editor then copy/paste into this form to make this thing work right.”

        Personally, I dont trust ANY web form to do the right thing, and therefore write everything in a text editor first, unless it is a six line comment like this.  Alternatively, just before I hit submit, incase I have timed out the session etc, if I have written staight into a form I quickly grab it with Ctrl+A and Ctrl+C then hit that button.

        That way, its a frustration free experience! 🙂

        superdisco (aka Karen)

      • #3085699

        Well, forget you too

        by somebozo ·

        In reply to Well, forget you too

        well google gmail does auto save when you are composing an email. And it does it transparently without any interaction or disturbance to the end user. Therefore its not the online applications which stink…the developers lack of mind who write them..

        usually i put my long posts in notepad and then paste it on the webpage text field when done…

        Notepad is a desktop application and it does not autosave so again u may be losing data if lets say power went off or system hanged..

      • #3263049

        Well, forget you too

        by wayne m. ·

        In reply to Well, forget you too

        I Share Your Frustration 

        I would diagnose the problem as being session timeout more than anything.  This has always been a major problem with every web based application that I have seen.  Neither the server nor the web client is aware of user activity.  I have seen numerous kldges implemented just so that a user can spend a reasonable amount of entering a thought.

        I also remember the telnet days when getting a 9600 Baud modem meant life was good.  VT100 encode screens displayed and updated faster than HTML even over a LAN connection.  If we advance any further, I think I’ll revert to faxes.

        Any way, I tend to avoid posting anything in these blogs unless I really, really want to enter it.  I have also given up when my first attempt fails, usually indicated by a long period of no action.  The only trick I have found is to copy the text box (Windows trick) before I submit, kill the browser when the submit fails, relogin and paste and post.  Thanks for letting me vent as well.

    • #3271579

      The sorry state of web development

      by justin james ·

      In reply to Critical Thinking

      UPDATE (2/28/2006): I’ve posted a follow up to this article that presents positive ideas on how to change this situation.

      Last night I read a great article (http://www.veen.com/jeff/archives/000622.html) from about 16 months ago about how lousy most open source CMS (Content Management Systems) packages were. While focused upon open source, the author made mention numerous times in the article and follow up comments that his compalints also apply to commercial CMS’s.

      Sadly, all of his complaints are still true, and apply not just to open course (or closed source) CMS’s, but to about 90% of the web applications out there.

      The simple truth of the matter is, web developers generally stink, not just as programmers, but as user interface engineers.

      Over the past year-and-a-half or so, I have spent countless hours installing, trying, and uninstalling literally dozens of various open source CMS systems, without once finding something that works right, if at all. The best one out there, for my needs, was WebGUI. Too bad it broke the moment I tried to upgrade it, Apache, mod_perl, perl, or just about any other dependency it had!

      Ten years after the Web revolution began in earnest, I still find myself using systems that are not much better than the systems I was using ten years ago. Part of the problem is the continual state of change within the Web development world. Every time a new language or framework or web server or technique (like AJAX) or whatever starts to gain momentum, all development on existing systems seems to halt, and everyone decides to do everything in the new system. By the time the new systems are about as good as the old ones, another technique, language, or whatever seems to come out. By the time the server-side Java and ASP web apps got to be as good as the CGI/Perl they were replacing, .Net and PHP came out. Now that .Net and PHP apps are getting as good as the Java and ASP pages they replaced, .Net 2.0 and AJAX are suddenly the rage.

      The fact of the matter is, if all of that time had spent spent making things work in CGI/Perl (or whatever system had come first), I might have a chance of finding a quality web application.

      AJAX is the current fad. It seems to be predicated on the fact that since Google Maps are so good, and they use AJAX, that AJAX should be used everywhere. Here’s the truth:

      • Any thing you do with AJAX also has to be written server side, because otherwise your application will not gracefully degrade on a browser without JavaScript or JavaScript turned off.
      • The tools to write and debug JavaScript, ten years or so after it came out, are atrocious.
      • Different Web browsers still do not agree 100% how to render HTML and CSS, nor do they implement JavaScript identically, forcing you to stick with either HTML/CSS/JavaScript that renders and executes identically or “good enough”, or cut yourself off from a significant portion of visitors.
      • People are using XML for something is simply was not designed to do. XML was designed to be used in such a way (in conjunction with XLST, XSD, and UDDI) so that systems could automatically discover and use each other. Thus, XML is written for the “lowest common denominator”, which makes it extremely wasteful, in terms of resources needed to create it, transmit it, and consume it. People are using XML to pass data back and forth between different parts of their code (server-side to client-side, and vice versa) because it is quick and easy for them to code that way. The fact is, their applications are significatly slower, both server-side and client-side than they need to be because of this. It is much quicker, if you know the data format, to pass fixed length or delimited data back and forth, and nearly as easy to write the code. But because programmers are lazy, they would rather save 30 – 45 minutes of code writing, at the expense of creating scalability problems (just compare the file size and parsing time of XML vs. CSV to get an idea of what I mean).
      • JavaScript interpreters are incredibly slow. I recently worked on a Web application where the customer wanted some customer validation done client-side on a 3,000 record, 3 field data set. The web browser would lock up for a minute on this. Actually sending the data to the server and handling it there was faster by a factor of about 10. I don’t consider that “progress”.
      • HTTP is a connectionless, stateless protocol. In other words, it is pretty much so useless without a zillion hacks laid on top of it to accomplish what a desktop application can do with minimal, if any, coding. Look at the design of application servers. They need hundreds of thousands of lines of code, and basically all they do is receive GET/POST/etc. data, pass the appropriate information to an interpreter or compiled software, then return the results. In the process, they perform validation, maintain connection states (either via cookies or session ID’s in the URL, both of which are hacks, when you think about it), and so forth. This is utterly rediculous. Desktop and server software is using Kerberos and other harded authentication management systems, while Web applications are sending plain text, occassionally protected with SSL. Is this really the best we can do?
      • Web developers are still clueless about interface design. Half of the problem is that a Web developer is frequently forced to work with some sort of graphics designer who was brought up in the print world. Sure, bandwidth is cheap now. But with all of the hoops that an application server jumps through to process each result, a significant portion of the response time of an application is dependent upon how fast the Web server can build up and tear down each connection. An application that increases the number of HTTP requests, regardless of how small they are, is an application that won’t scale well. AJAX goes from “make an HTTP transaction with each form submission” to “make an HTTP request with nearly every mouse click”. I don’t call this a “Good Thing”. I call this stupidity. AJAX multiplies, quite significantly, the amount of data going to/from the servers, switches, routers, load balancers, the whole architechture. All in the name of “improving the user experience.” At the end of the day, “user experience” is determined less by what “gadgets” are in the software, and more by “how well can the user accomplish their goals?” A slow application doesn’t “work”, as far as the user is concerned, regardless of the features it has.
      • AJAX “breaks” the user’s expected browsing experience. All of those cute XmlHttpRequest() statements don’t load up in the user’s web history. That means that the “Back” and “Forward” buttons don’t work. If you’re providing the user with an interface that looks like a normal Web page, with “Submit” buttons and so forth, breaking the browser’s interface paradigm is a decidedly bad idea.
      • And last but not least, JavaScript (as well as Java applets and Flash, for that matter) do not get local disk access. Their only recourse, if they want to save the progress of your work, is to periodically submit the work-in-progress to the web server.

      Has anyone actually tried to make something basic even work right? It sure doesn’t seem like it. TechRepublic’s blog system is a great example of how even basic JavaScript can create a lousy user experience. With every keystroke (and mouse click within the editor), it re-parses the entire blog article. It also refreshes the simple buttons at the top. In other words, with each key I press, it is making 14 (yes, FOURTEEN) [correction: 28!] connections to a web server. That is patently rediculous. This should be re-written in Flash, or dumbed-down so it doesn’t need to do this.

      In fact, just about the only AJAX applications [addendum: I’m talking about just the AJAX portion of the functionality, I’m not particularly impressed by Google Maps’ results] I have seen worth using is Google Maps and Outlook Web Access. The rest seem to make my life more frustrating than whatever problem they thought to solve.

      Google’s success is a great example of just how lousy web applications are. Google Maps took off like a rocket, because their competitors, despite having five, six (or more) years lead on them, had wretched interfaces. Mapquest hadn’t seemed to become any more usable since Day 1. Yahoo was making changes, but came out with them after Google Maps did. Same thing for search. Sure, Google’s results were (and still are, but less and less so) better than their competitors. But their interface is a joy to use [addendum: this is rapidly changing as Google becomes more of a portal]. GMail’s biggest “feature” isn’t even its interface, it is the amount of storage space. All of a sudden, you can use Web-based email with much of the benefits of a traditional POP3 client. Outside of the storage capacity, GMail wasn’t much different from Hotmail or Yahoo Mail.

      Google cleans up because they find a market where the current market leaders have a great idea, maybe even great technology, but provide a lousy user experience anyways. The fact that Google can break into an extremely mature market and blow it wide open is proof that Web applications, by and large, stink. Because even with five, ten years of market domination, the original players still provide a lousy customer experience.

      And at the end of the day, even the most basic network aware dekstop application is easier, faster, more secure, blah blah blah, better in every measurable way than the best Web application.

      J.Ja

      Want to see who’s next On the Soapbox? Find out in the Blog Roundup newsletter. Use this link to automatically subscribe and have it delivered directly to your Inbox every Wednesday.
      Subscribe Automatically
      • #3272555

        The sorry state of web development

        by zging ·

        In reply to The sorry state of web development

        Good argument, with valid points (especially AJAX).

        BTW Flash can save on the local computer (in it’s own little space).

        How about finishing your article with some ideas/direction that developers can take to improve on the points you’ve made? It’s easy to right a critique, but not a solution!

        Also, “web developers generally stink, not just as programmers, but as user interface engineers” I think that a major amount of developers will strongly argue this, especially as web development is changing what “programmer” means. Also, how good is your average ‘programmer’ at interface design? I’d guess that you’re average ‘developer’ has a lot more idea of what they’re doing with interfaces than programmers!

      • #3272554

        The sorry state of web development

        by jaqui ·

        In reply to The sorry state of web development

        TechRepublic’s blog system is a great example of
        how even basic JavaScript can create a lousy user experience. With
        every keystroke (and mouse click within the editor), it re-parses the
        entire blog article. It also refreshes the simple buttons at the top.
        In other words, with each key I press, it is making 14 (yes, FOURTEEN) [correction: 28!]
        connections to a web server. That is patently rediculous. This should
        be re-written in Flash
        , or dumbed-down so it doesn’t need to do this.

        Actually, a bad idea, Flash should never be used for anything other than advertisements.
        site functionality should be in lowest common denominator, no clientside scripting. after all, the browser may not support the feature you are trying to use so it doesn’t work for those people.
        [ I refuse to install, or enable any clientside scripting. I don’t get the flicker in the TR blog you talk about. ]

        The way to look at it, if it requires clientside scripting, including javascript, then the website contains nothing I need to see or get.

      • #3088177

        The sorry state of web development

        by apotheon ·

        In reply to The sorry state of web development

        I’m not quite the client-side antiscripting zealot that Jaqui is, but it’s true that one of the major problems with web applications is that a lot of the time they don’t degrade gracefully. Here’s a hint for degrading gracefully: If, as a web developer, your application won’t at least run in Firefox with no extensions or plugins, and with Javascript and stylesheets turned off, then you’ve failed. Sadly, there are thousands of websites out there that fail that test, whether it’s because of Flash, Javascript, ActiveX, or nothing more than really gnarly standards-noncompliant CSS.

        Clean and simple design first, then bells and whistles if they actually enhance site functionality somehow: that’s what’s important in web development. AJAX can actually provide a great deal of enhancement to the interface, and it’s not always a bad idea, but there needs to be consideration for those who can’t or won’t use Javascript as well.

        Again, the first rule of web development should be quite obvious. Make sure it degrades gracefully.

      • #3088035

        The sorry state of web development

        by mindilator9 ·

        In reply to The sorry state of web development

        I’m gonna have to disagree with Jaqui. To say that Flash should only be used for advertisements is extremely myopic, to say the least. Just because you have no vision for Flash, or are not skilled enough to use it for other things besides banner ads (Flash is like a big GIF animator to you, isn’t it), does not mean its use should be restricted to your limited knowledge.

        The examples for AJAX given by the author nowhere alluded to it’s inherent solution to the ActiveX debacle. Sure there’s probably a better way to do that too, but if you don’t know what it is, I would just as soon stay silent on the issue.
        And finally, what is the point in comparing desktop based apps to web based apps? You start your tirade on bad web developers and their CMSs and end it with “And at the end of the day, even the most basic network aware dekstop (sic)application is easier, faster, more secure, blah blah blah, better in every measurable way than the best Web application.” How does that statement even qualify? I’m gonna skip “Why do I care?” and go straight to “Well, duh.” It kind of has to be. First, it runs on the local machine’s processor, not the server processor too, nor the transfer friction that goes with it. That’s a neat little metaphor meaning all the attributes of a web connection that slow down the information’s travels. Obviously desktop apps don’t have to wait for their requests to come back over hundreds to thousands of miles and various other weak links in the chain. If your point is that open source developers suck because their apps won’t run as well as desktop apps, then you have no point at all. Oh and btw, keep your eye out for the trend where Microsuck and everyone else does away with desktop apps and makes you subscribe to crap like Orifice (Office for the humor impaired) online.

        “Has anyone actually tried to make something basic even work right?” No, I’m sure that every web developer in the world made it their personal goal to write absolute crap and pass it off as gold. Not a single one of us wants to do anything quality in our work. But hey, being a developer yourself, you knew that. Dincha. Unfortunately I took away nothing of value from your post because the whole thing has the miasma of your negativity surrounding any statement that comes close to coherence.

        Good luck on your CMS search.

      • #3087904

        The sorry state of web development

        by zetacon4 ·

        In reply to The sorry state of web development

        I read this blog with interest, due in no small part to my long history of creating business applications hosted by the best browser technology of the time. It’s been a painful journey, but today, I can report a lot of good news to all the nay-sayers out there. I do not confuse a browser-hosted application with a standard public web page. The two have almost nothing in common.

        As a web developer and interface designer of many years, I can attest how difficult it can be doing a good job of this area of design. The latest capabilities of browsers like Firefox and Safari help the programmer design and implement some very pleasant interfaces. I am still amazed at the phobia against javascript for simple client-side interface enhancements. There seems to be a great deal of misinformation floating around about this tool. If you attempt to make your web page behave as nicely without javascript as with it, using only CSS properties and actions, you will end up standing on your head, and scratching it a lot too! And, still you won’t have the simple efficient interface you can implement with just a small amount of javascript coding.

        The one thing I would love to see is a browser that is completely as sophisticated as a desktop application. I think there is a need for loading of encrypted scripting from the server and running it as tokenized code within the browser. This will allow client scripting to run as fast as native coded desktop applications for the most part. The programmer could feel a bit more secure with his coding too.

        The second thing needed to allow us “programmers” to build mature, user-friendly interfaced applications within the browser is a truly universal and standardized DOM API. No exceptions, nothing left out. This API would be portable to any browser or other program needing a truly universal web-smart engine for networked data processing. Everybody’s browser would behave exactly the same because they used this engine, rather than re-inventing the interface all over again.

        And, finally, the one big complaint our blogger buddy mentioned was the look and feel of the graphic design and how easy and simple it should be to use an application or web page! (Remember, they ARE two distinct things). The issue will never be what slick development environment you are using, what flavor of script or other programming tool you happen to be using to create your masterpiece. The issue will remain how do all these tools and methods and environments work together to render a pleasant and truly useful human work tool. It’s a very complicated subject. We will probably never cease discussing, abusing, redesigning and improving on it. It will remain one of developers’ and programmers’ favorite topic.

      • #3088511

        The sorry state of web development

        by jaqui ·

        In reply to The sorry state of web development

        a web page that includes clientside scripting is:
        1) theft of computer processing capacity.

        2) unauthorised access to electronic resources.

        if users had to specifically enable javascript then they would be choosing to allow it.
        because the default setting is to have it turned on, it means the end user isn’t asked. and that does not mean implied consent.

        a web based application is completely different, and would benefit from fancy bells and whistles.

        but if a web page requires bells and whistles then it is extremely poorly designed, and not designed for what the web is for, access for all to information.

        I’m seriously concidering developing a browser that will show exactly how many websites are both stealing from people and not coded for security.
        no plugins, no javascript, no activex, no vbscript and if it’s not using ssl it displays the page with a red wash over everything.
        be a great tool for developing secure web pages and apps

      • #3088438

        The sorry state of web development

        by roho ·

        In reply to The sorry state of web development

        I have to say there is some truth somewhere in this article, but it is overshadowed by a lot of frustration.
        Over the last years a lot has been improved about web sites and web applications. This is still relatively fresh territory and progress is maybe slower than one would hope, but prgress is there. I agree that there are still a great many sites that are providing bad interfaces and showing that they are based on hyped but not understood technology.

        On Javascript: it is currently enjoying its second life. It came about in two different DOM flavors and was abused for alot of things (nice clocks floating around your cursor!) and it was considered dangerous for along time. Then it was decided that you should use it as little as possible, better not at all.
        Then the “unobtrusive Javascript” idea was born. Since then Javascript is slowly coming back in the mainstream. With Javascript you can get a better user experience. Client side validation can help a lot when filling out a long form. But developers must always implement server side validation.
        AJAX can also add to the user experience but is till very new and not yet of age. It still a big hype. Reason enough to hold off for a while and let the community develop a good practice for using it. There is much hype and experimentation around and many of these are fun and cool, but not really usable.

        I will just leave the other subjects like XML alone.

        My opinion is that it the Internet and the Web is slowly becoming better and better. But it has its growing pains. Overall, however, I think it is improving.

        The statement that

        web developers generally stink, not just as programmers, but as user interface engineers.

        is way, way out of proportion. My opinion is that slowly, but surely things are improving and there will always be some inconsistencies (browsers, DOM, CSS, etc) that will make building good stuff a little bit more of a challenge. And I do like a challenge, howver frustrating these can be at times.

      • #3088328

        The sorry state of web development

        by wayne m. ·

        In reply to The sorry state of web development

        Similar Conclusions

        I largely agree with J. Ja and feel the current state of web technology is ill suited for application support and despite the promises of easy distribution and maintenance for web-based software, most end users need to be dragged kicking and screaming from their client-server versions (and rightfully so).  I would suggest the following changes are necessary before we can establish full-fledged web-based applications: return to a session-based model, define a rich set of user interface primitives, combine presentation and business logic, and adopting a client-multiserver model.

        Session-Based Model

        As J. Ja alluded to under his writing about HTTP, the sessionless model has failed and it is time to return to a session-based model along the lines of Telnet.  In a distributed application, the vast majority of the processing power and short-term storage exists at the client, yet the sessionless model tries to push processing back to the server and transfer information back and forth repeatedly.  The user thinks he has an application session and the developers play tricks to simulate a session, everyone gets the need for a session, except for the HTTP standard.  The result is increased software complexity, increased processing requirements for the centralized resources, and lack of a graceful degradation path (if you would like more on this last item, just ask!).

        Rich User Interface Primitives

        The reason that things like JavaScript, ActiveX, AJAX, and Flash came into being is that the HTML primitives are far too basic to provide a full-featured user interface.  Instead, developers are forced to write custom augmentation to perform common, repeated tasks.  Why can’t we have a text box that has a label, formatting controls, spell checking, grammar checking, table formatting, etc built in?  Why can’t we have a number box that has a label, only accepts numeric characters, and restricts entry to a specified bounds?  Why can’t we have a date box that has a label, that calls a calendar, and can bounds check a date against the current date?  Why can’t we have a list box that matches against partial typed data (who was the idiot who decided to cycle based on retyping the first character)?

        Combine Presentation and Business Logic

        To provide a good user experience, applications need to provide timely feed back to the user.  We cannot have a batch mode concept of submitting a page to be processed, rather we need to apply business rules and indicate the results to the user as the data is entered and becomes available.  Although this seems to disagree with one of J. Ja’s recommendations, I would argue that this is the root cause of several of his issues.  Because of the attempt to separate presentation logic from business logic, presentation languages do not support the constructs to implement business logic creating the need for these language hybrids such as JavaScript and ASP script.  Because of the use of different languages, we have created tiers of developers with different skills.  Rather than logic being placed in the most optimal place in the architecture, it is placed based on the development language skills of the particular writer.  No wonder the code is disorganized.  It is this separation of presentation and business logic that prevents us from developing the rich user interfaces I described above.

        Client-Multiserver Model

        The next major advance in application development will be the service-based model.  The idea of breaking a single screen into several areas that are independently updated needs to be extended to have several areas that are independently updated by different servers.  The client needs to be the central point to consolidate and distribute information.  To do this through a central server fails to take advanage of the distributed processing power of the client machines.  This also leads to large scale code reuse allowing major functions to implemented once and pulled together where needed.  With a single server model, common functions are typically rewritten for each application’s server.  This model also allows us to start to create multiple custom role based applications instead of one size fits all general application interfaces.

        Summary

        The web-based model was largely an attempt to provide a friendlier version of FTP; a way to provide a verbose directory listing and download files without manipulating directory structures.  It does not provide an adequate framework for interactive applications and much of the inherent capabilities of Telnet and 3270 interfaces have been lost.  It is time to create a technology that provides the benefits of a web-based application in a manner that is actually usable.

      • #3088283

        The sorry state of web development

        by staticonthewire ·

        In reply to The sorry state of web development

        I think there are two main causes for the problems you describe with web-centric software.

        The first is that web-centric apps have a much lower bar for programmer
        entry than standalone or LAN-based apps ever did, and as a consequence
        you get a crowd of n00bs building apps. And of course, they’re making
        all the traditional and time-honored n00b errors. So on one hand, you’ve got amateurs
        with powerful tools. On the other hand, you have the complexity of the
        problem space. Face it – it’s harder to build a web-centric app than
        any single-client or state-aware LAN app.

        Your specific focus is on CMS apps, and boy, that’s a doozy. I’ve
        written more than one in-house CMS app myself, and this is a truly
        messy place to be. Just defining the problem space proved intractable –
        one character thought “content” was limited to the text and pretty
        pictures that would appear in a web page, the next included
        advertisements. One person included javascript snippets as data,
        someone else had a special category for code. One wanted cross-browser
        generation that would handle Netscape 2.0 and up, another was an IE
        fascist…. it was neverending.

        Most CMS systems deal with this by NOT dealing with this. They hand out
        a system that can do it however the user wants, they add a few
        templates that can slice the system a few different ways, and leave it
        up to the client. Which is probably the right way to go right now,
        given the singular lack of maturity for this market. But of course it
        makes for a fairly intractable and very fragile end product…

        I get why you’re p.o.’d, and I’ve been in the business long enough to
        have a vague grasp as to why these problems exist, but to be
        honest, I really don’t see what you expect. You position the discussion
        on the bleeding edge of a nascent technology and then complain that
        everything is unstable and in flux. What do you expect, given the real
        estate you’ve staked out for yourself?

        Software grows through evolution
        and versioning, and in every fresh arena to which software solutions
        are applied, you find that there is at first a short period dominated
        by a single form, followed by a “wild west” period, during which an
        insane proliferation of (often laughably unfit) forms takes place.
        That’s where cutting edge web-centric apps are, right now. It’ll all
        settle down in a bit, and the cutting edge of software will perform the
        usual random walk to some currently unexplored area of human endeavor;
        at that point, people will start discussions on the topic of how stodgy
        and unexciting web development has become…

        I thought Veen’s discussion was thought provoking in some ways, but
        also to some degree he was failing to take the landscape into
        consideration. But of course he admitted that he was being deliberately
        provocative and inflammatory, as he wished to spark a discussion. The
        most interesting thing I found in his page was the series of links ot
        various CMS systems… none of which satisfy me.

        I am currently tasked with developing yet another in-house CMS system; I can
        build or buy, it’s up to me. I have to be able to manage 16 different
        text document types, six audio types, four video types, and the usual
        slew of image types, with data residing directly in MySQL and remotely
        across a corporate network, I have six levels of user access, an ad hoc
        “project” editor that lets users create any sort of assembly they want
        of the data they have available in the systems, object versioning, and
        get this, they want boolean ops! I have to be able to apply fuzzy set
        logic to multiple projects and come up with a coherent content
        generation schema. Wish me luck…

      • #3087817

        The sorry state of web development

        by tony hopkinson ·

        In reply to The sorry state of web development

        A nice rant.

        First of all the WEB was not designed for applications and in fact still isn’t.

        The standards vacuum allowed a whole pack of vested interests to leverage their market goals into the technology.

        I completely agree about the development/degugging difficulties, but that’s really down to the lack of standards.

        Client side scripting, as implemented is a security nightmare. I don’t care how much it improves the experience. I’m completely uninterested in having my resources controlled by any 3rd party. They’re MINE.

        I completely disagree about XML, OK some sort of compression would be nice. But standardised interface between applications is what we are screaming for, so whatever mechanism used must have meta data or it’s a complete waste of time.

        HTTP is stateless, and should remain so. All the difficulties we are having is as a result of trying to attempt solutions that require statefull communications in such an environment. So why did we get here, someone saw an opportunity not to have to micro design application specific front ends !. Yet another attempt at the non technical type’s holy grail of an all things to all men, super duper, don’t need any skills , cheap arse software development.

        Went wrong again didn’t it, instead of needing no skills, you need ten or more.

        Instead of doing all things well it did no things well

        Instead of meeting all mens needs it met no man’s needs

        Is this situation new ? Well no?

        Sorting it out very simple, use the right tools for the job, or design an environment where the right tools to do the job can be created.

        You can hammer in a nail with wrench, but you’re likely to smash your thumb, bend the nail, break the wrench and end up with two pieces of timber that still aren’t joined together. And that’s when you’re careful.

        Regards Tony

         

         

      • #3087792

        The sorry state of web development

        by billt174 ·

        In reply to The sorry state of web development

        Let’s see we want one application that will run correctly and look the same across how many systems and in how many browsers. I’m sorry but what is the complaint. Were lucky there is even half the compatability that there is. With the competing business and the level of cooperation to make this happen we are lucky to have what we have.

        With desktop apps there is usually one technology one small set of designers and users.

        Web developers? Some folks seem to think that they are the grand designers of everything that goes on a site. What planet do they live on. On my planet I have managers, clients and graphics people who all know more then me and make sure I know this. After I redo the design of a site for the fourth or fifth time it’s a little difficult to find time to make the code work correctly. We have groups having meetings about what new technology will make us more productive. No one asks the question, is technology the problem or is too much the real problem?  I’ve been programming for over 12 years and the technology and expectations have come at a faster pace each year and it just doesn’t make a lot of sense to spend time and money on perfecting what’s here and now since it will probable be gone in a year or two. It’s business, just business.

      • #3087778

        The sorry state of web development

        by jlatouf ·

        In reply to The sorry state of web development

        There is a creative solution to every difficulty. As has been mentioned in some of these comments, the Internet itself has evolved in ways never envisioned by its original architects. It shouldn’t be surprising that creative individuals have overcome the obstacles involving “state preservation” and efficiency (“full page refreshes” and interactive content).

        Yes, AJAX does require a commitment to excellence and the willingness to persevere with a new paradigm in an industry that is lucrative enough using the ?old methodologies?.

        Isn?t this exactly the type of recipe for market capitalization and success that levels the play field and allows a “diamond in the ruff” to join the big kids on the block?

        This manure pile of ?pragmatic difficulties? allows the few and the brave to become pathfinders that will forever change the efficiency of Net.

        The term AJAX may be a little over a year old, however this author has been committed to, and excelling at, this paradigm for more than 5 years. I have developed and refined a technology base which I have called ?Portable Interface? (PI) technology which employs many of the same methodologies as AJAX.

        As cited in this blog, there have been numerous obstacles; however, the end result is well worth the effort and perseverance.

        In addition to the traditional websites built for clients, PI has empowered ?Global Online Graphical User Interfaces? which are user-friendly, information-friendly and network-friendly.

        From my perspective web development is only in a sorry state if creative individuals become content with the past and hesitate to move into the future.

      • #3087763

        The sorry state of web development

        by codebubba ·

        In reply to The sorry state of web development

        J.Ja,

        You do make some good points here. Web development is a mess in a lot of ways.  I still, myself, prefer doing client/server development.  The “web” as it is presently designed (requiring a browser environment for presentation) is, IMHO, a kludge.  I personally think that it would have made more sense to use the Internet as the underlying data transport layer but to implement the presentation outside the confines of a “browser”.  Still … that’s the model that we all, apparently, decided upon so at the moment I guess we need to make the best of it.

        The .Net model I’m seeing using WinForms while still providing linkage through the Web seems like it might be a good intermediate step in the right direction.  However, this particular technology is not yet mature … it will be interesting to see what the whole thing looks like in another 10 or 20 years (by the time I retire).

        And you’re right … what’s up with this blog page and the refresh of everything every time you type a character?  Yuck!

        -CB

         

    • #3088611

      What To Do About The Sorry State Of Web Development

      by justin james ·

      In reply to Critical Thinking

      What To Do About The Sorry State Of Web Development

      A commenter on previous article (The Sorry State Of Web Development) make a good point: I put out a lot of negativity without offering anything constructive in return. Well, I?m going to make rectify that mistake.

      Here is what I think needs to be done to improve the Web, as far as programming goes. I admit, much of it is rather unrealistic considering how much inertia the current way of doing things already has. But just as Microsoft (eventually) threw off the anchor of the 640 KB barrier for legacy code, we need to throw off the albatrosses around the neck of Web development.

      HTTP

      HTTP is fine, but there needs to be a helper (or replacement) protocol. When HTTP was designed, the idea that anything but a connectionless, stateless protocol would be needed was not in mind. Too many people are laying stateful systems that need to maintain concurrency or two-way conversations on top of HTTP. This is madness. This applications (particularly within AJAX applications) would be much better served with something along the lines of telnet, which is designed to maintain a single, authenticated connection over the course of a two-way conversation.

      HTML

      HTML is a decent standard, but unfortunately, its implementation is rarely standard. Yeah, I know Firefox is great at it, but its penetration still “isn?t there” yet. More importantly, while being extremely standard compliant, it is still just as tolerant of non-standard code as Internet Explorer is. If Internet Explorer and Firefox started simply rejecting non-standard HTML code, there is no way that a web developer could put out this junk code, because their customer or boss would not even be able to look at it. Why am I so big on HTML compliance? Because the less compliant HTML code is, the more difficult it is to write systems that consume it. Innovation is difficult when, instead of being able to rely upon a standard, you need to take into account a thousand potential permutations of that standard. This is my major beef with RSS; it allows all sorts of shenanigans on the content producer?s end of things, to make it “easy” for the code writers, which makes it extraordinarily difficult to consume it in a reliable way.

      When developers are allowed to write code that adheres to no standard, or a very loose one, the content loses all meaning. An RSS feed (or HTML feed) that is poorly formed has no context, and therefore no meaning. All the client software can do is parse it like HTML and hope for the best.

      JavaScript

      This dog has got to go. ActiveX components and Java applets were a good idea, but they were predicated on clunky browser plug-ins, slow virtual machines, and technological issues which made them (ActiveX, at least) inherently insecure. The problems with JavaScript are many, ranging from the interpreters themselves (often incompatible interpretation, poorly optimized, slow) to the language itself (poorly typed, pseudo-object oriented, lack of standard libraries) to the tools to create it (poor debugging, primarily). JavaScript needs to be replaced by a better language; since the list of quality interpreted language is pretty slim, I will be forced to recommend Perl, if not for anything else but its maturity in both the interpreter end of things and the tools aspect. Sadly, Perl code can quickly devolve into nightmare code, thanks to those implicit variables. They make code writing a snap, but debugging is a headache at best, when $_ and @_ mean something different on each and every line of code, based on what the previous line was. Properly written Perl code is no harder to read and fix than JavaScript. Perl already has a fantastic code base out there.

      Additionally, the replacement for JavaScript needs to be properly event-driven, if it is to ever be able to work well in a web page. Having a zillion HTML tags running around with “onMouseOver()” baked into the tag itself is much more difficult to fix (as well as completely smashing the separation of logic and presentation which I hold to be the best way of writing code) than having TagId_onMouseOver() in the

      The client-side scripting also needs the ability to open a direct data connection to the server. Why does an AJAX application need to format a request in HTTP POST format, send it to an application server which does a ton of work to interpret the request, pass it to an interpreter or compiled code, which then opens a database connection, transforms the results into XML, and then passes it back over the sloppy HTTP protocol? Wouldn?t it be infinitely better for the client to simply get a direct read-only connection to the database via ODBC, named pipes, TCP/IP, or something similar? If we?re going to use the web as a form of distributed processing, with the code managed centrally on the server, this makes a lot more sense than the way we?re doing things now.

      XML

      XML needs to be dropped, except in appropriate situations (where two systems from different sources that were not designed to work together need to work together, tree data structures, for example). Build into our client-side scripting native methods for data transfer which make use of compression, delimited and fixed width formats for "rectangular" data sets (XML is good for tree structures, and wasteful for rectangular data), preferably have that automatically negotiated between the client and the server, and we?re talking massive increases in client-side speed and server-side scalability. This would only add a few hours of development time to the server-side to code in, and would pay dividends for everyone involved.

      Application Servers

      The current crop of application servers stink, plain and simple. CGI/Perl is downright painful to program in. Any of the "pre-processing" languages like ASP/ASP.Net, JSP, PHP, etc. mix code and presentation in difficult-to-write and difficult-to-debug ways. Java and .Net (as well as Perl, and the Perl-esque PHP) are perfectly acceptable languages on the backend, but the way they incorporate themselves into the client-to-server-to-client roundtrip is current unacceptable. There is way too much overhead. Event driven programming is nearly impossible. Ideally, software can be written with as much of the processing done on the client, with the server only being accessed for data retrieval and updates.

      The application server would also be able to record extremely granular information about the user?s session, for usability purposes (what path did the user follow through the site? Are users using the drop-down menu or the static links to navigate? Are users doing a lot of paging through long data sets? And so on). Furthermore, the application server needs to have SNMP communications built right into it. You can throw off all the errors you want to a log, but it would be a lot better if, for example, a particular function kept failing that someone was notified immediately. Any exceptions that occur more than, say, 10% of the time needs to be immediately flagged, and maybe even cause an automatic rollback (see below) to a previous version so that the users can keep working, while the development team fixes the problem.

      Presentation Layer

      The presentation layer needs to be much more flexible. AJAX is headed in the right direction with the idea of only updating a small portion of the page with each user input. Let?s have HTML where the page itself gets downloaded once, with all of the attendant overall layout, images, etc., and have only the critical areas update when needed. ASP.Net 2.0 implements this completely server-side with the "Master Page" system; unfortunately, it?s only a server-side hack (and miserable to work with as well, as the "Master Page" is unable to communicate with the internal controls without doing a .FindControl). Updates to the page still cause postbacks. I would like to see the presentation layer have much of the smart parts of AJAX built in; this is predicated on JavaScript interpreters (or better yet, their replacements) getting significantly faster and better at processing the page model. Try iterating through a few thousand HTML elements in JavaScript, and you?ll see what I mean.

      The presentation layer needs to do a lot of what Flash does, and make it native. Vector graphics processing, for example. It also needs a sandboxed, local storage mechanism where data can be cached (for example, the values of drop down boxes, or "quick saves" of works in progress). This sandbox has to be understood by the OS to never have anything executable or trusted within it, for security, and only the web browser (and a few select system utilities) should be allowed to read/write to it.

      Tableless CSS design (or something similar) needs to become the norm. This way, client-side code can determine which layout system to use based upon the intended display system (standard computer, mobile device, printer, file, etc.). In other words, the client should be getting two different items: the content itself, and a template or guide for displaying it based upon how it is intended to be used. Heck, this could wipe out RSS as a separate standard, just have the consuming software display it however it feels like, based upon the application?s needs. This will also greatly assist search engines in being able to accurately understand your website. The difference (to a search engine) between static and dynamic content needs to be eradicated.

      URLs need to be cleaned up so that bookmarks and search results return the same thing to everyone. It is way too frustrating to get a link from someone that gives you a "session timeout" error or a "you need to login first" message, and significantly impacts the website?s usability. I actually like the way Ruby on Rails handles this end of things. It works well, from what I can see.

      Development Tools

      The development tools need to work better with the application servers and design tools. The graphics designers need to see how possible their vision will be to implement in code. They graphics designers will also be able to see how their ideas and designs impact the way the site handles that; if they can see, up front, how the banner they want at the top may look great on their monitor, but not look good on a wider or more narrow display, things will get better. All too often, I see a design that simply does not work well at a different resolution that what it was aimed at (particularly when you see a fixed-width page that wastes half the screen when your resolution is higher than 800x600).

      Hopefully, these tools will also be able to make design recommendations based upon usability engineering. It would be even sweeter if you could pick a "school" of design thought (for example, the "Jakob Nielsen engine" would always get on your case for small fonts or grey-on-black text).

      These design tools would be completed integrated with the development process, so as the designer updates the layout, the coder sees the updates. Right now, the way things are being done, with a graphic designer doing things in Illustrator or Photoshop, slicing it up, passing it to a developer who attempts to transform it into HTML that resembles what the designer did, is just ridiculous. The tools need to come together, and be at one with each other. Even the current "integrated tools" like Dreamweaver are total junk. It is sad that after ten years of "progress", most web development is still being done in Notepad, vi, emacs, and so forth. That is a gross indictment on the quality of the tools out there.

      Publishing

      The development tools need a better connection to the application server. FTP, NFS, SMB, etc. just do not cut it. The application server needs things like version control baked in. Currently, when a system seems to work well in the test lab, then problems crop up when pushed to production, rolling back is a nightmare. It does not have to be this way. Windows lets me rollback with a system restore, or uninstall a hot-fix/patch. The Web deployment process needs to work the same way. It can even use FTP or whatever as the way you connect to it, if the server invisibly re-interprets that upload and puts it into the system. Heck, it can display "files" (actually the output of the dynamic system) and let you upload and download them, are invisibly, the same way a document management system does. This system would, of course, automatically add the updated content to the search index, site map, etc. In an ideal world, the publishing system could examine existing code and recode it to the new system. For example, it would see that 90% of the HTML code is the same for every static page (the layout) with only the text in a certain part changing, and take those text portions, put them in the database as content, and strip away the layout. This would rock my world.

      Conclusion

      What does all of this add up to? It adds up to a complete revolution on the Web in terms of how we do things. It takes the best ideas from AJAX, Ruby on Rails, the .Net Framework, content management systems, WebDAV, version control, document management, groupware, and IDEs and adds them all into one glorious package. A lot of the groundwork is almost there, and can be laid on top of the existing technology, albeit in a hackish and kludged way. There is no reason, for example, for SNMP monitoring to be built into the application server, or version control or document management. The system that I describe would almost entirely eliminate CMS?s as a piece of add on functionality. The design/develop/test/deploy/evaluate cycle would be slashed by a significant amount of time. And the users would suffer much less punishment.

      So why can?t we do this, aside from entrenched ideas and existing investment in existing systems? I have no idea.

      J.Ja

      • #3089538

        What To Do About The Sorry State Of Web Development

        by dfirefire ·

        In reply to What To Do About The Sorry State Of Web Development

        It is easy to look at what the Web has grown into now, throw the mess away and start building it from the ground up again, in a clean way. But what is clean? Not everyone shares the same idea.
        On top of that is the fact that the Internet is a growing and evolving thing, where many smart guys look for ways to get their ideas working within the possibilities that are offered by (also evolving) browser and server developers. Who has the right to abolish a technology that offers maybe many developers a means to earn their living? Let’s say, a JavaScript developer? It would be like changing the real world stating: ok now, bicycles are too slow and not capable of long distances, let’s replace them… what consequences would that have?
        The Net is very much like the real world. It’s a market place, where the smartest or richest vendor get’s to lead the way, but sometimes a very good idea can make it.
        It would already be a lot easier should all browser developers stick to the W3C guidelines, but I can assure you: even when this “brand new perfect way of internet” should become reality, you will see that someone like Micro$oft will have a different implemetation. And off we go again!
        I strongly disagree on the fact that XML should go. Instead, everything should evolve (or has already) in this direction. Developers should only use XHTML from now on, and evolve to pure XML/XSLT. That way many sorts of data can be sent, and the DTD specs can tell how to process it. As for the server side technologies: there’s no unified taste, so let’s leave it up to the developers themselves. I am a strong Java advocate for instance, why should I drop JSP for the .NET approach?
        And the bottom line: who will develop this new internet, get astronomically rich with it, and in the end steer the whole world down it’s path? Looks like I am getting a D?ja Vu!

      • #3089519

        What To Do About The Sorry State Of Web Development

        by andy ·

        In reply to What To Do About The Sorry State Of Web Development

        Mmmm – client scripting in perl. 

        If someone asked me what would be the one best way to ensure that most web pages were a complete disaster, I would probably say “have people do web scripting in perl”.

        I agree that well written perl is perfectly acceptable, but in my experience that constitutes about 1% of what I’ve seen.

        Given that a majority web script is written by people who aspire to be Visual Basic programmers “when they learn a bit more about computing”, perl would be a total and utter disaster.

        Why not leave javascript as is for the masses, but add a standardised method of using todays best languages (ie Java, and C# if it’s cabable of being integrated to Firefox et al) in the client. 

        Why should I not be able to write client code in Java to make XML requests and process DOM objects in the browser?  Seems simple to me.

        Andy

      • #3087811

        What To Do About The Sorry State Of Web Development

        by kovachevg ·

        In reply to What To Do About The Sorry State Of Web Development

        My comments are grouped by section. I will use short quotes from the original for better reference.

        JavaScript

        “This dog needs to go” – No, this dog needs to be standardized and Microsoft should abandon JScript. JavaScript is well established in the existing browsers and replacing it will force a re-write of so many web pages. Do you really contemplate so much effort and cost?!!! Any CFO will stop you immediately. The CEOs may fire you immediately. Sorry, business logic. Purely technical people don’t run companies and they shouldn’t be allowed to because they are only fascinated by how cool something is, ana rarely, if ever, pay attentionto  how much it is going to cost and how long it would take to transform an existing business model.

        I completely agree with you about the onMouseOver() – it should be where it belongs, in the script section.

        Perl is not an OO language. JavaScript is. So the purpose of the two languages is very different. There is object oriented Perl but I am not so familiar with it and thus cannot give an authoritative opinion on this one.

        “Wouldn’t it be infinitely better for the client to get a direct read only connection to the DB via ODBC, named pipes, TCP/IP, or something similar?” – you are talking like a pure technologist. Remeber there is something called application framework. Multi-tier application architectures are designed for scalability. You cannot effectively scale with your approach – it presents a paradigm that makes things hard manage when there are thousand of connections. On the back end you have a cluster with a hundred servers and you have things like load balancing and transparent failover. I hope you see the shortcomings of your suggestion in this context. It would scarifice scalability and redundancy – both necessary for e-commerce. The users want 24/7/365(6) availability. Sorry, business rules make life complicated.

         

        XML

        “XML needs to be dropped, except in appropriate situations”. It is precises the appropriate situations that drive its existence. For example, a pharmacist can describe a recipe in XML – a recipe has a drug name, an authirzing signature, a date and so on. A pharmacist does not understand HTML and a custom description reperesents an appropriate situation from his stand point. By now you probably see what I am getting at – appropriateness is highly subjective. XML allows you to define “unexpected” layouts for documents. Then they can be translated to HTML through CSS and other formatting, but also PDF, TIFF, etc.

         

        Application Servers

        “Event driven programming is nearly impossible. Ideally, software can be written with as much of the processing done on the client, with the server only being accessed for data retrieval”. Now you are talking FAT CLIENT. A Web browser is normally a thin client. Processing is left to the server – a powerful machine that can handle complex tasks. Have a look of the disadvantages of fat clients. Wikipedia should do it. In general they are good as programming tools, like .NET tool that allow you to drop object (VB programming as well). But for simlpe web presentation they are inadequate.

        “Any exception that occur more than, say, 10% of the time needs to be immediately flagged, and maybe even cause an automatic rollback”. No, no, no, you cannot do that on a production system, esp. in e-commerce applications. You potentially take out competitive advantages of the business model. If two companies offer you the similar products, competition is based value provided by additional features. Say, you buy books from Amazon or Barnes&Noble. Prices are the same. If one of the sites indroduces a feature to allow you to shave off 5 min of average purchase time, where would you go? … I thought so. Now imagine you roll this feature and therefore this advantage back – you will stampede the user because he will think the features is still avalable and will have to relearn the old ways. And for users who never used the old ways it will be a pain and your company will get a thimbs-down because the quality of user experience dropped – see CRM for more details. But again, you think like a technologist, not a business analyst. The majority of users are not technically savvy. I used to think like you. Then I took a Human Factors course and my views on what technology should be will never “roll back”.

         

        Presentation Layer

        Updating a small portion of the page is a great idea – it reduces net traffic and speeds up response time as perceived by the users!!! Thumbs up.

        “Try iterating through a few thousand HTML elemets in JavaScript”. You are not supposed to have thousands of HTML objects residing on a single HTML page. If you do, then you did not design it right and you overwhelemed the users with content that should have been spread across several pages. It goes back to design – more is not more, just enough is more. I can’t remember the name of the famous designer who said that but he is a renaowned expert. Such massive programming should be done on the server-side wehre you have the processing power and the complexity control to do ?t effectively … with Java or other objects, NOT html objects.

        “It also needs to be sandboxed, a local storage mechanism where data can be cached … This sandbox has to be understood by the OS to never have anything executable or trusted with in it …”. Here you imply browser-OS integration. A browser is just another application and even if it were integrated with the OS to that extent, I could install my own, recompiled version of the browser and do Black Magic. So, a noble idea, but with formidable practical implcations. How many browser vendors can pursuade to do that? Open source will never play well with Microsoft. Again, Human Factors are at play.

         

        Developement Tools

        “The graphic designers need to see how possible their vision will be to implment in code”. That would be wonderful, but they cannot do it. Graphic designers are people of art – they see things in colors, lines, shapes, paint, etc. They don’t udnerstand HTML, JavaScript and so on, or put things in variables, and even if they did they would be limited to a browser and cross compatibility would kill the whole thing. Sorry, we gotta bite that bullet as technologists.

        Later on you talk about tool integration – Photohop and programming tools. Integration of disparate programs is the hardest of all. It comes back to cost.

         

        Publishing

        “The application sever needs things like version control baked in”. Any skilled system admin would tell you that  running development tools on a production system is not a good idea – complexity is one reason, secirity is another, but most of all only the necessary application should run in a production environment. This reduces the probability of a process going haywire. You can run an application server with version control built-in in a testing environment. I couldn’t agree more. But that’s not what you had in mind, right? And you have application integration again – the toughest of all.

        “In an ideal world, the publishing system could examine existing code and recoded to the the new system”. Those are the dream of programmers MACHINES THAT CAN THINK!!! Real AIs are decades away my friend. Current technology does not allow machines to evolve naturally. The hardware is not compatible with the goal. You should probably consider some paradigms in biology but with non-living materials the effect you seek will be hard to achieve.

         

        Conclusion

        “It takes the best ideas from AJAX, Ruby on Rails, the .NET Framework, content management systems, WbDav, version control, document anagement … and adds them into one glorious package. A lot of the groundwork is almost there, and can be layed on top of the existing technology, albeit in a hackish and kludged way”. So it is going to be a glorious package, in which things will be layed on top of the existing technology in a hacking and kludged way. What so glorious about the pakage then??? I’ve never seen a better oxymoron. The undertaking you are talking about is massive and will take years to dvelop. Some of the technologies are proprietary and you will not the vendors’ permission to use them and mix them as you see fit just because it will be easier for developers afterward. For example, Microsoft will never allow you to mingle .Net with J2EE. The players on the market protect they interests. It will take you years to negotiate the basic agreements – Human Factors, sorry.

        A friendly advice: please take a good Human Factors course. If you embrace the basic concepts there you will definetly see the world of programming with new and better eyes.

        I enjoyed writing these commets and I thank you for your opinion.

      • #3087750

        What To Do About The Sorry State Of Web Development

        by davidbmoses ·

        In reply to What To Do About The Sorry State Of Web Development

        This is the second time in two days I have been directed to Jakob Nielsen’s site http://www.useit.com/

        Am I the only one that finds his site on usability extremely hard to
        use. My eyes flicked back and forth for a long time when I first opened
        that page. I have a hard time focusing on the content with colored back
        grounds. I find the page offers no proper focus for leading your eye
        through the content. I’m sure after you use that site a few times, you
        get use to it.

      • #3087569

        What To Do About The Sorry State Of Web Development

        by bluemoonsailor ·

        In reply to What To Do About The Sorry State Of Web Development

        Critical:

        1. Inclined to judge severely and find fault.
        2. Characterized by careful, exact evaluation and judgment: a critical reading.

        You got definition number 1 down pretty good. Definition number 2 however, seems to elude you. “So why can?t we do this, aside from entrenched ideas and existing investment in existing systems? I have no idea.” Hmmmm… How about “Why can’t I climb Everest in one step, aside from it’s so tall? I have no idea”.

      • #3090537

        What To Do About The Sorry State Of Web Development

        by joeaaa22 ·

        In reply to What To Do About The Sorry State Of Web Development

        to;  davidmoses

        i agree with you about his site being less than user friendly when it
        comes to reading the content of Jakob Nielsen’s site.  for me it’s
        not so much the colored backgrounds as the fact that everything is
        thrown together in two big lumps of words.  there’s nothing to
        differentiate anything in the two columns.  at least he could have
        done it as an outline format of something.  even an unordered
        bullet list would help.

        and sorry about the lack of caps.  my keyboard seems to be unhappy with me at the moment.

      • #3090231

        What To Do About The Sorry State Of Web Development

        by Anonymous ·

        In reply to What To Do About The Sorry State Of Web Development

        J.Ja, I have three letters for you: lol
        The comments others submitted made several good points which I won’t restate.
        Sorry, but I could not let this page go without laughing at it… at least you were not completely wrong, just in most cases.
        No hard feelings.

      • #3084852

        What To Do About The Sorry State Of Web Development

        by kreynol3 ·

        In reply to What To Do About The Sorry State Of Web Development

        How can you say “…as well as completely smashing the separation of logic and presentation which I hold to be the best way of writing code” in one paragraph and in the very next paragraph say “The client-side scripting also needs the ability to open a direct data connection to the server.” ??  Do you know anything about database connections?  There is a cost for each usually.  You have a limited number of these.  You seriously want every browser that can be opened to have a connection to the database?  LOL!  Good luck with that.  Ever heard of a connection pool?  Look it up, you’ll do your employer a huge favor and maybe not look like a fool in your next design meeting.

        I was reading your post to this point and then realized I wish I had never started reading it.  I hope I can purge what I read from my mind or at least apply some kind of mental flag to it so I do not accidentally use it without remembering where it came from.

      • #3084789

        What To Do About The Sorry State Of Web Development

        by mgtucker ·

        In reply to What To Do About The Sorry State Of Web Development

        I “hear” this conversation like people standing around the general store discussing the best way to shoe a horse.  Shoeing horses became irrelevant when motorized coaches were mass-produced. 

        Whoever comes up with a “killer”-application that millions of users want, users won’t care if the application is written in Fortran.  People did not buy the first PC’s or Apple computers because they had an Intel or Motorola chip.  They bought their first computer to run a spreadsheet or type a letter (“killer” apps).

        The first 20 years of any new idea or technology is called the “golden years” of that new paradigm.  Automobiles, telephones, aircraft, radio, television.  Do the math for the personal computer and you see the “golden years” have passed.  The “Year-of-the-LAN” was “foretold” again and again until the mid-90’s and the [effective] “public birth” of the internet.  (Yes, I know the history of the internet.  I’m talking when the public “woke up” to the internet.  That is also when the public woke up to Email.)

        That means to me that web stuff will only carry its novelty another 10 years (give or take a few months).  Then, even this “web development” discussion will be (for all intents and purposes) moot.

      • #3086150

        What To Do About The Sorry State Of Web Development

        by deedeedubya ·

        In reply to What To Do About The Sorry State Of Web Development

        I’m sorry to hear that you don’t know quite what you’re talking about.

        For example, “Having a zillion HTML tags running around with
        “onMouseOver()” baked into the tag itself is much more difficult to fix
        (as well as completely smashing the separation of logic and
        presentation which I hold to be the best way of writing code) than
        having TagId_onMouseOver() in the block (or better yet,
        in an external file; JavaScript is capable of doing this, but it is
        rarely used).”
        Yes, it is rarely used, but that’s not JavaScript’s fault, is it? That would be the idiot implementers of JavaScript. Don’t blame JavaScript for it. Really, JavaScript is actually a wonderful language and has many useful and properly implemented features. I don’t claim it to be perfect, of course. People see JavaScript just like they do Flash. The idiots that implement both are typically idiots with no sense of style or structure. They make flashing doo-dads and nonsense instead of making proper code and systems. This is the fault of the idiot, not the languages.

        Your negativity isn’t much less than before. Why must you feel that everything needs to be tossed by the wayside? Do you believe we should just have a nice, big flood and start over? I’ve been told we’ve tried that one before. I’m sure someone has Noah’s email address if you want it.

        If you think you’re so smart, why don’t you write up some proper recommendations for us to implement. When we see your wonderful work, we will bow down to your superiority. Until then, how about you say something that might be useful.

        (By the way, this silly comment posting thing is difficult to use in Firefox)

      • #3085370

        What To Do About The Sorry State Of Web Development

        by kreynol3 ·

        In reply to What To Do About The Sorry State Of Web Development

        A 20 year lifespan huh?  Internet is gonna be dead in 10 years huh?  Wow, how do you get from point A to point B?  I still use a car on a road.  The car “idea” has been around for what maybe 100 years?  What about the road, because that is a better analogy for the Internet (infrastructure)…  let’s see… Roman empire…  I really don’t know the age of the idea of roads… I’d guess 2600 years – maybe alot more and maybe even some other even more ancient civilization…  If anyone is a betting man, I’ll take a bet with you that the Internet will be here for at least as long as television has been (which is also older than 20 years – and the Internet will be the end of television by the way).  Just a guess, but the Internet will out live me and hopefully I have at least 50 years left.

      • #3085362

        What To Do About The Sorry State Of Web Development

        by duckboxxer ·

        In reply to What To Do About The Sorry State Of Web Development

        No offense, but do you like anything?  Unfortunately you sound like a very unhappy person in need of a new career.

      • #3085145

        What To Do About The Sorry State Of Web Development

        by dna708133 ·

        In reply to What To Do About The Sorry State Of Web Development

        Hi all,

         

        All I can say is javascript and the html interface never realised their full potential////

         

        http://www.online-media-services.com

         

        dna

    • #3089390

      Bad Excel! Bad! BAD!

      by justin james ·

      In reply to Critical Thinking

      So here I am, checking out the Excel spreadsheet that my customer is using to compare it to what I am currently working on. To see if they made any changes to the ODBC queries, I popped it open in a hex editor. Guess what? Not only were the queries there (as I expected), the connection string, INCLUDING THE PASSWORD was there, in plain text! Unbloody real.

      J.Ja

      • #3089060

        Bad Excel! Bad! BAD!

        by somebozo ·

        In reply to Bad Excel! Bad! BAD!

        well didnt they teach u excel was only good for school assigments..not enterprise apps..

    • #3085647

      Hey guys, the floppy drive is dead

      by justin james ·

      In reply to Critical Thinking

      [addendum 3/5/2006] Just hand me the “Putz of the Year” award. There was a patch out there for Partition Magic which removed the error code I kept getting; it was blaming its inability to copy on a bad file system, when in reality it itself was the problem. Also hand our a “Putz of the Year” award (dated 2001) to the person(s) at PowerQuest that allowed that bug to escape their QA process, and to take quite a number of months (a year?) to fix the problem to their software.

      I’ve been struggling with a bad hard drive for a few days now. Five, to be more exact. Not the “struggling” where I don’t know what to do or how to replace it, but struggling with various utilities to get it to copy the data over to the new drive that came on Wednesday (today is Sunday). So far, I’ve tried the Windows XP Install CD (XCopy so graciously will not copy most of the files, even in Safe Mode), Partition Magic (keeps finding errors with the drive that CHKDSK isn’t finding/fixing), Maxtor MaxBlast, and Western Digital’s Data Lifeguard. What does this have to do with floppies?

      All of these tools, except for the Windows Install CD and Maxtor MaxBlast, let me create a “Recue Diskette” of some sort, but don’t have the option to burn this diskette to CD! Computers have been coming without floppy drives standard for quite a number of years, but the utilities haven’t caught up. If it was a matter of simply hooking up a floppy drive, there would not be a problem, I have a spare floppy drive on the shelf just for this type of situation. But the sad fact is, the motherboard on the current PC simply does not have a connector for a floppy cable! I bet 95% (or more) of people who purchased a computer from an OEM (HP/Compaq, IBM, Dell, Gateway/eMachines) during the last three or more years are in the same boat that I am.

      Without being able to have something that works with its own boot loader, this operation just does not seem to be working. For whatever reason, upon booting with the new drive, certain things are not working right; my Windows toolbar is lacking the address bar, and more important, Office 2003 just isn’t working right. It requests a reinstall, but the installer/uninstaller says that it is an invalid patch file. I tried the fix from Microsoft, and it worked, but with those two really obvious errors in the disk copy, I really did not feel like taking my chances and sending the original drive back.

      It is just a bit frustrating that system utility makers (especially Microsoft, who helped puch boot-from-CD for Windows installations, as well as pushed OEMs to ditch floppies) don’t take into consideration how many people simply lack a floppy disk. I’m sitting here, knowing that all I need to do is a standard Windows backup, create an Emergency Repair Disk, then run the Automated System Recovery, and I can’t.

      Have I ever mentioned how amazed I get by the lack of conseration on the part of developers for the users?

      J.Ja

      • #3085639

        Hey guys, the floppy drive is dead

        by sbrooks ·

        In reply to Hey guys, the floppy drive is dead

        Had the exact same problem a while back, then I tried an external USB floppy disk drive, booted Ghost fine and copied across to new drive with no problems. It’s likely that any new mobo without FDD connectors will support boot from USB and so should boot from the external USB drive. It is also quite easy to create a bootable CD using an FDD image file in most CD burning programs, I have used this method to create a bootable CD from a ghost floppy that also contained the OS image I wanted to install. No matter what the problem there is almost always a way around it!

        Steve 

      • #3085029

        Hey guys, the floppy drive is dead

        by georgeou ·

        In reply to Hey guys, the floppy drive is dead

        I?m with the previous poster.  Bootable CDs are the only way to go.  In fact, they boot 10 times faster than floppies.  I keep plenty of IMG floppy images and I?ve trained plenty of helpdesk staff to use them with bootable CDs that can mount network drives or mount USB mass storage devices so they can offload their files.

         

        This is also exactly why I put all my data in a partition separate from the C drive.  Microsoft should have separated user data in to a different partition with Windows 2000 by default!  Then I keep a perfect PQDI or Ghost image of my C drive and I just flash it with Ghost in 10 minutes from a bootable DVD and I?m back and running and all my data is perfectly intact in the other partition even though I re-imaged my C drive.

    • #3268283

      Message To Programmers: Try Using The Junk You Produce

      by justin james ·

      In reply to Critical Thinking

      Yes, this is another insentive, inflammatory “Programmers Stink” post.

      Why?

      Because THEY. MAKE. MY. LIFE. MISERABLE.

      Today’s problem?

      Visual Studio 2005.

      [addendum: 3/16/2006] A few commenters have made it clear that I didn’t explain the problem very well. I am not working with SQL Server. That has excellent management tools. I am working with SQL Server 2005 Express. It is the same database engine, but works off of files and is designed to be able to be installed onto a client when you install the software, this way you don’t need to worry about having the customer have a full database installed on a server. SQL Server 2005 Express has absolutely ZERO management tools outside of Visual Studio that I have found. No other tool that I am aware of “knows” how to access a SQL Server 2005 Express database, unless I were to set up an ODBC connection to it (I haven’t tried it, but it may work). The point is, I followed the directions in the help files, and they were incredibly difficult to use (this is the first time in the nearly 20 years that I have used computers that I had a problem with COPY/PASTE), and the software did not make it easy at all. Especially frustrating was the “You have an error!” box that popped up for each and every error without explaining what the error was or providing a way of stopping the process causing the error without killing the entire process. I want to see you click “OK” 24,000+ times and tell me that you’re working with smart software.

      I’m writing a small application and I want to use the SQL Server 2005 Express database engine. I have the data elsewhere, in a FoxPro table, that I want the application to use. I exported the data to CSV. Visual Studio won’t import that to the SQL Server 2005 database. FoxPro didn’t work either. Excel didn’t work. Text file didn’t work. In fact, there is no “import” feature at all, unless I want to write Transact-SQL code. At the end of the day, Visual Studio won’t copy a table from one data source to another. I can’t do a “SELECT INTO” between data sources.

      What do the help files say? Open the data and do a copy/paste. So I tried that. For hours, it was just trying to do one row at a time, and paste the contents into one cell. Eventually I figured out how to get it copied right. Why do I need to figure out how to “copy correctly”? Am I missing something?

      The piece of garbage decided to have an error on one column. Instead of telling me what the error is, it decided to be vague. To make it worse, it pops up an error on every single row that I’m pasting. All 24,000+ of them. It was faster and better to trash the whole Visual Studio session. Your “error handling” should NEVER make the user’s life worse. The joke is, Access does this well, it does the import, then create a second table with the errors. Simple. Elegant. Quick. Useful. Non-annoying.

      To all of you lazy programmers, stupid programmers, and other jerks who never actually thought about how your users would use your lousy software: I HATE YOU. I want to find where you live and cause you as much aggravation as your products cause me. For every hour of my life that a computer has saved, bad programming and poor user interface design wastes two. I want to find the moron who wrote this, go to their house, and waste 8 hours of their life, maybe force them to watch Bloodsucking Freaks 4 times. EIGHT HOURS OF MY LIFE HAVE BEEN LOST TO WHAT SHOULD BE A 5 MINUTE TASK. I am eternally grateful that I get paid salary and not per-project, I’d be eating out of Dumpsters at this rate…

      And the sad part is, this problem has caused me significantly less pain than Google’s Customer Dis-Service Department has recently.

      I swear, I want my next job to be 75% done with paper and pen…

      J.Ja

      • #3266182

        Message To Programmers: Try Using The Junk You Produce

        by apotheon ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        I know it’s not much consolation when you have to use specific proprietary applications for work, but you might try using more open source software. For the most part, open source software is developed by people who want to use it, not by people who are just being paid to produce something for others.

      • #3074898

        Message To Programmers: Try Using The Junk You Produce

        by dogknees ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        You seem to be abusing yourself! Visual studio is a tool used by programmers to write programs. If you’re using it you are a programmer. If it’s so bad, why aren’t you writing a better one?

        >>Why do I need to figure out how to “copy correctly”? Am I missing something?

        You have to figure out how to copy correctly, because like everything in life, if you don’t do it correctly it won’t work, pretty obvious really. This idea that somehow the system should figure out what you’re trying to do and how is ridiculous.

        If it gives you an error, that means you’re trying to do something incorrectly. If I try to tell my spreadsheet that 1+1=3 I expect and want it to complain, and the more insistent I become the louder and nastier should be the complaint. It should never “guess” that I meant to type 2 but was tired today. It should do preciselsy what you tell it to. If you tell it to do the wrong thing, that’s what it should do, and the consequences are yours.

        If I’m dumb enough to try pasting null data into a column that won’t accept nulls, it should refuse, for every line. I made the mistake, I should be subject to some consequences, how else does one learn not to do it.

        I’m confused though, if Access is the tool for the job, why are you trying to use something else?

         

      • #3074813

        Message To Programmers: Try Using The Junk You Produce

        by tom.bunko ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        Yeah, why can’t the software do what I want it to do, rather than what I tell it to do….

      • #3074768

        Message To Programmers: Try Using The Junk You Produce

        by dancoo ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        IMHO Visual Studio is for programming, not database manipulation.

        SQL Server has various tools built into it that can easily load data from places like CVS files. I used to always use the command-line too BCP, but there are SQL statement equivalents. I forget what they are, but try looking up the SQL Server help under “Load”. There is absolutely no need to do cut and paste. And BCP gives fairly helpful error messages, too.

      • #3074717

        Message To Programmers: Try Using The Junk You Produce

        by mindilator9 ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        LMFAO an even better question is why don’t you use php5 or python or something like that where YOU CAN ACTUALLY PROGRAM YOUR OWN ERROR MESSAGES! OMGZ! and use it fast and learn it easy. as was mentioned earlier, why are you using a widget builder to do simple database manipulation. god, no wonder you’re frustrated. open source is the way to go, as was also mentioned, because yes we do use each other’s stuff, and when we want to change it, it’s usually pretty easy. assuming you’re familiar with the language. all of your problems sound like an 8 year old who’s frustrated that his legos won’t assemble themselves into a masterpiece sculpture. you know that kid, he’s always trying to force different size or shape pieces to fit, sometimes breaking off a peg if it helps. take a lesson from the Demotivation posters, “When you earnestly believe you can compensate for a lack of skill by doubling your efforts, there’s no end to what you can’t do.” there’s a silver lining though, these mistakes you’re making that are pissing you off so much are actually teaching you what you need to know to be a good programmer. if you can see that. just stop blaming others for not building an IDE that reads your mind. i recently got an mvc framework dropped in my lap and the comments are somewhat lacking. i’m still struggling to find a way to subclass his controller in a way that plays nice with the rest of his subs, but while i hate that he didn’t make his code clear i don’t feel like he’s incompetent. the system works so obviously he’s not. i’m also relatively new to the mvc pattern having only played with a handful of test cases, so i account for my learning curve. i could easily moan that his lack of comments wasted two days of figuring out what implements what, but that’s just the breaks. the way i see it, whining about it is a bigger waste of time. you really want a job that’s 75% pen and paper? what’s the other 25%, using an abacus? or serving food?

        not trying to come down too hard on ya, but come on man your problems are your own.

      • #3074687

        Message To Programmers: Try Using The Junk You Produce

        by bluemoonsailor ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        1) Create an Access database

        2) Link to external tables, point it at the SQL 2005 mdb file.

      • #3074612

        Message To Programmers: Try Using The Junk You Produce

        by justin james ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        I’ve put my reply to these comments up (http://techrepublic.com.com/5254-6257-0.html?forumID=99&threadID=184332&messageID=1979179&id=2926438) in the form of a new post. Many of these comments started a lot of thought in my mind, thanks a lot!

        J.Ja

      • #3077168

        Message To Programmers: Try Using The Junk You Produce

        by vaspersthegrate ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        Blame the “Don’t Worry Be Crappy” school of Guy Kawasaki for it. Or the “Screw Jakob Nielsen” design anarchist/narcissists.

        All it takes is 5 typical users in a well designed User Observation Test to get rid of most bugs.

        I feel your pain. And anger. Cuz I yam Vaspers the Grate. 

      • #3076989

        Message To Programmers: Try Using The Junk You Produce

        by laconvis ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        Insert comment text here

        With users like this it is no wonder that programmers refuse to communicate.  There is a partnership between good programmers and good users. 

        In today’s environment the idea of a job that does not use Information Technology is hillarious.  I do hope that you find such a job, maybe trash collector, sewer cleaner, or zoo cage maintenance.  They all have a future without any technology.

      • #3075160

        Message To Programmers: Try Using The Junk You Produce

        by lobster-man ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        You get no sympathy from me.  I have been programming for thirty years and have always stayed current.  When you look at how easy things are today to program compared to the Autocoder, assembler languages, fortran, and COBOL, you really appreciate how much Microsoft has helped the industry and specifically the programmer’s in that industry.  today I am using Visual Studio 2005 and have also used both SQL Server 2005 and the SQL Server 2005 Express tools.  Like any programming tool, they are difficult to learn, but once you know them they are productive programming tools.  The suggestion of using Access was a good one.  It is a simple database that has been working well for years.  Or why not develope using the full SQL Server product and then downsize to Express.

        Be thankful that you are in a profession that has been around many years and will continue to be around well into the future.  Don’t get frustrated if something takes a litlle while to learn because it will pay off later.

        CharlieC

      • #3075075

        Message To Programmers: Try Using The Junk You Produce

        by kbremer1 ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        Unfortunately with some free software you have to go that extra mile to get the tools/help that you need.  You should have download the “SQL Server 2005 Express Books on Line” and do a search on Import.  Then you would have learned about using the BCP command line application which can import and export data from and to files.  

        I would also suggest that you download the “SQL Server Management Studio Express 2005 CTP to help you with creating ad-hoc query and managing your databases.

        Ken

      • #3264980

        Message To Programmers: Try Using The Junk You Produce

        by narg ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        I peronsally do a little programming, and a know a lot of high end programmers.  Trust me, I’ve asked the same question to them, and myself as well.  It just doesn’t work like you’d think.

        Programmers know their stuff in-side and out.  They know where the buttons to do this and that are.  They know why the program acts like it does.  So, in the end they basically become complacent (sp?) about their work.  I’m not meaning in a bad way, but they learn to overcome what their work will and will not do.  Place that same product in the hands of an average user and lo-and-behold!!!  The difference in interaction can be eye-opening to a truely concerned programmer.  (personally I think Vista is heading in this direction, they are over-killing the “useability” by taking away the interface tools we are used to, but that’s another discussion.)

        It’s virtually impossible to allow the interaction of 6 billion people into a program’s abilities and shortcomings.  I’ve tried, to interject as much user interaction and considerations into my work, and it’s impossible to please all of them.  True, some programs seem to have attempt no usability considerations at all, but most do.  It’s a daunting task to say the least.

        My personal Microsoft favorite response is “it works as designed”.  You know they banged their heads on how to make a program respond to certain input, then gave up thinking “that’s good enough”.  Makes you really wonder sometimes though.

      • #3106043

        Message To Programmers: Try Using The Junk You Produce

        by bjasper1 ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        This person likely has never worked with a SQL Server of any kind before. Pasting data into a table is easier than creating the table to begin with.

        But even the most experienced database programmer would try a small sampling of data prior to trying to paste 27,000 rows.

        I find it interesting that people are touting the benefits of open source for someone who does not know what he is doing to begin with.

        When looking at the poster’s profile he insuates he is a programmer. I know if his resume ever crosses my desk I will not be hiring him.

      • #3271261

        Message To Programmers: Try Using The Junk You Produce

        by daferndoc ·

        In reply to Message To Programmers: Try Using The Junk You Produce

        Hear, Hear!!!  I have been saying the same thing for 25+ years.

        Does anyone else believe, as I do, that Microsoft programmers do all their development work on Macs??  If they used Windows, it might work.

         

        FLS

    • #3074614

      Users never do the wrong thing, you wrote the wrong code

      by justin james ·

      In reply to Critical Thinking

      This is a response to a comment on a previous article: “Yeah, why can’t the software do what I want it to do, rather than what I tell it to do….”

      <sigh>

      This is one of those cases where I get mad that computer communication doesn’t carry vocal inflection. If you’re being serious, then you truly understand what my gripe is. If you’re being sarcastic, it’s exactly that attitude that is the problem.

      The user should not have to adapt to the software. The software should adapt to the user. The user pays to use software in order to gain productivity. The software designer is not paying the user. If your software is not easy to use, the customer can always go get someone else’s software. If the user experiences any pain while using your software, whether through your poor design or their own ignorance or incompetance, someone else will be more than happy to write software that helps that user and take their money away from you.

      I recommend that anyone who writes software that interacts with humans try using Perl for a few months. The whole language has “does what you mean, not what you say” as an explicit design goal. As a result, it is one of the most pleasurable languages to work in that I have ever used. The language itself vitually never gets in your way. It is such an incredible world of difference. .Net languages, Java, and other langauges I have worked with (recently) have only been able to keep up with Perl in terms of the ability to rapidly develop software because they have such gigantic libraries. As languages in and of themselves, they are lightyears behind.

      Why am I mentioning all of this here?

      Because of the concept of “gee, well you’re the one who clicked on the wrong thing.” Wrong. Unless the user makes an accident, such as a typo or a mis-click, the user has always done precisely what they thought would accomplish their goal. ALWAYS. To beleive otherwise is designer hubris, an attitude that needs to be eradicated from your thinking if you ever want to be a great (or even good, let alone employed) programmer, engineer, or any other designer.

      Here’s an example: propositions on ballots. If you have ever lived in a state that does these, you know exactly what I am talking about. You are presented with one to five paragraphs of dense legal text, and end the end, asked to vote “yes” or “no”. Here in South Carolina, a proposition barely passed that was heavily demanded. Exit polls indicated that it had won overwhelmingly. What happened? A significant number of portion of voters simply did not understand which of the two choices they were presented with would actually vote for the proposition. I know that I re-read it three or four times, and still was not sure if “Yes” would vote for the end result that I knew I wanted. Post election polls that asked what choice people made, and compared it to what result they wanted showed this. The ballot was poorly designed.

      Software is all too often designed by people who have this designer’s hubris. If your software reacted badly to a user’s input, it is because you failed to perform proper error handling. More importantly, you failed to comprehend your user’s needs. Web forms are notorious for this; instead of anticipating that a user may put dashed or spaces (or no delimiters at all) into a credit card or phone number entry field, and stripping out non-numerics, then trying to process the card, they limit the user’s input. The user doesn’t find out that the field is limited to 16 digits, then they need to go back, remove the dashes or spaces, then finish entering the credit card number. Frustrating! Just make the field bigger! More importantly, those dashes/spaces help the user ensure that the numbers were put in correctly. It takes but one line of code to strip the junk characters out and reformat that credit card number (in Perl; other languages need a zillion lines to make and use a regex object…). Either way, five minutes of programming will make your users happier and provide you with more accurate data input.

      Instead, we get websites that kick you back with an error message (often hard to find the error message on the screen, to make it worse). That’s designer hubris in action.

      Let’s review some of the comments the previous article received to see some of this designer hubris in action:

      “I know it’s not much consolation when you have to use specific proprietary applications for work, but you might try using more open source software. For the most part, open source software is developed by people who want to use it, not by people who are just being paid to produce something for others.” – apotheon

      Sadly, much (if not most) open source software has even worse user interface design than proprietary/commercial code. All too often, the answer to even a simple item is in the manual, not the interface. A user should never have to read the manual for anything but the most complex tasks. How to accomplish simple, day-to-day tasks should be intuitive and easy. Too often, I have found using open source software to be like using a website where many of the pages aren’t linked to, but need to URL typed in manually.

      “You have to figure out how to copy correctly, because like everything in life, if you don’t do it correctly it won’t work, pretty obvious really. This idea that somehow the system should figure out what you’re trying to do and how is ridiculous.” – gbently

      Wrong answer. When someone has table data in the clipboard, and is pasting into a table object, it is fairly obvious that the user is trying to populate the table with data, not an individual cell. If the program is in doubt, it should simply ask “what are you trying to accomplish, how can I help that to occur?”

      “IMHO Visual Studio is for programming, not database manipulation.” – dancoo

      I agree 100%. To load the data into MySQL, I used the LOAD DATA statement. If I wanted it to go into SQL Server 2000, I would have (probably) used the Import Wizard. If I wanted it in Oracle, I would have used sqlldr.exe. But I wanted it in SQL Server 2005 Express, which has no management tools that I could find. I followed the directions in the help system. They said to use copy/paste. The database tools in Visual Studio aren’t so hot. But they are still better than nothing, much of the time, and in the case of SQL Server 2005 Express, they seem to be the only thing going.

      “LMFAO an even better question is why don’t you use php5 or python or something like that where YOU CAN ACTUALLY PROGRAM YOUR OWN ERROR MESSAGES! OMGZ! and use it fast and learn it easy.” – mindilator

      Wow, this one had me laughing. I am not quite sure how me using Visual Studio compares to me using PHP or Python. The last I checked, I was trying to load data into a database; what language I was writing my software in is 100% irrelevant. Even if it was relevant, this poster makes a really stupid assumption: that I had much of a choice in my language. This person is obviously trying to make me look stupid, saying that I am too stupid to use PHP or Python, and that I’m a fool for using Visual Studio. What if I told this user that I was using IronPython for Visual studio, or Perl.Net? Would that make them happy? Probably not. It also shows this person’s ignorance. Visual Studio does a lot more than write web applications. In this case, I am writing a desktop application. This is why I was using SQL Server 2005 Express in the first place. PHP and Python aren’t too much help when writing a Windows desktop application!

      “just stop blaming others for not building an IDE that reads your mind.” – mindilator

      This is actually one of the things that a good IDE does. A good IDE, for example, when debugging, doesn’t just say that an exception occurred; it shows me the error. That’s a form of mind reading, it knows that I want to see the error that occured. More to the point, I was simply following the instructions. My intial thought was to find either an import wizard or an import utility. The documentation did not mention either of those. Most importantly, a well designed system does exactly what I need it to do. If I need to treat my computer like a newborn puppy, I might as well give up now.

      “i could easily moan that his lack of comments wasted two days of figuring out what implements what, but that’s just the breaks.” – mindilator

      Not only could you, you should be concerned about his lack of comments. You wasted two days of your time working with his system. If he had spent a few hours writing some basic documentation, you would have saved a lot of time. Furthermore, what if he had chnaged the function of a parameter, but not the name? Without that documentation, you would need to inspect his code line by line to see what each parameter is sued for. Additionally, the process of writing comments serves as a code review. Many times, it is during the comment-writing phase when it becomes obvious how to streamline architechture, add in additional functionality, and so forth. Good and great programmers write comments. Bad and average programmers don’t. I’m not saying that writing comments is enjoyable; it really isn’t. But the lack of comments is disturbing, particularly if he was aware that this was code that other people would be working with.

      “you really want a job that’s 75% pen and paper? what’s the other 25%, using an abacus?” – mindilator

      I’m actually glad you mentioned the abacus. Many abacus users are significantly faster at performing math than many computer users. What does that tell you? Yes, I want a pen and paper job. When I am programming, I spend about 50% of my time not writing code, if not more. I spend a good portion of time performing planning on paper. I spend time talking to my customer or end user determining their needs. I spend time sketching potential screenshots and showing them to others for critque and input. By the time I’m actually sitting down and writing code, it is a fairly quick and trivial process. It doesn’t take much time to write the code itself, and debugging is typically fairly quick as well, assuiming I have access to a quality debugger. Things like breakpoints and watches (vs. “print ‘Executing line 678, the value of sFileName is ‘ . sFileName;”) cut debugging time by about 75%, and often help you find problems before they even occur, because you get to watch the logic you wrote in action.

      The point is, the user does not pucnh themself in the head. A user who “clicked the wrong button” and wiped out their data is the victim of a bad interface design, not an idiot. Users don’t say to themselves, “gee, I want to wipe my hard drive”, they do these things because the design stunk. It is my beleive that user interface stinks, because many programmers simply do not pay close attention to the needs of their users, do not test with real-life users, and have a bad attitude towards users. It’s time the designers started paying better attention. Look at Google. Google went from a nobody to a market giant, by simply helping the user accomplish their goals (searching the Web for data) better than anyone else. They did this by providing a significantly better user interface, and by taking the time to write a better search algorithm. They identified the user’s need, and wrote software that met that need while simultaneously working the way the customer needed it to, instead of making the user work the way the software did.

      J.Ja

      • #3077167

        Users never do the wrong thing, you wrote the wrong code

        by vaspersthegrate ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        Why don’t software providers do real User Observation Tests with typical customers? This is how most bugs are to be caught.

      • #3077064

        Users never do the wrong thing, you wrote the wrong code

        by dougb ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        J.Ja,

        I agree; 150%. One of my goals as a developer is to anticipate what a user might do accidently and program allowances for that. If you want to enter today’s date in an application, you can enter 3.17, 3-17-2006, or 03/17/06. If you enter something my program does not understand, I do not pop up a message box that says “Error somewhere on the form. Try to find it” It states the date is not recognized and suggests the proper format, sets the focus to the date field and selects the text.

        Too few developers out there go to that level of detail. They want to get it cranked out, get there money, then move on to the next project. From what you have described about SQL Server Express, I would say you have every right to be frustrated. A database engine that cannot import/export data is about as useful as a canteen in the desert with a 2″ hole in the bottom of it.

        DB

      • #3075257

        Users never do the wrong thing, you wrote the wrong code

        by metalpro2005 ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        I agree to a great extend to the comments laid out here. However there are more dynamics in software development which s
        hift the focus from ‘UI quality’ (or process support) to ‘producing code’. Clients tend to see software development as a technical issue. So clients put the emphasis on producing code (i.e. giving the user MORE options to get lost in) than doing the right things to support end-users.

        This does not mean the programmer is off the hook here, but my experience as a programmer is that with the best intensions (trying to convince management otherwise) there simply is not enough time (commitment in every sense of the word) to do the ‘right thing’ especially in a web environment which is stateless and platform independent by nature.

      • #3075178

        Users never do the wrong thing, you wrote the wrong code

        by lee ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        My favorite definition of a software error comes from “Software Reliability” by Glenford J. Myers.  I am showing my age now, but this book was written in 1976, this topic is not a new one by a long shot.

        “A software error is present when the software does not do what the user reasonably expects it to do.  A software failure is an occurrence of a software error.”

        Of course, I have heard developers react to that with “How can I possibly know what the user reasonably expects?”  Well, isn’t that half of the task of being a developer, finding out what the user’s expect?

        Lee

      • #3075045

        Users never do the wrong thing, you wrote the wrong code

        by apotheon ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        re: open source software

        There’s a big difference between a v0.3.1 piece of open source
        software that is still only being developed by the original author and
        two of his buddies and a v1.0 or greater piece of open source software
        that has thousands of users, out of whom hundreds are contributing code
        to make it more “intuitive” and easy to use. I suspect you’re
        misjudging open source software by choosing your examples poorly.

        I’ve got news for you that you may find shocking: the examples
        you’re holding up as perfect specimens to demonstrate the differences
        between DWIM and simply doing what the hubristic designers expect
        people to bend over and take are great examples of how a mature OSS
        project kicks the butt of closed source proprietary projects almost
        every time. .NET and Java are closed source solutions. Perl is an open
        source project.

        Languages similar to Perl, such as Ruby and Python, are also open
        source projects. They also implement great regex capabilities (though
        Perl is still the best at regexen). The Rails web development
        framework, which uses Ruby, is about the most unsurprising (in terms of
        “when I do this, it should respond the way I expect”) development
        framework I’ve ever seen, though it’s unfortunately very surprising to
        come across something that works so well and easily for web application
        development. The Ruby language and community themselves are very
        focused around the “Principle of Least Surprise”, where it’s assumed
        that good design means things should work in a way that makes sense.

        All of this sort of thing comes about because of the open source
        development process, and the presence of a development community
        rather than a proprietary development shop run by a vendor. When
        your developers and users are one and the same, mature software means
        software that works exactly as you think it should.

        By the way, you seem to be a pretty big fan of Perl. Have you
        checked out the perlmonks.org community? You might find something there
        you like.

      • #3074996

        Users never do the wrong thing, you wrote the wrong code

        by kevin.cline9 ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        “Good and great programmers write comments”

        Good programmers write code that is understandable without external comments.  See Martin Fowler’s Refactoring for some specific techniques of comment elimination.

      • #3076391

        Users never do the wrong thing, you wrote the wrong code

        by dogknees ·

        In reply to Users never do the wrong thing, you wrote the wrong code

        Surprisingly, I agree almost entirely with what you’re saying here.

        However, there is a fundamental divide between user level applications and development tools. In development, you need to know exactly what your code and the rest of the system are doing. I’m at a loss to see how you can develop software if you don’t have precisely defined syntax and semantics that completely define the environment.

        That was my main point in the post to which you responded. You’re using a developers tool, developers tools must act in a precisely defined way to be useful. It’s not at all helpful for them to fix things up in the background.

        I’d still argue though that there is a limit to what the application should accept. At a certain point, it may become obvious that the user has no idea what they are trying to accomplish. Is is appropriate to keep guessing, or is it better to stop them, tell them to go read the manual and try again.

        My example about attempting to force a spreadsheet to get a total of 3 when it adds two values of 1 is also relevant. The software shouldn’t allow the user to make blatant errors of logic. If I try to turn a Gif into a Midi file, the system should complain, it’s a nonsense to attempt this.

        I guess there’s another side to this. I like being told I’m wrong, it teachs me something, which is a good thing. If I’m not doing something in the right way, I want to be told there’s a better way rather than the PC let me continue doing something dumb. I’m perpetually surprised when other react badly to correction. But that’s just me.

        Another issue. My definition of well written software is software that conforms precisely to it’s specification in every detail. It’s obviously the task of user-interface designers, analysts and system architects and customers to define the system. All the programmer does is implement the specification as written. If there’s a problem, it’s not with the programmer, but with the designer/analyst, and the management that decides what resources are to be expended on the development.

        Regards

    • #3076990

      The user/developer disconnect

      by justin james ·

      In reply to Critical Thinking

      I’ve been blogging a bit lately about poor user interfaces and the developers who create them. Sometimes, the problem with the end result of software is caused by the initial project specifications. Here’s are of some of the major pitfalls that can occur in any project.

      The initial project specifications are not correct

      All too often, the user has an idea of what they want in mind at the beginning of the project, and that vision is completely different from what they want by the end of the project. A project that takes a long time to develop is sure to not be finished before the business rules or needs change. Another problem that I have seen time and time again, is that until the user gets their hands on a working version of the product, they do not get to find out if their initial perceived needs are their actual needs. At the beginning of a project, the user may only want a few options, but once they see the power that the software is giving them, they want a lot more options. Or to go the other way, they ask for a thousand options or ways of doing slightly different taks, but find out that in reality there are only a few features they actually use, and the rest are waste.

      I combat this by providing sketches and mock screenshots throughout the development process. I try to involve the user as much as possible, because I have found that it is always better to spend an hour giving them a prototype (even if none of the widgets work) that let’s them get an idea of how things will work, than it is to spend weeks or months writing code that they don’t want.

      Users seem to think that low-output projects are low work

      This is one of my favorite user misconceptions. They seem to think that the number of results directly impacts the length and difficulty of the development process. Something happened today that is an excellent example of this. I am currently working on a system that ties maps to the user’s sales data and sales representatitve data. One of the people in my company sent me some information to think about using “as a demonstration.” He seems to think that it should be easy or trivial for me to create results based on this data, because the data set is significantly smaller than what we normally work with. Wrong! It is no more difficult to program to handle 1 million rows of database data as it is to handle 100 rows of data. It is almost always the number of columns and the complexities of the data transformations that make a project more or less difficult. I frequently have customers ask me to run a report ad hoc that will only generate a few “bottom line” numbers. They think that because they are asking for only one row of data that this is a trivial task. It is just as much work to produce the “bottom row” numbers as a full report, because the full report needs to be made to generate the bottom row of it anyways.

      The only way to beat this one is through educating your users. But there is a way to be nice about it, and give them a value add as a bonus. For example, in the bottom line number scenario, I go back to the user and say to them, “listen, to generate those bottom line numbers, I need to do a full report anyways. Would you prefer to get the full report while I am at it? It will take only a little bit longer to be able to provide a full report to you, and then you will be able to have the individual numbers as well.” This way I have impressed upon them that while the amount of output they are asking for is low, the amount of work is the same as a high-output request. In addition, I am giving them the option of having information that they might not have asked for because they thought it would be too much additional work to produce. The users are very happy with this approach, and after a certain point they come to you and say “would it be much harder to do XYZ versus ABC?”

      The project specifications do not match the users’ goals

      This one is always a problem, particularly when the people performing the work are separated by one or more degrees from the users. In fact, like a game of “Telephone”, the more layers between the developers and the users, the worse the problem gets. For example, I recently had a user who needed to get names and addresses from an Excel spreadsheet, insert the information into some text to generate a form letter, and be able to print those letters at Kinko’s. This was their actual goal. My project specifications were “take this Excel spreadsheet, insert these values into the text already entered into these cells, then save the individual sheets to PDF format, and make sure they look good.” Between trying to take flowing text in Excel and making it “look good”, and producing PDF output, I could tell that someone had given me project specs that were not properly aligned with the user’s needs. They had given me project specs that matched what they thought would be the best way of performing the task, not the task that the user actually wanted to be done. I dodged this by asking to know what the user had actually asked for. Once I found that out, the project was much easier – it took a few hours as opposed to a few days.

      Whenever you are given project specs, you need to get access to the user. In some companies, you may need to demand it. Many project managers get the attitude that you, the developer, needing to speak to the user reflects poorly on them as a project manager. That may be so, or it may not. But at the end of the day, you are going to be writing the code, not the project manager. It is in everybody’s best interest for your project specs to accomplish the user’s goals, not to fulfill a PM’s vision of how you should accomplish what they perceive the user’s goals to be. A PM who refuses to allow users to interact with the developers is a bad PM. Currently, my customers have my direct number. They take their requests directly to me, and they love the “self serve” aspect of it. Does my boss get involved? Sure he does, he needs to make sure that one customer’s needs don’t interfere with my time on other projects, that they are aware of what is within scope and out of scope of the contract, and so forth. But at the end of the day, the user is getting exactly what they need, they are in control, and they are delighted to spend the money because we are doing for them what their own staff cannot or will not do.

      At the end of the day, the smaller the gap between the user and the developer, the better the product will be in terms of meeting the user’s needs and expectations. Good communications between the suers and developers, while being sometimes difficult, tend to save significant amounts of time and frustration. It is always better to spend an hour talking to the user, even if they are some empty-head sales manager to get to know their real needs, than it is to spend three months designing and developing a proejct, only for the user to never actually use it, or reject it outright, because it doesn’t actually do what they need, even if it does what they asked for,

      More to come on this topic over the next few weeks…

      J.Ja

      • #3076963

        The user/developer disconnect

        by wayne m. ·

        In reply to The user/developer disconnect

        Look At XP & Agile Development 

        I would recommend looking at Afile Development methods in general and Extreme Programming in particular.  These approaches address many of the issues described above.

        Agile methods focus on eliminating degrees of separation between developer and user and focus on providing quick turn around of functioning (not just shell) software.  Miscommunication is always possible, but it can be reduced by eliminating middlemen and quick turn around ensures the miscommunication that does occur is identified and addressed as quickly as is possible.

        It is quite a mind shift for many to realize that adding ever more and ever more thorough levels of review of intermediary products exasperates the problem instead of improving the situation.  Look at XP, I think you will find that it addresses the issues described above and more.

      • #3076389

        The user/developer disconnect

        by dogknees ·

        In reply to The user/developer disconnect

        >>At the end of the day, the smaller the gap between the user and the developer, the better the product will be in terms of meeting the user’s needs and expectations.

        Agreed and agreed.

        There’s two sides though. Doesn’t the user bare some responsibility for taking the time and making the effort to learn about the options they have? It’s ultimately their responsibility to ask for what they really need. It’s not mine to try and lead them along some path of learning to think logically and thouroughly about their needs. If they don’t have the skills, then they need to go and learn them, I’mnot a teacher.

        I’d expect the builder of my house to be familiar with the latest technologies available to him. I similarly expect a knowledge worker to make themselves familiar with the latest tools available to them, which in large part is software.

        Regards

      • #3076166

        The user/developer disconnect

        by wayne m. ·

        In reply to The user/developer disconnect

        Disagree – Developer Is Responsible

        Remember, the user already knows how to do his job and has been doing it.  The software is developed to make it easier for him to do his job.  As the technical expert, it is up to the developer to propose how the software can help the user.  The analogy of the house builder started out correctly, but the conclusion reversed roles.  Just as the house builder needs to provide expertise in the technologies available, the software developer must also provide this knowledge.

        Remember, software exists only to automate manual tasks.  If the automation provided by the software is not obvious, the user will not know to look for it and will continue to use the manual approach.  Always remember that the user-developer relationship is not peer-to-peer, rather the developer is providing support to the user and the developer cannot expect the user to meet him halfway.

      • #3076935

        The user/developer disconnect

        by fbuchan ·

        In reply to The user/developer disconnect

        As a developer for more than 2 decades, I have found that there are two basic situations that work poorly for everyone involved.

        The first is letting the client abdicate their responsibility. The developer (or some representative like a project manager) must focus the client, shape their expectations, and engender a transfer of their knowledge. Just as we don’t expect clients to be savvy about development options and techniques, they cannot expect their developers to be aware of their job needs. Requirements analysis is the foundation for communicating needs. It is the client responsibility to make an effort to convey those needs concisely and effectively, with help, of course. The method of requirements analysis is almost irrelevant as long as it results in good communication, but we can never forget the problem with most software is that clients got exactly what they asked for in many cases, rather than what they needed. Heck, when I buy a meal at a restaurant I order what I want, and modify it as needed even then with comments like, “light on the salt.” Given the price of software, clients have to expect to provide valid and timely input.

        The second fatal approach to any software development is to expect developers to design a workable interface in a vacuum. As soon as I hear the words, “You know what we need,” and hear a developer reply, “Yeah, I do,” I shake my head sadly. Developers have a different focus than end-users, and cannot be expected to become cross-domain experts to the degree where they can magically envision some for of ultimate user interface. The only way they can even come close is if they shadow the user community for days, weeks or months. If software is modelling a process of any degree of complexity, the interface will either be complex and poorly done, or simplified overly, unless the use experts (the end users) actually participate. The clients understand already what will be an assist, and the developer must be made to understand, not allowed a pretence of understanding.

        As for the comparison to the building of a house, consider that it is usually problematic to have a general contractor do your plumbing. It is far better to think of developers as medical providers. Some are GP types, and some are surgeons. You wouldn’t hire an eye, ear and nose doctor to do brain surgery; and while a brain surgeon can tell you that you have a cold, you’d be a fool to have him open your skull to do so. Expertise requires focus, and focus can be counterproductive to end-use simplicity. The best example of the gap between the developer and the user is seen in some excellent open source products like FireFox. Powerful and incredibly solid, yes; too damn hard to use for a majority of average users, apparently. Or, closer to home for developers, consider the latest Visual Studio product: Alongside vast improvements in the user interface are pockets of aggravation caused by the fact developers cannot even reliably create interface for developers.

        Finally, to add one pitfall to the list generated so far: “lack of value proposition is a serious challenge in developer/user relationships.” Users simply don’t know how to equate the cost of an undertaking with its value, and frequently cannot grasp how making their latest proprietary application look and feel like MS Office is a million dollar prospect. While I have been lucky to never walk into a development where the value proposition connection was totally lost, I have watched a few go off the rails while developers struggled to deliver impossible expectations for pennies. And I did once actually have to recompile an entire application to change the label beside a field on a screen called “Inspection” so that it read “Inspection Description” rather than “Description.” (And, yes, they had approved the field labels previously, but thought description was confusing.) Until clients understand the cost versus value of their requests, I fear that the design process will never truly mature.

      • #3075912

        The user/developer disconnect

        by dogknees ·

        In reply to The user/developer disconnect

        The example I gave regarding builders etc. is in regard to the client. In my environment, my clients are information workers. In the same way I expect a builder to know his tools, I expect them to know theirs. As the primary tools of the knowledge are software, I don’t think it’s unreasonable to expect a basic level of understanding and for them to spend some of their time and effort getting themselves to this level. I see it as part of any professionals job.

        I generally agree with the comments. But I still say there is a degree of responsibility that must remain with the client. To ask or what you want is the fundamental one. Despite the advances in science in the last few decades, mind reading is still not commonly available. If the client can’t spell out what they want, they haven’t given it sufficient thought or effort. A very wise person once said “if you can’t measure it, you don’t understand it”. I’d paraphrase that if you can’t define your requirements, you don’t know what they are”.

        If you don’t really know what you want, how are you going to know when you get it?

        One of the respondents made the comment that software only exists to make tasks easier. Disagree vehemently. There are whole ranges of applications that have only become possible due to cheap fast computers. They weren’t done at all before. Simulation, various kinds of graphics, deep analysis of histroic data, etc.

        Regards

      • #3265560

        The user/developer disconnect

        by sghalsasi ·

        In reply to The user/developer disconnect

        When I think of developer-user communication gap and project failure it is because of both of them. But I think as the user is the origin of the requirements a rigorous brainstorming must be done to make the user communicate effectively. It should be seen as a seperate responsibility of the organisation to equip their employees to tell the requirements when the decision to go for a new system is finalised. Only the developer cannot be held responsible for requirements or project failure  

      • #3265496

        The user/developer disconnect

        by al_lee ·

        In reply to The user/developer disconnect

        Interesting to read some of the comments within the thread of the
        blogs. Everyone seems to be in agreement that proper requirements and
        immediate access to the user are the keys to successfully generate
        requirements and hence complete the work accurately and as expected.
        Lets level set the discussion with the over-simplified statement above.

        I’ve also read about processes and analogies regarding the proper
        delegation of responsibilities (ie. doctors and builders) to properly
        execute on projects; yet I’ve not heard from the development community
        one comment on properly delegating the tasks to the appropriate
        skill-sets. Introducing and utilizing other development resources
        (multi-disciplinary approach) to be responsible for defining,
        understanding and generating the ‘users’ requirements.

        I’ve practiced XP/Agile/MSF/IPD/Chrystal/UCD/PMP/et al, I’m a PM with
        an analyst background and I’ve never started a successful project
        without the appropriate resources and skill-sets. What I’m talking
        about is a multi-disciplinary approach. I delegate the requirements
        realization to not only a lead developer but also a User Experience
        Designer. This individual has a usability background, is a visual
        designer and an information architect. You’re worried about taking what
        the user says as what the real requirements are? These are the people
        with the skills to do the proper ‘shadowing’ and interviewing
        techniques/observations/questions that will realize the true tasks,
        activities and translate them into accurate requirements that are
        represented within the interfaces. They will even perform quick
        usability studies to validate the designs prior to development’s
        implementation. This provides enough time for the development group to
        absorb the business requirements to tranlate them into technical
        requirements, execute on low-level system designs, proof of concepts
        and anything else required to make the technology work best.

        Maybe I’ve been lucky in that I’ve had very little difficutly in
        selling this to a client or organization, everyone from stakeholders to
        the project team can understand and accept the approach, but I’ve found
        this to be one of the most effective methods for interfacing with
        users. Developers should not be attempting this alone, they will
        already have too much on their minds to effectively provide this
        service. You would never expect the developers to write end user
        documentation, why would you expect them to go it alone to realize the
        requirements?

        The industry has to realize that in order for projects to succeed we
        need the appropriate skills represented to provide the level of
        competence and expertise to succeed. In order to do this we need to
        educate ourselves and to introduce other possiblities that broaden just
        the ‘development’ world, with methods that work and that don’t work.
        Hopefully this helps someone.

    • #3074397

      Programming “How” versus programming “Why”

      by justin james ·

      In reply to Critical Thinking

      I want to send my thanks out to all of the readers and commentators who have been spending so much of their time to post comments, debate, criticism, and more on my most recent string of blog posts. You guys are the greatest, and I think it is really awesome that you are putting as much, if not more, time and effort into discussing this blog as I spend writing it. My recent post regarding the relationship between users and developers has started a great thread of comments. One of the comments, from gbentley, provoked quite a bit of thought in my mind:

      “One of the respondents made the comment that software only exists to make tasks easier. Disagree vehemently. There are whole ranges of applications that have only become possible due to cheap fast computers. They weren’t done at all before. Simulation, various kinds of graphics, deep analysis of histroic data, etc.” – gbentley

      This quote contains two commonly accepted ideas. The first is that “software only exists to make tasks easier.” The second idea is “[t]here are whole ranges of applications that have only become possible due to cheap fast computers.”

      Are these two ideas actually in opposition? Are either of these ideas even true?

      “Software only exists to make tasks easier.”

      I will call this ?the assistant argument?. This one is so close to being a truth with no questions required of it. But it has a really big problem: the word “only”. Without “only”, this is a “duh” statement, of course software exists to make tasks easier. But to claim that software’s only purpose is to simplify tasks, without enabling new tasks? That’s a pretty bold claim. But it is easy to see why people hold this idea. Email replicates the postal system or letter and packages, spreadsheets replicate ledger books, word processors replicate typewriters (which themselves merely replicate paper and pen), and so forth. Indeed, with the exception of software that are directly related to computers, there are very few pieces of software which do not replicate an existing task that can be done without a computer (the qualifier is in reference to things like development tools or backup software; without computers there would be no need to write software and nothing to backup).

      “There are whole ranges of applications that have only become possible due to cheap fast computers”.

      I call this idea ?the enabler argument?. This claim seems to be much more reasonable than the first one does. It does not deny that software simplifies many tasks, but simply states that software and computers allow work to be done that could never be done before. This claim has two possible variations. The literal variation boils down to ?there are certain things that simply cannot happen without a computer.? This is pretty hard to prove. Even in the examples given by gbentley are all things that someone with amazing patience and/or dexterity, or a huge team of people, could do without anything more than paper and pencil. From there, we have the pragmatic variation: ?there are certain tasks, that while able to be performed without a computer, that it is unrealistic to expect them to be done without a computer.? gbentley?s examples, I believe, fall firmly within this argument, and quite nicely at that.

      It is actually impossible to evaluate either of these claims without an understanding of what a computer is.

      For example, I could state, ?a ledger book is a form of computer, albeit a mechanical one; it is a mechanical database where all C/R/U/D (Create/Read/Update/Delete) is performed by the user with a pencil, and all calculations must be performed by the user, but it is a database all the same. With that statement, we have essentially said that even a checkbook is a computer, and more or less, even the most basic accounting is impossible without either a computer, or superhuman memory and math skills.

      Or one could interpret ?computer? to mean any electronics device using a binary storage and logic system to represent arbitrary data. This would be more appropriate; anything that we accept as a ?computer? is a binary device. Or is it? Scientists are working very hard to develop computer that make use of quantum states, where the answer set is expanded from ?on? and ?off? to include ?maybe?. Other scientists are working on computers incorporating organic matter, which requires an analog interface. Even a hard drive is hardly a ?binary? device. It records magnetic states. Magnetism isn?t an ?on? or ?off? property, it is a relative property. A hard drive is really reading the relative strength of tiny magnetic fields. If a field has more than a certain amount of strength, it is considered ?on?. So it is hard to say that the use of binary storage or processing is what qualifies a device as a ?computer?. We could also try to make a derivative argument, which is that a computer needs to have transistors, but again, there are established computing devices without transistors, with more on the way.

      Luckily for us, The Enabler Argument makes it clear what kind of ?computer? it means: ?fast and cheap?. That rules out anything below, say, a four year old PC. A checkbook, abacus, etc. should be unable to perform any task faster than a piece of modern computer hardware, given the right software. And therein lies a major problem. The software.

      This takes us back to where the whole argument began in the first place, the relationship between software creators and software developers. We can safely ignore the folks who think that we are on the verge of having the group groups suddenly overlap. I?ve been told countless times that ?Application XYZ is going to let users create their own code without a developer needed, with simple drag ?n drop interface (or ?plain English queries? or ?natural language queries? or ?GUI interface? or ?WYSIWYG interface?, or whatever)!? Luckily for me and my career, this magic application has never been written, nor do I see it on the near horizon.

      For one thing, it would require an entirely new way of writing and maintaining computer code. I have been thinking about this quite a bit lately, but I am unable to share my thoughts at the moment for a variety of reasons. Hopefully I will be able to share them in the near future.

      Another problem is the way developers go about thinking about the concept of how users perform their tasks. This also relates to the problem of user interfaces. Developers ask users certain key questions when writing code to help them translate the users? business needs into Boolean logic and procedural code (call it what you will, but even event driven and OOP code is actually procedural code, once the event is fired or method executed). Developers ask questions like:

      If this value is higher than the threshold you have laid out, should I reduce it to its maximum allowed value, or produce an error message?

      Is this data going to be in a list or a spreadsheet format (when a developer says ?spreadsheet? to a user, they really mean database, but users understand the idea of spreadsheets better than databases)?

      Under what circumstances do I need to highlight this data in the output stream?

      These are questions that are needed to be asked to code business logic, and there is nothing inherently wrong with them. But where this model of specification creation fails, is in not understanding what the users? actual needs are. We are finding out how they do their job, but not why. A great example of this is the Microsoft Office Macro Recorder. Sure, the end result may be a bunch of VBA statements, so you could claim that the MOMR turns ordinary users into developers. But look at the code it produces; it creates code that replicates every stray mouse click, key pressed, etc. It replicates how the user does their job. This is why the MOMR code is so useless for all but the most trivial tasks. It does not have any understanding of the user?s intentions. If the user selected a particular column and ten made it bold, the macro simply says ?select column C and then make it bold?. It isn?t even clever enough to reduce that to ?make column C bold? and eliminate the selection aspect of it! Running that macro while the user has another column selected loses the user?s prior selection. A developer will re-write the code appropriately. But even then, the software has no idea why the user made column C bold. This code will always make column C bold every time it is run, even if column C was the correct column to make bold when the macro was run. If that decision was based upon the contents of the column, or maybe the nearby columns, the macro does not accomplish the user?s goals in the least.

      This is where the developer fits is. The developer sits down with the user and asks question like ?under what circumstances do we make column C bold?? Even better, the developer will ask ?under what circumstances do we make any particular column bold?? The best question to ask, though, is ?why are we making columns bold?? That is when we transition from programmer to me ?how? and start programming to meet ?why?. There is an amazing thing that happens at this point in the conversation: the developer may learn at that point that the user?s ?why? is not best served by the ?how? that was requested in the project specifications. The user may say, for example, ?well, we want to column bold, because any bold column indicates poor sales numbers.? The developer could turn around and say, ?it sounds like what you need out of this is a method to identify underperforming accounts, would it be better if we created a second tab on the spreadsheet that contained only the underperforming accounts along with the relevant information to help you see why they are not doing well?? All of a sudden, our job just became a lot more difficult, but at the end of the day, we have a much happier customer.

      It is at the point when software ceases to be written to replicate the how and fulfill the why that we actually answer the questions that started this article.

      Up until relatively recently, software creation was merely a how oriented task. You could throw as many new languages and pieces of hardware as you wanted at the situation, but that is where we were. Only recently has software started becoming a why oriented task. When software is only written to meet the ?how?, then we end up with software that only makes existing tasks easier. When we write software to accomplish the ?why?, we are writing software that allows things to be done that could not have been done (meeting either the literal or the pragmatic variations) through the use of software. The user?s why is not to slightly alter the color code of some pixels. That?s the how. The user?s why is to remove the red eye from a digital photograph.

      When we begin to address the why, we write great software. Bad red eye removal software will turn any small red dot in an image into a black dot, after the picture has been taken. Good red eye removal software will hunt only for red dots in areas that are potentially eyes and turn them into black dots. Great software will first find a face-like shape, then search for eye-like shapes, and then turn the red dots into black dots. The best red dot removal software doesn?t even need to exist on a computer; it would be in the camera! The camera would alter the flash (length, brightness, maybe have a few different flashes that use different spectrums) to use as little flash as possible and maybe even do something to prevent the red eye from even making it into the image file, let alone reach the lens.

      That?s just one (probably not great) example of what I mean. And this is where we are seeing the user/developer gap. The developers are simply too far away from the user?s, in so many aspects, that they have a very hard time understanding the why of the software. I am certainly not claiming that the develop needs to know how to do the user?s job (it certainly helps though!) or that we find a way to make tools to allow user?s to generate software (impossible to do well at this point). I am saying that developers need to work to determine the user?s why as long as the how, and figure out how to code to the why as opposed to the how. Some developers (including myself) are lucky enough to be writing software that will only have a handful of users. We are able to have these conversations with our users. Imagine trying to write a word processor that meets the potential why of every potential user! Some users use word processors as ad hoc databases like grocery lists. Others use it to replicate typewriters. Others don?t even create content, they use a word processor to review and edit content. And so on and so on. From what I have read about Office 12, I think Microsoft has the right basic idea: make the interface present only the options that are contextually relevant to what the user is trying to accomplish. But it still isn?t anywhere near where it needs to be. It is still working with the how.

      File systems are another great example of a system that meets how but not why. When computers were fairly new, the directory tree made sense. The information contained within the directory names and file names themselves was a form of metadata. ?Documents/Timecards/July 2005.xls? is obviously a spreadsheet from July 2005 that is a timecard document. This met the how which was the filing cabinet metaphor, that was how these things were done for 100 or so years prior to the computer. The why was ?identifying and finding my data?, which is what the directory tree structure does not accomplish very well. In reality, the user would be much better served by a system that would assign certain properties such as ?Timecard?, ?Document?, ?Spreadsheet?, ?pertinent to July 2005?, etc. to a file object. All of a sudden, we can do away with the ?Timecard? directory, and have a ?Timecard? group that contains all documents marked as timecards! That is a very powerful change indeed.

      Again, all too often our design and development process gets sidetracked by how and not why. How many times have we as users said something like ?I think a drop down box for that information would be great?? We think a drop down box is best, because that?s how we saw it done somewhere else for something similar? not knowing that there may be a much better interface widget that we simply were not aware of. As developers, how many times have you heard a customer dictate architecture details to you to the point where you wonder if it would be easier to just teach them a programming language and let them do it themselves? This is because the user/developer conversation is focused on how. As a developer, in the most metaphoric sense possible, it is my job to translate your why into a software how. That is it. Writing software that accomplishes what you need to do better than how you are doing it now. And that is the crux of the problem. If the user?s current methods are so great, why are they asking me to write software for it? When we begin to address the why instead of the how, we are no longer replicating what may be an inefficient process (or taking what is a great process when performed manually, but inefficient as software), but helping to create a whole now process appropriate for the medium.

      A large part of the problem, from my viewpoint, is that programming languages are still how oriented. There is no why in writing software. Some languages are a bit better than others. Perl, for example, with its ?Do What You Mean, Not What You Wrote? attitude is a better one. C is a very bad one. C will be more than happy for you to alter a pointer itself instead of the bits it points to, when anyone reading the code would understand that you wanted to alter the code itself. This is at the heart of my complaint with AJAX. People are using AJAX to replicate rich client functionality, with tools that were simply never designed to do that. Of course it will be sloppy. I won?t go into the technical details at this time, I have been over them a number of times already. But AJAX, and many other Web applications are replicating what is frequently poor rich client how to begin with, and compounding it by inventing a why that does not exist. The user does not use a word processor because they need to be able to type from any computer with an Internet connection. Thus, a Web based word processor does not address any actual why, but creates a bad one, then replicates a horrendous how in the process, using bad tools to boot.

      I have a lot more thoughts on this topic, and what changes need to occur for developers to become empowered to address the why while not sweating the details of how. Stay tuned.

      • #3265630

        Programming

        by fred mars ·

        In reply to Programming “How” versus programming “Why”

        Wonderful thoughts!  Great article!

        The main problem I see with developers moving from “how” to “why” is that many people are not willing to invest the time, patience, and effort for asking “why” even if it will give them increased benefit in the end.  It’s a similar issue with the idea of documenting everything as you go along in the project, rather than rushing to put out incomplete and incorrect documents at the end.  This is a fundamental change in the mindset of the people who are involved, and many people are unwilling to allow such change in themselves.  This is not only a “developer” problem, but potentially impacts almost all facets of our lives and relationships.  Dealing with the “why” before getting into the “how” can provide benefits across the board, but is a tough sell to most people.

      • #3265559

        Programming

        by dogknees ·

        In reply to Programming “How” versus programming “Why”

        Good article.

        Pretty much agree with what you’ve said. It is an enormous battle to get people to think outside their immediate experience. But I agree we need to do so to go forward. This has to come from both sides, it isn’t possible for it to all be done on the developers side without any investment (of time and intellectual effort) from the client. I battle this every day. I’m in an environment where the line “You know what we need” is often all the spec I get for a new development.

        This is very much what I’ve been railing against in my recent posts. There seems to be no acknowledgement from business that this is a shared problem. It’s not something the IT industry can fix without that input and support from our clients.

        I’m a bit wary of the idea that some new language is going to help significantly in this. Remember all programming languages are capable of exactly the same results. Turing proved this quite a while back. All different languages do is provide differents models of the same thing. Some models may be easier to understand for some people, but that doesn’t change the intrinsic limits of what can or can’t be done in that language.

        I agree also with your comments on web-based apps. They should be limited to those situations where you NEED access from any web-enabled device. I also think the whole structure of HTML/… needs to be revised to enable full-function interfaces to be created as easily as they are in a standard development scenario. We’ve spent a lot of time learning the tricks of the Windows interface, lets leverage it, not throw it out. I like draggable/sizeable objects that respond instantly. That is the base standard for any interface for me. By all means go to a web interface, as long as it is at least as fast and responsive.

        Regards

      • #3265517

        Programming

        by staticonthewire ·

        In reply to Programming “How” versus programming “Why”

        The perfect client workstation consists of a system unit and a keyboard with JUST ONE KEY.

        All the user has to do is tap that one key all day – the software is SO SOPHISTICATED that it knows what the user means at ALL TIMES…

      • #3265397

        Programming

        by apotheon ·

        In reply to Programming “How” versus programming “Why”

        I’ve linked to this weblog entry from off-site, in a weblog entry titled users as programmers. If you read it, I’m sure you’ll quickly grasp why I linked it, but in short, I was inspired in part by this entry to write about programmers who develop applications for their own use, as opposed to those who do so only to satisfy project specs handed down from on high.

        In any case, I thought you might want to be made aware of the referral.

      • #3265690

        Programming

        by rayjeff ·

        In reply to Programming “How” versus programming “Why”

        This is a very interesting article. I guess I have to say that I’ve been lucky to have started my developing career by doing more of the “why” or understanding more of the “why” along with the “how”. The first developing project I’ve worked on was a great experience because the users of the application I designed are basically computer-literate, as far as being able to turn a computer on and maybe emailing. But using computer applications…NO. So with “given” the task of desiging a custom application for that kind of environment, having the users along the way, every step of the way helped alot.

        There were time when I would walk back and forth from office to office to make sure I understood what the requirements where in order for me to translate into the application.

        After reading the article, I have a question. Should a programmer/developer try to anticipate as much of a user’s needs as they can? An systems analyst I worked with one time told me that and I tried to rationalize that. Maybe because I haven’t been a developer long enough to see why that statement has merit.

      • #3106069

        Programming

        by tillworks ·

        In reply to Programming “How” versus programming “Why”

        Couldn’t it be argued that rational vs. agile development represent approaches to programming how vs. why, respectively? If so, it seems that agile development is the best approach to producing better software. The problem I’ve found with implementing agile development is that it’s difficult to give a client a firm quote for a project because it’s a figure-it-out-as-you-go-along approach. So we revert to the rational approach by creating a detailed, comprehensive spec before any development begins. That spec is inevitably focused on the “how”. And I think the software delivered frequently does not offer the end-user productivity gains it could have.

    • #3262907

      Grateful that I don’t “mashup”

      by justin james ·

      In reply to Critical Thinking

      So (no surprise here), Google is ramping up to start monetizing their map system.

      There are a lot of companies and people pushing this “Web 2.0” idea. Never mind the fact that none of them seem to be able to define it (is it AJAX? is it cross-site use of APIs? is it simply a deep version of hot linking? does it involve lots of words that sound vaguely like English? does it involve another stock bubble?). Phil Wainewright over at ZDNet goes even further; he talks about “Web 3.0” (when companies learn to monetize the undefined “Web 2.0”). If he’s right, than Google is leading the charge to Web 3.0.

      People like Dave Berlind push the idea that the Internet rocks, because anyone can start a new business using a “mashup”. Yes, it is true that you can put together a very interesting website, maybe even a complete application as a “mashup”. You might even be able to monetize it. But to think that you are going to build a system for free out of other people’s data is insane. Imagine if each time you booted up a RedHat Linux system, your computer took up a bunch of RedHat’s bandwidth. Eventually, they are going to need you to pay for that bandwidth, whether it be by direct payment, or through advertising dollars.

      This is the problem with “mashups”. People are using a free service to try togenerate revenue. How long do you think that service will be provided to you and your customers at no cost and with no ads? And when the ads come, do you want the ads that seem to appear on your site to be under the editorial control of a different company? Let’s say that you are running a religious website, and you have a map showing the location for a sermon about “Removing Lust from your Heart”. Google graciously puts the locations of the nearest adult novelty shops within a 50 mile radius of your church on the map. Just what your visitors wanted, I’m sure.

      Call me silly, call me crazy, but I would laugh if I were a venture capitalist or investment banker and heard a pitch that relied upon a third party keeping their service free and ad-free. I would say, “look pal, you’re being set up, the moment a critical mass of companies rely upon that third party, the third party will tell you ‘give us money or get stuck with ads’, you’re doing business with unknown costs, and you think I’m going to give you money?” And anyone who starts a business involving actual money on a “mashup” is a fool, or naive, or worse. Thanks, but I’d rather pay up front, and know what my costs are for the data, and be able to use it my own way.

      Don’t get me wrong, “mashups” can be neat, they can be cool, but I would never in a million years consider using one as a business or as a tool for my business. Personal site? Sure. Non commercial/non profit site? Sure. Business? No way.

      J.Ja

      • #3165843

        Grateful that I don’t

        by rayarosh ·

        In reply to Grateful that I don’t “mashup”

        You’re right, if my business model  was based entirely on using Google free maps API that would be foolish, but you need to ask yourself what’s the alternative.  If you have a localized mapping system ok maybe you could buy the maps and the satellite images and host them youself at an extremely high bandwidth cost, not counting the high cost typically associated with any sort of quality up to date digital map.

        Now let’s say you want some sort of nationwide mapping system, the server space, cost of maps, and bandwidth that would be required would be unbelievable, all for a yet undeveloped application that may not even gain traction with consumers.

        What Google Map Mashups allow you to do is beta test you app with real maps on real servers and see what happens.  You can see if people like your mashup that shows you bars within 1 mile of hospitals, or houses outside police coverage areas.

        If it works then you can build your own infrastructure, or since Google really is just in it for the money, work out some kind of licensing or revenue sharing idea.

         

    • #3087404

      Web-based apps: WHY?

      by justin james ·

      In reply to Critical Thinking

      Lately, the idea of “How” software works as opposed to “Why” users use software has been on my mind (http://techrepublic.com.com/5254-6257-0.html?forumID=99&threadID=184332&messageID=1983228&id=2926438). Anyone who thinks that Web-based apps should take over the world is being naive at best. Web-based apps virtually never address the “why” that users use computers. How is having an application Web-based going to increase the user’s satisfaction or their productivity in the slightest? I can think of dozens of reasons why Web-based apps are more frustrating, less easy to use, less functional, and less productive than a desktop app. I cannot think of any pain points that a Web-based application relieves for 99% of the applications out there.

      Why would I even want to bother with a central, Web-based repository for my documents like this? The free 64 MB USB thumb drive that one of my customers gave me stores enough for 99.9% of users to put every document they will ever need on, and nearly every computer on the planet can open and edit an Office document, whether it be Microsoft Office, Open Office, etc.

      In addition, what third party vendor do I trust with a sensitive business document, to the point where I’m going to permanently leave my data on their servers? Umm… none. If I have a VPN on my network at work (and nearly every corporate network does now), the problem is solved right there. At the worst, I can call someone and have them email me the document, or I can VPN into my network and open a Remote Desktop session to my desktop PC.

      I have never heard a user state their needs as “I need to do XYZ through a web browser.” I have heard “I need to be able to do XYZ from any computer I might find myself sitting at”, but if I wrote a well written application that did not need to use an installer, and is small enough to fit on a USB thumb drive, I have accomplished that exact same goal without having to even involve a network! [Addendum 4/3/2006: sMoRTy71 points out (correctly) that USB drives aren’t perfect solutions. This is an example of ways to not have to write a full Web-based app; another alternative would be to offer this same software (no installer needed binary) as a download. The point is, there is zero reason to write a Web-based version of an existing desktop application.]

      These vendors of Web-based applications are not listening to their users, or even trying to put themselves in the shoes of their users. They are listening to the self congratulatory babblings of venture capitalists, Well Street investment bankers, techno-nerds for whom technology exists for the sake of technology, and so on and so on. Technology is a lever, nothing more, nothing less. If that leverage is not being applied to solve the problems the user has, it is a waste.

      Every obstacle that you put in the user’s path hurts their productivity. Making an application depend not just upon a local application, but having a constant Internet connection, the uptime of a third party’s webserver, the performance of said server, and the end user having a PC that meets the requirements (including web browser configuration, a HUGE “if”) just throws more obstacles in the user’s path while giving them a second rate application that will ALWAYS be slower than native code looking at data on the LAN. This is “progress”?

      This is the Object Oriented Programming Model gone to a hideous extreme. Let’s just abstract everything, stick half the parts somewhere else, stop caring what is actually happening on the backend, and call it a day. I categorically reject this mode of thought. It does not help the user one bit. It just creates a nightmare for me (the developer), the user, and the people trying to keep the network locked down. Everyone a Web-based app touches is made unhappy.

      Think ?mashups? are so hot? The real question is “if someone made this data store available to me on my local network, would my users be best served by having a ?mashup?, or something running locally?” The answer is almost always “a desktop application built to order”. Software development isn’t about “cool”. It isn’t about “nifty”. It isn’t about what language you are using, or whether or not you are coding to spec, or whether or not you followed “eXtreme Programming” or “Agile Programmer” or “Ten Thousand Monkeys with Ten Thousand Compilers Programming”, it’s about meeting the needs of the users. To think anything else is a bad case of “developer hubris” and even worse is a waste of your employer?s money and your user’s time.

      Web-based applications, except for rare occasions, does not address the user’s needs, it strokes the egos of those involved with the development process. Google’s ego is swollen to the size of a small country. They think because their fanboy base is so big, and because so many tech writers adore them (c’mon ZDNet, how many “Google Watch” type blogs do you need? Where is the “MySQL Watch” or “Oracle Watch” or “IBM Watch”?) that they are actually any good. The truth is, their software is in perpetual beta and exists merely for the sake of suckering people in to develop a critical mass of users, so Google can then plaster ads all over it.

      Do you really want Google’s servers indexing your critical business documents and injecting their ads throughout them? When you give a map to your church’s special sermon about “Removing Lust from your Life”, do you want ads for every adult novelty shop appearing on that map (http://techrepublic.com.com/5254-6257-0.html?forumID=99&threadID=184332&messageID=1987268&id=2926438)? When you send an email to your therapist about tomorrow’s appointment, do you want your browser showing giant ads to help you find a website for treating whatever it is that you’re in therapy for?

      I have yet to see functionality in a Web-based application that could not (and usually does) exist in a desktop application. Everyone is going ga-ga over ?mashups? involving Google Maps. I hate to share a little secret, but Microsoft has offered this functionality through MapPoint for years to desktop users as a simple-to-program ActiveX/COM object. The only thing that using Google Maps over MapPoint gets me is that it is free. Yes, that may be a big deal for many many people, but Google does not give anything away for ?free?. What they mean by ?free? is ?you pay us with your users? attention and gestures?. Do you want your users leaving your application or being distracted by Google?s ads? Neither do I. I am honestly surprised that Steve Gillmor (who understands ?attention? and ?gestures? better than anyone else) would support a mechanism where the attention and gestures are going to a third party in a process than actually subtracts value from your application.

      Most users are fed up and frustrated with their existing desktop applications, which is why people are looking for a solution. The solution isn?t to start moving everything to the Web by simply replicating the existing, crummy apps with AJAX, but for programmers to change their mindset when writing code. Web-based, desktop, thin client, it?s all the same lousy software if the developers aren?t addressing their users? needs.

      J.Ja

      Want to see who’s next On the Soapbox? Find out in the Blog Roundup newsletter. Use this link to automatically subscribe and have it delivered directly to your Inbox every Wednesday. http://nl.com.com/MiniFormHandler?brand=techrepublic&list_id=e131

      • #3087304

        Web-based apps: WHY?

        by smorty71 ·

        In reply to Web-based apps: WHY?

        “I have never heard a user state their needs as “I
        need to do XYZ through a web browser.” I have heard “I need to be able
        to do XYZ from any computer I might find myself sitting at”, but if I
        wrote a well written application that did not need to use an installer,
        and is small enough to fit on a USB thumb drive, I have accomplished
        that exact same goal without having to even involve a network!”

        I don’t think you’ll ever hear a user state their needs as “I need to do XYZ through a USB thumb drive” either. Come on, are you really suggesting that writing install-free apps to fit on a thumb drive is a better alternative to having a web app (for those users who need to access the app from anywhere)? Where is all of the data for the app going to be? Are you really going to put company data on a thumb drive which could easily be lost?

        You also present all of these scenarios for computers that might not have the specs to run a web-app, but never mention that users *could* find themselves on computers without USB ports.

        If you don’t like web apps and you don’t like Google, just say so (actually, after this post, you really don’t need to say so). But saying that local apps running on USB drives (with company data on them) is a better/safer way to solve user needs is just silly.

      • #3087268

        Web-based apps: WHY?

        by wayne m. ·

        In reply to Web-based apps: WHY?

        Availability and Maintenance

        Although I agree that web-based applications are sorely lacking in a lot of operational areas, client-server and client-only applications are difficult to make available and maintain.  This is the driving force behind web-based applications.

        Applications with client resident software require a software installation.  This limits that availability of the software.  One can either choose to issue every employee a license and CD-ROM of each program  that he might ever need to access, distribute updates as they become available, and have each employee install the updates as they become available and be responsible for resolving conflicts on his machine; or one can have a web-based application that is compatible with the top one or two browsers and have the application available to any employee at any time.  The CD-ROM version also runs into problems when one is at a client site and cannot load software onto the available machines.

        There is definitely a need to make software widely available without license concerns, a need to minimize incompatibility due to multiple versions, and a need to minimize the dependency on the hardware platforms.  For one large, geographically disperse customer, software updates for one custom application were limited to one or two a year because of the expense.  One update cycle took almost 2 years because an underlying third-party component required a field-wide memory upgrade.

        Contrast that situation with a recent web-based development effort.  We made a development web-site available to our user group to evaluate monthly versions and provide feedback.  We were able to incorporate user recommended changes in a timely manner that would not have been feasible if we had to constantly distribute the software and have the users install it.  As a further example, would you even consider using TechRepublic if you were required to install the software on your client machine?  Would you faithfully install updates?

        There is value to the corporation and to the end user to making software widely available with limited restrictions based on the location and specifications of the machines it will be run on.  Unfortunately, the web browser is insufficient to provide the base capabilities needed to run remote software.  AJAX is merely the latest patch being laid on top of a poor foundation.

         

      • #3106509

        Web-based apps: WHY?

        by aaron a baker ·

        In reply to Web-based apps: WHY?

        Damn that was a “Good”, excuse my language.  🙂 

        But you’ve hit so many nails on the head I don’t kow where to start.

        My biggest beef was the ridiculous amount of space {250mg} in HOTMAIL. I wouldn’t trust Microsoft with even the slightest of secrecies, now I’m supposed to use them as my E-Mail data base. They who through their contracts, you kow,the ones you are “Forced to sign?”, promise to keep your information Private and then without telling anybody went out and “Sold” that very same info to people so that today we get tons on junk mail, in our hotmail inboxes and MSMessengers. You can’t got to hotmail and download and save “On your own Computer any of your mail, it must be saved at Microsoft.” I don’t like to be Bullied, hence my Outlook Express.

        Oh they have “Filters” BIG DEAL ! ! . They should never have sold the information in the first place. They abrogated their contract with us. This is only one sample of what using the net as a data base can end up looking like.

        As for the Info Thief / Hog, “GOOGLE”, well I won’t go there except to say, that I am absolutely amazed at how many people were taken by these this scavenger. It amazes me that people don’t see them for what they really are and are too busy jumping on the eye candy to even look up. Someday, the piper will come, then these people will know.

        Yahoo the same thing, as a matter of fact, they don’t even let you use your own browser.

        So you see, I couldn’t agree with you more. There is no way on this planet that Web based apps are anywhere near the quality,security and dependency than those that we have on our own systems.

        I am a Private man, and so I’ll just continue to avoid the aforementioned like the scurrilous plague that they are and I shall laugh when the complaints about Google and Yahoo start pouring in.

        The only reason this hasn’t started so far is that people don’t know………Yet !! 

        I close in saying Thank you

        For writing and excellent article and point out what people, especially we in the know, “should have known “.

        Web Based Apps are not good at all and I for one will not ever use them.

        Warmest Regards 

        Aaron   😉

      • #3106438

        Web-based apps: WHY?

        by wayne m. ·

        In reply to Web-based apps: WHY?

        Do Not Confuse Web-Based Applications with Application Service Provider Model

        Although the vast majority of Application Service Providers (ASPs) use Web-Based Applications, the majority of Web-Based applications are not provided by ASPs.  Most Web-Based applications are used and maintained by the same corporate entity.

         

      • #3106406

        Web-based apps: WHY?

        by mark miller ·

        In reply to Web-based apps: WHY?

        I agree up to a point, from the user’s perspective. Web-based apps. are not as responsive as thick client apps. I’ve used OE and a web mail client, and I can manage my e-mail so much faster with OE (or any thick mail client for that matter). It’s my understanding that the impetus for web apps. was that administering thick client apps. was getting too expensive. So it wasn’t necessarily an end-user decision, but an IT management decision.

        From my perspective the main advantage web apps. have is that IT managers can control who has access to their application. It’s more difficult to do this with a thick client, since an end user literally has a copy of it. A web app. just essentially provides access to a service. If an employee leaves the company, it’s a simple matter to deny access to that person. With a thick client, it’s not so easy. Even though it’s illegal, I’ve heard stories about employees taking thick client apps. with them when they leave and then using them for a competitor or to start their own businesses. In a thick client project I’ve worked on, they implemented a time clock system, such that the app. gets a key from a server, which allows access to the thick client for a limited time. Once it expires, the app. no longer operates, until the key is renewed. This isn’t the best solution, but better than no limit on application use. With a web application this would be a simple matter. Just take the user off the central database of valid users, and the user is denied access then and there. I and the company I work for have discussed converting their app. to the web for this reason, but they said they need employees to be able to run the app. without an internet connection. So there you have it. A client/server app. that required the user to sign in every time they used it would work just as well, IMO.

        There are technologies available now from both Sun (JRE, Swing) and Microsoft (.Net Windows Forms) that make it possible to deploy thick client apps. with the advantages of web deployment. Personally I hope this catches on. As an application developer I don’t particularly like web apps. They have their good points, but overall I think they’re more of a headache, more expensive to develop. I think web apps. do well when an application is data-driven, such that controls on the screen can be added dynamically, depending on the situation. Web technology makes it relatively easy to do this, since formatting is done like a word processor would do it, and you don’t have to worry about everything being aligned since the HTML renderer does a lot of that work. That’s the only good point that I see with it from a developer’s perspective. Other than that, state management is a pain.

        Thick client apps. have the advantage that so long as the user is running the app., the app. is continuously running. You can save values to a variable and be done with it, and be assured that you can access those values later, anytime you want. With the web, you either have to save these values to session state, to a database table, or serialize it between page refreshes, and then get them back out again the next time the app. does a postback. It’s better than it was 6 or 7 years ago, but it’s still clunky. The other thing that I never get over is the fact that unless you’re using a technology like Flash or AJAX, every time the user posts back a page, the ENTIRE page is being refreshed! Even if the user has just entered text into 3 textboxes and then hit “submit”, the entire page of content goes back to the server, and then back to the browser. How inefficient! Thick client apps. never need to do this. They can just send/receive the necessary data.

        Just my 2 cents.

      • #3106384

        Web-based apps: WHY?

        by dave lathrop ·

        In reply to Web-based apps: WHY?

        I agree that not all applications should be web applications. They have their pro’s (easy deployment, data kept on server, etc.), but also a lot of con’s.

        However, I find the greatest problem is people who don’t design good applications! Regardless of whether this is a web app, desktop (VB, C#, C++, Java, COBOL, or whatever), client/server or scripted in a host (MS Access or Excel). Too many times, I see CRUD screens (excuse me “forms”, they just look like 3270 screens) for each table. This rarely matches the user’s task, so they are left to figure out what sequence of screens and CRUD operations will achieve their current goal. A well designed and constructed application with any implementation technology that supports the users’ work and usage patterns will always beat a poorly designed and/or constructed application that makes their life more difficult.

      • #3106232

        Web-based apps: WHY?

        by kenfong ·

        In reply to Web-based apps: WHY?

        web is not perfect but it does solve issues like deployment, upgrade, access-anywhere, cross-platform support, etc. Companies can subscribe for what they need only and they don’t need to keep a team of people just to maintain a proprietary software. Why spend money on big machines, storage, backup, and people when the core of your business is selling bananas. Leave them to people who are good at those things, and let them bare the cost of an n-tier infrastructure and server clusters. Let users call them 4am in the morning, instead of asking you where the USB port is.

        Thick client software does a better job in certain areas such as scientific projects, graphic stuff, software development, etc. It’d be cool though if some of these tasks can be done over the web – so I don’t have to worry about applying service pack #58 will screw up my video editing software, and forced to install dx9.1b and dnet2.0 because a little backup tool requires them.

        The web sucks in areas like response and interface. But the idea of thin-client computing is de facto.

      • #3106187

        Web-based apps: WHY?

        by driverjoe ·

        In reply to Web-based apps: WHY?

        I agreed that we don’t need or even want web app. but it’s not about what is wanted or need! It’s about Subscriptions! Why have you pay once for a program when they can have everybody pay monthly for something nobody needs. Now that’s Innovative.

    • #3106745

      Micro distributions are NOT micro kernels

      by justin james ·

      In reply to Critical Thinking

      All too often, I read a tech writer who seems to think that a Linux (or some other OS) that takes up a small amount of disk space is a “micro kernel.” This is patently absurd. These are micro distributions. Most Linux and BSD based operating systems already have an extremely small kernel, small enough to fit on a floppy disk along with enough configuration files and utilities to get a system up and running and perform a restore.

      If you compare these “micro kernel” OS?s like Damn Small Linux and PuppyLinux to a full blown Linux distribution like RedHat or SuSE, what you will see is that they really are no different. In fact, they are usually just a stripped down version of those “fat” OS?s! All that they have done is remove every piece of code that the distribution makers view as “unnecessary”.

      Indeed, you should be able to take the rc.d and other configuration files from one of these “micro distributions” and put it onto RHEL or SuSE or Mandrake or whatever, and have that OS run in an identical memory and CPU footprint as the “micro distribution.”

      At the end of the day, a “fat” OS is only “fat” by virtue of how much disk space it consumes. Most of these “fat” OS?s are simply loaded down with fifteen different open source versions of the same type of application. This is where we see RHEL or SuSE have more lines of code than Windows. Windows ships with one web browser, one media player, one basic text editor, etc. whereas a *Nix will ship with five different applications for each task and let you choose which one you prefer.

      I have tried working with these “micro distributions” and to be honest, it is a pretty miserable experience. I have tried Damn Small Linux as well as PuppyLinux. It was extremely frustrating to not be able to install any software that was not available as a binary package through the system?s installer, simply because there was no compiler available. *Nix without a compiler is like having a car without a steering wheel. PuppyLinux did have a Perl package, but it was missing most (if not all) of the standard Perl libraries, rendering it useless to most programs (and without a make command and/or CPAN, very difficult to add new packages) that need it. PuppyLinux did not have cron (the standard *Nix task scheduler) so it was difficult to see it being used as the basis for any serious server application.

      Anyone who thinks that these “micro distributions” are going to make a huge different has not actually tried doing anything with them that are trivial tasks on a “fat” *Nix. As stated before, the only difference is how much disk space they take up. If an extra gigabyte of disk space is going to make that much of a difference to you, you are in trouble anyways. And if you think that these “micro distributions” are going to help you reduce system resource requirements, you are wrong there too. Sure, many of them are compiled with a 586 or even a 486 target in mind, but at the end of the day, all *Nix distributions are pulling their kernel from the same place (within families, of course).

      J.Ja

      • #3286131

        Micro distributions are NOT micro kernels

        by jmgarvin ·

        In reply to Micro distributions are NOT micro kernels

        I have to disagree. These micro distros are great for thin clients or for that older desktop that just doesn’t have the power to push “full” modern distros.

        I’ve actually setup DSL on a thumb drive and had it boot from there…this created a portable thin client that could plug in anywhere. Had I actually thought about it, I could have modified the distro to authenticate me back to the domain so that I could do my admin tasks or have it authenticate the users onto a different domain so they could do their user tasks.

        These micro distros have their place…You typically don’t want a compiler on a desktop machine, so this makes sense.

    • #3106337

      Improving the code writing process

      by justin james ·

      In reply to Critical Thinking

      Writing code is a goal oriented process. Unfortunately, the tools that developers have do not assist them in attaining their goals. The tools are getting better, (as someone who has had to write COBOL in vi, I can attest to that), but they still do not understand just how programmer operate. The development tools themselves are still how oriented, not why oriented. Let us take a look at how the code writing process hinders rather than helps developers.

      The documentation is predicated on the user knowing what they are looking for.

      This is only improving, because the IDEs have glued ToolTips, AutoComplete, etc. into the editors. Coding now is a process of naming your variable, pressing the period, and then scrolling through the list of methods and properties to find what sounds like it does what you want.

      But try starting off from a state in which you do not know what objects you need. In other words, try something you have never done before. You are in deep trouble. Language and API documentation is still dominated by how and not why. It assumes you know what object class (or variable type, or function, or whatever) you need, and then shows you what your options are. This is required information, but it is not very helpful, especially if you are not familiar with that language’s terminology (or that library’s terminology). It is so easy to not find what you are looking for, if the language has standardized .ToString() for everything, but what you are working with has .ToText() instead. More to the point, there needs to be more documentation like the Perl FAQs: goal oriented documentation.

      The Perl FAQs are perfect. There are hundreds of “I am trying to do XYZ” items in there, and code that shows you exactly how to do it. The documentation asks the user, “what is your why and how can I help you accomplish that?” I use the Perl FAQs more than the actual reference most of the time; I already know the language syntax, but there are a lot of whys that I have not tried to do. Indeed, the Perl documentation contains so much usable code in a goal oriented layout that it is possible to write 75% of a program out of them. Just try that with Whatever Language In a Nutshell. I have only seen one programming book laid out in a “let us accomplish something” format as opposed to a “here is how we work with strings, here is how we work with numbers” format.

      The tools are too focused on writing code.

      I know that this is counter-intuitive. IDEs all about code, right? Well, not really. Writing code is the how. The true why is “creating great software.” Writing code is simply an ends to that means. The reality is that too many pieces of software simply stink, not because the internal logic is no good, but because the programmer left out things like error handling, input validation, etc. out of sheer laziness or ignorance. An IDE that lets you try doing an implicit conversion when you have strict on with a strongly typed language is doing you no favors, especially if that block of code is somewhere that it only gets accessed once in a blue moon. A language or IDE that makes input validation “too much hassle to bother with” is not doing anyone any favors.

      Here is a great example: too many web applications rely upon a combination of JavaScript and the maximum length specification in a form object to do their validation. Unfortunately, not everyone has JavaScript turned on, and many people use some type of auto complete software to fill out a form. And someone can always link to your application backend without replicating your interface. So no matter how much input validation you do on the client side (not saying you should skip it; users typically prefer getting the error before the form is actually submitted to a server), you still need to do it on the backend. Sadly, the concept of tying the input validation logic on the server side to the input validation on the client side is still pretty rare (ASP.Net with its Validator controls is good, but not great). So you end up with code that either is a hassle for the end user (no JavaScript validation) or vulnerable to all sorts of nasty things to occur (no client side validation), or you are forced to write all of your validation code twice, in two different languages.

      This is all a by product of the sheer amount of effort that is needed to write this code. It is not brain work, it is drudge work. A well written program with a large amount of user interaction but little complex logic behind it, in a language with large libraries, can be 25% input validation. Let’s be real, most applications are of the form “get data from a data source, display it to the user, allow the user some C/R/U/D functionality, and confirm to the user that the procedure was a success or failure.” That is all most programs are. A significant portion of security breaches are caused by failure to validate input. For example, Perl has a known buffer overrun problem with using sprintf. “Everyone knows” that you need to validate user input before passing it to sprintf, to ensure that it will not cause a problem. And either through laziness or ignorance (note how I put “everyone knows” in quotation marks), this does not happen, so you get a web app that can execute arbitrary code. The WME exploit, zlib problems, et al all boil down to a failure to validate input.

      Imagine if instead, the IDE (or the language itself), instead of being aimed at providing you with fancy indentation and color coding and what not, actually did this on its own? Perl does this to an extent with variable tainting; it will not let you pass a variable that came from the user with certain functions until you touch it first with other functions. More languages need a mechanism like this. But it is not enough. The idea that user input is always clean needs to be erased from the language and the tools, and replaced with a system that encourages good coding practice, through compiler warnings, and even better yet, handling it for you. Imagine if your language saw you taking the contents of a text input and converting it to an integer input, and had the good sense to automatically check it at the moment of input to ensure that it would convert cleanly? That would be a lot better than it is now; trying the conversion, catching an exception, and throwing an error back. This lets the programmer focus on the why, in this case, getting numeric input from the user.

      Program logic is a tree, but source code is linear

      This is a problem that I did not even see until very recently. Very few programs are written procedurally at this point. The event driven programming model has taken over, and for good reason. Unfortunately, our entire source code creation process is derived from the procedural days. Look at some source code. What you see is that you have a bunch of units of code, all equal to each other. Even when working with object oriented languages, the tools themselves treat the code writing process as a linear, procedural system. You write an object; within that object, all method are equal within the code. Navigating the code is tricky at best.

      Even with an IDE that collapses regions, functions, properties, etc., when the code is expanded, it is still a plain text file. The way we have to write overloads is ridiculous. The whole process itself is still stuck in the procedural world, but we are writing for event driven logic. The tools simply do not understand the idea that some blocks of code are inherently derivatives or reliant upon other blocks of code. Too much code serves the purpose of meta data to the rest of the code (such as comments, error handling, function parameters, and more). It does not have to be like this, but it will require a major shift in thinking, both by the people who create the tools, and the people who use them.

      Code writing is too separate from the rest of the process

      Right now, the tools for completing a software project are loosely integrated at best. Even with the major tool suites, the tools within the suite are not all best of breed, and the better products just do not integrate well into the suite. For example, it would be pretty painful to write a VB.Net Windows desktop application in anything but Visual Studio. Even a simple ASP.Net application would be a hassle to work with outside of Visual Sudio. Sadly, Visual Studio’s graphics tools are crude at best. Its database tools are not so hot either, especially for database servers that do not come from Microsoft. Adobe/Macromedia makes excellent graphics editors. But Photoshop, Illustrator, etc. simply do not acknowledge that Visual Studio exists. So the tools that the person making the graphics is using (Photoshop, Illustrator, Freehand, Flash, and so on) have zero awareness of Visual Studio, and vice versa. The graphics person has to do his work and then pass it to the programmer and the GUI person so they can see how it fits.

      Microsoft is trying to address this problem with the upcoming Expression system, but I am not holding my breath. I will believe it when I see it. This creates a problem where the graphics artist does not realize that their vision cannot be implemented within the code. The systems architects have a hard time seeing that their detailed database layout is nearly impossible to turn into a usable interface. The project manager does not get an idea of just what is needed to make the workflow go smoother. And so on and so on.

      It is great that the tool makers have brought testing and version control into the process. This helps tremendously. But these tools still are not perfect, and could use a lot of improvement, particularly version control. At this point, version control is still a glorified document check in/checkout system with a hook into a difference engine. It has no awareness of the program itself and it is still very difficult for multiple people to simultaneously work on the same section of code. Even then, as one person makes changes that affect others, the version control system is not doing much to help the team out. I worked at a place that used CVS; the system was so complicated that we barely used it. For what little it did, it was not worth the effort. Version control, even in a single developer environment, is a major pain point. I have some ideas on how to improve this, but this is not the time to discuss them.

      The situation is not as bleak as I paint it

      I know, I make it look like it is a wonder that programs get written at all. It is not quite so bad as that. But I think that it is time that the tools that we use to create software evolve to meet the why of the code writing process, as opposed to making the how easier. There are a lot of great things about the current tools, and I would not go back to vi and command line debuggers for any amount of money. But I also think that the tools that we have need to make a clean break from their past, and help us out in ways that they simply are not doing at this point in time.

      J.Ja

      • #3106330

        Improving the code writing process

        by jmgarvin ·

        In reply to Improving the code writing process

        I’d like to point you to a book from a really great guy named Allen
        Stavely.  It’s called Toward Zero Defect Programming.  This
        will really open your eyes and let you see that programming CAN work in
        almost any environment, it just takes a little take on things.
        http://www.awprofessional.com/bookstore/product.asp?isbn=0201385953&rl=1

        On a side note: I don’t think programming in this method is any more
        expensive than not using it.  However, I firmly believe that the
        long term costs are FAR greater if you don’t use this process.

      • #3106117

        Improving the code writing process

        by wayne m. ·

        In reply to Improving the code writing process

        Why to How Will Remain a Manual Process

        I guess I do not see any technology replacing humans in understanding the “why” of an operation.  There is simply too much visual, and to a lesser degree, audible information to process.  People are by far the most efficient means of processing diverse information.  Unfortunately, many of the processes put into place serve to isolate the developer from the sources of information.

        As to some other points, I would put the amount of custom code developed for data validation at a much higher level than 25%, and find that in many systems it is repeated in three places: client, server, and database.  I say repeated, not duplicated, as it is rare that precisely the same validation is performed in all three places and this problem is augmented because each place uses a different coding language.

        I will quibble slightly on the comment concerning procedural development.  Code is packaged, more or less, in an object organization, but I think most development is done (and should be done) procedurally.  An event is just a different means of launching a procedure (interrupt driven versus polled).  The lack of a standard mechanism to disable events or define critical regions has lead to its own share of problems.  Back to my original point though, newer agile development methodologies are returning the focus to procedural development away from the object development of the late 1980s.

        I feel the definition of the process of writing code is in its infancy.  To date, code development has been treated as a mysterious blackbox and the focus has been on adding more and more restrictive controls before and after the actual code development.  This will require developers, however, to actually work with the users to assimilate the “why” of the workflow.  Developers will not be able to sit in a dark room with specifications handed to them.

        Back to the original point.  People are simply the best available means for processing and consolidating information.  Understanding the why will remain a manual process, at least through the end of my working career.

         

      • #3285370

        Improving the code writing process

        by tillworks ·

        In reply to Improving the code writing process

        What about the O’Reilly “Cookbook” series? These stay near the top of my stack because they take the “why” approach. I’ve learned a great deal of “how” by searching for “why” solutions in these references.

      • #3103718

        Improving the code writing process

        by dogknees ·

        In reply to Improving the code writing process

        This won’t be a popular comment, but mine rarely are!

        One of the issues with all this is that people don’t seem to realise that writing software is a fundamentally difficult thing to do. Anything more than the trivial requires significant intelligence and effort to achieve a succesful result. This is unlikely to change anytime soon. Like many other things in life, it cannot be made simple enough for the average joe to understand.

        Learning a new system is no harder now than it was 20 years ago. You still pretty much need to read all documentation cover to cover two to three times before even starting to use the product. You just aren’t going to learn it in any easier way. Learning part of a system is pointless, you need to whole picture to make rational design decisions.

        This probably sounds pretty arrogant, but I don’t believe it is. The simple truth is that there are things in life that only some people are capable of doing at all, let alone doing well. Saying this is not arrogance, it’s reality. What percentage of people are capable of a 2 metre high-jump?. To try and dumb everything down to the point where every person can do it is pointless and bound to fail. The world is just not that simple.

        So, some positive ideas. The industry needs to recognize the difficulty of the task and reward it accordingly. If it’s harder to write good code than manage the company, then you pay the coders more than the managers. Attract smart people in the usual way, by paying them decently. Stop treating coding as an entry level task that most will move on from to management. Make it the goal.

        As regards Tools. The manufacturers need to start applying some of the effort they apply to user software to developers tools. Simple example. I do a lot of development in Excel/VBA. One thing that often happens is that you name a range and then use that name in your code. If you then rename the range for some reason, the code is broken. Why usn’t the system smart enough to work out that I’ve used thay name and fix my code? It’s certainly possible, but does M$ do it? Never!

        That’s one simple low level example where the system could assist. It won’t make development a no-brainer, but it will alleviate the load and help us produce more stable software.

        One area where some progress is being made is in UML and related systems/products. If you work within the “system”, you can automate a significant part of the coding process and bring the problem definition closer to the customers view of the world. They can give you more useful information if they understand the specs and the process  better.

        I still think the quest for a silver bullet that makes development an automated process is going to be a long one, but progress is being made. I guess the hardest thing for many of us is keeping up with this progress, or more accurately even getting to know it’s happening. Reminds me of something about alligators and draining swamps! It’s very easy when your heads down, to see what’s giong on around you.

        Personally I believe that we will develop true machine intelligence sometime in the next 20 -40 years. Then maybe, we’ll have automated developers. The job won’t have gotten easier, but the “tools” will be MUCH smarter.

        Best Wishes to All for Chocolate Day.

         

      • #3103706

        Improving the code writing process

        by tony hopkinson ·

        In reply to Improving the code writing process

        I certainly agree that there is far too much emphasis on how as opposed to why. Proprietry systems and their certifications have exacerbated this problem in my opinion. Equally education seems to have gone more how oriented, I’ve worked with more than a few recent graduates who have not been taught why we are, where we are.

        They don’t know why structured methodologies were invented, they don’t see the progession to OO. They don’t see the realtionship between the procedural (linear) and event based model. The basics which I learnt, because that’s all there was are missed out or at best superficially covered. Worse still and indicated somewhat by your post is that they are somehow no longer important.

        The ability to write code, is a particular way to think. Some people can do it, some can’t. Same as some are mathematically gifted, others geometerically. I can see how IDEs could improve, but any attempt to control how a developer develops is another constraint in the process. So a more ‘intelligent’ IDE could help in terms of focus, imposition of standards, but as a developer it is necessary to step back from the detail, sometimes that standards preclude what you wish to do for perfectly valid reasons.

        A piece of software to ‘write’ code writes code in the ways that designers think you should. Were they correct, will they always be correct. Who knows, I guarantee that if we went down this route with yet another phase of dumbing down of the discipline, we would suffer even more bad design. If you don’t have the talent that is the ability to program, you will never be any better than the designer of the tool that stands in for the mising talent.

        Think of it this way, if you were to sit down and write a program to play chess, how much better would the program play the game than you do yourself. Certainly it would never miss fool’s mate, but it would never come up with a new gambit either.

        Chess is much simpler than the programing.

         

      • #3287259

        Improving the code writing process

        by jefromcanada ·

        In reply to Improving the code writing process

        I agree with most of what you’ve said.  But I see the problem not as being “how vs. why”, but rather a “simple” problem of abstraction.  Regardless how many libraries, functions, and standard procedures there are, programmers by their nature wish to reinvent things so they have their own stamp.

        Whether it’s reinventing the UI or the processes, there is an emphasis on the “nitty gritty” that you refer to as the “how”.  There are a few tools that generate standardized code based on requirements documentation.  Most also allow you to tweak the generated code.  The more you tweak, the more you drift back into the “how”.

        My current tool of choice is from the TenFold company.  Their tool removes all the “nitty gritty” programming chores from the implementation process, focusing instead on requirements building and rule-setting.  In an innovative twist on things, their product “renders” an application, just as a spreadsheet program “renders” a spreadsheet.  By building all the housekeeping tasks (like screen creation, menu generation, data validation, security implementation, data access, etc.) into the rendering engine, designers get to concentrate on the “why” (as you put it), and let the rendering engine create a completely standardized, secure, and fully documented running system.

        Caution:  The TenFold company is very small, and currently very cash-poor.  I don’t know how long they can continue as a viable entity.  Therefore, my comments refer to the quality and innovation of their product, and do not constitute a recommendation with respect to the company.

    • #3285920

      Anticipation and program design

      by justin james ·

      In reply to Critical Thinking

      A recent TechRepublic discussion focused upon the idea of anticipation in program design. I answered that the designer should not try to anticipate the limit of the user’s needs, and that the software should try to anticipate the user’s actions. What exactly does this mean, and how does it relate to my recent theme regarding how and why?

      Regular readers will be aware that I advocate the idea of the software developer viewing a project as a method of fulfilling the user’s ultimate goals, not necessarily writing the software that the user originally had in mind when they requested the software. Look at the common spreadsheet. It started as a means of replicating what accountants, bookkeepers, and other number crunchers were doing with ledger books. Now it gets used not just for that purpose, but as a quick, easy to use, lightweight database. If the developers of software had it designed so that all it could do was perform basic numerical computations on columns of numbers, it would not be very useful at all. It would have fulfilled the original design request (replicate with software what was done with ledger books), but would not have met the true why. The real why was “we need to be able to put data into a Cartesian coordinate system and perform operations upon that data.” As users push spreadsheets beyond their intended purposes, the software developers add in functionality to address the new uses, which allows even further innovation by the users.

      This is one reason why I am such a big fan of interpreted programming languages. Interpreted languages allow the developer and the user to quickly expand functionality beyond the original specifications in ways that were never imagined. A piece of software that exposes its functionality to a macro or scripting language (always an interpreted language) is always more useful than a program that is compiled with whatever functionality the developer put in and is unable to do anything else without talking to the developer. This is not to say, of course, that all software should be written in interpreted languages, of course, but that they should support the use of an interpreted language within the software itself. Providing the user with a simple macro language within your application directly addresses the idea of not anticipating the limits of your users’ needs.

      One reason that HTML and HTTP are abused as application platforms is because the interpreted nature of many of the application servers (Perl, PHP, ASP, JSP), as well as HTML and JavaScript itself lead to instant results. It takes a small amount of time to crank out a pretty interface, only a little longer to get some sort of dynamic functionality going on, and everyone is happy until the trudge work of actually writing a quality application sets in. This is partly why VB got such a bad rap for so long; someone with little to no experience could spend a day making an interface through drag/drop, and then power it with totally garbage code on the backend. Before VB, Delphi, etc., you had to spend quite some time and know the Windows API fairly well just to get a basic window on the screen.

      Especially when working with an interpreted language, it is extremely easy to separate the business logic from the presentation logic, even in a desktop application. I discussed a potential project today with my boss. As we analyzed the user’s why (one user program), one thing that jumped out was that the user would want to have us make wide scale changes to the business logic as their needs changed, and that providing them with an entirely new installation with each change was going to be unrealistic to do. The solution that we are going to propose? The application itself will be written in VB.Net, but all it will do is pull data from the database and expose core functionality to a Perl interpreter that will eval() the contents of an encrypted file. The end result? Business logic can be edited and altered without requiring a full recompile/installation, and users with minimal programming skills should be able to make minor changes themselves. That is handing power to the user. They do not want to call us for every minor change, and we do not want to support them for every minor change.

      Especially when developing software that will be used by a wide and diverse set of users, it is vital that the developer not attempt to anticipate the limit of the users’ needs. Indeed, it is equally important when software is aimed at a very small, specialized set of users. The “large audience” software such as graphics editors, office suites, Web browsers, email applications, etc. can be used by so many diverse sets of people, that it is impossible for the developer to conceive of every possible way to use them. For the “small audience” program, the users tend to be extremely specialized and have their own particular way of working; what may be perfect for one user will be absolutely worthless to another user.

      On the other hand, the application itself should respond to what the user is doing, and anticipate their needs. This is, in many ways, a matter of interface design. A piece of software that responds to a user’s attention and gestures the moment it recognizes a unique pattern is one that will be more useful than one that doesn’t. Photoshop, for example, shows not just a thumbnail preview of what the changes will look like as you adjust the values of the tool you are using, but also shows the picture itself changing. This saves a lot of time; instead of adjusting the values, clicking “OK,” then having to undo the change and try again, you know if the results are going to be what you want before you even click “OK.” It would be great if Office suite software did them same; mouse over the “Bold” button? Make the selected text bold while the mouse is over the button, and un-bold it when they stop hovering over bold. That will let the user see if bold is really what they want before they commit to it.

      Prefetching data is another great way that software can anticipate user’s needs and deliver extra value in the process. If the user is paging through a large data set, go ahead and start loading the next page’s data in the background (resources and bandwidth permitting of course, no one likes an application that makes a computer slow when you aren’t doing anything). The user will see instant results when they click to the next page, instead of waiting for the results. This is one reason why Web-based applications have much less potential than desktop applications; their mechanisms for caching stink, even when using AJAX methods.

      Even the programming tools we need fail to anticipate. Developer’s tools are unique; they are written by the very people who are the intended audience. Yet, they fall very short in terms of anticipation. Version control is especially bad at this; it does not let you know that another person changed code that your code relies upon in a way that breaks your code until you refresh your code. It also does not notify that person that they are about to break your code. This is a situation which causes a lot of problems.

      I love the idea of code that anticipates the user’s needs. This is one direction that I see Steve Gillmor headed in with the Gesture Bank. If developers can access an aggregated source of user’s attention and gestures, they can write software that reacts as soon as the user begins to take action, not when they finalize the action. Really smart developers will have the software develop that knowledge on the fly on an individual basis. And that will let the user focus on why they are using your software, and not how they are using your software.

      J.Ja

      • #3286261

        Anticipation and program design

        by wayne m. ·

        In reply to Anticipation and program design

        Anticipating Needs or Understanding Current Workflow?

         I believe I am in agreement with your intent, but I would like to suggest a change in terminology away from “anticipating needs.”

        When I see the term “anticipating needs”, I usually envision a developer adding a new feature just because the developer thinks it might eventually be useful.  I do not think this is the intent of the discussion.

        I see the current discussion reflecting the need of the developer to understand the current workflow of the user and anticipate the next steps.  This is not about identifying new needs but is about understanding existing needs and work sequences.  If one understands the workflow, then it becomes possible to prepare for a following step while a current step is in progress.  This is bascially the application-level version of instruction caching on a microprocessor.

        I am a strong advocate of the developer understanding the users’ current operating environment.  I do not believe in having the developer predict future functional needs.  I trust thisis in agreement with the article intent and would recommend we avoid the term “anticipating needs” as that may carry a different connotation with some readers.

    • #3285679

      Ironies abound…

      by justin james ·

      In reply to Critical Thinking

      I am currently attending a “webinar” presented by MySQL and Spike Source (both are companies making money on open source). The “webinar” is about using open source Content Management Systems. The joke is that the “webinar” is being presented over Microsoft Office Live Meeting. I guess they are not yet ready to put their money where their mouth is…

      J.Ja

      • #3105765

        Ironies abound…

        by mschultz ·

        In reply to Ironies abound…

        I think they are, most CMS’s are built on an MySQL database. To me It doesn’t matter how they decided to do a webinar, open source solution or not. Doesn’t strike me as ironic at all.

    • #3264448

      If you want Windows, buy a PC, not a Mac

      by justin james ·

      In reply to Critical Thinking

      As I have said in the past, the dual boot story on Macs just is simply not very compelling.

      There is tons of buzz out there about Boot Camp. A lot of people are saying that this is the best thing for Apple since OSX. Others are saying that this is the worst thing for Apple since the Newton. Personally, I think this is a non-story.

      Dual booting simply is not very useful! Both Mac OSX and Windows XP tie an extraordinary amount of their functionality to the file system. To dual boot, and have your data be usable on both platforms would mean that you are going to be putting your data on a monster FAT32 partition, and giving up the advantages of NTFS for the Windows XP installation, and HFC+ for the Mac OSX installation.

      Furthermore, dual booting is a huge pain in the butt. Do you really want to interrupt your workflow, shut everything down, reboot, wait until that special moment to hold a button (to tell it to boot to the alternate OS), and re-login, just to start one application? Neither do I.

      The one thing I see as being a Good Thing with Boot Camp would be to have a small XP partition used for gaming. This would finally allow someone to own a Mac and actually play a game. Not only that, but when starting a game, it is common practice to shut down every possible application in order to provide the game with the maximum possible system resources. Not many people multitask with a game either. The idea of giving Mac OSX the ability to play Windows XP games is a good one, as this is a major reason why many people will not go with a Mac.

      On the other hand, do you really want to spend $100 on a Windows XP license just to play games? It is one thing to spend $100 on a piece of hardware to make your gaming experience better. It is another thing entirely to spend $100 to gain the ability to play games on a hardware platform that pound for pound is already significantly more expensive than a Windows PC to begin with. I can put together a decent gaming PC for around $800. That is only a tad bit more expensive than the cost of a Mac mini. And a Mac mini, even with the new Intel infrastructure, is hardly a gaming machine. Its graphics capabilities are not outstanding, its sound capabilities are not outstanding, it is only 1 GB of RAM for that price, and so on and so on. If you are talking about taking the Intel version of a PowerMac and playing a game on it, fine. But for the price of a PowerMac, you could be putting together The Ultimate Gaming PC (assuming you are not being stupid and convincing yourself that you need 500 watts of power supply to drive your computer). And even then, you would probably be better served by buying a Mac mini, a decent gaming PC, and a KVM switch.

      Unless someone has a lot of money to burn in the quest to play Windows XP games on a Mac, there really is no reason to be dual booting into Windows XP from Mac OSX anyways. When the Mac mini first appeared, I investigated the possibility of switch to the Mac platform for my day-to-computing. What I discovered was that every single application I used my Windows XP computer for either had a Macintosh version, or an equivalent that is just as good. The only reason why I have not made the switch yet is for financial reasons. My at-home computer usage simply is not very complex or dependent upon a PC-only application. Indeed, with Mac OSX being able to run FreeBSD software quite easily, there is a large pool of free, open source software that often does the same thing as PC software, and often just as good. Outside of games, business environments, and software development for Windows XP, I just cannot find any reason why anyone needs to boot into Windows XP versus Mac OSX. And even then, the drawbacks are miserable. In fact, in a business environment, you effectively would not be able to use Mac OSX at all. So the Boot Camp software really does not help Apple penetrate businesses, and only a few people will be able to productively make use of dual booting.

      J.Ja

      • #3264446

        If you want Windows, buy a PC, not a Mac

        by steven warren ·

        In reply to If you want Windows, buy a PC, not a Mac

        Have you tried it yet? I have several friends who said that when they tried to put their xp key in during installation, it didnt work.

        -ssw

      • #3264415

        If you want Windows, buy a PC, not a Mac

        by justin james ·

        In reply to If you want Windows, buy a PC, not a Mac

        No, I have not tried it. I do not have access to a Mac, and even if I did, I wouldn’t try it. I have gone the dual boot route a number of times in the past, and it was always an unpleasant experience for the reasons outlined above. At this stage, for me to try dual booting a Mac would be like if I stuck my hand on a hot burner on the stove and did not expect to be burned, just because everytime the stove burned me it was a different burner.

        J.Ja

      • #3264366

        If you want Windows, buy a PC, not a Mac

        by georgeou ·

        In reply to If you want Windows, buy a PC, not a Mac

        Dual booting is the jack of all trades and master of none. Simultaneous booting with hardware virtualization if the jack of all trades and master of all.

        No one wants to dual boot, but everyone will want simultaneous boot IF it’s packaged to be friendly. The ability to flip instantly between OS X and Windows Vista which are both installed on top of XenSource is extremely compelling. For maximum performance, it would be even better if they allowed you to prioritize the OS with Focus or even pause one OS in favor of another. If Apple goes as far as simultaneous boot, it will have a VERY compelling argument and a huge differentiator over any other hardware manufacturer as the jack of all trades and master of all trades.

      • #3286428

        If you want Windows, buy a PC, not a Mac

        by somsubs ·

        In reply to If you want Windows, buy a PC, not a Mac

        Would be useful for those who only need to run one or two non native applications to the main operating system. Mac or windows are both capable of running most of the things you need.

      • #3103813

        If you want Windows, buy a PC, not a Mac

        by apotheon ·

        In reply to If you want Windows, buy a PC, not a Mac

        I’d like to add a big fat “ditto” to what George Ou said. Simultaneous operation would be a good thing to have available. It’s not the right answer most of the time, but it’s a far better answer than a dual boot setup most of the time.

        There are instances where you’re better off with a dual boot system, but they’re pretty rare. For my purposes, I’d usually want either two separate machines or Linux running Wine for Windows application compatibility most of the time, but I could see a virtualization environment being occasionally useful, especially if I can use that to cut down the number of machines I need to run closed source proprietary OSes.

        Just this last weekend I shoehorned a Linux install into a brand-new Thinkpad to make a dual boot system for someone. It would have been easier to set up a virtualization environment from scratch, but alas, I needed to keep the current Windows XP install on the thing, so I got to resize the NTFS partition and install Linux on what was left instead.

    • #3103767

      MSN adCenter Review

      by justin james ·

      In reply to Critical Thinking

      MSN adCenter: WOW. I just acted on an invitation to MSN adCenter on behalf of one of my customers. We decided to try to enter the beta, since the Overture/Yahoo! ads are randomly not showing on MSN Search when MSN tests their new system, typically late at night. Since the site sells consumer goods, it is just as important that ads are showing at 9 PM as it is to have them up at 9 AM.

      After the initial login, I was brought to an interface that just completely blew me away. Google AdWords and Overture/Yahoo! Both have cluttered interfaces. Google has an especially poor interface. The MSN adCenter interface is clean, fast, easy to use, well marked. I just cannot say enough good things about it. I have not reviewed all of its features, but it looks like it has everything that Google AdWords and Overture/Yahoo! have, while being easier to navigate.

      Additionally, my customer in the MSN adCenter program is in love with the pricing structure, which is better than how they feel about Google AdWords or Overture/Yahoo! They are currently spending $150/month with Google AdWords, and they are hitting their daily budget cap by about noon everyday. We are not currently able to perform log analysis (the logs are not available through their current web host), but we are pretty sure that the clicks from Google AdWords are pretty useless. Indeed, we are pretty sure that many of the clicks from Google AdWords are actually click fraud, but without the logs there is no way to tell. They have the opposite problem with Overture/Yahoo! The bids are so cheap with Overture/Yahoo! That they are only incurring about $7/month in charges. But Overture/Yahoo! Has a $20/month minimum spend limit, so they are being charged for clicks they never get. Part of the problem is that this website is a Top 10 listing in the organic results for both search engines anyways for nearly every keyword they can think of. I know, it sounds crazy that being in the top ten search results (even for generic terms!) can be a “problem”, but as far as Overture/Yahoo! is concerned, it is! With MSN adCenter, there is a $5 signup fee, the minimum bid is only five cents, and you get charged at first in $50 increments (or every 30 days, whichever comes first); the $50 increment slowly goes up as your time in the program increases. There is no minimum spend limit, so you only pay for clicks you get. My customer is delighted.

      My only complaint so far with MSN adCenter is the system for importing keyword information. It took me about fifteen minutes to find a link to the spreadsheet, and when I found it, it did not match the examples or the otherwise well written instructions.

      The help system is very well done as well. It feels like a Microsoft Office application, by shrinking the browser window to have the Help window fill the remaining space, with a tight interaction between the two windows.

      I was also surprised that MSN assigned us a representative to help walk us through the process and evaluate our needs. The representative is actually calling us by phone; it is nice to hear a voice instead of sending emails and waiting for two days like you need to do with Google or Overture/Yahoo!

      Overall, with the exception of the speed bump caused by the import spreadsheet, I am extremely impressed with MSN adCenter, and look forwards to exploring it in depth as my customers’ needs require me to.

      J.Ja

    • #3105513

      Is Apache inherently more secure than IIS?

      by justin james ·

      In reply to Critical Thinking

      Richard Stiennon at ZDNet argues that Apache is inherently less vulnerable to attacks than IIS, because it makes less system calls over the course of serving an HTML page, and is therefore less vulnerable to things like buffer overflow attacks. The argument, while have some prima facie appeal, is specious. Let us examine in depth the truth about what he says:

      Both images are a complete map of the system calls that occur when a web server serves up a single page of html with a single picture.

      It is odd, but I cannot remember the last time a Web server was exploited on basic static HTML serving functionality. Why? Because there is nothing to attack! The serving of static HTML pages simply does not leave room for a buffer overflow, because the server is not running any arbitrary code; all it is doing is mapping the URI request to a local file, and streaming the file to the client with the appropriate HTTP headers at the top. That is it. How are you going to attack that, except for attacking the method that the server uses to process the headers, or maybe getting it to serve a file it should not?

      The more system calls, the greater potential for vulnerability, the more effort needed to create secure applications.

      I can agree with this. Except there is one little problem: Apache cannot be compared to IIS! Take a close look at what Apache does, out of the box: it serves static web pages. CGI is disabled by default. Even if CGI were to be enabled any vulnerabilities at that point are not in Apache, but with whatever is fulfilling the CGI request. IIS, on the other hand, has all sorts of functionality built into it, such as running ASP scripts, .Net applications, and so on and so on that Apache cannot do without the aid of third party (or non-default) extensions. What does the system call tree look like for the entire LAMP stack compared to the Windows/IIS/ASP.Net/SQL Server stack? I bet they look much more similar. Sorry pal, but you are using an apples-to-oranges comparison when comparing IIS?s system calls to Apache?s.

      Furthermore, how often does the Web server itself get attacked? Not nearly as often as the applications running on the Web server. Poor programming habits (such as not properly validating data, misuse of routines line printf() on input that was not validated, and so on and so on) are the cause of Web application vulnerabilities. There are not many Web server vulnerabilities out there now, or ever.

      Poor systems administration is another source of common attacks. I don?t care what OS you are running, when you have your Web server running as root or Administrator because that is easier than properly setting up permissions, you have a problem. A Perl script that is running as root outside of a chroot jail is much more of a problem that even the naughtiest ASP.Net application running on IIS as a restricted user. Period.

      Ignorance and laziness are the root cause of the vast majority of security breaches, not the server?s OS or application stack. PERIOD. No OS or Web server in the world will protect you if a programmer sticks the input from a Web form into the WHERE clause of a SELECT statement against a SQL injection. No amount of anti-virus or anti-whatever will help you if you have a sys admin who lets the user upload a file to an area outside the acceptable area and then execute that file while the Web server runs as root. No firewall will save you if the programmer uses a function with a known vulnerability on data that has not been scrubbed.

      Those are the facts. Mr. Stiennon, I suggest that you learn the facts. You may not be a journalist and just a blogger (I am assuming by that, you mean ?I write subjectively, not objectively? which equates to ?this is my opinion, not fact?), but you still have a responsibility as a representative (employed or not employed) of a publication that is well regarded.

      J.Ja

      • #3287212

        Is Apache inherently more secure than IIS?

        by jaqui ·

        In reply to Is Apache inherently more secure than IIS?

        IIS has asp support built right in?

        really?

        that’s why my neighbor who hates open source products gave up in disgust at getting an asp page served from iis and went with apache.

        IIS was incredible difficult to get asp page support working in.
        apache getting php or perl support is simple in comparison.
        [ this according to a person who adores Micorsoft and won’t use open source software if it’s possible to avoid it. ]

      • #3287159

        Is Apache inherently more secure than IIS?

        by justin james ·

        In reply to Is Apache inherently more secure than IIS?

        Jaqui –

        I cannot vouch for your friend’s experience, but yes, IIS does have ASP (and ASP.Net) support built right in, and the last I checked it was a non-removable part of IIS (unfortunately).

        I do agree, Apache is very easy to get up and running, as well as adding extensions like PHP, Perl, etc. to it. But that ignores the basic premise of the blog, which is that IIS does more from the get-go. That is an objective statement, not a subjective statement. IIS is slowly headed towards the level of modularity that Apache has. Apache’s modularity is awesome. But I stand by the contents of that part of the blog: when you add enough extensions to Apache to provide it with abilities equivalent to IIS’s base functionality, it will make just as many system calls and be just as complex and prone to programmer error. Whether or not someone can figure out how to use IIS’s base functionality or not isn’t the point. I have had my own share of struggles with IIS, and know that it is not always a very pleasant system, and have found myself wishing I could just run vi /usr/local/etc/iis/iisd.conf or something to work with it…

        J.Ja

      • #3287092

        Is Apache inherently more secure than IIS?

        by merwin ·

        In reply to Is Apache inherently more secure than IIS?

        Strange – last time i checked a bufferoverflow is possible everywhere
        that a memoryarea is moved from one place to another, especially the
        places where it intersects the stack?

        Kim

      • #3287049

        Is Apache inherently more secure than IIS?

        by phirephanatik ·

        In reply to Is Apache inherently more secure than IIS?

        Wow, what an ignorant post. Contradictions all over the place and obviously someone that just doesn’t understand the premise of the original argument.

        Regarding the calling of a static html page and an image, that’s what you might call a “baseline reading”. If you’re testing the web server itself and the OS, you’re not going to load anything special like php. That introduces all sorts of other variables into the picture, such as how php (or asp) is handled. This is comparing the webservers, NOT the things that can be plugged into them or put on top of them. And depiste this blogs naive opinions, there are plenty of ways to attack a web server. I see it in my logs all the time. They go after IIS ’cause it’s a very low bar, obviously. There are all sorts of hacks that you can do to just the simple web server, and that’s all this was testing. An HTML doc and an image is a fair baseline measurement of how the web server goes down into the OS and back again.

        You argument about what apache comes with as opposed to IIS is a poor one as well. There’s not much in the way of apples to oranges in this comparison here. The request was for the same html file and the same image file. If IIS is leaving vulnerable all these holes that aren’t even being used, then that’s yet another flaw in IIS. Long ago, we (except for Microsoft) learned that in the world of security, you turn it off by default and when you need it, then you turn it on. I guess the author regularly runs a machine that answers on all ports without a firewall based on the poor understanding displayed here. Would you mind posting your home IP address? I think you might have some friends that want to find you. As stated above (and in the original article which you don’t seem to understand), this is comparing the WEB SERVERS, not the middleware apps. If IIS is firing off its CGI handler to fulfill a simple HTTP GET request, then there’s something seriously wrong with IIS (not that we didn’t know that already).

        The web server itself is attacked far more often than this author would like to think. Can we at least get people writing here that are in the business? Sheesh. That’s why anyone that’s serious about this does doesn’t run IIS. It’s been flawed since day 1 and has not become any better. If there were no flaws in the web servers, then I guess Apache would be complete. Done. Never has to be touched again. Ditto IIS. Since there are no web server vulnerabilities, it never has to be fixed again (assuming it had been writting correctly the first time). However, in the real world and in real life, this is wrong, just like most of the poster’s comments. Patches for web servers are a way of life, just like any other piece of software. There don’t have to be a lot of vulnerabilities in a web server, there just has to be one. That’s all it takes. And the point with this spaghetti mess that is called IIS on Windows, finding and fixing any single vulnerability is obviously going to be a huge mess and it will be very hard to tell what other ramifications the system will have when things are so poorly organized.

        The next two arguments about poor system administration are irrelevant to the topic at hand as nowhere in the original article did it make any comment as to how well the systems were maintained, etc. These are typical windows user responses to try to make themselves feel comfortable about how crummy windows is. Looks like the author more or less copied page 1 out of any security textbook. If you really want to get into it, though, the Apache way is very much better because the patches come much faster. We’re starting to see more and more often people other than Microsoft writing patches to fix Windows since MS can’t fix it fast enough. You’d think that a company with virtually unlimited resources could fix critical vulnerabilities in a reasonable amount of time, but they can’t… and now we have a very much clearer idea of why.

        The author of this needs to learn where the facts really are (Microsoft is not where you “Get the Facts”, hello). Stiennon is right and you are wrong. An organized code base helps you find and fix errors promptly, and also helps you to avoid spaghetti errors in the first place. That’s the whole point of this, and it could be applied to anything, be it web app development (hello php, asp), software development (ie, firefox), OS development, or what we’re seeing here. This is common sense, nothing new. Further, the author of the original article that this author tries to hard to lamely bash isn’t even making any new assertions. Anyone that’s been out there for more than a week knows that you don’t run IIS webservers. Ever. You’re just asking to be hacked. This is what we call “old news”. All the original author did was show us why IIS is in such horrible shape.

        I can’t believe J.Ja wasted all this time writing such a load of garbage. None of it has anything to do with anything. Please, guy… think before you write. I doubt you will be, but you should be embarassed to have signed your name to this.

      • #3104887

        Is Apache inherently more secure than IIS?

        by a. kem ·

        In reply to Is Apache inherently more secure than IIS?

        It might be that Stiennon is also trying to put fear into our hearts and boost sales of Webroot software – especially since they don’t have a workable Linux solution 🙂 Enable DEP and lots of these problems introduced by the traditional code injection into data space are thwarted by NX anyway, but that’s another topic altogether.

        You’re right on target though. If you make an apples-apples comparison, you’ll end up with vastly different (and comparable) results in the respective call trees. Applications are where the likely vulnerabilities are now, not so much the OS or services. Laziness and/or ignorance is the source of most problems.

        Furthermore, simpler isn’t always better. Would your trade your trans-oceanic flight on a 777 for a flight on a Super Constellation? Heck, the 777 in-seat entertainment system is more complex than the entire Super Constellation. Because something is easier to understand or appears simpler, doesn’t mean that it is more reliable (although there is a certain intuitive appeal to the notion that simpler is better).

        Good discussion. There should be more of it.

        A. Kem

    • #3104986

      No data is safe, not even data you already validated

      by justin james ·

      In reply to Critical Thinking

      I was recently subjected to an interesting new type of malicious code a few days ago, and I wanted to share it.

      A friend of mine asked me to help him with a little bit of PHP coding a few days ago. He understands a bit about programming, but never did it for anything complex, and does not do it very often. He sent me the script he had written, and I started tinkering with it, and the more I started playing with it, it soon ended up being a ground-up re-write of his code.

      The script itself had a fairly simple logic to it: parse the Apache access log, find other pages that have referred visitors to this particular page, and create a link page with the number of referrals. When doing some debugging, though, something odd happened. I was dumping the output of the raw log to the browser, and all of a sudden I was redirected to another web site!

      Digging through the log file, I found the culprit: a site spider had put a chunk of JavaScript within <SCRIPT> tags to perform a redirect as its User Agent header. Obviously, someone had figured out that many people use Web-based log analysis tools, which will show you the user agents. I am grateful that the site which I was redirected to did not contain any malicious code of its own.

      What made this attack extremely interesting to me is that it did not actually attack a particular piece of software, nor did it care what OS I used or anything else. All it needed was for someone to run code that did not validate data. Indeed, it is a very common developer misperception to assume that data, once it is in a database, is clean and does not need to be validated on its way out of the database. This is the real lesson learned here. I can put all of the input validation I want into my program. But if someone else’s software also accesses the same database, and does not properly validate data, I might as well not be doing validation at all, if I assume that the data is valid when I use it.

      This is another example of how ignorance or laziness on the programmer’s behalf can become a major catastrophe. Imagine if a piece of software had been written before JavaScript had been introduced, and was still in use? The programmer would not have even known to be able to prevent this kind of attack.

      This is yet another reason why I am down on Web applications; it is the only system that I can think of in which input by one user is presented to another user in a way that the second user’s computer will parse and interpreted and maybe even execute the first user’s input, outside of the control of the developer. In thin client and desktop application computing, the programmer has total and complete control over the presentation layer and what occurs there. In Web application, the presentation layer is a complete no-man’s land. There is no telling what will happen there. Data that is good today may become dangerous tomorrow if some new technology gets added to the browser and creates a browser issue. One example would be to allow users to post videos online; if there is a buffer overflow problem in the user’s media player of choice, then you (the programmer) are giving malicious users a tool to attack other users. Web services are just as bad, particularly when using an AJAX method that takes your software out of the loop. In those situations, you do not even control the third party website. It could be riddled with problems, and you would not even know it until users are contacting you and asking why your software infected their computer with malware or crashed their computer completely.

      At the end of the day, I was able to complete the script. Naturally, I made sure to strip any and all HTML, JavaScript, etc. from the input as it was being read. But it was a great reminder to me that no matter how many external parsers, validators, etc. that a piece of data goes through, they may not be providing the validation that my application requires. Input that is healthy, acceptable, and possibly even desirable for one program is not necessarily so for another program.

      J.Ja

      • #3103869

        No data is safe, not even data you already validated

        by wayne m. ·

        In reply to No data is safe, not even data you already validated

        I’m not sure that I agree with some of the specific suggestions, though I strongly believe that any system needs to define a data validation and error handling policy.

        Validation of data coming out of a database and having different data validation rules on different systems provide  sources of error and manual rework.  Any time data is rejected, human intervention is required to reconcile the data.  This implies that the appropriate place for data validation is at the user interface to allow data correction to occur with the person most knowledgeable about the data being entered.  Data that is rejected later in processing can only be put into a queue for handling by operations personnel who may lack the knowledge needed to reconcile.

        The same logic leads to the conclusion that applications that share data must accept the same data validation rules.  Data delayed or lost during transfer between systems leads to duplicate data entry and loss of data synchronization between systems.  Applying differing rules ensures manual effort is required to share data. 

        Data validation rules must be consistent and the best way to maintain that is to have a single point for creation, update, and delete of data items; only read capability is shared among systems.  This allows data validation to be implemented and maintained in one location.

        The key is to establish a system-wide data validation and error-handling process.  Having each individual component maintain its own private rules is a recipe for lost data.

      • #3104195

        No data is safe, not even data you already validated

        by tony hopkinson ·

        In reply to No data is safe, not even data you already validated

        If you do not have control over all input you’ve got to validate on output. Not just for web based tools.

        Simple CD collecions database with access, you should n’t write the application assuming that because you carefully validated the input it’s going to be correct on output. You don’t have control, the user does, and they can get at the raw data without using all your careful work.

        Belt, braces, a bit of rope and a regular visual check that your trousers aren’t round your ankles is a requirement.

        Murphy’s law is universal constant, design with it in mind.

    • #3104022

      Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

      by justin james ·

      In reply to Critical Thinking

      David Berlind over at ZDNet recently responded to George Ou’s piece on the possibility of media bias regarding coverage of Internet Explorer and Firefox.

      Let us apply my patented Logic Analysis System on the concept of holding companies to different standards for the same type of product, using some of Mr. Berlind’s arguments:

      “Internet Explorer can be held to a higher standard than Firefox because it has been on the market longer. In addition, Microsoft has plenty of money and can hire the best engineers in unlimited quantities. Microsoft has been making bold claims about IE whereas the Firefox team does not. Therefore, when Internet Explorer has defects in terms of stability and security it is much more of a problem than when Firefox does.”

      Let’s first parse out all product references and replace them with variables:

      “{Product A} can be held to a higher standard than {Product B} because it has been on the market longer. In addition, {Company A} has plenty of money and can hire the best engineers in unlimited quantities. {Company A} has been making bold claims about {Product A} whereas {Company B} does not. Therefore, when {Product A} has defects in terms of {Defect X} and {Defect Y} it is much more of a problem than when {Product B} does.”

      Now, let’s try filling in some new values:

      Product A = “Ford Explorer”

      Product B = “Kia Sportage”

      Company A = “Ford”

      Company B = “Kia”

      Defect X = “engine fires”

      Defect Y = “brake malfunction”

      “Ford Explorer can be held to a higher standard than Kia Sportage because it has been on the market longer. In addition, Ford has plenty of money and can hire the best engineers in unlimited quantities. Ford has been making bold claims about Ford Explorer whereas Kia does not. Therefore, when Ford Explorer has defects in terms of engine fires and brake malfunction it is much more of a problem than when Kia Sportage does.”

      All of a sudden, this type of statement doesn’t sound so great, does it?

      What I find even more amazing is that the Web 2.0 cheerleaders seem to closely overlap the Firefox groupies (as well as the Google bootlickers, for that matter). They want to replace most of userland applications with Web-based applications, but they seem to have no problem if the web browser is filled with problems? It is fine to use a buggy, error prone product as long as it is GPLed? Especially when you want the browser to replace most userland applications? Get real.

      Some argue that since you pay to use Internet Explorer (indirectly though an OS license) and Firefox is 100% free, that Internet Explorer can be held to a higher standard. This is an excellent point, but not completely correct. If you had a choice between paying for IE and not paying for IE (such as purchasing it separately, or as an add-on to Windows) then this might be a legitimate argument. Similarly, if the inclusion of Internet Explorer in Windows played a part in your choice in operating systems, then it could be said that you are paying for Internet Explorer. But in all honesty, if you chose Windows for reasons that have nothing to do with Internet Explorer, you did not really pay for it; you got it as gravy.

      A lot of people also emphasize that Firefox is a new product, and where it is at this stage of development (better than Internet Explorer on some things, more features than Internet Explorer, not as good for other things) is amazing considering its age. This is a completely bogus statement! Firefox actually comes from an older code tree than Internet Explorer! How is that? Firefox is actually the Mozilla Web browser at heart, sans the Mozilla suite (originally it started off as a lightweight browser, but it is now just as heavy as Mozilla). And where did Mozilla come from? It came from Netscape, when Netscape open-sourced Navigator ages ago. Netscape was on the market before Internet Explorer. So to give Firefox bonus points for its age is simply ignoring history.

      The real truth is, Internet Explorer and Firefox need to be held to the same standard, and that standard is not the one that Firefox is being held to; it is the standard that Internet Explorer is being held to, if not a higher one. It is downright shameful that Internet Explorer still has as many bugs and security holes as it does. I do not know if ActiveX is disabled by default, but it should be. Internet Explorer, to be frankly honest, is a dog. It is not standards compliant, its PNG rendering is messy at best and ActiveX is still filled with security holes, and so on and so on. If Internet Explorer needs to meet this standard, than so should Firefox. I do not consider either one of them to be particularly great products. I rarely use Firefox, except to test cross-platform compatibility, so I cannot truly judge it as a browser. But from all of the reports I read regarding its stability, I think I would prefer to avoid it for the time being. When it comes to web browsers, my personal choice is less features, probably less secure (I do not think we will be able to really judge Firefox’s security for a while longer), and more stability. I rarely go to Web sites that I am not familiar with, I have locked down Internet Explorer fairly tightly, and I most definitely do not go to Web sites of a questionable nature. But I frequently have a web browser open, and cannot afford to have it crashing on me repeatedly.

      There is no extra credit for an engineer who designs a bridge that only partially collapses. There is no curve for an auto maker who has a faulty product that kills only 25% of the people who own the product. There is no bonuses to give to programmers who write code that allows security breaches. Period. It does not matter who you are. So why give one product a free pass (or a reduce fare admission) and not another? Especially when both browsers need a lot more work to be ready to transition to the world of online applications replacing desktop applications?

      J.Ja

      • #3104323

        Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        by tommy higbee ·

        In reply to Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        Reasons why Internet Explorer SHOULD be held to a higher standard

        (Ignoring both the question of a) whether it IS held to a higher standard and b) how well it meets the standard)

        1) It’s part of the operating system, even if you don’t use it.  You can’t uninstall it, you can’t get rid of it. If there’s a vulnerability in IE, you’re stuck with it, and any patches must be applied EVEN IF THE PC NEVER BROWSES THE INTERNET (for example, servers)  Nothing quite like having to shut down and restart 5 separate servers that are part of an application because the latest Windows update for IE came out….

        2) It’s part of the operating system, and any flaws affect the entire OS.  If Firefox crashes, you can just restart it.  IE  crashing is much more likely to require a reboot.

        3) ActiveX:  The single biggest security hole on every Windows PC, the single biggest vector for spyware.  Acitve X controls have every bit as much right to your PC and its hardware as the OS.  (Do we detect a theme here?)  Security for ActiveX controls focuses on only one thing: preventing it from being installed unless you’re sure you want it.  There is no “sandbox”, unlike Java.

        4)  Regardless of whether or not you bought the PC FOR IE, you still paid for IE.  It’s only reasonable to expect more from a product you paid for.

        BTW, while you COULD take the attitude that you didn’t buy Windows for IE, so it’s just gravy, you could just as easily say that you were forced to buy IE whether you wanted it or not (it’s undeniably part of the cost of producing the OS you DID buy).  Which attitude is more right?  If you use Firefox and avoid IE, you might resent having to pay for IE.  In that case, as unwanted software, it SHOILD be held to a higher standard.  Hey, there’s reason number five!

      • #3104269

        Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        by conceptual ·

        In reply to Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        Amen 

      • #3104212

        Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        by tommy higbee ·

        In reply to Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        Reasons why Internet Explorer SHOULD be held to a higher standard

        (Ignoring both the question of a) whether it IS held to a higher standard and b) how well it meets the standard)

        1) It’s part of the operating system, even if you don’t use it.  You can’t uninstall it, you can’t get rid of it. If there’s a vulnerability in IE, you’re stuck with it, and any patches must be applied EVEN IF THE PC NEVER BROWSES THE INTERNET (for example, servers)  Nothing quite like having to shut down and restart 5 separate servers that are part of an application because the latest Windows update for IE came out….

        2) It’s part of the operating system, and any flaws affect the entire OS.  If Firefox crashes, you can just restart it.  IE  crashing is much more likely to require a reboot.

        3) ActiveX:  The single biggest security hole on every Windows PC, the single biggest vector for spyware.  Acitve X controls have every bit as much right to your PC and its hardware as the OS.  (Do we detect a theme here?)  Security for ActiveX controls focuses on only one thing: preventing it from being installed unless you’re sure you want it.  There is no “sandbox”, unlike Java.

        4)  Regardless of whether or not you bought the PC FOR IE, you still paid for IE.  It’s only reasonable to expect more from a product you paid for.

        BTW, while you COULD take the attitude that you didn’t buy Windows for IE, so it’s just gravy, you could just as easily say that you were forced to buy IE whether you wanted it or not (it’s undeniably part of the cost of producing the OS you DID buy).  Which attitude is more right?  If you use Firefox and avoid IE, you might resent having to pay for IE.  In that case, as unwanted software, it SHOILD be held to a higher standard.  Hey, there’s reason number five!

      • #3104197

        Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        by jhilton ·

        In reply to Both Internet Explorer and Firefox Need To Be Held to a Higher Standard

        In the article you state you rarely ever use Firefox, because you’ve read reports of instability? If you were to actually use it on a regular basis, you would discover these claims are completely false. I have been using Firefox on many different machines in many different environments since it’s beta’s, and the only thing that has been able to crash it is Adobe Reader (suprise suprise). If a product is rapidly stealing market share, it must be doing something better.

    • #3286916

      MapPoint is nearly uselessly crippled as an ActiveX control

      by justin james ·

      In reply to Critical Thinking

      I have spent the last three weeks of my life, more or less, trying to do what should be a trivial task: get a map out of Microsoft MapPoint 2004 (the desktop app, not the Web service) and save it as an independent image. There are all sorts of problems with this though!

      It seems that someone at Microsoft decided to cripple the MapPoint ActiveX control, for reasons beyond my understanding. Functionality that exists within the MapPoint software, and is accessible via VBA just cannot be used from outside of VBA, such as saving the map as an HTML file. The solution suggested? Fire up an entire instance of MapPoint and programmatically command it to do a “Save As”. For whatever reason, what is a 1 second operation through the MapPoint application becomes a 2 ? 20 minute (you read that right, 20 minutes for some maps!) operation when run from VB.Net. During this period of time, MapPoint like to glom 100MB of RAM and 100% CPU.

      Every single method that I have seen suggested simply does not work. Even telling it to copy to the clipboard fails after about 1300 iterations (I am looping through a large number of maps). The only thing anyone can tell me about this error is to keep retrying it within a loop and eventually it will work. Code that fails at random is not code I think I want to be using.

      What bothers me about this is Microsoft’s decision behind this. I just cannot fathom why they only included basic functionality to the ActiveX control, especially when that functionality exists within MapPoint itself. Since I need to have MapPoint installed to use the ActiveX control, it is not like they would be losing sales or anything like that by providing full functionality to the ActiveX control. This reminds me a lot of the undocumented Windows APIs that Microsoft used to keep other companies from writing software as capable as their own software. If it were not for the fact that MapPoint costs a fraction of what my other options are, I would be more than happy to dump it.

      This is actually my first time using an ActiveX control of an Office product. Up until now, all of my Office programming has been with VBA within the application itself. Has anyone else had these kinds of problems? If so, I may need to reconsider a few things in how I was hoping to do some future projects.

      J.Ja

    • #3287516

      Geeks and Communication Skills

      by justin james ·

      In reply to Critical Thinking

      It is a commonly held belief that geeks do not need to be able to communicate outside of Nerdland. In fact, it is an outright expectation. Programmers who gets nervous around pretty girls, systems administrators who cannot give a presentation to more than two people at a time, and DBAs that stutter unless they are discussing Dungeons and Dragons are what many people envision when they think of IT professionals. These are all common stereotypes of IT professionals. Sad to say, many IT professionals buy into this idea, and sometimes even actively encourage it!

      I am not going to pretend to be surprised by this. Up until the age of sixteen or so, reaching Level 4 as a bard seemed more important than reaching first base with a woman. Weird Al Yankovich was “romantic” in my mind and a “nice wardrobe” meant a closet full of shirts from hardware and software vendors, preferable ones with multiple years’ worth of pizza stains on them to prove my “authenticity”. I thought that if people did not understand me, it was because they were stupid, not that I was unable to communicate with them.

      Thankfully, I changed. Mostly. I still think Weird Al is funny on occasion, and the ratty shirts are still there (though they now tend to be Metallica and Mr. Bungle shirts from my post-ubergeek years). The biggest change was that my communication skills improved significantly. I took classes in high school such as AFJROTC and Mock Trial that taught me how to speak to an audience, with or without notes. My classes in college (I will merely admit that I double majored in “cannot-get-a-job-ology” which is code for “the liberal arts”) involved few tests, but endless amounts of paper writing. What few tests there were tended to be essay questions. In other words, I was learning a lot about communication skills.

      What does this have to do with the IT industry? Plenty. If you want to know why your manager seems to be a “grinning idiot” with no clue what your job is instead of someone with technical skills, take a look at what that manager brings to the table. That manager is very likely to have an MBA or maybe an MIS degree. Their external learning is probably in “risk management” or Six Sigma, not the Cisco or Red Hat certification you just earned. The manager’s job is to interface between “the suits” and the IT people. The manager does not actually need to know how to do your job if you communicate your needs to him properly. What manager does need to know is how your job relates to the business.

      It has been my experience since I started blogging about IT issues on TechRepublic, that the majority of the time when I receive heavy criticism, it is because I failed to write clearly and properly communicate my message. Sure, there have been instances where someone climbed all over me for using one bad example or analogy in a 3,000 word post, or where someone was obviously unable to comprehend the topic at hand. But by and large, when I receive negative feedback, it is my own fault for not writing clearly.

      At my current position, my manager does not understand much programming (he knows some VBA), systems administration, database administration, networking, computer repair, or any of the other tasks I do. He knows how to run the company, deal with customers, and so on. He really does not need to know the gritty details of what the project is hung up on; he just needs to know how long the delay will be. He does not care what brand of motherboard I buy or what CPU I select; he needs to know the price and business justification for the expenditure.

      Many of the IT people that I have worked with simply do not understand this. They fill a proposal with technical details, and expect the person reading it to understand the benefit of the proposal from the technical information. In other cases, they write an email that is littered with typos and spelling mistakes. These types of mistakes do not help the recipient to understand why they should approve your request or give your project more resources, or otherwise help you with whatever goal it is that you are trying to accomplish. Tailor your message for the audience. If the recipient is a technical person, make it technical. If they are a non-technical person, use language that a non-technical person can understand. As I often do for programs that I have written, I pass it through the “Mom test.” In other words, I ask my mother to review it. She is about as non-technical as it gets. If my mother can understand what I have written to the point where she can make an educated business decision, then it is a good communication.

      Many of the IT people out there seem to think that this is degrading. These are the same types of IT people who make web sites that only display in one particular web browser, or require you to go find some funky external library, or insist that you recompile the application yourself without providing any documentation. These are the IT people that may be excellent at their jobs, but are hated by everyone that their job touches. You do not need to go this route. No one will criticize you or complain if you learn to effectively communicate with non-technical people. In fact, they will appreciate you even more. My experience has been that improved communications skills leads to better opportunities in life and in my career. If a manager is evaluating two candidates for a promotion, they are more likely to pick someone with less technical skills who communicates well than a more technical person who does not communicate well. Why? Because the person with good communication skills is able to show that they know what they are talking about, while the person without those skills simply cannot be understood.

      If you feel that your communication skills may be lacking, there are things that you can do to help them improve. One suggestion is to read more books and magazines. If you already ready books and magazines, escalate the difficulty level of your readings or try reading about topics that you are not familiar with. I have found that crossword puzzles are great tools to expand your vocabulary. Try your hand at writing something, whether it be short fiction, how-to articles, or poetry. If you can, try to go to new places or talk to different people; sometimes we find ourselves in cliques with a shared mindset that makes it difficult to learn how to communicate outside of that group. There are lots of different ways to improve communications skills, but at the end of the day, they all amount to “increase the frequency of your communications, the diversity of the mediums, and the people that you communicate with.”

      J.Ja

      • #3287413

        Geeks and Communication Skills

        by georgeou ·

        In reply to Geeks and Communication Skills

        Nice job!  This should be mandatory reading for all IT people!

      • #3104428

        Geeks and Communication Skills

        by Jay Garmon ·

        In reply to Geeks and Communication Skills

        I second George’s motion. (And since I have a small hand in promotions
        around here, I can actually do something about it.) Your “Secret
        Origin” is remarkably similar to my own, though I stuck with the
        communications angle rather than the path of technical competency. I
        guess that’s why I write trivia questions and you design software
        (lousy pay grade differential). Every job I’ve had, every major success
        I’ve enjoyed, every great D&D character I’ve played, each were made
        possible by two basic skills: critical analysis, and my ability to
        communicate. There is no professional (or, I dare say, human being) who
        could not benefit significantly from cultivating both. Who knows, it
        might just get you a gig writing a column enjoyed by dozens while
        posing for your author photo in a NextGen uniform.

      • #3104395

        Geeks and Communication Skills

        by sarathi ·

        In reply to Geeks and Communication Skills

        Eloquently put!

        I do what you are suggesting to improve communication ? I haven?t tried crossword puzzles though. But I read a lot of books ? fictions though but written by in different languages and translated into English ? these books always have some less frequently used words and you need to keep a dictionary handy while reading them!

        You hit the nail on the head when you talked about moving with the same set of people, I have first hand experience of my communication level going down the drain whenever I become lazy and stop reading magazines & writing and start spending lot of time with the poorly communicating peers! 🙂

      • #3104364

        Geeks and Communication Skills

        by wayne m. ·

        In reply to Geeks and Communication Skills

        Absolute Agreement

        Working with software developers, I am in absolute agreement.  Good technical skills help an individual do a good job; but good communications skills help everyone do a good job.  Some of my personal recommendations for imporved communications follow.

        1) Adopt the “Say it three times” pattern.  Use an introduction, body, and conclusion both verbally and in written communications.

        2) Purchase and read Strunk and White’s “The Elements of Style.”  This is a classic, it is short, and, even though it is a grammar book, it is an enjoyable read.

        3) Get some public speaking training.  There are dedicated training courses, college and community college courses, and the low cost option is ToastMasters.  ToastMasters clubs can be found at http://toastmasters.org/ , look for the “Find a Club” link.

        4) Write some papers meant to sway opinion, typically either purchase justifications or proposal and new business work.  I’ve been surprised at the number of technical people who, when asked to write a justification for something they want purchased, just go away and sulk.  Hey, I want to get you the tools you need, but I need some help to do it.

        None of the above recommendations is very costly nor time consuming.  The entire set is no more difficult than obtaining a new technical certification or learning a new programming language.  The advantage of imrpoved communication is that it opens a door to a wide range of interesting new career paths.  Try some of these ideas out and take your peers along for the ride.

         

      • #3285241

        Geeks and Communication Skills

        by tony hopkinson ·

        In reply to Geeks and Communication Skills

        I agree with what you are saying, though I personally don’t see the problem anywhere near as wide spread as some would have us believe.

        One tip I suggest, is it’s not enough to simply remain non technical. What you need is a meaningful analogy, I explained the programming concepts of scope, coupling and cohesiveness to a electrical engineers in terms of circuit design. I knew enough about that not to give them an illogical comparison.

        This is the same argument as talking to business people in business terms, ie communicating in terms they understand.

        The most disheartening aspect though is when you do this and they ignore you. Present them with a long term future and on going cost, vs a smaller short term bodge and they go for the latter every time. Even after you explain that this does not make the long term cost go away, in fact it increases it.

        So even though we’ve learnt to talk in their language they still aren’t listening. Push us on this front we have no argument except technical aspects, and so we confirm our stereotype.

        Now maybe I’m not business aware enough to realise why short term gains are preferable to long term success, may be one of these business types should explain it, then we can all row the boat in the same direction.

         

      • #3285186

        Geeks and Communication Skills

        by justin james ·

        In reply to Geeks and Communication Skills

        Tony –

        Your comments are always insightful and great to read, even when we disagree (although in this case we do agree). I do know the answer to the short sightedness issues, and I will be blogging about it shortly. You are right, this is a major problem, particularly with IT projects that tend to have a large initial investment. Stay tuned, and thanks for the great idea for my next post.

        J.Ja

      • #3271304

        Geeks and Communication Skills

        by vaspersthegrate ·

        In reply to Geeks and Communication Skills

        I began as an English/Creative Writing major in college, spent many years as an ad writer and direct marketing strategist, then expanded my skills by launching out into internet marketing, blogology, web usability, and ecommerce. So my primary expertise lies in communication and analyis of text and design.

        This background enables me to say, from my point of view, that it’s not always easy to describe technical issues and products to non-technical people, but I greatly enjoy the challenge and the reward. To see the light of understanding click on suddenly within a client’s eyes, or to see in a blog comment the vivid comprehension of a reader, is a joyful and fulfilling event.

        Technical documentation requires rigorous thought, careful observation, and detailed progression from step A to step B to step C, without ever assuming, “they’ll automatically do this” or “I’m sure they are already at this step (a few steps into the total process)” or “they won’t need me to mention this obviously mandatory activity”.

        To start at the real world Square One, as users behave without coaching, FAQ, tool tips, help desks, or site search reliances, is an exacting art.

        Highly technical persons should read “dumbed down” popular tech books, and extremely simplified online sources, like HTML Goodies, to at least get a feel for how a patient, super easy explanation can be presented. I like to use an esoteric speicalist term in my writings, immediately followed by a parenthetical definition, or a synonym string.

        In addition, it’s good for technical personnel (all of whom must necessarily explain things at some point, to someone) to read such authors as Hemingway, Kafka, Twain, Dickens, Proust, Joyce, Eliot, Steinbeck, Faulkner, and Poe to learn how to write clearly and with great impact. Poets may also be studied or reviewed to gain techniques for adding interesting, colorful, inventive expressions to a description, when such elaboration, analogy, and emotion could prevent the text from being dull, dry, and immemorable.

      • #3148686

        Geeks and Communication Skills

        by charlie.lacaze ·

        In reply to Geeks and Communication Skills

         If you want to get your point accross, you should try teaching as a profession. Try teaching non-nerds some simple software tasks and you’ll understand the level of nerd expertise they possess. Rule # 1 is: Understand that just because the upper level management doesn’t speak nerd, doesn’t mean that they aren’t proficient at what they do and you may be bidding for dollars that would rather be used in their realm of expertise. Rule # 2 is: Never assume that everyone understands the simple things. I’ve had to teach folks how to use a mouse before I could teach them basic software skills but I would trust them with my health care. (You guessed it…. Doctors and Nurses) 

      • #3148687

        Geeks and Communication Skills

        by charlie.lacaze ·

        In reply to Geeks and Communication Skills

         If you want to get your point accross, you should try teaching as a profession. Try teaching non-nerds some simple software tasks and you’ll understand the level of nerd expertise they possess. Rule # 1 is: Understand that just because the upper level management doesn’t speak nerd, doesn’t mean that they aren’t proficient at what they do and you may be bidding for dollars that would rather be used in their realm of expertise. Rule # 2 is: Never assume that everyone understands the simple things. I’ve had to teach folks how to use a mouse before I could teach them basic software skills but I would trust them with my health care. (You guessed it…. Doctors and Nurses) 

         

      • #3148647

        Geeks and Communication Skills

        by carter_k ·

        In reply to Geeks and Communication Skills

        If you can afford, or get your company to pay for, continuing education, I recommend a program like that at Mercer University: a fully-online master’s degree program in Technical Communication. See http://www.mercer.edu/mstco/ for details. I heartily agree that interpersonal communication is crucial to success at work, whether that means getting promoted or simply being understood by those you work with. It’s not enough to just be smart or a technical hot-shot. You need to be able to communicate the brilliant ideas you have.

      • #3148570

        Geeks and Communication Skills

        by mcphaim ·

        In reply to Geeks and Communication Skills

        Insert comment text here

        People interested in improving their communication skills might want to consider joing a Toastmasters International Club. 

      • #3148518

        Geeks and Communication Skills

        by mitchlr ·

        In reply to Geeks and Communication Skills

        Having been employed in IT for more than a decade, I have more than once found myself reduced to incredulity, not only owing to the seeming infacility of some of my colleagues with the English language, but also with their apparent blithe unawareness that communicating with others outside the tribal confines of the geek community may be a desirable goal. 

        One could wish that 133t hackers might breed with the English majors up in the communications department, who are as infacile with technology as the geeks are with language, and hope that hybrid vigor would produce progeny with the ability to make servers tap dance and write a clear elucidation of how and why it was done and how it benefits the company.  It is more likely, however, that such a pairing would likely reinforce the negative attributes rather than the positive with a resulting individual who could neither speak, read, write, spell, or remember his password.

        TBG58

      • #3148499

        Geeks and Communication Skills

        by dirtclod ·

        In reply to Geeks and Communication Skills

        Great Article!
        I question the integrity of individuals who supposedly grasp complex
        technological issues, yet claim an “inability” to master their native
        language. I believe it’s more accurately described as
        “Selective Application Laziness” and quite possibly moderate narcissism
        – either way, it’s by choice and it’s a cop-out. If someone can’t
        master simple
        communication, I wouldn’t hire them to walk my dog, much less run my
        servers. The guy can’t SPELL, but you’re trusting him to CODE or work
        in a BIOS environment?!  You mean to tell me that you cannot learn
        basic sentence structure but have the ability to design networks? 
        I’m sorry, but I’ve seen the output of the south end of a bull before.

        I once worked with scientific researchers (who unlike me) had 3 degrees
        in things
        like Quantum Physics, Nuclear Physics, etc. The brightest ones,
        who had the Nobel Prizes could EXPLAIN IN PLAIN ENGLISH what they
        were doing! The PRETENDERS couldn’t – they invented arcane slang
        and invented needless complexity in order to JUSTIFY THEIR
        EXISTENCE. Scientists who communicate get GRANTS.  IT people who
        want the boss to GRANT funding for projects might find the time to
        learn our language.  If not, that’s where I question these alleged
        “savants” true intelligence.  It’s that old axiom, “If you can’t
        dazzle em with brilliance, baffle em with b.s.”

        Now as far as being uneasy around attractive women, speaking in front
        of groups etc.? I have great empathy for these
        people, many of whom have been cruelly embarrassed, rejected or
        mistreated by some of these narcissistic pretenders – there’s usually a good, concrete reason. Social
        skills are not learned in a classroom – they’re gained by “hard knocks”
        and learned on the “fly.” At any rate, there’s a great degree of
        LUCK involved in whether or not someone masters advanced social skills.

        Any time you communicate – someone will not receive the information
        as you intended.  That’s no reason to give up – if you can’t make
        a mistake, you can’t make anything. 
        If companies continue to hire mumb-jumbo masters who intentionally
        invent complexity as some form of mental masturbation, then those
        companies will
        be:
        Up an unsanitary tributary with no feasible means of transportation!

      • #3148497

        Geeks and Communication Skills

        by duckboxxer ·

        In reply to Geeks and Communication Skills

        Great article.  I went to college at a small school, actually the one Carter mentioned above.  At that time, Computer Science was classified under liberal arts.  I had to take all the English and history classes that the philosphy kids did.  Turns out that was a good thing – I can communicate decently with the non-technical (ie: management and customers) people I have to deal with on a daily basis.  Also if one wants to move up the career food chain, communication is a key factor.  

      • #3148489

        Geeks and Communication Skills

        by duckboxxer ·

        In reply to Geeks and Communication Skills

        Great article.  I went to college at a small school, actually the one Carter mentioned above.  At that time, Computer Science was classified under liberal arts.  I had to take all the English and history classes that the philosphy kids did.  Turns out that was a good thing – I can communicate decently with the non-technical (ie: management and customers) people I have to deal with on a daily basis.  Also if one wants to move up the career food chain, communication is a key factor.  

      • #3149154

        Geeks and Communication Skills

        by robbi_ia ·

        In reply to Geeks and Communication Skills

        Very well written!

        I have more suggestions for learning to communicate.  Join a professional organization, and then get involved in the organization.  Volunteer to speak at meetings, or volunteer at the job to teach staff trainings.  And as J.Ja has already suggested, read, read, read!

      • #3149071

        Geeks and Communication Skills

        by james b. ·

        In reply to Geeks and Communication Skills

        I completely agree. I was a bit lazier and only got one BA though, but I think it was much more helpful than any BS I could have gotten. I am currently the only IT guy at a satellite office for my company. I have to manage desktop users all day long, and occasionally network and phone system issues. I got my job with a strong tech background, but absolutely no direct experience. I got my job because all of the other applicants had just graduated from one of those schools that guarantee an MCSA, and they had no social skills. I am still a complete geek on the inside, but at work, I keep is simple and easy to grasp for my users. I learned the specific tech skills I needed on the job. I think that is what the hiring managers here understood; you can teach someone tech skills pretty easily if they have the aptitude to learn them. If you have ‘aptitude’ for social grace, you would already have learned it. They realize they can’t teach that to you.

      • #3150361

        Geeks and Communication Skills

        by oneamazingwriter ·

        In reply to Geeks and Communication Skills

        Fantastic post. I felt badly that I didn’t read it until now, until I read the many comments. Now I’m glad I was late to arrive.The original post and comments will bring me back to read this again. Great stuff, J.Ja

      • #3157245

        Geeks and Communication Skills

        by santhanag ·

        In reply to Geeks and Communication Skills

        Check out this introduction article on Technical communication:
        http://www.articleworld.org/Technical_communication
        Content:
        1.Professions
        2.Formats
        3.Tools
        4.Resources

    • #3104307

      Why the Oracle Application Stack should not happen

      by justin james ·

      In reply to Critical Thinking

      Apparently, Oracle is seriously considering putting together its own full application stack. There seems to be a lot of debate about this, both positive and negative. Personally, I think I will have to side with the naysayers on this one.

      First of all, the last thing the world needs is more confusion amongst the Linux distributions. I understand that Oracle is looking to purchase an existing distribution, not start their own. But do open source projects really manage well after being purchased by a large corporation? SuSE seems to be a bit unhealthy after being bought by Novell. Granted, Novell has “Fido’s magic touch” where everything they touch turns to dog doo. WordPerfect. Corel. Their own products. And so on. Much of the open source community seems to be “personality driven.” The movement of one or two key contributors to a project can cripple it. Not just because they were cornerstones of development, but if they leave, so do many other people. Forking is another common problem after an open source project gets purchased. This is actually the main reasons why I use BSD instead of Linux; I feel that there is too much churn in the Linux community. If  I were a Linux user, my biggest fear would be “what if Linus Torvalds gets hit by a bus?” I can imagine the power scramble if that were to happen, and it scares me. So the idea of Oracle purchasing or starting a Linux distribution of their own would worry me, particularly if they were to purchase one.

      Another issue with the idea of Oracle having a stack, is that I am not sure if Oracle is a company I would want to have to deal with. Their website is a frightening place indeed; finding useful information in a usable format can frequently take hours. Just try to find the SELECT syntax on their website, I dare you. I do not think that Oracle understands how to interface with customers on that type of basic level well enough to want to have my OS coming from them.

      Another arena where Oracle falls woefully short already is in their management tools. Everything about installing, configuring, and maintaining Oracle’s database products is pure misery. Everything is done wrong, as far as I am concerned. They have visual tools such as Oracle Enterprise Manager that simply do not work right. For example, if you put the cursor into a field and start typing, the first character is usually dropped. The interfaces on their visual tools stinks as well. Oracle Enterprise Manager has “features” on the menu that when selected, tell you to use “Oracle Enterprise Manager Console.” Isn’t that the tool I am currently in? Even the GUI version of SQL Plus is a dog; it has a maximum line length, forcing me to wrap lines by hand, and does not even do my the courtesy of putting a vertical bare showing me where to wrap them. Oracle products do not install correctly; the DLLs needed for ASP and ASP.Net connectivity have the file permissions set incorrectly, a problem that has persisted through a number of major revisions. Even trying to get clients talking to Oracle is a pain. Oracle needs multiple hundreds of megabytes worth of garbage to have a desktop talking to an Oracle database, whereas MySQL and Microsoft SQL Server just need a tiny ODBC (or JDBC, or whatever the right method is for your purpose) driver. Overall, the last thing I want is for my OS to be delivered by a company with this mentality.

      Oracle is also very, very bad about delivering patches, and they do not seem to have a handle on security. In terms of timely patch releases, they make Microsoft look like perfection. Oracle has a bad habit of outright ignoring critical security flaws for months or years at a time, even after being told about them. Their patch cycle seems to be quarterly; meanwhile, Microsoft gets criticized for monthly patches. Oracle also does not seem to understand automatic patching. Again, these are traits I simply do not want in the source of my OS.

      As it stands now, a good portion of Oracle’s stack is not even their own software. Oracle Application Server appears to be a hacked up version of Apache. They do not have any languages of their own (or even re-packaged/re-branded) outside of PL/SQL which you will not be writing applications in. Right now, the only part of the LAMP stack that Oracle can play a role in is the M. You can have a LAOP stack if you want. Oracle is considering grabbing the L. They are still missing the A and P. Indeed, when one looks at what Oracle does well (high performance, scalable database server), I would much rather prefer that Oracle go after the A and not the L! Web servers are more closely related to databases, in terms of how they get written. You simply do not try to start from the middle of the stack and work your way out like Oracle is considering. You need to start from one end or the other and go across. Red Hat had the L, they bought JBoss to get the A. They still need M and P. Oracle is trying to start at M and then go for the L. This just does not work. This is not a stack, this is patchwork insanity.

      The idea behind a stack is that you have a group of tools that sit upon each other and play nicely with each other. Patchwork stacks just don’t cut it. Components within a stack are often not best of breed by themselves, but the combination works great. Look at LAMP: the P is not so great (Perl is poor for web development, PHP is wretched in general, Python just isn’t very popular), and MySQL is not quite top-tier yet (although it is still great). But LAMP works great. Oracle does not know how to make their software play nice with other software. The idea of them trying to build a stack is laughable, at best.

      J.Ja

      • #3285313

        Why the Oracle Application Stack should not happen

        by georgeou ·

        In reply to Why the Oracle Application Stack should not happen

        “Oracle is also very, very bad about delivering patches, and they do not seem to have a handle on security. In terms of timely patch releases, they make Microsoft look like perfection. Oracle has a bad habit of outright ignoring critical security flaws for months or years at a time, even after being told about them. Their patch cycle seems to be quarterly; meanwhile, Microsoft gets criticized for monthly patches.”

        Where’s Apotheon on this :)?

      • #3150170

        Why the Oracle Application Stack should not happen

        by ms_lover_hater ·

        In reply to Why the Oracle Application Stack should not happen

        J.Lo,

        Oh, Jeez, sorry that another company wants to get you out of that comfy Microsoft world… I don’t have time to respond to all of your lies (do you work for M$???), but this one truly shows your ignorance:

        “Their website is a frightening place indeed; finding useful information in a usable format can frequently take hours.”

        What, you still on 56K? You don’t use web browsers? 5 minutes tops to find anything there, sorry that Oracle wants to give you best of breed, where as Microsoft has been inbred for quite a while now, and it shows. Instead of blathering on for paragraphs about how bad Oracle is, spend some time learning how to navigate a web site. Say hi to Billy Boy Gates the next time you see him.

      • #3150050

        Why the Oracle Application Stack should not happen

        by ljhunt ·

        In reply to Why the Oracle Application Stack should not happen

        Are you kidding — You can’t smell the burn MS will need to go through to become ‘COMPETITIVE’.

        Oracle has the power, money and know how to bring Linux to the market place, in direct competition with any MS product and all /users /admins will benefit. Either MS will adapt and overcome or be extinct (Musuem of Antiquities — MS Wing). That simple.

        As far as tech – ‘any’ sysadmin willing to properly configured Linux server will find out it is superior to any windows server performing the same task with the same hardware and from cold metal to up and running in half the time and dodge the once a week MS security patch for some items over 2 years old (yes years) that was only recently addressed.

        As far as taking on stack comparsion to LAMP; realize MS has a fear of being ran out of the market on a raise floor and if Oracle doesn’t address it soon their business will be threatened MySQL has grown by leaps and bounds and by a verion 6 may be more than just a serious threat to Oracle’s and MS business. Oracle needs 1 year of integration with Apache and PHP (or Perl) with a tuned stable version of Linux is Oracle’s attempt to stave off premature extinction. Open source is here and with several other advances will in PC hardware and cost reduction the UNIX servers farms of the past will be reborn in Linux (PC based) in Bewolf configurations. Remeber no matter how good we are the bean counts run the business and cheaper rules the business, with faster taking a close second.

      • #3162384

        Why the Oracle Application Stack should not happen

        by havacigar ·

        In reply to Why the Oracle Application Stack should not happen

        I have just two things to say, OCS and OCFO.

        They should have named them crap and worse crap.

        I rest my case.

    • #3285244

      eWeek’s Interview with James Gosling

      by justin james ·

      In reply to Critical Thinking

      eWeek recently published a very interesting interview with James Gosling, the father of Java. Mr. Gosling’s candor and honesty are to be admired, and it seems that he and I are on the same wavelength regarding many topics:

      AJAX

      “Creating [AJAX components] is extremely hard. Not because programming JavaScript is hard, but because all these flavors of JavaScript are ever so slightly different. You have to build your components so that they’re adaptable to all the different browsers that you care about. And you have to figure out how to test them. None of the browsers has decent debugging hooks.”

      I have been saying this for some time as well. The discrepencies between various browsers’ implementation of JavaScript make it difficult at best to write a fully cross-platform AJAX application of any complexity. Furthermore, the browsers themselves simply do not offer an environment to debug in; developers are stuck with write() and alert() to show the value of variables throughout the code execution. It is like working with BASIC code circa 1986.

      “There’s no ability to do cross-platform QA; you’ve just got to do them one by one. Right now it looks pretty hopeless to make AJAX development easier.”

      This is so sad, yet so true. I simply fail to see how anyone can realistically push AJAX as a platform for applications with the same level of functionality as desktop applications under these conditions.

      Regarding Sun’s business mistakes

      “There are so many to choose form. And sometimes it’s hard to say what’s a blunder and what’s just the case of the world being weird.”

      All I can say to this is “WOW!” Can you imagine Bill Gates or Steve Ballmer or Michael Dell or Steve Jobs or Larry Ellison saying something like this? Neither can I. Granted, Gosling is an engineer, not a business person. But it is this type of attitude that has hampered Sun so badly over the years. The fact is, with business sense like this, Sun’s very existence is testimony to the quality of its products. Sun has indeed made more blunders than just about any major tech company out there, except for maybe Novell and Borland. Like Novell and Borland used to be, Sun is run by engineers. Their products are amazingly good most of the time, but they often simply have no good fit into the realities of the business world, and the rest of the company just does not know how to get paid for those products. Solaris is regarding by many, if not most knowledgeable people as the best UNIX out there, and certainly better than Linux. Yet Sun cannot manage to give it away! It is because Sun waited way too long to try to go open source with it. First they attempted to embrace Linux, then they open source Solaris. Sun changes its motto every year it seems like, that just shows how confused and directionless they are.

      Overall, I like Sun. I think Solaris is a good UNIX, from what I know and have seen of it. Java, while being a dog in reality, is an innovative idea and did a lot to break Web development out of the stagnation of CGI. If the VMs were not so wretched, I would see it as a great competitor to .Net on the desktop. It is just a real shame that no one at Sun understand business.

      J.Ja

    • #3148976

      IT Projects Up Against “Penny Wise, Pound Foolish”?

      by justin james ·

      In reply to Critical Thinking

      In response to a previous article (Geeks and Communications Skills), commenter Tony Hopkinson wrote:

      Now maybe I’m not business aware enough to realise why short term gains are preferable to long term success, may be one of these business types should explain it, then we can all row the boat in the same direction.”

      DirtClod also points out the connection between IT projects and research grants:

      Scientists who communicate get GRANTS.  IT people who want the boss to GRANT funding for projects might find the time to learn our language.”

      The two ideas are not unrelated. Tony?s complaint is a common frustration in IT. Even the best proposal in the world that shows great ROI numbers, productivity gains, reduced downtime, and all of the other things that a good IT proposal should show can be turned down. One would think that “the suits” of all people would jump on a chance to increase profits after overcoming a substantial upfront cost. After all, this is why companies build factories, invest in training, outsource workers, and so forth.

      Unfortunately, increased profit is not the actual goal of many companies, particularly publicly traded companies. The actual goal of these companies is “to increase shareholder value.” “Shareholder value” directly translates to “stock price.” Look at the compensation plans for C-level executives (CEO, COO, CIO, etc.). Their bonuses are tied more directly to stock price than to profit, market share, revenue growth, or any other direct financial metric. In addition, a significant portion of C-level compensation is in the form of stock options. Additionally, the management of a publicly traded company has what is called “a fiduciary responsibility to the shareholders.” That means that they are held accountable to the shareholders, not to the employees or customers. To put it in a more obvious way, “it is in management?s best interests to increase the stock price of a company at the expense of any other metric.”

      The end result is that a company acts in whatever way will be best for the stock price, which is not always best for the company. To make the situation worse, a C-level executive typically does not stay with a company for more than a few years. They have little incentive to worry about the long term health (or even the long term stock price) of a company. Shareholders now rarely hang onto a stock for the long term; they are looking to “buy low and sell high,” which is hardly a recipe for having shareholders that care about the viability of a company?s business model or the sense of their business practices.

      It is true that things like profit and loss, revenue growth, market share, and so on and so on play a role in the stock price of a company. This information gets released to the public once a financial quarter. IT projects, sadly, tend to be capital expensive. The inefficiencies that IT projects resolve are not line items on any spreadsheet though. Let me give an example. At one company I worked for, they computer that I was assigned was a Pentium 1 with 128 MB of RAM, running Windows 95. This computer was expected to be running the following applications throughout the shift: Microsoft Excel, Microsoft Outlook, Microsoft Word, Internet Explorer (two windows minimum, one of which contained a Java applet), a custom built Java application (using a non-standard Java VM, so I would have two Java VMs open at any given time), McAfee Anti-Virus, and a telnet client that was extremely heavy (it had all types of scripting, macros, etc. built in). Microsoft Word would be open about half of the time, as well as a few other applications. By the way, this was in the year 2003. To the best of my knowledge, that PC is still in use. This setup was so unstable, I had to start the applications in a particular order or face system lockup. Everyone had to come onto the shift 15 minutes early as well. So the company was paying 15 minutes of overtime per day, which comes to 65 hours of overtime a year. Even at our pittance of a wage, this inefficiency would have paid for replacement PCs within a year.

      Why were those PCs never replaced? For the same reason that many IT projects never occur: the short term budgetary impact would have affected the stock price more than the efficiency gains. Inefficiencies simply do not show up in reporting figures. There is no line item in a quarterly report that says, “excessive staffing due to a process not being computerized,” or “additional payroll because computers are slow,” or “missed SLAs due to poorly trained users.” In comparison, “three month project to automate a process,” “replacement and upgrade of existing PCs,” and “training classes for users,” are all line items that Wall Street (or London, or Tokyo, or wherever) sees. It is sad, but it is true. This is why a great project can get shot down. The initial upfront cost is just too high and the ROI just does not come fast enough. I am not going to pretend to know what numbers “the suits” have in mind when evaluating an IT project, but off the cuff it seems to me that you need to deliver a 25% savings within a quarter of project completion, and 150% within one year of project completion on a one quarter long project to get approval.

      This mentality hits IT everywhere you look. Projects get rolled out as “betas” that never seem to get finished because of it. Programs get written in “quick and dirty” languages like .Net that are just not appropriate for certain types of projects. In reality, if you want to write a highly scalable Web application, writing it in C++ in a CGI environment is your best bet. But now one will ever get approval for that project, because good C++ coders are expensive, and the project will take forever because you would need to re-write much of what JSP, PHP, and ASP already handle.

      The only way to get the funding you need for your project is to do exactly what DirtClod says to do: learn management?s language. If you cannot show in words that management understands, in a format they understand (“Gentlemen, start your copies of PowerPoint!”) that your project will more than pay for itself quickly enough to be palatable to the stockholders, it simply will not fly.

      Do not think that this does not affect private companies, either. It does. Private companies with a small amount of ownership (a few partners) have a tendency for the owners to treat every dollar in the company as their own, which is understandable. The $10,000 you want to spend on servers is viewed by the boss as half a year?s tuition for his child?s college. Many, if not most privately held businesses think ahead to selling out or going public. A history of cost management helps them get the most return on their initial investment of time and capital. No matter how you cut it, IT projects that cannot be shown to have a big, quick ROI kicker are just not going to be approved, regardless of how well you write the proposal. I will discuss these kinds of “low hanging fruit” projects soon, stay tuned.

      J.Ja

      • #3150446

        IT Projects Up Against

        by wayne m. ·

        In reply to IT Projects Up Against “Penny Wise, Pound Foolish”?

        Unfortunately, the focus on short-term thinking is not new nor unique to IT.  Dr. W. Edwards Deming (“Out of the Crisis”, “The New Economics”) was writing about those concerns 25 years ago.  In fact some of the phrasing in the blog above sounds very similar to Dr. Deming.  I would highly recommend reading his books to anyone (I believe “Out of the Crisis” is still in print).

        One warning.  Dr. Deming was a very insightful man, but not a very captivating writer.  Nonetheless, he is one of the few writers that I have reread multiple times and found value in each rereading.

         

      • #3150403

        IT Projects Up Against

        by justin james ·

        In reply to IT Projects Up Against “Penny Wise, Pound Foolish”?

        Wayne –

        Thanks for the heads up on those books. I will try to get a hold of them, I love reading about economics and business (thinking about getting a Masters in it). I have actually heard of The New Economy but not the onther one. They sound more interesting than my current reading list anyways. 🙂

        J.Ja

      • #3150070

        IT Projects Up Against

        by tony hopkinson ·

        In reply to IT Projects Up Against “Penny Wise, Pound Foolish”?

        I’m a tech head, a programmer, a rational mind. I’m well aware that there are reasons for the decision, I’m even clever enough to deduce a few possibilities despite my appalling lack of business skills.

        All I want is a clue, along with the NO !

        Not a lot to ask, otherwise few possibilities exist.

        I’m crap at communicating and ‘you’ don’t want me to improve.

        I’m good at communicating but ‘you’ don’t agree with my numbers.

        I’m good at communicatng and ‘you’ are bloody horrible at it.

        I’m going to be sacked next week, so there’s is no point in starting.

        The firm is going bust next week, so there’s no point in starting.

        ‘You’ are going to pass the idea to your nephew, the IT guru who starts next week.

        ‘You’ were unable to repay the unofficial loan from company accounts.

        ‘You’ spent the entire budget on gee whizzery.

        ‘You’  are leaving next week, so you you aren’t signing that big a cost in case your boss takes it out of your golden handshake.

        A manager who doesn’t communicate is less use than a tech who can’t or won’t. If the perception is that you don’t communicate, then you don’t communicate.

         

        Low hanging fruit, my favourite !

        Eat them all real quick, get too fat to climb for more, exhaust yourself building a ladder, call in a mate for a share of the viands, watch him chop the bloody tree down.

        Sore point J, been there done that, have the compost.

         

    • #3150274

      Podcast Only? No Thanks!

      by justin james ·

      In reply to Critical Thinking

      [5/2/2006] Edited to clarify for those who may have missed the original discussion. I am not against podcasts. I am against information being distributed as a podcast but not available in any other format. There is nothing inherently wrong with podcasts as long as that information is available in a text format in addition to the podcast, or if the podcast provides a unique value proposition.

      This blog is written in response to the comment thread for Don’t be lazy: Communicate with your end users.

      I am with Palmetto on this one. I refuse to use podcasts (along with RSS and many other “Web 2.0” hocus pocus). I can read approximately 10 times faster than anyone (save the guy who did the Micro Machines ads) can talk. I can read the script for a 90 minute movie in about 10 minutes, typically. I can read a 1,000 page book in about 10 hours. Listening is the least efficient form of knowledge transfer for me. Reading is the most efficient. Like many others, I can read an article at my desk, even if on the phone or waiting for another process to finish. If I am interrupted, I just remember where I was, re-read the preceeding paragraph or sentence, and continue, as opposed to having to shuttle back and forth to jog my memory.

      Furthermore, this is the Internet, not the radio. Text is the lowest common denominator. Satisfy the LCD first, and then worry about the high-end users or special need users. This is Reason #4,562 why I am against AJAX; it fails on the LCD test. If your information cannot be used within Lynx, throw it out, it is worthless. Even someone who uses a screen reader (such as the blind or vision impaired) is equally well served by a text document as they are a podcast. A deaf person cannot use a podcast at all, whereas they can read text. Do you mean to not have your message be accessed by the deaf? If I owned a store that a handicapped person could not enter for whatever reason, I would be in deep trouble as per Federal law. Your online business should operate the same way. No excuses. Call me silly, call me crazy, but I do not turn away business because I wanted a fancy widget that deprived 5% of my potential customers of the chance to give me their money.

      The only time a podcast (or any other audio-only presentation of information) adds any value is when the voice itself provides information that cannot be easily or adequately transcribed (vocal inflection, information about music with sample clips, sarcastic remarks, etc.). I do not have an iPod or similar device, and if I did I might not have a way of hooking it up to my car stereo, where I listen to most of my audio stuff. What do you propose I do? Burn the podcast to CD to listen in my car? I will not do that, and neither will anyone else most likely.

      In your article itself, you state:

      “In this 5-minute podcast, I explain why there is no substitute for good communication and offer a little advice for using three common communication methods: e-mail, voice mail, and face-to-face contacts.”

      Offering a podcast without a transcript (or at least a summary of the salient points) is pretty lazy. I have a problem with my hands, wrists, and eyes. Even at my age, typing, especially at the end of the day, is extremely painful to me physically. Yet I do it anyways, because that is the best way to communicate on the Web, and I communicate via the Web. I would love to simply be a lecturer or a radio or TV personality, but I am not. I put up with the pain in order to provide the best possible service to my readers.

      Remember, we (the audience) are the customer. If we are unable to use or consume your product, no matter how good it may be, you will not be able to sell it. Period. This is why I spend so many bytes in my blogs discussing basic usability. Poor user interfaces equal low market share, regardless of how good the product is (look at Linux in the desktop market). Of course, great user interfaces do not result in great market share (Macs in the desktop market), but a poor interface will always lose to a better interface, unless the product is so good that its value proposition after the hit from usability still makes it substantially more valuable than the more easily used product. It is that simple.

      J.Ja

      • #3150127
        Avatar photo

        Podcast Only? No Thanks!

        by Bill Detwiler ·

        In reply to Podcast Only? No Thanks!

        TechRepublic is about content choice (not exclusion)

        During my almost 6 years with TechRepublic I’ve learned that our members (our customers) want content delivered in multiple formats. Some members prefer to read articles online, others want to download our PDF documents, and many prefer our newsletters. Taking a more active role, many members create content by participating in our Discussion forums and Technical Q&A. TechRepublic has always and continues to provide choices.

        We offer online articles on Windows hacks, policy and form downloads, interactive discussions on IT management and all matters of geekology. When possible and appropriate, we even offer the same content in multiple formats–check out the download Security through visibility: Revealing the secrets of open source security and the article version. My podcast 10+ tools every support tech should have in their repair kit is also available as a PDF download. We know our members pick and choose the content formats they find most helpful–we encourage that choice.

        As the Internet evolves and content formats expand, TechRepublic will strive to offer content in those new formats. Text is still the dominant Internet format, but high-speed connections are increasing the prevalence and user desire for new formats, such as RSS feeds, podcasts, photo galleries, and video. In a July 2005 News.com article, the Diffusion Group predicted that the U.S. podcast audience will climb to 56 million by 2010 and that three-quarters of all people who own portable digital music players will listen to podcasts. As an online media company, CNET and TechRepublic cannot and should not ignore this trend. Does this mean we will abandon our text-based content? Of course not. We will continue to offer content formats that our members tell us they want.

        As someone who wants to produce great content that our members find helpful, I welcome content suggestions and constructive criticism. I may not always agree, but I will always listen and act when appropriate. I take offense however, to J.Ja’s statement that by offering an additional content format I am making TechRepublic more difficult to use for our members with disabilities. Now that we offer streaming and downloadable podcasts, those with impaired vision can choose to listen to our content or read it with a screen reader. Those with hearing impairments can still access our online and downloadable text-based content. Again, we often offer content in multiple formats and allow the customer to choose.

      • #3148182

        Podcast Only? No Thanks!

        by mwaser ·

        In reply to Podcast Only? No Thanks!

        Transcripts of podcasts would make TechRepublic much more valuable to me.  I don’t know why you don’t do it ALL the time.

      • #3148177

        Podcast Only? No Thanks!

        by duckboxxer ·

        In reply to Podcast Only? No Thanks!

        You sound as though you are completely against podcasts all together.  I truly hope not.  I have actually very recently gotten hooked on these.  I download (via NewsGator and FeedStation) and pop them on my PDA to listen too.  I’m a news junkie and generally nerdy; hence I’m usually listening to some news commentary or other academic podcast rather than some random joe’s blog.  What I also do, is listen to things that will  help me in my job, like development or management podcasts from key industry players (PMI, eWeek, Helms & Peters, etc.).  Being a multitasker, this allows me to help out my skillset and still be working.  

        Having been at an advertising agency, I have to agree with Bill.  A company has to provide what their customers (audience) wants.  And if their competitor provides this service, then you will lose those customers.  True a company has to weigh the potential ROI for adding a new services like podcasting.  Yes, for ADA compliance you ought to provide various forms of information, but honestly this provides more choices for customers.  And currently sites like TechRepublic are already doing that from articles to white papers to newsletters.  They realize that different customers need information in different formats, from printable to mobile (and I classify a podcast as mobile).  Just because you don’t like information in this format, doesn’t mean others don’t.  🙂

      • #3163495

        Podcast Only? No Thanks!

        by justin james ·

        In reply to Podcast Only? No Thanks!

        I am not against podcasts. I am against information being distributed as a podcast but not available in any other format. There is nothing inherently wrong with podcasts as long as that information is available in a text format in addition to the podcast, or if the podcast provides a unique value proposition. I did not make this as clear as I should have, as I was originally writing this as a comment to a thread, and I apologize for any confusion or miscommunication.

        J.Ja

      • #3161939

        Podcast Only? No Thanks!

        by andy goss ·

        In reply to Podcast Only? No Thanks!

        I have to agree with Justin James. I am not going to listen to a five
        minute podcast just to find out if I want to listen to it. I can
        eyeball the text of it in seconds to discover if I need to read it, I
        can backtrack and cast forward at will, I can Google on bits I want
        background on, I can cut and paste information I may need again. I can
        ponder on a phrase that is in front of me for as long as I like. Can I
        do any of that with a podcast? With text I do not have to cope with
        regional accents, poor diction, or the irritating verbal mannerisms
        that few people commit to writing but allow into their speech.
        I am sure the podcast has a place in the world, I just can’t think of
        one offhand, unless as a form of light entertainment. Like “push”
        technology, the portal concept, internet fridges, and talking doors
        (“Glad to be of service!”), the podcast is an overrated gimmick, unless
        someone finds a real purpose for it. Then I may use it. Until then,
        like the pointless video clips on news sites, I shall ignore them.

        Andy Goss

      • #3161581

        Podcast Only? No Thanks!

        by aaron a baker ·

        In reply to Podcast Only? No Thanks!

        Same Here;

        I don’t use Podcasts,RSS feeds, etc “RSS Feeds, now there’s something totally useless and all the “Other” forms of  communications that we seem to be killing ourselves,trying to achieve. I for one, like to “read” a story, it I find it interesting, my preferred method for download is Pdf.

        I can understand that podcast “May serve a purpose, but I question as to whether or not we are really just playing with new toys here. It seems that the minute something new comes out i.e.Podcasts, you’re left with the impression that all things before it are now ancient history. I’m sorry but I beg to differ.

        I much prefer reading the article, I find it far more relaxing than watching someone trying to “sell me” an idea on a podcast “Naturally, they get all excited an besides themselves and for the most part, get caught up in heat of the moment. All this does is annoy me. It’s nowhere near as relaxing as reading them downloading what you like. So by all means, continue with the podcasts, and the “Blind of Vision routines etc, just make sure that the equivalent is available in the normal way. As for RSS Feeds, well I won’t go there. But I will say this.                                                                         One of my favorite parts is the one where in the RRS Feed menu, there is an area that asks if you want to view this “in your browser”. In your Browser, no less, right back where I started from, I laughed out loud and shut her down. Then wrote a blog about it. Check it out. 😉

        Regards

        Aaron

    • #3148309

      Open Source Does Not Make Better Code. Better Programmers Make Better Code.

      by justin james ·

      In reply to Critical Thinking

      Every now and then, I will see a “dispelling the myths of open source” type of article, blog, discussion, or whatever come my way, and it always seems to come around to the “more eyeballs means less defects” idea. For whatever reason, many open source proponents seem to believe that there is this rear guard of closed source folks spreading FUD about open source (even Microsoft has toned down their rhetoric lately). I think that less than 10% of the knowledgeable people out there actually claim that closed source is inherently more secure than open source. It definitely seems that most people (at least amongst those that voice an opinion) believe that open source software is inherently more secure than close source software.

      In reality, what matters much more than “open” or “closed” source, is who is writing and reviewing the code, why they are doing it, and how long they are doing it. If I compare OSS project “Project A” to closed source project “Project B”, and “Project A” is being written by five 15 year olds who just wrote “Hello World” last week for the first time, and “Project B” is being written by twenty crusty old timers, and “Project A” has a three person “community” and “Project B” has zero community inspecting the source, I still guarantee that “Project B” will blow “Project A” out of the water. Open source, closed source, it really does not matter.

      Another thing that I find fallacious about this argument is the continual assumption that “open source” means “free.” The two are not mutual ideas, not by a long shot. Nor are they mutually exclusive. Historically speaking, UNIX is “open source,” but hardly “free.” Indeed, the original BSD386 project was due to the desire for there to be a free UNIX. One of the reasons why so much System V source code has ended up in various UNIXs over the years is precisely because Sytem V was open source, and SCO?s lawsuits exist because System V was not free! On the other hand, there is plenty of free software that is not open source. Just got to any shareware repository and find a piece of freeware that is pre-compiled and that does not include source code.

      What really matters is who is writing and reviewing the code, and money tends to attract the continued writing and review of code much better than whatever it is that actually motivates FOSS coders. Sure, some FOSS projects (Apache, Linux, BSD, MySQL, etc.) attract top talent, but just taking a look at Source Forge shows that the vast majority of Open Source projects go nowhere. To decide that FOSS is the best possible method of development and quality control based on Windows vs. Linux or Oracle vs. MySQL or IIS vs. Apache or PHP vs. Java, or whatever is silly. That is like saying that “a Dodge will always be faster than a Chevy” based upon a comparison between the Viper and the Corvette.

      One of the reasons why these projects are able to attract such a large pool of developers and testers has less to do with the fact that they are “open source,” but the fact that they are free. Only a minute percentage of Linux users ever touch their source code, let alone look at it (or even care about it). They are attracted by its phenomenal price/performance ratio. The same can be said for any FOSS project. Thanks to the widespread usage of various package managers, it is fairly uncommon for most mainstream Linux users to even compile from source, let alone modify compiler flags or make changes to source code. If these packages were closed source but still free, most of their users would still use them and be testing them.

      The vast majority of lines of code are written under the radar of most people and do not get any attention. Try comparing a small sample of each type of software. Take a few dozen random items from Source Forge that are in at least a “release quality” state and compare them to a few dozen freeware applications. Then evaluate the difference between closed source and open source. I really cannot tell you what the results will be, but I do know that many of the open source pieces of code that I have used, outside of the “big stuff” (various UNIXs, Apache, MySQL, PostgreSQL) are not so great. For a project that lacks glamour, it is hard to attract someone to spend a large amount of time seriously working on it. It is that simple.

      Mind you, I am not against open source or free software whatsoever! I use it all of the time in my day-to-day life, especially FreeBSD, Apache, MySQL, and Perl. But I also use a lot of paid and/or closed source software as well, like Windows, IIS, Oracle, Microsoft Office and so on. Some of my favorite pieces of software are simple shareware applications: Notetab Pro and ThumbsPlus immediately come to mind. It is very rare that I have ever wanted or needed to “look under the hood” of a piece of software. Tomcat/Jakarta required me to do so to find out why it was not behaving as documented. Indeed, most of the time that I have had to look at source code, it was to compensate for poor documentation, not to actually make any change or satisfy a curious itch. I was grateful to be able to inspect the source code, but I would have preferred better documentation instead.

      Many of the arguments that I hear in favor of open source compare Windows to Linux, or Internet Explorer to Firefox. Windows vs. Linux just shows that the Linux coders are better, smarter, and more well organized than Microsoft’s Windows coders. Microsoft has had something like ten years to get Internet Explorer right, and they still have not managed to get it nailed down. This is not news. The fact is, closed source shops consistently crank out many products better than Microsoft?s too. One can just as easily compare OS/2 Warp or BeOS or MacOSX to the version of Windows from the relevant timeframe, and Windows still falls short on many (if not most) benchmarks like security, stability, usability, and so on. I note that it is rare for someone to compare MySQL or PostgreSQL to Oracle or Microsoft SQL Server in an “open source versus closed source” debate. Why do we never hear “.Net versus Mono?” Even comparing Apache to IIS is difficult, because IIS is a significantly more ambitious piece of software than Apache.

      Open source, in and of itself does not produce better code. Better coders and better testing produce better code. It is that simple. When a closed source shop has the better coders and better testing, they write the better software. When an open source project has better coders and better testing, they write better software. To think that just because a piece of code can be modified or inspected by anyone and everyone means that the best coders and testers will be modifying and testing that code is just not correct.

      J.Ja

      • #3148234

        Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        by georgeou ·

        In reply to Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        I think Vernon Schryver who is the creator of DCC (Distributed Clearing House) said it best in this post.  It permanently dispels the myths of “many eyes”.

        In addition to this post, there was an Open Source “many eyes” project that called for volunteers to do the dirty work of auditing Open Source code.  It didn’t take long to die because there were almost no volunteers.  The truth of the matter is, no one will do the dirty work of code auditing unless there’s a salary involved.  Most people (the .01% that can even write any kind of source code) just want to do the “fun” stuff.  Code auditing is time consuming and boring.

      • #3148184

        Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        by tony hopkinson ·

        In reply to Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        I got to the end of your post , my first response was ‘and your point is ?’

        Well I nearly agree with you. Certainly I’d expect three experienced developers to come up with a better product than five fifteen year olds. In fact I’d expect them to come up with a better application than five top class honours graduates.

        Obviously if no one feels there is a use for an open source project there isn’t going to be a decent enough community to take advantage of the benefits of peer review.

        A successful open source (nothing to do with free as in money) will always out perform a successful closed one in terms of security, stability and code quality. For a very simple reason, it will have to be written so many developers of varying standards of skill can understand it enough to contribute. You can create cases where closed out performs open, weight the odds in favour of whatever you like, but the biggest indicator of code quality by whatever measure is readability.

        The latter is not a requirement in closed source shops, it’s not a requirement in academia unfortunately either. Equally those who do contribute the open source projects have a vested interest in it being the best it can be, because they want to use it, not the most profitable it can because they want to sell it. There is a difference and it’s a massive one.

        I’m not allowed to do the best work I can, I’m not allowed to do things properly, I’m not allowed to choose a technical positive over a cost neagtive. That’s business, it makes perfect sense to me, so why are you comparing apples and oranges in the first place ?

         

         

         

      • #3163390

        Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        by mindilator9 ·

        In reply to Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        I see what you’re trying to say, J, but it seems to me that having competent developers is requisite to the issue. You can switch the players on both sides and get inverted results, meaning you’ve proven nothing. It matters not even what field we’re talking about: take any industry, grab 5 inexperienced kids vs 2 masters of their craft, and you get what you would expect. Let’s make the question relevant by starting off with the assumption that we’re dealing with expert programmers who know better than to make the same basic mistakes we see constantly on the web everyday. Of course there are inept programmers on OSS projects, just as there are in closed source, so that’s moot. Let’s just look at the 2 classifications of software technology assuming the best programmers of both sides of the fence are what we’re using to gauge the quality by. Then and only then should we compare the best attributes of OSS and closed source, because now we’ve got an even playing field, the experiment’s results aren’t already tilted to one side. Another way to look at it is to compare a race between that Viper and Corvette, but instead of putting a 15 year old in the driver seat of the Viper, make them both pro drivers. Between a Viper driven by a 15 year old and a Corvette driven by Jeff Gordon, my money’s on the Corvette…no brainer. If you want to know the real comparison, you’d have to race Jeff against himself or another equal. Let’s race OSS and closed source, using professional programmers as the natural assumption for implementation, and see who wins. Human competence is an issue all its own outside of how well two coding archetypes compete, and is in no way limited to even the IT field. I’ve got 2 excellent programmers and they each use a different type, OSS and closed (M$). What they deliver me is, for me, the real test. If one of them wasn’t excellent, I would not deign to assume one type’s performance over the other. That leaves me with one last remark…we’ve got to clarify this question even further and differentiate Microsoft from other closed source definitions. No other closed source organization comes close to the magnitude and scale of Microsoft, the products they produce, and the huge tangled ball of yarn called their code base. Microsoft aside, what OSS vs closed source projects have features which outperform the alternative? We can compare OSS with Microsoft, and OSS with other closed source, but taking all closed source together only convolutes the experiment because it is safe to assume that few if any other closed source organizations have anywhere near the scale of problems Microsoft does. (Maybe cuz they’re not trying to take over the world??). Let’s just stay reasonable about this. After all, reasoning is the trait that propels most of us in this field.

      • #3163612

        Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        by charliespencer ·

        In reply to Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        “… the biggest indicator of code quality by whatever measure is readability.”

        Tony, I thought it was the performance of the compiled code. If it’s readable but slow, how is it high quality?

        Also, why does readability across a range of programming skills mean isuperiority? “Fun with Dick and Jane” is readable, but it isn’t superior literature.

      • #3163548

        Open Source Does Not Make Better Code. Better Programmers Make Better Code.

        by wilrogjr ·

        In reply to Open Source Does Not Make Better Code. Better Programmers Make Better Code.