Software Development

Programming industry forecast for 2008: Get ready for a bumpy ride

Find out what Justin James says will severely impact how we program and how we develop applications.

In the blog posts I wrote in 2006, I was primarily trying to bring some reality and common sense to the hype flying around many new ideas (it was a lost cause); discuss usability theory; and expose that mainstream programming is pretty mediocre. In 2007, my main themes were education, multithreading, and functional programming. This year, I will examine my prediction of a decline (not a demise but a decline) in traditional client/server desktop computing.

Here's a roundup of some of my previous writings that feed into this progression of ideas:

News stories of interest

Two seemingly unrelated news items that we should pay attention to are: Bill Gates is retiring this year, and phone manufacturers finally settled on an across-the-board power/connectivity connector. To put it another way: The man who drove the systems that 95% of us use is leaving the scene; and our cell phones, PDAs, and other devices can now easily interconnect to each other (and our existing peripherals) with a $5 cord without the need for a PC.

History lessons

What does all of this have to do with programming? Everything. We are in the middle of intense change within the programming industry even though most of us can't see or participate in it. We will just be stuck cleaning up the aftermath five - fifteen years down the road. The programming industry prides itself on rapid change and instant obsolescence, but the reality is that it's changing very, very slowly. There are still a ton of important computing systems that are rooted in designs from the1950s and 1960s.

I have been learning a lot about the mainframe and supercomputer eras in an effort to try to mine that history for clues about our future. I believe that many of the standard features of mainframes are going to migrate to the current server hardware, typically in OSs adapted to take cluster configurations and treat them as a single addressable system. The result is that programmers are increasingly writing code that targets systems that behave like the Windows or *Nix servers of yesterday and today, but have the reliability and abstraction of multiple processors that mainframes have enjoyed for years. Look at the list of mainframe-like features that are sprouting up all over the place in microcomputer-oriented server rooms:

  • Virtualization of resources
  • Zero down time for hardware failure/replacement/upgrades
  • In-place OS and application upgrades
  • Bytecode interpreters and runtimes that abstract the hardware to the point that the programmers do not know or care what is underneath
Note: All of these trends are already well underway.

Ride this programming wave

We are seeing the server room getting better and better, the desktop platform stagnating, and the mobile device platform improving. How much longer will it be before "the suits" ask themselves, "Do employees really need expensive PCs and phones if I have been using my BlackBerry as my sole computing and telephony device for a year?" The answer all too often is, "No, they do not."

As applications shift to the Web platform (or even just a server-based platform, so long as the client can be mostly lightweight), the only thing holding these cell phones, UMPCs, game consoles, and smartphones from entirely replacing the desktop PC as well as the desktop phone is the input situation and the output situation. Give these machines a USB port, and the problem is resolved with one connection to a docking station. I would love to replace my desktop PC with a UMPC with a docking station, but the prices need to come down some and my personal needs would have to be a bit less development oriented for that to be reasonable. But for an office worker with a $700 Dell and a $300 Avaya phone, it makes too much sense to the IT department to replace these with a $400 smartphone.

This will severely impact how we program and how we develop applications. I believe that the programmers who ride this wave will be the leaders of the next generation. Another odd knockoff effect of this change is that Linux will suddenly figure much more prominently in client-side development, and technologies such as Web browsers, Adobe AIR, Java, and Silverlight (a lot is still unknown with Silverlight) that allow a "write once, instant-client anywhere with zero configuration needed" will be reconsidered. The failure of Java applets in the 1990s to become a worthwhile technology (not fully cross-platform compatible, slow, resource hogs) is a shame in retrospect. We could have skipped the whole mess of trying to squeeze all of our work through HTTP/HTML/JavaScript like we are attempting now with AJAX. Developers who learn to write client code for mobile devices will be far ahead of the game down the road.

J.Ja

About

Justin James is the Lead Architect for Conigent.

50 comments
alexey
alexey

I would agree in general but there is another trend that I would like to outline. You mention interoperability on the hardware layer and software layer separately, that is, the devices can talk to each other and programs execute on multiple platforms. Let us take a step forward and imagine a piece of software that is hardware-independent. That is, you have an Internet application that equally executes on a PC, on a cellphone, etc. Bomjpacket is an open-source mobile browser with support of Javascript whose goal is exactly this. It can execute any interactive internet application on a limited $100 cellphone: http://research.alexeysmirnov.name/bp

Justin James
Justin James

To be honest, it is an ambitious project, but WAP has been a universal failure. No one wants to write the code twice. Instead, what you're seeing is a single HTML page, with a device-specific CSS sheet, like one for mobile, one for print, one for standard display, and so on. It makes a lot more sense from the developer's viewpoint. J.Ja

martinrleon
martinrleon

Although I agree that mobile computing will likely increase significantly, who will enter all the data that "the suits" can browse so easily on their Blackberries, where will it be stored, how will it be managed? Even with a fully capable browser embedded in a mobile device (a la iPhone), there's a lot of application development that needs to be done to make a complete business system. I worry that the suits may not realize that 1) they are benefiting from the labor and data input of many people who couldn't necessarily do their parts as cogs in the machine with just a Blackberry and a USB keyboard and 2) that there is a significant computing infrastructure required for them to be able to benefit from interacting with such a system so easily and 3) systems will continue to need to interact with other systems to benefit from external services that are more cost effective than home-grown or in-house systems. It will be interesting to see what develops.

Justin James
Justin James

These are excellent questions that *will* be discussed in the upcoming weeks. Please stay tuned. :) J.Ja

fasternfaster
fasternfaster

Absolutely far-sighted! Bravo. I believe this story is right inline for those of us online.

joness59
joness59

Smartphones or blackberrys may be good for texting, email or surfing the web, but many things that users use the desktop or laptop for at work & home are very difficult to do on these other devices. I've tried maintaining a simple spreadsheet on my Palm and it is a royal pain. Tools such as Wordprocessing (which includes much more that just typing text), Presentations, Spreadsheets, financial software (Quicken, Quickbooks, etc) would frustrate the user trying to accomplish these tasks on a Smartphone and would be much much more time intensive. Technology may have changed but the way people think, view & interact hasn't. Smartphones and similiar devices certainly have their place but they aren't the end all, be all of the future. IMHO

Justin James
Justin James

USB ports on devices = docking stations. J.Ja

Still_Rockin
Still_Rockin

I agree with most of what Justin says here; my only observation would be that there are always going to be "large", complex software applications that will need to be operated by certain classes of users (think accounting applications and whether accounting staff would be able to do their GL/AP/AR jobs using only a smartphone). And on the development side, in order to develop that type of large/complex software applications, developers will always need to have beefy machines to run the analysis/design/ide etc. -type apps to produce the business apps. At least for quite awhile, anyway--Of course I'll probably be proven wrong when the whitecoats come up with molecular memory that can store thousands of terabytes on a chip the size of a fingernail and quantum cpus that run at a gazillion terahertz :)

Tony Hopkinson
Tony Hopkinson

I have to rotate my 22" monitor through 90 degrees and have it on it's highest resolution to make one page readable with out scrolling both ways. It would have to a f'ing big PDA. There's some nice stuff you can do on mobile devices and there will be market for it. However it won't be in the real meat and potatoes client server market in my opinion. I do agree the desktop is/has been stagnant but naff attempts to try and shoe horn it into a cell phone are not going to take off.

t.anthony.ash
t.anthony.ash

Most guys want the smallest wallet they can find, but they want it to contain everything they even remotely might need. Given that paradigm, let's consider the computer. We want security, availability, speed, integrity, flexibility and power, but we want it to take up as little space as possible. This was the justification for moving from a "desktop" to a "tower" configuration. Space and cost were the driving factors in moving from a full tower to a small form factor (non-upgradeable) tower. It seems to me that the logical extension of this would have to be, as Justin said, plugging a hand-held device into a docking station that permitted a similar UI to our current configuration. In other words, the only difference the user sees between the current set-up and the envisioned set-up is that the computer always remains in the physical possession of the user. I have thought that the real solution was to have everything, including the OS, loaded on a USB flash drive, docking into a truly diskless workstation. The downside of this is that, away from a dock, there is no usability. Given Justin's suggestion of a UMPC, even down to the size of a Blackberry, you add the benefit of having access to an interface, albeit imperfect, no matter your location. What do we really want? Everything in the world, in a tiny package, but shown in a readable size with context, and a usable input mechanism, now.

Justin James
Justin James

It's funny, but it costs more to service/support a tower that can be easily opened, but requires backing/restore, data transfer, etc., than it does to throw out a BlackBerry or similar device. Sure, the tower can let you stuff 6 drives in there, but if everyone is supposed to be putting things on the network drives, who needs a desktop drive bigger than 40 GB for OS & apps? Sure, some users like developers, graphics people, and so on will always need the horsepower and peripheral support of a full PC... but users like us are a small minority of users overall. J.Ja

amcookjr
amcookjr

I think Justin makes some good points. 1) With a univeral and ubiquitous interface port for mobile devices, the limitations of the HMI of these machines is removed. If I need a big screen, I plug on in. (Literally or figuratively.) If I need a keyboard, I plug one in. In other words, the device becomes scalable and extensible as needed by the application. This extensiblity even applies to computing power. 2) The traditional client/server model will continue toward extinction. It's an extremely expensive model in terms of hardware costs, software costs, and operational / maintenance costs. The old mainframe / terminal model is more cost effective in many environments. Thin client / no client is today's version. 3) As computing power requirements move from the user to a central resource, left with the user are connectivity, security, local data storage, framework, HMI (presentation, input, and control), and personality. These are all currently available in a mobile platform but with limited performance. Standardization and the progression of Moore's Law could significantly improve the performance in the near future. 4) Virtualization, abstraction, and other technologies are making server side aps easier, more reliable, and much more cost effective. Web technologies are making web-based computing real (but with a lot of room left for improvement). Together, these are moving application processing away from the end user. Yes, we are early in this process but we are moving in that direction. It will take time for the hardware, software, and networks to adapt. But cost and efficiency will rule.

Tony Hopkinson
Tony Hopkinson

Thin client now where have I heard that before? Plug and Play... You must be new in IT I worked on mainfame systems for a long time, the more you wanted to do the bigger and smarter your terminal got. Thing was larger and cost more than a PC. Heard it all before, seen it all before didn't work then won't work now. Wrong thing driving it, cost reduction. They'll just spend a load of money on PR, come out with some flakey unmaintainable crap, that cost's way more than than they said and then it will fail again. 3G, anyone? They won't wait, they never do.

dtec
dtec

I'm in favor of everyone thinking they need pc's. As soon as the masses stop buying pc's, the prices will fall in the short term, but in the long term it will become quite expensive for us developers that actually need the pc's.

Absolutely
Absolutely

More broadly, unencrypted communications are problematic. Slapping a PGP key onto e-mails is sub-optimal, I agree. But, it would guarantee positive identification of the messages that are most important to me, on my home computer. For those that procmail sorts into the "other" directory -- such as TR peer messages -- I'm willing to take my chances that naive Bayesian algorithms will sometimes throw out messages that aren't SPAM. If a casual acquaintance from an Internet forum has something important to say, it can be said publicly.

Justin James
Justin James

"If websites published their "send from" domains, that would not be a problem." Again, we are used to (I know that you are *hardly* the only one doing this!) spending gobs of time trying to workaround SMTP's lack of sender authentication. It feels normal because we've been doing it so long. It's like people who grow up in pretty bad situations, they assume that certain ways of living are normal, and they are not. SMTP's failures have accustomed us to some pretty abnormal behaviors. My post on this should go up in a few days, it will be more in depth than what I've gone into here, we can go back at it then. ;) J.Ja

Tony Hopkinson
Tony Hopkinson

but I'm afraid I'll have to leave it to the boffins, I've yet another crud application to build and five to maintain, so my hours are accounted for.

Justin James
Justin James

Yeah, I forgot how many resources we burn on solving artificial problems such as those based in our choice of protocols, instead of solving the problem at hand... J.Ja

Justin James
Justin James

Tony - That's exactly right... with our current brute force techniques, it probably would take 20 Crays to match your dog. That's exactly the kinds of issues that I am talking about. Brute force is an intellectually lazy way of approaching problems. "I don't feel like routing around this wall, so I'll just ram my car into it until it falls over." It works with the easy problems, but puts us in bad habits for the hard problems. I've been seeing some encouraging research coming from a few different quarters. I have a feeling that the techniques being used with linguistics called "Latent Semantic Analysis" will have some pretty useful applications both within and outside of linguistics. One thing I like about that approach (in a nutshell: a vector search algorithm with a "weighting" derived from a "corpus" of works on the same topic) is that it is easily and readily computed in parallel, lending it quite well to being spread around, unlike some algorithms which rely on the previous state to generate the current state (like psuedo-random number generators), and therefore are tradtionally confined to a single process/thread, as they must be computed as iterations of a sequence. I ramble. J.Ja

Tony Hopkinson
Tony Hopkinson

we use substrings. We don't use pattern recognition for anything in terms of how it works biologically. You ever seen someone across the street and shouted "Hey Fred" and a complete stranger ignores you. How do you recognise someone you haven't seen in years, even without the red toupe :D You can't code it if you don't understand it. Pattern recognition is something we code with a brute force approach. You'd need about twenty crays in parallel to compete with my dog on that basis, after you'd turned his nose and ears off.

Tony Hopkinson
Tony Hopkinson

'Nuff said Organise me a jackpot on the lottery, I'll f'ing employ you. Nothing to with can't everything to do with not allowed to. Pattern recognition is something I've thought about on and off for a while now, unfortunately all teh processing power required to do it as we understand it (not at all in my opinion) is current devoted to adding state to http, checking to see if is file has a virus, and this really nice 3d concertina control. Get real JJ.

Absolutely
Absolutely

If websites published their "send from" domains, that would not be a problem. But, too many businesses publish their pages on the web as "business-name.tld" while sending mail from "business-name@host.tld," [b]and don't publish the difference[/b]. I still say there are deficiencies built into business-as-usual practices, equal-to-or-greater-than deficiencies in the programming paradigms, which cost end users' valuable money. I'm not saying the SMTP protocol, [i]sans[/i] encryption, is the greatest thing since packet transmission of data. It is, however, not the entire problem, either.

Justin James
Justin James

"Your particular line of work is atypical, and those problems are inapplicable to most users, who nevertheless are offered greylisting software on the shelf instead of easy-to-use default deny policies & whitelists. Why do you suppose that is?" I really can't say, but I do know this: everything you have described in this thread are after-the-fact attempts to remedy the shortcomings of the SMTP protocol. If stuff like Sender ID or SPF records were built into the protocol (not neccesarily those approaches, of course), then the spam issue would probably decline to a fraction of what it is now. "Speaking for the majority of computer users now, if they are a legitimate contact, they will tell me their e-mail address personally, if they desire the convenience of contacting me by email. And I'll manage my own PGP keys, thank you very much. Get me?" So what about automated systems? Let's say, for example, you purchase a product from a Web site. They send an email to you notifying you of your order status, but their site never told you in advance what email address they would be sending from. So now what? You don't get their emails. You can't exchange PGP keys with Amazon.com. And finally, the fact that I get dozens of spam emails from addresses like "order@amazon.com" just further goes to show that the whitelist approach is flawed. At the end of the day, you are investing a lot of time into maintaining a workaround for a system that does not meet your needs. Even if you could show that you have crafted a perfect layer of software, rules, etc. on top of email to eliminate say, 99% of spam with only a 5% rate of false positives (which would be really good, BTW), how would that discount *in the slightest* the fact that the need to build such a system is a direct result of the nature of SMTP? In other words, you are not addressing my point; you are simply validating my point by describing a workaround to the problem that I am discussing. :) You simply chose to add multiple layers on top of SMTP to allow it to still be usable and useful to you, while many other people are simply sidestepping it. Either way, the fact remains. SMTP is a standard that does not meet the current needs in the current environment. J.Ja

Justin James
Justin James

Lucky for me, I am rarely on those types of projects. Sadly, much, if not most, of the industry is about exactly those types of projects. In a nutshell, while our ability to perform advanced computations has grown beyond our wildest dreams, the business end of things (the people who write our paychecks) typically cannot see past data processing and reporting. Look at all of the fancy terms that the industry has come up with to disguise the fact that at the rawest level, most applications that are being written could be done in COBOL. Things like "business intelligence" (reporting), "enterprise [whatever] management" (batch processing), and so on. I think you completely missed my point about "rush" factor. I meant nothing like what you are talking about. I was talking about the customers demanding timelines based on their needs, not the capabilities of the people writing the code. Speech recognition is a perfectly valid item to bring up! Sure, the speech recognition in Windows may be pretty darned good, but even at 99% accuracy, it is still far less accurate than a decent typist. A month or two ago, I was on with one of those spech recognition call tree systems; it could not recognize me saying "customer service", despite the fact that no other options had "customer" or "service" in them. I end up managing to confuse the poor thing to the point where it gave up and transferred me to customer service. I hardly call that a good system. Right now, speech recognition systems approach the topic in a manner very similar to chess programs; they do not perform any recognition at all, but really follow possibility trees to a certain level. As the processors get faster, they can search deeper and wider in the same amount of time. That's why as processor speed doubles, speech recognition accuracy only goes up by a factor of an inverse logarithm (say, for every 2x CPU power, accuracy goes up 0.1%). These systems hit well over 90% accuracy *decades* ago, because the first few levels of those trees are more than adequete 90% of the time. Likewise, chess programs in the 80s could beat, say, 90% of human opponents, and the last 20+ years of development have brought exponentially slower and slower development, despite the doubling of transitor density every 18 or so months (Moore's law). This is, in fact, why learning to perform parallel processing on modern x86/x64 hardware is of such crucial importance to anyone doing "real work" on the microcomputing platform, and that is because Moore's law *no longer benefits single thread performance*. Why? Because if those cores get much faster with current materials, they burst into flames. So the only route for Moore's Law to follow is to go wider, not deeper, which means more cores at a relatively static speed. End result? Without multithreading, computational code will no longer double in execution speed every 18 months, even though the transitor count is doubling every 18 months. The upshot here is, if you look at the scarcity of programmers (relatively speaking) who seem to care about multithreading, or even be aware of it, it is fairly obvious that the bulk of developers are nowhere near hitting the computational wall. Why? Because that wall is being hit by their *database server*, not their *application server*, because they are writing forms to act as a friendly front end to a DB. And once the wall gets hit in the DB server, that means that the problem affects the few thousand developers out there working on Oracle, SQL Server, DB2, and MySQL in C/C++, primarily, not the *millions* of programmers writing code that runs against those servers in Java, VB.Net, and C#. And what makes you think that the human brain is not a specialized pattern recognition system? Much of the research I have read in the field of cognitive science lends much credibility to that viewpoint, particularly research into the concept of expertise that shows that *regardless of the field*, it takes approximately the same number of exposures to something to build a body of experience that allows the person to become an "expert". At the very least, pattern recognition can *emulate* end product of the human brain's thought to the point of usefulness. And this cuts *directly to the heart* of the matter. Are we, as an industry, satisfied with simply programming calculators and spreadsheets of ever increasing capacity and speed? Or do we want our work to act as a lever for human creativity? If your answer is the latter (and from your responses, I suspect it is), then we need to stop using the wrong tools for the job. That means getting these abacus applications back to the mainframes (or a mainframe like environment) where they rightfully belong, and off of the "Personal Computer" platform (aka microcomputer). It means getting rid of the expensive and wasteful PC for those who really don't need them, and replacing them with easily managed clients, whether they be the platform I discussed in the original post (the mobile thin client). And it means learning to write applications for the remaining PCs that actually utilize their power to the utmost. The fact that the computer *still* cannot reliably seperate spam from human generated text is ample evidence that our ability to leverage the computer is still in its infancy. J.Ja

Absolutely
Absolutely

Justin: [i]I may add, it's not even a fix for the spam topic. I get emails from strangers all of the time, and it is legitimate email too. But because of my writing.[/i] Your particular line of work is atypical, and those problems are inapplicable to most users, who nevertheless are offered greylisting software on the shelf instead of easy-to-use default deny policies & whitelists. Why do you suppose that is? Justin: [i]But what do you do about the person who gets a new email address? Reject their mail?[/i] H-E-double-hockey-sticks, yes. Speaking for the majority of computer users now, if they are a legitimate contact, they will tell me their e-mail address personally, if they desire the convenience of contacting me by email. And I'll manage my own PGP keys, thank you very much. Get me?

Justin James
Justin James

Yes, that would be a possible fix for SMTP. The problem is, you are talking about correcting a problem with SMTP outside of SMTP. The problem is that the SMTP protocol allows anyone to send email to anyone on behalf of anyone, with no trust, authentication, or verification built into the system. This is actually next week's topic, so I am getting a bit ahead of myself. I may add, it's not even a fix for the spam topic. I get emails from strangers all of the time, and it is legitimate email too. But because of my writing. But what do you do about the person who gets a new email address? Reject their mail? Blacklists have never been a workable email system, and the maintenance of the list is a nightmare. But... In a nutshell, every single "solution" we try to paste on top of SMTP is just that... on top of SMTP. Why? Because the underlying protocol, while it met the needs of when it was developed, is no longer adequete for our current needs. But because it is a standard, and changing established, adopted standards involves an "all or nothing" approach to upgrading the software that connects to it. The end result (again, giving more of a "sneak preview" of next week than I probably should) is that a huge percentage of people no longer use email *when given the choice.* They use in-house messaging systems (like MySpace's "caged garden" messaging system), IM, TXT, phone calls... anything but email! It has gotten to the point where I actually call some people to verify that my email got through without a) getting marked as spam by accident or b) being confused for spam by a human scanning through their emails. That's just sad. And the fact that SMTP is a standard that does not meet the needs of the current computing environment is at the heart of it. J.Ja

Absolutely
Absolutely

Start with reject by default policy and accept mail only from known senders. You want to be allowed to send me email, you have to ask permission from somebody who's already on my list, just like being invited to my party. C'mon, even rocket science isn't [i]really[/i] "rocket science."

aaron
aaron

I don't know what applications you write, but if they are "nothing more than forms in Access" then you need to find a different job! You are confusing the "rush" factor with versioning. If every developer on the planet sat on their hands and waited for every possible problem to show up so they could write a solution for it into the first version, we wouldn't be communicating right now. Technology as we know it would cease to exist. There wouldn't be some sort of utopian existance of technology in it's superior form either. We would be sitting around a fire wrapped in animal skins to keep warm in the winter. And speech recognition? Come on. I can't believe you brought this up the way you did. If I bring up Word, turn on my mic and speech recognition, I can create a complete letter without ever touching my keyboard. What you are dreaming of is artificial intelligence that can guess what you want based on what you tell the computer. This is not just simply *pattern recognition*, this is trying to implement the human brain into a computer program, which is also why it hasn't happened even with the fasted CPUs in the world. Don't be silly, be a little realistic here.

Justin James
Justin James

Silly Tony! You and your long history in this industry! I am sure that no matter which way this thing goes, it will be done wrong. There is too much momentum for "now now now" and not enough for "right". Like Web applications would make sense... if they didn't involve HTTP and HTML. But that's the way things went. SMTP-based email would have been great... if it had the safeguards baked in to protect it from being 95% spam. Now, email is practically worthless, and IT departments spend a fortune trying to fight spam, viruses, and phishing via email. Thin clients would/could be fine, if they were more than simply a mobile Web browser, and a bit more like a mobile X Terminal, or maybe more like a portable PC with the option of docking somewhere. Heck, client/server would be fine, if we treated the server as more than a dumb file store or DB backend, and actually did some real processing on the client, instead of wimpy data validation. At the end of the day, we as an industry keep finding new combinations of mobility and where the processing responsibility occurs to do the same bloody thing: Microsoft Access, more or less. WHY WHY WHY? I just don't understand it anymore. 40 freaking years, and the "applications" most of us write are about equal to the forms in Access, maybe with a "real" database behind them. They don't do any real processing or computing, just some input validation and adding, and then a percentage and/or average at the bottom. 40 plus years and we are still screwing this up. Meanwhile, we still can't get speech recognition to the point where it is trustworthy. All of the CPU power in the world. It obviously is not a CPU issues, since I keep getting promised that the next generation of CPUs will make it work. We can't program something that can say with any certainty, "there is a triangle in the middle of that image" or how about something as simple as, "given (2 + 1) * (7 + 2), I can compute 2 + 1 in one thread and 7 + 2 in a second thread without fear or being told to do so." In a nutshell, *pattern recognition* as a whole is the Holy Grail of computing, and we keep blaming everything but ourselves for the failure to get there. J.Ja

steinalexsilver
steinalexsilver

Keep in mind that the computing power of the devices you speak of is insufficient for most technical tasks. The future you mention will not happen for the average desk worker.

Justin James
Justin James

The average desk worker's CPU sits at 5% or less, except to open apps. Many apps, even traditional enterprise apps, are moving to the Web, dramatically reducing the client-side system resources. At the same time, the mobile devices have shown themselves more than powerful enough to run copies of Office and other productivity apps on them. If the average desk worker was using Photoshop or doing heavy number crunching in Excel or Access, I would agree with you (even then, they can be moved to a server based model). But since the average desk worker does so little, I have to disagree. J.Ja

Tig2
Tig2

I don't know, Justin. I think that some form of the traditional keyboard would still be a requirement as would a reasonably sized screen. I wouldn't want to write my blog on my PDA, even though I COULD. The ultralight mobile tablets and such might be okay. But the ones I have seen don't seem to me like they would be the whole answer. If you are thinking of a SaaS SOA type of environment, why not an NCD? They are designed to provide a base minimum- connect to the network and provide a way for the end user to get to their applications. But they provide a monitor and keyboard. Having said all that, I agree with your theory. I know people whose only interaction with their work computer is to read email and surf the net. But a lot of those folks are senior level. There should be a standard that provides for a guideline here. If keyboarding is 40% of my job, I get a real keyboard to do it on and a real monitor to read it on. Or something like that.

Justin James
Justin James

I think a docking station could be had for under $100. What does it need to be? Let's look: * Display adapter * Ethernet adapter * 4 port USB hub * WiFi adapter (optional) * 12 v power input (cord with a transformer "brick" to reduce costs of replacing failed power units) That's dirt cheap. It also eliminates the airtime charges (valid concern). The periperals (display, mouse, keyboard) are the same as you'd have for a desktop PC, so that part of the equation cancels out. Take the $200 Avaya monstrosity phone off of the desk, and the scales are tipped well in favor of the mobile device platform. And yes, 2 docking stations (home and work) + AJAX-ish (or RDP/Terminal Services/X Window) = zero sync, zero downtime for failure, zero maintenance... in other words, massive savings for the IT department. Beleive me, *this is not a future I want to see*. I prefer working on computational systems, and this kind of future precludes that work from being a mainstream possibility. But business is about money, not "letting Justin James write the code he enjoys writing", and as a result, I think this is where smart companies are going to be headed soon, and not-smart companies will eventual follow. J.Ja

SnoopDougEDoug
SnoopDougEDoug

For the vast majority of desk jockeys the cost of a mobile device PLUS the cost of a docking station is much greater than the cost of an adequate desktop PC. Add in connectivity and airtime charges and you have a must costlier proposition. Now if you are talking about mobile information workers--sales, onsite tech support, etc., then the idea of a smart mobile device with two docking stations (home and work), is palatable. In this case it makes more sense to look at AJAX-ish apps to obviate the need to synch data from the device to corporate DBs. doug

Justin James
Justin James

I really regret not making it more clear that my expectation is docking stations for folks. :) Input/output is why most workers need desktops today, so they have monitors, mice, and keyboards. That's why I think the USB connector is such a big deal. It means that it is a trivial OS or firmware change to get these things a) talking to each other without a PC in the middle, and b) connecting to docking stations easily and seemlessly. NCDs are a joke. They cost like 80% of what a desktop PC does, while deliving none of the functionality. That's why I feel that the mobile device route is the one that will be followed. It is cheaper than the NCDs or the desktop PCs, even accounting for a docking station. Plus, you eliminate the phone from the desk, AND you instantly make the worker mobile AND get disaster recovery/business continuity benefits. It just makes way too much sense, now that these devices have more horsepower than some of the PCs that were for sale when Windows XP hit the market... J.Ja

simon
simon

I think Justin has a good point here. I can easily imagine that the two screens, keyboard and mouse I'm currently in front of could be plugged into a usb hub, and then in turn to my handheld PC of the future (e.g. a Blackberry size device, with the computing power we currently have in the average laptop). Not sure it'll happen this year, but I reckon it will happen. When you decide to leave the office, unplug one usb (or whichever bus we're using by then) and off you go!

Absolutely
Absolutely

Although mobile devices do not have the computing power of today's desktops, they do have the computing power of the desktops that were more than adequate to perform the same tasks -- with less eye candy -- which are the primary work of many desktops. High-performance video and computation have always been in leagues of their own.

Justin James
Justin James

One thing I didn't really make too clear in the post, this is where I think the industry is headed. It is not neccesarily where I think it *should* be headed. Personally, I think there is a lot of good to it, but my fear is that we'll either transport the resource (and bandwidth) hungry client/server model to mobile devices, or keep relying upon the "RPC over HTTP" mess of AJAX, instead of really doing the right thing, which would be to bring some sort of remote windowing system (like X, but not X) to these devices. J.Ja

Vladas Saulis
Vladas Saulis

I agree with Justin that most of client side computing will move to PDA/mobile devices. But I also predict that on the other side of the peer we will mostly have our personal [web] server, which can be a huge and powerful multi-core computer. The tipical scenario would be following: I'll have a personal PDA, have one or more at work (for work), and some additional for my peers (family, work collegues, ...). So, programming will split into to main streams - PDA client [Web] apps, and super-server OS and Database programming. These super-servers will interconnect in between, same like we connect now from our client desktops. This connetion finally will be almost totally moved to inter-super-server connections only. In fact, I already using such sceme in some sence...

Justin James
Justin James

What you're suggesting is a local "concentrator" of data, something to do the downloading and such in advance of items, then apply some algorithms to it to digest the information, weed out what the users really do not need, and then present it to their mobile devices as a Web item, correct? For example, instead of downloading my emauil to the PDA and doing spam/virus filtering there, downloading it to the server, do the spam/virus filtering there, and then access it over the PDA. Is that what you are talking about? If so, there is definitely some sense to it. I think we are seeing it already, with corporate Intranet applications consuming third-party data sources, repacking it, adding some business-specific logic, and then giving it to users as a Web app. J.Ja

jspurbeck
jspurbeck

Data entry, scanning a grid of data for analysis, database management (table updates), and many other IT tasks are hindered by the extremely small footprint of most mobile devices. An older monitor, keyboard, & mouse, hooked up to web connection, using a web browser/OS that supports W3C standards, and applications supporting those standards and built to take advantage of that "rich" environment, is what "most" IT users will need. I consider this the most light-weight configuration necessary to send most IT users home to work (telecommute). Use the mobile devices for voice communication and whatever "mobile" needs can be delivered. But I do not believe that "mobile hand sets" are the future means of "work" for even the younger generations. They may multitask better than us, but let's remember what they are multitasking about these days. Music, Video, Phone Calls to Friends, MySpace, etc. Hardly real work there my friends.

gsikora01
gsikora01

Our client base manages the international forwarding process; a task that includes the entry of up to several thousand data elements over a 1-3 month span for thousands of records per yr. 95% of the users are central office based. Their role includes just in time follow-up on multitudes of critical events during transit, data entry of details, production of record specific notices to vendors and clients, management reports, and other similar tasks. Does anyone seriously see these users effectively working from any other environment other than a traditional client server application?

aaron
aaron

While you are right that most people will still need a PC to do "real" work, I think the workers that this is going to be true for are the true mobile workers. The sales reps, the service technicians, and even delivery personel. Most of these people do not need rich client experiances, they just need the hard data. The customers contact info or the delivery information is what most of these people will need. Carrying around a tablet PC just to get to this info is not ideal, but a blackberry or similar hand held is. You don't need a lot of screen real estate to satisfy the needs of these workers... Data entry clerks will never use a cell phone to input customer data. Programmers will never use their blackberry to develop those mobile device applications. You can see this exact trend in Japan, which is where it usually all happens first. The PC sales have plumetted in favor of all in one mobile devices that are web enabled. But this is also mostly in the consumer market. The consumer market does not equal the IT or corporate market... I don't think we need to all start converting every one of our applications to a mobile device, but we should be aware of the needs of true mobile workers.

mjallyn
mjallyn

In reading the replies to your post, I think readers I missing the gist of what you're trying to say here. Most of us reading these posts represent a small slice of a VERY technical community. Our requirements are very different from the average PC user. We need powerful client PCs for compiling etc., and big displays for running dev environments, email, project planning software, and test environments all at once. The fact is, the other 99.9% of PC users in the world need a box that allows them to browse the web, listen to music and read/write emails. By docking a 333MHz smart phone to a 12" 800x600 LCD display and a keyboard/mouse combo, all of their needs would be met. Any processing-intesnsive operations (rebalancing a stock portfolio?) could be handled by remote servers, as they are today. Look around at all the pre-teens, teens and twenty somethings. They're all holding a cell-phone in one hand, an iPod in the other, and chatting w/friends - all while driving down the highway at 70 mph. A PC isn't in the future for these people, and they will represent the majority of our nation's population in 10 short years. I'm with you Justin - it isn't a pretty future, but it's coming. But I think the good news is the PC will still have a place in the technical community for many years to come.

Justin James
Justin James

I agree with you... a lot of people are confusing their personal usage with the average worker. At my last job, I worked in the middle of a cube farm, watching people all day long who only used two applications: Outlook for email, and Internet Explorer. For what they were doing with email, they could have been using OWA, and reduced themselves to 1 open desktop application. It really struck me as silly to have an entire desktop PC, plus the support, cost of OS, power usage, etc.... all to run Windows XP, only for Internet Explorer. As much as I dislike Web applications (regular readers know I am not a big fan of them), it's where things have already shifted. The desktop PC does not make sense any long. We have the physical infrastructure of client/server, but the computing model of sophisticated green screens. I do "real work", in terms of development, occassional graphics, etc. I still need a real PC. But most people don't budge that CPU meter over 5%, unless an app is opening, or something is wrong. J.Ja

carbondog
carbondog

Even if desk workers' functionality needs allow use of cheaper (hardware costs only) smart phones, don't most smart phones end up requiring significant monthly outlay for data plan + voice plan? It would seem to make the cost savings argument not quite so persuasive...

Justin James
Justin James

Many devices now are starting to ship (or will ship soon) with WiFi in them. In fact, some devices can do automatic rollover from VoIP on WiFi to cell networks (and back) already, which is a huge cost saver. At my last job, there were probably 20, 30 people who ran around the building all day on their cell phones, having that tech would save them thousands a month on the phone bill. The UMPCs also have WiFi and many of them come with wired Ethernet ports as well. And, of course, if your device can be docked via USB, there is no really good reason why the docking station wouldn't have network connectivity. Sorry I didn't make this more clear though in the original, thanks for bringing it up! J.Ja

simon
simon

Carriers are already starting to introduce more flexible and cost effective data plans. The advent of a fully functional and usable browser (e.g. Safari on the iPhone and various other popular phone/PDAs) has forced their hand early. O2, Virgin and 3 all now offer an unlimited data package either with the call plan, or as an extra. I know business plans are typically more expensive, but they will trace consumer plans/price fluctuations eventually. Processing power is the other potential barrier, but Moore's law already seems to struggle to equate the current rate of increase in computing power. I don't see this being a real obstacle in the future.

amcookjr
amcookjr

You are trying to evaluate a future implementation in current terms. The data+voice model will disappear, to be replaced by what I call the EIJD model (everything is just data). Example: VOIP. In that environment, a single processor makes sense.

Editor's Picks