Smartphones

Reflections on 2009: Parallel programming, mobile and Web development, cloud computing

Justin James discusses the TechRepublic's Programming and Development blog content in 2009 and shares his observations about hot industry topics.

 

In 2009, we really shifted the content mix in TechRepublic's Programming and Development category. In past years, there was a lot more analysis and opinion on my part, and a mix of tutorials and hands-on content from other authors. This year, most of the analysis and opinion was repurposed from our sister site ZDNet, and I focused a lot more on articles that teach an idea or a topic. As part of this effort, I started writing the Code concepts series, which provides a high-level introduction to a topic, and the Hands-on programming series, which provides real-world coding examples of a common programming task. I also try to run a reader response article each month, although lately, I have not had enough questions from readers to keep up that pace. So please, send in your questions!

In this meta post, I look back at the topics that we discussed throughout the past 12 months, examine readers' reactions to those pieces, and try to be as objective as possible about where I was right and where I was wrong.

Product overviews and book reviews

I've been writing fewer columns along the lines of "My first impressions of product XYZ" based on a product demo or a conversation around a product and publishing more reviews based on actual product usage. Because these reviews take a substantial amount of time and effort, I have been trying to limit their frequency. I've also been trying my hand at book reviews. I intend to keep them up as well, but they are based on me having the time to read books, which I have been a bit short on lately.

To keep you guys in the loop regarding product news and announcements, we've started the programming news pieces; originally, these were being run on a weekly basis, but we have recently started to experiment with a bi-weekly schedule.

Parallel programming

One thing that really disappointed me is the utter lack of interest in parallel programming. I'm not upset that so few developers share my love of parallel programming, but rather that they don't seem to understand that it's the only answer for a slew of applications. The lack of interest is a red flag to me that, as an industry, we are not working on those kinds of applications. Why are we not solving the kinds of challenges that require vast computing horsepower? This is a very real concern.

This is a topic that I will be taking up in the very near future, but let me just say that every true industry insider that I've talked to is in 100% agreement that the vast majority of the IT industry is the emperor's clothes and that we are the weavers ripping off our clients.

Mobile development

I am also a bit disappointed in the direction that mobile development has been taking. Windows Mobile is, hands down, the most easily programmed of the major platforms on the market (BlackBerry, iPhone, and WinMo).

I've seen numerous demos of all three systems, and there is not enough money on the planet to drag me into BlackBerry development. I saw someone using not one, but two IDEs on the same codebase, due to the lousy debugging tools in one and the poor authoring tools in the other. That's insane.

iPhone development looks only marginally less awful; the tools seem primitive, the test cycle is horrid (to put it mildly), and then you still have to deal with Apple's Byzantine AppStore rules.

WinMo development uses the well-known, high quality Visual Studio environment with a good test cycle, well-documented languages and APIs, etc.

So, which platform is sinking like a rock? WinMo, naturally. In addition, the iPhone does not integrate well into enterprise networks, and BlackBerry devics are expensive to integrate, while WinMo is cheap and easy. This restricts the market for enterprise apps for the BlackBerry and the iPhone markets, while the platform for enterprise mobile development is rapidly shrinking. Hopefully, the Droid platform will get enough market share and enterprise connectivity to become a legitimate platform for mobile enterprise development.

Web development

The Web development space is rocking and rolling. Looking back at some statements that I made two to four years ago, they were on-target at the time, but now they are incorrect.

AJAX

AJAX development, for example, is now a viable option. (Read what I said about AJAX in 2006: AJAX: The Right Goal, but Often the Wrong Tool.) It's not that the underlying technologies suddenly improved -- the libraries that mask the underlying technologies improved.

Silverlight

Out of the new RIA technologies, it looks like Silverlight is making a huge impact, but maybe it's just that Silverlight developers are particularly vocal. I still don't see much Silverlight on sites that are not Microsoft owned or sites run by Silverlight developers, outside of some streaming media applets on a number of sites. All the same, Silverlight installations are pretty high already, and in a year or two, Silverlight's installed user base will be really close to Flash's, if its current pace keeps up. At a technical level, I believe that Silverlight has surpassed Flash, based on some of the things I have seen implemented in it (such as Photosynth); also, Silverlight continues to add new features, sometimes at a pace that I think is too quick to absorb.

ASP.NET MVC, Ruby on Rails, and Java

In addition to Silverlight picking up steam, ASP.NET MVC has made a big splash, and I feel that Ruby on Rails continues to gain mindshare. While I think some of the "wow, this is magic!" is gone from Ruby on Rails, and folks are seeing it a little more objectively, people are finding a lot to like about it. While I don't see Rails as grabbing huge market share from Java or .NET any time soon, I do see it eating into PHP a bit, since both user bases have a lot in common. Speaking of Java, I have been very impressed by the way the JVM has transformed from a single-language runtime into a hotbed of language innovation, with JRuby, Scala, and Groovy all making serious inroads.

Cloud

While 2009 certainly was not "the year of the cloud" as some might have you believe, it was definitely the year that "the cloud" became a viable target for development. Amazon Web Services has silently become a real force in the cloud, and Azure, Engine Yard, and other entries into the cloud space have provided a lot of choice. While the very real issues of trust, security, and latency are still out there, I feel that the comfort level is quickly rising. I think that development shops are learning that certain applications lend themselves quite well to the cloud paradigm and others do not; more and more shops are seriously considering cloud computing when it makes sense.

The economy

The big issue on far too many people's radar is the economy. At the beginning of the year, an awful lot of folks I knew lost their jobs. Some of them bounced back quickly, some not so quickly. My general take on things is that the bloodletting has ceased, but that it will take quite some time for things to truly get back to "normal" (whatever that is). I truly hope that all of the folks out there who have been affected by the bad times are able to get back on their feet without too much damage.

And that's it for me. I can't wait to see you guys in 2010!

J.Ja

Disclosure of Justin's industry affiliations: Justin James has a contract with Spiceworks to write product buying guides; he has a contract with OpenAmplify, which is owned by Hapax, to write a series of blogs, tutorials, and articles; and he has a contract with OutSystems to write articles, sample code, etc.

---------------------------------------------------------------------------------------

Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!

About

Justin James is the Lead Architect for Conigent.

5 comments
Mark Miller
Mark Miller

[i]every true industry insider that I?ve talked to is in 100% agreement that the vast majority of the IT industry is the emperor?s clothes and that we are the weavers ripping off our clients.[/i] Interesting comments. I kind of had a sense of this working in IT service firms. I've worked at more than one that had a nasty habit of promising its clients the Moon. This is just my recollection, but with rare exception I don't think the clients got ripped off. We just wasted their time (Well, time is money. What I meant was we didn't take the money and run). We ended up eating whatever the loss was. I didn't like it one bit. I heard stories from many other developers who had similar experiences. I wondered why IT companies were behaving this way. Didn't they know that by not delivering the goods as promised they were cutting their own throats? Anytime you have an "emperor's new clothes" situation it means that one group of people really doesn't have anything new to offer, because they lack the skill to deliver it, but they still have exclusive access to a body of knowledge that's perceived as valuable. The reason they can get away with it is the people with the money/power are too ignorant to notice the lack of substance. It takes me back to a quote from Alan Kay from the late 1980s, I think it was: [i]"A twentieth century problem is that technology has become too 'easy'. When it was hard to do anything whether good or bad, enough time was taken so that the result was usually good. Now we can make things almost trivially, especially in software, but most of the designs are trivial as well. This is inverse vandalism: the making of things because you can. Couple this to even less sophisticated buyers and you have generated an exploitation marketplace similar to that set up for teenagers."[/i] All it's going to take to burst this bubble is a competitor that offers a superior system, or uses it to take others down. Then even the ignorant might sit up and take notice that the way things have been done doesn't work anymore, and will be more receptive to a new solution. Re: mobile development So is it that there's no demand for enterprise mobile development? I was kind of curious about that. Perhaps it's being killed by cloud computing. Maybe it's that the major apps. are moving to that, and so anything of business importance can be accessed via. a browser, and it doesn't matter where that browser is, whether it's a laptop, desktop, or mobile device. So the apps. that are being downloaded by phone users are more along the lines of helpers and novelties. It seems like the last platform market that Microsoft was really able to capture was PDAs. Once smartphones started to take over, that's when the Blackberry and iPhone replaced Microsoft's dominance. Blackberry took over the business phone market primarily, I hear, because of its unique ability to deliver e-mail to the phone in real time. That seems to have eaten Microsoft's lunch. I think ease of use was the iPhone's claim to fame, that and its cachet. It solved the UI problem that seemed to elude Microsoft. I take it from past posts you've written about this that a lot of the money going into mobile apps. is going for things like digital media, games, and apps. that leverage GPS info. for maps. As usual, it's the platform features, or a platform's exclusive access to in-demand add-ons, that drive demand, not quality of tools or workmanship that support the system software, or even the quality of the system software itself. Microsoft rose to its position that way. It just failed to do that with smartphones, probably because what it missed about them was that they were not simply a new PC platform to most buyers. They were information access devices. And so the important thing was the network infrastructure supporting the device, not the device itself. Google's new Chrome OS is modeled along the same lines. Like I've said before, the new medium is "digital", not computing. Computing is now just the magic substrate that supports "digital". Hence the reason Alan Kay has said, "the computer revolution hasn't happened yet". What might give a hint to what he was really talking about would be to rephrase that as, "the [i]computing[/i] revolution hasn't happened yet".

Justin James
Justin James

I spent a long time writing a reply, and the system ate it (partially my fault). :( To summarize: It's not that there is no mobile enterprise development demand, it's that of the three major platforms, the one that's easiest to develop for AND integrates easily into networks is the one with the least customer demand. BlackBerries integrate OK into corporate email systems (the software is miserable) and support some enterprise functions, but developing for them is miserable. iPhones are a pain to develop for too, and do not integrate into corporate networks at all. Who's winning the mobile war, though? It makes zero sense. On the other bit, I wasn't referring to the service industry, although that is a problem too. I more meant the fact that we keep rewriting the same applications in new systems because that is easier and safer than trying to actually innovate and do something different. There would be a lot less development jobs if that's the way it worked! J.Ja

Mark Miller
Mark Miller

[i]It's not that there is no mobile enterprise development demand, it's that of the three major platforms, the one that's easiest to develop for AND integrates easily into networks is the one with the least customer demand.[/i] Some would argue that this is the way it's been for decades. You and me have just been used to that paradigm, but now we have higher expectations of design, just like our forbearers did. [i]I more meant the fact that we keep rewriting the same applications in new systems because that is easier and safer than trying to actually innovate and do something different. There would be a lot less development jobs if that's the way it worked![/i] From my experience this is something that businesses have done to themselves. It's not something that the IT industry has done to them. You give a business a choice of innovating the technology or sticking to what they've always done, and they'll do the latter, because it lowers the capital outlay. They want to keep development costs to a minimum. They also want to stick with their work habits. Retraining is another capital outlay. In some cases non-developer jobs depend on sticking to the old habits. What I've always noticed is they don't care as much about the maintenance stuff, because that's counted as operating costs, and for whatever reason increasing efficiency there is not a high priority. As long as expectations are kept low on that front, no one will see that they can lower their operating costs in the long run by investing more in the capital outlay. There are a couple reasonable rationales for this. Some of this is just what I've read. One is that the effectiveness of developers varies, and it's impossible to predict how effective any amorphous team will be from project to project. So one business may by chance hire a stellar team to help them innovate. Another may try the same process and utterly fail. Since it's unlikely that metrics were kept and shared, and thereby the process could be studied to see where improvement is needed, people look at this and say, "It's just a toss up". It's a complete crapshoot. No one can even put odds on its success or failure. The other factor is that since IT is dependent on the technology that's produced by major manufacturers, like Microsoft, IBM, Oracle, etc., there may not be an economic payoff to innovating yourself, because it's assumed that most developers and technology workers have the same skills to some degree. They understand vendor/consortium technology, not your company's custom-made product. So each new employee you hire will have to learn the custom technology--both developers and users. Secondly, whatever innovation you expend resources on (if it gets completed successfully--and that's a big if), some vendor may come out with the same thing sometime down the road for a lower cost, and with fewer bugs in it. We have a tiered framework operating here, where the innovators go work for software/service vendors who figure out what's in demand, and can handle the risks of innovation. The users of IT are the consumers of these products. They're trained to use equipment and software frameworks that are industry standard. All that's required to use these products is technical training, learning languages, frameworks, and equipment operation skills, which schools and bookstores provide (for those who are self-taught), and organization skills, which management schools purport to train in. In this scheme critical thinking and experimentation (beyond problem solving) are not required. Problem solving focuses the mind on, "Here is my problem. How do I solve it?" not, "This doesn't exist yet. How do I create it?" In the case of most technology, if you're dealing with skilled developers, the question isn't so much, "Why isn't my code working?" than "How do I get this technology to do what I want?" So developers apply their creativity towards their vendor-trained skills, not towards greenfield development. In this scheme developers are assemblers, who put components together, or adapt them to interoperate. And the managers of this process are the foremen or "wranglers". Design is the province of the architects, who lay out the general scheme of a system, but don't get their hands too dirty. A scientific approach toward design is not required, nor is it contemplated. At least that's the thinking behind it. The way out of this in my view is to rejuvenate a real science of computing, and establishing a real engineering discipline in software, and to scale this out in our educational systems (I know, easier said than done). Once that's in motion, the first step in trying to change this dynamic is to reduce the complexity of software/system construction with better architectural ideas (this may require different operating system designs, and/or different language designs). The confluence of a more sophisticated way of seeing computing, and more sophisticated architectural systems that reduce complexity, will in turn lower the risk of innovation. Getting there is the huge boulder that innovators in computing will have to push up the mountain. In order for this to really work though, CS is going to have to be brought out of its silo so that others can learn what it has to teach. Business managers will need to understand something about computing as a discipline, and the value of science and modern engineering in order to recognize good talent and good technologies. Standard MBA training is not going to cut it.

Justin James
Justin James

What were you thoughts on 2009, as far as software development goes? J.Ja

Jake_ Robert
Jake_ Robert

Good topic. I have gathered lot of useful information about the Cloud computing and its technologies from Cloud Computing conference 2009 which is the World's largest and 1st annual and virtual conference on cloud computing events and innovations. I found the information's about the Cloud computing conference from Cloudslam09.com.