Software Development optimize

Development trends to watch in 2010

Justin James talks about what he thinks will be important development trends in 2010. He covers .NET 4, Visual Studio 2010, cloud computing, and more.

 

Welcome to 2010! I took a look back at 2009 at the end of the year, and now I am summarizing my thoughts and ideas about what topics will be important in the software development industry in 2010.

.NET 4 and Visual Studio 2010

One of the big items for 2010 is the release of .NET 4 and Visual Studio 2010. I anticipate that this will be much more revolutionary than the release of .NET 3.X and Visual Studio 2008 for the following reasons:

  • Visual Studio 2010 will fully and properly support all of .NET 4, unlike Visual Studio 2008, which had lackluster support for much of .NET 3.X's features.
  • .NET 4 finally brings C# and VB.NET into close feature parity; the new C# features will make interacting with Office much easier, and VB.NET will have the ability to play well with lambdas, both of which are really important capabilities.
  • F# will be a full-fledged member of the .NET ecosystem, bringing functional programming to the masses.
  • ASP.NET MVC will now be an "out of the box" experience, as is the Web Platform Installer.

Cloud computing

I have my doubts about cloud computing, and so do a lot of Programming and Development readers. We all know the laundry list of concerns (most of which can never be fully alleviated): security, privacy, network latency when integrating with in-house systems, "is the vendor doing what it says they are?" and so on. All the same, some cloud computing vendors (e.g., Amazon Web Services) have built a solid reputation, and developers are seeing that, in many scenarios, cloud computing makes sense despite potential or real risks.

I expect to see more developers use cloud computing in 2010. While you might not need to start using it yet, get familiar with cloud computing, so when your boss asks you about it, you look like a genius.

Ruby, Scala, Groovy, Python, and other programming languages

In 2009, a whole host of alternative languages established themselves as being more than niche players. A lot of companies are seeing not just the value proposition of many of these languages, but that the risks are not nearly as bad as they were perceived to be a year or two ago. While it may still be hard to find employment as a full-time, W2 employee in many of these programming languages, there is lots of room for consultants to make a living, or developers to use them in a few projects. This is great news, and I want to see these languages have more of an impact going forward.

The JVM's renaissance continues

A few years ago, the only language running on the JVM was Java. If you wanted to use multiple languages on the same runtime, .NET was where it was at, and your options were C# and VB.NET. Now, the JVM has established itself as the premier multi-language runtime, with JRuby, Scala, Groovy, and Jython all looking like seriously useful systems. Meanwhile, IronPython feels half ignored, and IronRuby still can't get out the gate on the .NET platform.

The economy

The economy is still the big bull in a very fragile china shop. From what I feel (pure gut instinct, folks), I don't think massive layoffs are still happening for IT workers, and I have been seeing a trickle of hiring in certain types of jobs. I think that highly skilled superstar programmers can find work if necessary, but they may need to relocate or not get as much of a pay hike as they would like. My instinctual reaction is also that entry-level and intermediate developers are still very vulnerable, as many companies perceive them to be "a dime a dozen," and their jobs are potential targets for offshoring.

If I was an entry-level or intermediate developer, my plans for 2010 would be:

  • Learn some advanced and/or cutting edge development techniques: parallel programming, games development, component design, etc.
  • Merge development skills with industry-specific skills to add value. For example, don't just implement the algorithms the business analysts hand to you -- learn to develop the actual algorithms.

Buzzword alerts

Something I see time and time again is that an idea will work well in certain shops that have the right attitude and people for them to become popular, but as other companies try to implement those ideas, they fail. The buzzwords that succeed in the long run are the ones that have enough perceived risks to slow adoption. Slow adoption rates mean that people have time to explore the possibilities and learn to mitigate risk. Super-hyped ideas don't get that maturation period, and instead, people rush headlong to implement them and then give up when they don't get the promised results.

Last year, SOA (and its buzzword predecessor, SaaS) lost a lot of its shine as big companies without a lot of IT dexterity tried to implement SOA and turned those projects into the usual enterprise IT boondoggles for the usual reasons (wrong people, lack of passion, too much red tape, "too many cooks," and so on). I highly doubt you will see many new SOA implementations this year.

Mark my words that agile is about to go this same route. As more folks hear about the benefits of Agile, and as more "gurus" write books, the more the pointy-haired bosses become willing to try it. We all know what happens when some Seattle or Portland hipster cool concept gets implemented by pointy-haired bosses: total failure. It's like watching your grandfather try to skateboard.

J.Ja

Disclosure of Justin's industry affiliations: Justin James has a contract with Spiceworks to write product buying guides; he has a contract with OpenAmplify, which is owned by Hapax, to write a series of blogs, tutorials, and articles; and he has a contract with OutSystems to write articles, sample code, etc.

---------------------------------------------------------------------------------------

Get weekly development tips in your inbox Keep your developer skills sharp by signing up for TechRepublic's free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!

About

Justin James is the Lead Architect for Conigent.

24 comments
jean-simon.s.larochelle
jean-simon.s.larochelle

I actually started playing with JavaFX over the holiday. Now that things have settled down (version 1.2) it is starting to get quite interesting. I'm using Netbeans 6.8 in Linux and having a lot of fun.

mattohare
mattohare

I see a lot of potential with both. You're right Justin in that there is a long way to go, but that doesn't mean we won't do it. The Cloud vendors and consumers will work together to establish adequate security and privacy. We may see some third parties that provide consumers with the tools to encrypt what goes into the cloud. If I knew they couldn't be read, I'd give my deepest secrets to FOX or the Belfast Telegraph for storage if the price was right. Rails is another story. The Controller and View parts of the MVC can be so natural that simple site development takes little effort. The underlying language, Ruby, can be just as natural for coding. ActiveRecord is a good start for minimal database needs, but is not strong in scaling and integrity. I think we'll need a new object model from the ground up. While the Ruby language is natural, it can do better with Unicode and multi-threading. You didn't mention multi-threading, but I think that is something that will come in with cloud computing more. It comes with the idea of letting a user keep working while something they started completes in the background. For example, an Instant Messenger application doesn't need to stop while one window is busy (sending a file or even a text message). The application simply acknowledges the completion when it happens.

Tony Hopkinson
Tony Hopkinson

To gte the cluld to be an efficient modularisation and parallelistaion are a must. Just trying to run our standard monolithic CRUD apps, if the relational database space is affordable, is not going to cut it. I see it going more towards a lot of losse coupling and a mainframe style batch processing myself, than multi threading. Same sort of issues but on a 'process' per core basis. Certainly on Azure, moving away from relational to object style databases, document/property bag, is going to be far more cost effective.

mattohare
mattohare

I bet I love it once I make the transition. That was the way when I moved from all procedural to procedural with in the object-oriented programming. I think there are times when I do seem to think towards object style.

Tony Hopkinson
Tony Hopkinson

lose his temper that one.... I can see why they call the it orphaned, abandonded or neglected would be more accurate though. Parent's weren't killed in an accident, they were smoking crack and used the kid as a pipe holder.

mattohare
mattohare

All of my databases have some relational structure. And, I let the database engine do the work. I can show you some permenant scars from having urgent orphan data and the harddisk equiv of a memory leak. My opinion of what I've seen of others that 'do it in code' is that they don't want the user to ever see an exception thrown. You know what? They're going to happen. And, they may happen because you've lost a parent (that will never come back) or run out of storage space. The referential integrety exception will probably come at a better time (during the delivery/shakedown period) than the other two exeptions (during period end financials, Christmas dinner, Six Nations match).

Tony Hopkinson
Tony Hopkinson

I've seen one or two the other way though. It's the same with any tech, go too mad with it, and you end up with some thing that can only be described as insane.

Justin James
Justin James

... I meant when *not* doing a lot of JOINs. Most apps don't work that way, even though the DB diagrams do. :) Most apps just need to pull a few rows and maybe JOIN them to one or two other tables. After all, if the app really needed it all like that at once, imagine what the UI would look like to make sense of it. J.Ja

Tony Hopkinson
Tony Hopkinson

your design is more fit for storage and possibly data integrity than for practical use. There are well documented ways round that but they increase storage demand possibly bandwidth requirements and reduce data integrity. I can live with that, it's people not recognising the consequence of the shift that get me. The largest single advantage to putting relations on your data, isn't fast searching, nice diagrams and such. It's our code falling over when we make a dumbass mistake like having an order without a customer.... We know thay are going to want to link from one document to a related one. We know they'll want to know the linked document has changed, to cascade any common information into this document, a new version of it, or optionally pick and choose. We can do that, but it's more code, more meta data, more complexity, tighter coupling, less cohesion. And that costs a lot now, and even more later. The question has never been can we, it's should we. My question to a BA when given a requirement along these lines is would you like it to make you a cup of tea as well? A bit more business required in teh analysis me thinks.

Justin James
Justin James

When writing a framework, it makes the back end code much, much easier if you treat the DB like a flat file. And, in the vast majority of use cases (where you are JOIN'ing massive numbers of rows, or doing highly specialized queries), despite the ineffiencies, it won't be noticably slower. I know, that's not the "do things as well as possible all the time" mindset that guys like you and me work on, but it does fit in with the current "forget CS theory, let's get it done NOW" mindset that has moved from being an unpleasant necessity in corporate development to being embraced by many developers. J.Ja

mattohare
mattohare

I see a lot of development talk that think of the database layer being a collection of nothing better than flatfiles or separate tables. Relationships happen 'in code' instead of in a database engine. The surprisingly bad form I found was in Rails. I was so impressed with the object oriented layout of the presentation, I expected the database layer to be just as sound. As you saw in another post here, I found the database layer to be no better than secondary school beginners hacking. And now we're talking about a new generation of database philosophy? That aside, if we are to go object oriented in databases, I still think there's room for the table-column relational philosophy with it. One should be able to define what should be in sub-nodes of a particular node.

Tony Hopkinson
Tony Hopkinson

The approach makes some things easy, others unpalatable. The switch though, is far from trivial code wise, and somthing like reporting particularly adhoc needs a serious think through. Basically it's going back to a hierachical database at best, sometimes more like a file based system. Getting managment to appreciate what it will mean to the cost of the feature set is difficult at best and as usual they want everything yesterday with no disruption. I can see more than a few who are beleiving the it will be cheaper Gartner bollocks just bunging a basic monolithic single threaded CRUD apps into the cloud and then blaiming us crappy devlopers when it doesn't work mainly becaue the applications design eithe was or is now wholly unsuitable for modularised distributed processing.

Tony Hopkinson
Tony Hopkinson

Agile is not and never was new. It (properly) done is just more 'erm agile :p, than classic waterfall. May be it's where I gained my experience, but the only time I've used it is in class. Moscow and spiral were far more common, anf they've both simply been rebranded as agile, after the total f'ing disaster that was RAD.

Mark Miller
Mark Miller

I liked the last part of your post, because this has happened too often in this industry. They take a good idea, screw it up, and after a lot of failures think that the original idea wasn't any good to begin with. Blind reasoning leading to more blind reasoning. I would've thought that agile would've been discredited by now by this same process. I heard from a friend who worked at a software company a few years ago, a place that I thought would've been using professional methodologies (who shall remain unnamed). He told me this funny story about how they had formed a new Java team of about 5 people. They said they were going to use agile, but when it came down to it, they were just working ad hoc. There was no discipline to it at all. The team couldn't even decide what they were working on, what the goal was. Everyone was just off doing their own thing. It was chaos, worse than any environment I've worked in, by the description. The team and the project ended up falling apart, due to professional mismanagement. It sounded like a comedy of errors.

Justin James
Justin James

... is a great example of how these things get mangled. When I was in college, I suddenly found myself a bit heavier than I ever expected to be, I guess my metabolism shut itself down a bit as I got older than 18. :) I had heard about the Atkins diet, but instead of actually learning about it in depth, I took hearsay information from people who didn't understand it as gospel. In a nutshell, I thought I was allowed to eat anything and everything, as long as it wasn't a carbohydrate. What happened? I was eating (literally) PLATES of bacon *and* 18 - 24 hard boiled eggs for breakfast, salami and other fatty lunch meats 1 - 2 lbs. at a time for lunch, and who knows how much meat and cheese for dinner. After a month, my entire body was ready to shut down. That's what Agile is like for a lot of companies. They don't actually learn about it, they *hear* about it, and then try to invent something like Agile based on what they heard. Up until a year or 18 months ago, I had already *heard* Agile as, "dump the project managers, have the programmers talk directly to customers, and release every 2 weeks" more or less. Needless to say, I was horrified. As you say, "... just working ad hos. There was no discipline to it at all." Then you have stuff like Scrum; if you look at Scrum closely, it's really Waterfall in a really tight loop. Needless to say, Scrum doesn't impress me much. It turns out, what Agile is *supposed* to be, is a reaction to the problems with Waterfall. It encourages flexibility in plans, and architecting for change, so that as customer feedback comes in, the development roadmap can be changed without needing to rewrite the core. That makes a heck of a lot of sense. But if you look at what most people hear Agile to be, it's total nonsense. And it's hard to distill Agile into the kind of methodology that can be taught and be put into books or have certifications. It's a mindset that requires the right combination of people and policies, and there is no way it can be legislated via management into existence. J.Ja

Mark Miller
Mark Miller

was with the Waterfall model. When Waterfall was originally conceptualized it had feedback loops. So there was always the possibility of backtracking from coding, to design, to requirements, but almost every company out there that used it didn't include them. I think it mostly has to do with budgeting. When I worked in IT services, we usually had fixed-bid contracts. This always meant we had to provide an estimate up front before we started coding. This would determine our budget, and then there was a deadline. Whenever I'd estimate I'd always just look at how long it would take me to design, code, and unit test my modules, straight through. Then I'd wonder why my estimates were off. A manager I had finally tipped me off to the fact that we had had, and were going to be having team meetings during each project, and that I (we, the whole team, actually) needed to include the time for those meetings in our estimate. That was a light bulb moment. :) We never included time for evaluating the design as a group, or updating the requirements documentation (going through feedback steps). Of course the requirements and design changed, but those were recorded in meeting notes and e-mails--addenda, if you will. :| We were exposed to the ideas in different estimating strategies when I took SE in college, but never in great detail. We didn't get a chance to practice them. I later learned from a friend that they covered that stuff in graduate school. I don't know if they still do, but masters programs typically had a project management course where they actually educated students about making reliable time estimates. It was kind of out of step with the times, if you ask me. It wasn't PMs that had to make estimates. It's been the developers who've had to do that.

mattohare
mattohare

They were strictly small time between friends stuff. We wanted to stay friends after the projects.

Justin James
Justin James

... then you are a pretty lucky person. :) I've worked with a few customers like that, but they are very, very rare. J.Ja

mattohare
mattohare

Mind, this only works with a client that respects the vendor and the work they do.

Justin James
Justin James

The IT profession has a fee structure that ensures that at least one party gets burned most of the time. Flat fees per project kill the vendor due to changes, per hour kills the customer due to changes. :) J.Ja

Tony Hopkinson
Tony Hopkinson

Works quite well based on our workload. In many ways it's formalised how we worked anyway, but with nice slant towards management prioritising based on a 'plan', as opposed to reactively orf worse still some sort of marketing whim. Personally I'm a big fan of choosing a lifecycle that gets the work done, foolishly naive I know.....

mattohare
mattohare

GRAND SLAM! :D How about that! I see it happening for another year. You think Wilkenson's going to be up to scratch this year?

Tony Hopkinson
Tony Hopkinson

Eye gouging, stray kicks, forcefully standing on someone, pubic twists, fish hook, a stray boot... All useful metaphors in the cut and thrust of a rugby game... :D

mattohare
mattohare

I'm looking too closely at the origin of the metaphor (real rugby matches, especially in the Six Nations) and just can't make the connection. Maybe if they called it Charlie or Liam, I'd get my mind around it.