IT Employment

How entry level developers are being squeezed out of the job field, and what they can do about it

Entry-level developers can overcome the increasingly steep barriers to entry in the field, but only if they are willing to invest the necessary time to get some hands-on experience using the tools that most shops use.

By all measures that I have seen, the job market and demand for developers is doing well. And every prediction out there, both formal and "best guess" is that demand for developers is only getting stronger as the future depends more and more upon software. You would think that this would be a great time to come onto the job market, but unfortunately this isn't so. It is increasingly difficult to get work as an entry-level developer, and without a major shift in how most software is developed, it will only get harder.

The culprits here are primarily .NET and Java, though the increasingly complex nature of Web development in general isn't guiltless either. The languages themselves are getting more complex over time. And yes, it is possible to say, "well, no one is forcing your entry-level developers to know all of the dark corners of C# to be a success." And what happens when the senior architect sets up the whole thing so that LINQ is the primary driver of the architecture?

Look, I think LINQ is awesome, but the fact is, I had to read a fairly long, in-depth book to learn it. And that applies to XAML/Silverlight/WPF/WinRT/WinForms, Entity Framework (or NHibernate), SQL, ASP.NET MVC, and a whole host of other items that are the bedrock of a modern .NET application. There is a pile of knowledge you need to know and know how to use that knowledge in a project just to be considered a "competent" .NET developer. Meanwhile, people are graduating from college with 36 credit hours of total "computer science" (and increasingly, "information systems" or similar degrees). That is barely enough to teach them the basic fundamental principles of programming let alone the tools and information they need to survive in the .NET environment.

I look at the baseline knowledge needed to make it in a modern .NET or Java shop, and I realize that if I was faced with that reality upon graduation from college, there is a solid chance that I would have washed out of software development.

And that is exactly what I am seeing with recent college grads. If they were taught in a classical computer science environment and know principles, then they seem to be sorely lacking in real-world languages and systems. For those that went to a more "modern" school that taught the "latest and greatest" they simply have not been armed with the proper fundamentals. Or to put it another way, C# and Java (and the .NET and Java ecosystems) are really poor environments to learn programming skills, but the systems that are good teachers are not going to be C# or Java. And what use do most shops have for people who not only lack experience in general, but also need to be taught the language and frameworks?

This is not a situation that makes it a good time to be an entry-level developer.

So that's the bad news. Is there any good news? I struggle to see a silver lining in this cloud, but I do feel that entry-level developers can do things to make it easier to get hired.

The most important thing (and I hate to sound like a broken record here) is that they must get real-world experience. End of story. Unless schools suddenly start devoting the full 120 hours for graduation to programming and ignore general education and minors, or students combine undergraduate work in a solid, theory-oriented program with an internship or a Masters program rooted in real-world development, it is a guarantee that a recent graduate has gaps in their basic toolset that will take years to fill in order to be effective in a typical programming environment.

Now, there are places where you can become effective with far less learning. A Ruby on Rails project, for example, has a significantly reduced amount of overhead to become productive. The Agile Platform tool that I have been working with is the same way. As a result, I feel much more comfortable hiring someone with less real-world experience (or a lack of knowledge of the full stack) for these systems, simply because the "full stack" is short enough for someone to learn in a reasonable amount of time.

But, for the entry-level developer, openings for these kinds of positions are rare. For the average person, they will be looking at breaking into Java or .NET development. And really, those jobs are going to need a working understanding of a large portion of their stacks. I can't speak to the specifics of the Java stack, but I know that for the .NET stack, I would want to see that an entry-level developer had put together a project (as an intern, on their own time, for an open-source project, or as a volunteer at a charity or non-profit) using the following technologies:

  • C#
  • ASP.NET MVC or a XAML-based system (Silverlight, WP7, WinRT, WPF)
  • HTML5, JavaScript, CSS (for someone with ASP.NET MVC experience)
  • Entity Framework or NHibernate (or similar)
  • For bonus points, SQL and LINQ would be great too.

In truth, simply seeing one or two simple projects leveraging these technologies would be great, and I think that for a senior in college, they should be able to either do a small project on their own, or work in something like a volunteer or internship situation where they will have the opportunity to do it.

So, entry-level developers can overcome the increasingly steep barriers to entry in the field, but only if they are willing to invest the necessary time to get some hands-on experience using the tools that most shops use.

About

Justin James is the Lead Architect for Conigent.

12 comments
jp273934
jp273934

I am recent university grad in software engineering. I will tell you what the problem is. Its the fact that employers do not want to hire you without experience in a business environment regardless of what personal projects you have done.  I have built a website as my portfolio. I built a mobile app for my wife to use on her Android device. I built services in the .Net framework in my internship I was doing in my last semester in college. But yet I have lost count how time I have been turned down from an interview because I don't have enough real world experience. Everyone wants you to have a bachelors degree but no one wants to hire you unless you have real world experience in a business setting.

junk
junk

This problem occurs for those of us who have been in the business for a long while, as well as the entry level folks. I've just completed a 12 year run for the same company developing embedded industrial controls software/firmware. Stability and familiarity are far more important here than having the latest, flashiest version of Windows, so we stuck with the same set of tools and ideologies for the whole time. Now, I've begun looking for my next gig and I find that my skill set may not be what companies are looking for despite 30 years of experience. The result is the same as for the newbies - no job offers! I have a leg up in that I have brought many projects to market over the years with good success and know the overall development process. But, the specific tools and ideologies have taken a huge and very fast shift and it will take me months of self-study to appear competent to potential employers. It's a nasty side effect of our industry!

greetings
greetings

I concur with CodeCurmudgeon. This was even true during the dot.com boom. The silver lining is the sector has become progressively more specialized since the Internet boom. There are many technologies besides .NET and Java and HTML/CSS has to be considered a separate type of development. Those students whom realize development is niche market should do well. Cloud computing and the growth in the proliferation of APIs is creating even more opportunities to learn new technologies that in some cases nullify the need to have experience. For example, one can search Monster or Indeed and find positions such as Wordpress Developer or Facebook Developer at any given time.

greetings
greetings

I concur with CodeCurmudgeon. Even during the dot.com era, most positions generally required at least two years of experience. Internships and some time work experience is pretty essential because its impossible for most colleges to keep up with the rapid changes taking place in the field The silver lining is the increased specialization within development. There really is no such thing as a developer or programmer. It is a niche field and has been since the Internet came of age, but its escalating even become because of cloud services and the explosion of APIs. At any given point, one can find 70 or 80 different job titles/functions or more and growing. Just as entry level staff don't necessarily need to know any HTML or CSS to do .NET or Java development; they likewise don't necessarily need to master either of those languages if they can master an API or CMS system. At any given time, one can search Monster or Indeed and see openings for Facebook Developer, Drupal Developer, Wordpress Developer, IPhone developer or whatever development need an organization and that's an indication of how specialized the field has become.

CodeCurmudgeon
CodeCurmudgeon

By my own recollection, entry-level jobs have always been scarce: Even back in the days of COBOL and FORTRAN employers wanted 3 - 5 years of experience with just exactly the technology they use. For that matter, they still want that sort of experience with just exactly the technology they use, never mind that you may have decades of experience with other technology.

SaadHusain
SaadHusain

Line up an intern with someone experienced and collect and expert and a novice into programming teams helps disseminate the expertise throughout the organization. Walk throughs where the developer shows the rest of the staff how the code works also helps produce quality code and spread knowledge. Remember that developers want to work at places where they can learn.

jkameleon
jkameleon

> By all measures that I have seen, the job market and demand for developers is doing well. And every prediction out there, both formal and best guess is that demand for developers is only getting stronger as the future depends more and more upon software. You would think that this would be a great time to come onto the job market, but unfortunately this isnt so. It is increasingly difficult to get work as an entry-level developer, Job market isn't doing that well, if it's difficult to get work. If there would indeed be strong demand, as you claim, employers would gladly be offering young people an opportunity to gain experience. But they aren't. > Look, I think LINQ is awesome, but the fact is, I had to read a fairly long, in-depth book to learn it. IMHE "just start using it, and learn as you go" approach is the best here. Dotnet is very well designed, there are no hidden caveats, and what you don't know (usually) doesn't hurt. > And that is exactly what I am seeing with recent college grads. If they were taught in a classical computer science environment and know principles, then they seem to be sorely lacking in real-world languages and systems. For those that went to a more modern school that taught the latest and greatest they simply have not been armed with the proper fundamentals. The fact is, that you need three things in order to survive in software development: Basic principles (mathemathics and such), real world experience, and problem domain knowledge. The school can and should give you the basics, and enough general education to understand problem domain. As far as the real-world languages and systems are concerned, it's every man for himself. They change a lot, and keeping one's skill current is individual responsibility.

HypnoToad72
HypnoToad72

In a college right now for programming and development, the tools being used are sometimes (but not always) behind the times. Whether this is due to lack of desire to keep updated on their part, or not finding enough instructors to teach the languages, and with college costs being far higher than entry level positions, one would say that workers wanting to go into this field are being squeezed from both sides. It is true that technology changes often, but training out of date methodologies does not help students. Given the costs of college, volunteering and unpaid internships can only go so far as well. Meanwhile college costs go up, but instructors' wages don't and equipment at colleges is not improving, nor are the technologies being taught. Indeed, as more classes become web-based, students will spend more for their own equipment and broadband. Not to sound pessimistic, of course, and granted this response can't begin to cover every nuance, but students in particular shouldn't have to be held responsible for others' failings. Because we're all in this together, and if companies also want people with higher degrees along with the skills, there is a disconnect.

HypnoToad72
HypnoToad72

In a college right now for programming and development, the tools being used are sometimes (but not always) behind the times. Whether this is due to lack of desire to keep updated on their part, or not finding enough instructors to teach the languages, and with college costs being far higher than entry level positions, one would say that workers wanting to go into this field are being squeezed from both sides. Not to sound pessimistic, of course, but students in particular shouldn't have to be held responsible for others' failings.

Tony Hopkinson
Tony Hopkinson

I'd put the lack on entry level positions down to offshoring and the economic downturn. It's been bleak for a good while, even if it picked up, going to take a while for academia to catch up, not to mention devlopment to filter down as a career choice to gear up your education for. The education being provided, not a pretty picture at all. Not even close to being geared up to the work environment. Either XXX cookie cutters or people very familiar with big O notation...

Tony Hopkinson
Tony Hopkinson

But I see your situation as more self inflicted than the one facing entry level. You've had twelve years to perceive and address it, I'm at a loss to explain how you could have been caught so unprepared. Trying to get out into the wider industry is going to be difficult as far as the recruiters are concerned you might as well have spent the last twelve years doing flower arranging and career switched to IT. Rule one of our game is keeping yourself up to date, employers who stay at the forefront technologically are a bit rare.

Tony Hopkinson
Tony Hopkinson

cheap glorified clerks. Either students are taught to code in a language (not program) Or they are taught how to program, but not how to develop.

Editor's Picks