Software development projects are tricky enough without adding bad technology choices to the mix. Over the years, I have learned some painful lessons in picking software. You can benefit from my mistakes by learning these 10 lessons without having to experience the pain first hand.
1: Don't expect a tool to be a silver bullet
One of the biggest mistakes people make when picking out software to improve the development process is thinking that problems magically go away by waving technology at them. This simply is not true. I have seen countless ALM systems littered with a ton of items from the first month or two after it was installed, and then nothing at all. Were those systems bad? No. The people simply failed to use them. If your team isn't going to take the time to learn the tool and really make it part of the process, do not even bother.
2: Beware the integration
Something we frequently see on the systems side of things is that the efforts of integration can turn even a free software package into a budget black hole. This holds true for software development tools too. The "cost" of integration is more than the resources to tie it into existing systems; it is the interruption in the development process, time to relearn old habits, and more.
3: Don't get fooled by "total packages"
Most of the times I have been sold on a "total package" to solve a problem, it turned out to be a bust. Even when they worked, they were too cumbersome to get much done in. A great example of this is Microsoft SQL Server. While it is a very good database server, all the various add-ons that are part of the suite (particularly SQL Server Reporting Services) range from "not bad" to "ugh," and I have learned that it is better to work with other add-ons to the base package.
4: Do a pilot program
A commonly quoted statistic says that 70% of IT projects are deemed failures. I believe that number to be just about right based on my experience. So why stake your entire development process on a project that has a good chance of not succeeding? This is why I've learned to do pilot programs. Choose a smaller, new, or less risky project to use the new tool or technique in and honestly evaluate it. Without the pressure of "the whole thing falls apart if this doesn't work" or "we'll sink the entire team if we picked poorly," it is a lot easier to avoid slipping into the "sunk costs fallacy" and try to force a bad product to work.
5: Be ready to throw the pilot out
Speaking of the "sunk costs fallacy," resist the temptation to turn your pilot program into the running system. There are some great reasons to not do this:
- The pilot program may not have been integrated or configured in the best possible way.
- The pilot program may reveal that the package is not right for your organization.
- The pilot program should not represent such a deep commitment of resources that it is a "point of no return."
- Committing to what is supposed to be a "test" and turning it into "production" is a bad policy in general.
Keep your pilot program isolated. Remember, you are potentially going to be stuck with your decision for a long time. It's better to have one small project using the new tools and not be a stunning success and back out than to turn around and commit the whole team to using those tools. I've learned this lesson the hard way by watching "teething pain" decisions from the pilot program get baked into the system long term.
6: Consider the team's background
You can take the greatest tool in the world and give it to the wrong team and problems will come up. For example, I think Mercurial is a fantastic version control system, but folks from a more Waterfall-like background often struggle to see the benefit. At the same time, while those same people feel comfortable with Team Foundation Server, I do not feel that the full Team Foundation Server package is a good choice for people coming from an Agile background.
7: Evaluate the learning curve
Some tools are simply easier to learn than others. A tool that's easy to learn and use is almost always more effective than a powerful package that people struggle to master and incorporate into their work. For instance, I've been on countless projects managed by Microsoft Project or similar tools, and they almost always become a mess when people stop updating the project plan because the tools are such a hassle. But I recently switched to Trello, which has a low barrier to entry and use, and the result has been a smooth flowing project. Is Trello as full featured as Project? No way. But a tool people use is always better than one they don't use.
8: Make sure it addresses a real problem
It is easy to get wrapped up in the excitement of trying out a new technology, but it is important to determine whether it addresses a real problem you have. A few years ago, I spent a lot of time checking out F#. And honestly, it was a really neat thing to look at. It appealed to the part of me that is fascinated by algorithms and how to program them. But it just did not solve any problems I faced because I was not doing algorithm work at the time. Could I have replaced C# with F# at the time? Yes. But it would have been a real mess for little gain. I had the same experience with parallel processing; even though various languages and systems have made it much easier to write multithreaded code, few applications actually justify the techniques in most cases.
9: Check out how "open" something really is
The word "open" gets to be more and more meaningless every year, thanks to the marketers and spin doctors trying to hawk their wares. I've seen many systems billed as "open" when they are anything but. The good news is that this is an easily verified claim. If the source code is available, do you need to spend extra to see it? What is the licensing like? Does it use common standards? Are those standards free to use and build on? Etc. "Open" does not necessarily have to mean open source (though that helps) or "licensed under a copyfree or copyleft license" (those help too). But it does need to mean that you can work with the system the way you want to.
10: Vet the vendor
As customers, we are so used to being used and abused by our vendors that we forget that it does not need to be this way! While every relationship is going to be unique by definition, I've learned to pay attention to a number of red flags during the selection process. Although this list is not exhaustive, it certainly warns of a vendor that will be hard to work with:
- Does the vendor participate in its own user forums/communities?
- Does the vendor suppress dissent on those forums and communities? Or does the vendor acknowledge it, embrace it as criticism, and try to fix things?
- Does the vendor have easily understood and customer-friendly licensing and pricing terms?
- Is the vendor clearly trying to lure you in with a low purchase price and nail you with a big maintenance contract?
- How long is the contract? Do you feel like you need a lawyer to understand it? Does it have unusual clauses in it?
- Are tons of people complaining about their customer service or policies?
- Does the vendor offer an "exit option" for your data?
- Does the vendor adhere to industry standards or use proprietary standards and formats?
- Can you call people directly or is every request shuttled through a CRM system before you eventually get help?
- Can you get pricing from the vendor Web site or do you need to fill out a form and wait for a salesperson to call you back?
Justin James is the Lead Architect for Conigent.