The
world of software development is changing. Software development tools and frameworks are promising to
improve speed of development. New agile development techniques are changing how
we approach the development process. Whether your organization has adopted new
tools, new methodologies, or is just looking at what’s going on in the market,
you’ve got to account for the compression of the construction phase of the
software development process.
Silver bullets
Although
there are no silver bullets in software development (See Fred Brook’s seminal
work The
Mythical Man-Month for more on silver bullets), we continue to make
incremental improvements to the software development process. You might think
of it as introducing a little bit of silver into lead bullets. Many of the
improvements that we have for software development are improvements for the
construction part of the process.
Figure A |
![]() |
The Mythical Man-Month |
New
tools like Visual Studio 2005 have editor technologies that improve
development speed by making it easier to remember the properties and methods on
the objects that you’re working on. The compilation process has changed so that
it’s largely interactive. Most of the syntactical errors that you can make are
automatically identified while you’re editing and are flagged so that they can
be fixed without running a compilation cycle.
In
addition, more of the basic functions are available in the form of both
built-in foundational libraries as well as third-party open or shared source
libraries. Most of the broad application objects have been written and more and
more commonly these libraries are bundled into the core foundational libraries.
This means that to create a feature point within an application there’s a
greater and greater likelihood that the code is already written, all you have
to do is connect it to the rest of your application.
Even
if not contained within the core language operating system, there are
foundational layers which offer enhanced and more feature rich functionality
which can be layered on top of the core libraries and in doing so provide
further enhanced functionality. For instance, the Microsoft Enterprise Library,
while not a part of the core .NET framework, reduces the amount of code
necessary for those interesting in using it. It’s a collection of free
documentation and code designed to make enterprise projects go faster.
Third-party
open and shared source projects are filling in the niche areas where the
standard libraries aren’t playing. Whether it’s a library for reading and
writing RSS feeds, a web based text editor, or a complete web management
system, there is a plethora of individual solutions which can be integrated
into your applications.
The
benefits of this are reduced construction times. However, the cost is
additional design or exploration time at the beginning of the process to
identify which of these foundational components make sense for use in the
project. Additional design or exploration time is necessary to identify the
third party projects which are valuable to parts of the solution.
When
Fred Brooks wrote his essay “No Silver Bullet – Essence and Accident in Software Engineering,”
one of the observations is that the software development process is
fundamentally complex and there is little or no way to fundamentally reduce
this complexity. As a result we’re left with making incremental improvements. Steve
McConnell in his book “Rapid Development” commented about the idea of
Silver Bullets and made the case that even if a tool were able to reduce one
part of the software development process it’s unlikely that its effect could be
felt in every area of the project. In other words, a 25 percent reduction in
the construction phase of a project does not create a 25 percent reduction in
the overall project’s effort.
Application development management
-
The
top 4 things project managers do to destroy software quality -
What
are you teaching your software developers and is it hurting you? -
The
best software developers are built not bought -
The
declining importance of performance testing should change your development
priorities
Software quality and testing
On
January 21, 2006 Symantec’s anti-virus software was capable of detecting 72011 threats.
That’s a staggering number. Today there are dozens of new threats to computing
appearing each day. Not all of these threats rely on software vulnerabilities,
but a substantial proportion of the newer threats are directly related to bugs.
Much
publicized failures of software including a few NASA missions, a baggage system
at a major airport, and others have created a greater awareness—at least in the
IT industry—of the impact of software defects.
A May
2002 report by the National
Institute of Standards and Technology (NIST) titled “The
Economic Impacts of Inadequate Infrastructure for Software Testing”
estimates the impact of software quality in the tens of billions of dollars
each year. At just under one percent of the nation’s
gross domestic product. That’s a staggering number.
Clearly
software quality has a long road ahead of it as we try to make software more
reliable. It is no wonder then that a greater emphasis is being given to
software testing in software development projects. Formalized testing is
getting a boost in terms of total elapsed time in the project and the resources
dedicated to the testing process.
Agile
development methodologies are beginning to drive the idea of test driven
development into the consciousness of developers and development managers. Test
driven development, in a nutshell, means writing your test cases first. First
you decide what the module or object that you’re working on should do and then
you write tests which test whether that functionality works as you envisioned
it—even before creating the object itself.
The
net effect on the software development project is that less of the formal
construction phase is invested on truly developing the code, and more time is
invested in developing testing constructs. The back end formalized testing
further reduces the overall construction phase of the project.
Test
driven development advocates argue, somewhat successfully, that the additional
time spent on testing reduces the overall amount of time necessary to construct
the software because fewer bugs or unimplemented features go on to be
undiscovered until late in the process – when it is known that bugs are more
costly to fix.
What does this all mean?
These
changes mean that organizations need to make changes to the way that they plan
for projects. Instead of using historic guidelines for the amount of time that
a project will spend in construction, they must look to smaller construction
numbers while increasing the amount of time for design and testing. If
construction has historically taken 35 percent of the overall project time, then
you may be able to squeeze that number into 25 percent of the overall
development time while increasing the design and testing phases by five percent
each.
While
I’ve expressed this as a zero sum game—that is likely only to occur in a case where
new technologies are integrated into the development processes of the team. Once
the team is comfortable with the new foundational classes and the third party
solutions the additional design time will fade away for a while resulting in an
overall reduction.
Techniques
like test driven development can reduce the overall construction time,
particularly on projects where quality is prioritized, however, initially it
will be accompanied by a learning curve that will likely more than eat up any
savings you’ll have.
In the
end, software development is getting better by improving the accidental (non
essential) aspects of software development. This doesn’t radically improve the
development speed of the project, although it does has a positive long term
impact, if you’re willing to start making investments in tools and techniques
today.