The need for balancing the level of complexity in your
systems and networking is the need to manage the costs of complexity itself—and
the benefits that sometimes come with making the decision for the component
which adds the complexity. For instance, fifteen years ago I was running a LAN.
When I was managing it, we had a Token Ring environment because of the IBM
AS/400. We were faced with a need to put in a small fiber run to the plant
across the street. At the time, doing that with Token Ring was extremely
expensive. As a result, I added Ethernet (10Base-T) to our physical network.

We saved several thousand dollars but it required that I
train technical staff not only on how to make a Type A/Type 1 cable for Token
Ring using a shielded twisted pair, but also on crimping RJ-45 connectors and
how to get the pair ordering right for Ethernet. It helped that the IT world
was going to Ethernet, and we would have to make the transition one day anyway.

It is decisions like that one, and the hundreds that you
make each year, that add complexity to the environment. It could be something
as minor as buying a different network switch than you normally buy because
your model isn’t in stock, and you need the new switch in a hurry.

For each decision, you have to evaluate whether the
complexity being added to the environment is justifiable given the cost
savings.

Understanding
the need for balance

The natural tendency is for systems and networks to get more
and more complex up to the point where the system architect can no longer adequately
manage them. At that point a pushback begins. In other words, there is a
resistance to complexity starts to appear.

Sometimes the constraint on complexity comes too early. The
additional complexity of the solution prevents us from implementing it even
when it is in the best interests of the organization. However, the opposite is
sometimes true. We allow things to get too complex and end up creating
unnecessary challenges for managing the environment. Learning to create the
appropriate amount of complexity in an environment is essential to reducing
costs.

Not
complex enough

The bias towards limiting complexity keeps you from
exploring other valuable options. Let’s face it, the pristine network or
architecture just hasn’t been created yet. Elegant solutions to problems are
always a goal but if you can’t point out at least one scar in the plan, then it’s
likely that the plan isn’t real or hasn’t been tested yet.

However, there are those who hold tightly to the simplified environments
that they know and love:A
Novell shop shuns Microsoft Windows Servers and waits until it has absolutely
no choice but to implement a Windows server because the business finally
insists upon it. A shop will sometimes use only Microsoft Windows even in its
graphics department even though it can reduce problems and rework to have the
graphics department working on a Macintosh.

These environments are holding on to a simplicity that is
too pure. Instead of the deciding factor being based on the best solution for the organization,
the question becomes what is the best solution for me and my familiar environment.
This is just one way that we shun complexity in favor of what we know and
understand.

Too
complex

On the other end of the spectrum are the managers who try new
things without a care for how much complexity they add to the environment. Their
point of view is that if they know and understand the technology, then everyone
should. They typically have different kinds of hardware and software scattered
throughout their environment.

Their routers and switches are whatever was cheap the day
they bought them. They have added Linux to the mix of server operating systems
to handle some trivial task like serving HTML pages to the Web. The added complexity
of a new router, switch, or even an operating system doesn’t scare managers whose
environments end up too complex for anyone but them to run.

This condition is one part ego and one part job security. The
ego part says, “I’m the only one who can do it.” The security part asks,
“How could they replace me? No one would be able to understand this the
way I do?” Of course, both are unhealthy responses to large systems. A
more appropriate statement is “Anyone can do it.” The key to large
systems is sustainability, and that requires that more than one person can
understand the system.

The appropriate response is to evaluate the additional
training and support costs, and decide from there whether the additional
complexity makes sense. Once you’ve made an evaluation, and made the decision
to accept the complexity, you need to mitigate the risks and costs associated
with that decision.

Controlling
the cost of complexity

The simple answer to constraining training costs is to find
something similar to existing pieces of the system. This can mean a different
model of the same brand, or a model of a brand that is familiar to most of the
people working on the system. Ultimately the goal is to minimize the amount of
difference from one part of the solution to the other. For instance, if you
typically buy NetGear switches and routers, try to add/replace
NetGear switches and routers, if possible. If that’s
not possible, look for a solution such as LinkSys
which has a high penetration in the consumer market. Most IT professionals are
bound to have run into these solutions before.

The next step in controlling costs is to document how the
tool is used and what it is expected to do in the environment. If you clearly
define the component’s role, the less confusion there will be about how it’s
being used. Documentation doesn’t have to be fancy, and shouldn’t be long. It
should be the information someone needs to support and troubleshoot. The basic
architecture of every solution should be documented.

Finally, good documentation can be a great help particularly
in saving support costs when it’s unclear what pieces of the environment may be
involved. Solid documentation
should include product names, versions, firmware, serial numbers, support
accounts, etc. It can be essential in reducing the amount of time necessary to
resolve problems when they arise.

Key
learning

Managing complexity in the environment is key
to maintaining low costs. Managing complexity requires an honest evaluation of
not only the acquisition
cost
but the costs associated with the long term impact of the solution.

Strategies that Scale (e127)

TechRepublic’s free Strategies that Scale newsletter, delivered each Tuesday, covers topics such as how to structure purchasing, when to outsource, negotiating software licensing or SLAs, and budgeting for growth.Automatically sign up today!