Leadership

An open enterprise may not be such a good idea

Most recent technological and management innovations focus on "openness." Patrick Gray talks about why this is not always such a good idea.

Most recent technological and management innovations focus on "openness." From open-source applications where anyone can enhance or use a bit of software, to open protocols and communications standards that allow diverse technologies, companies, and countries to interoperate. Employees are routinely pushing for increasing openness as well, demanding everything from "Bring Your Own Device" initiatives to access to social networks and cloud-based software. However, an increasingly risky world is calling into question some of this push for openness.

It's a dark world out there

Fear and uncertainty are classic tools to grab attention and sell products, and usually overwrought. Even the most wonderful eras have no shortage of people predicting doomsday scenarios, yet there are certainly several clouds on the horizon as of late. The global economy continues to misfire, from developed nations to once high-flying emerging nations, and crisis abounds from U.S. markets to an increasingly unstable Union in Europe.

IT security specialists warn us of increasingly sophisticated malicious software, and cyber warfare, once a topic for science fiction buffs, now seems to be a legitimate form of combat. It's enough to make even the most ardent optimist glance furtively over his or her shoulder, and has the pessimists in many areas literally heading for the hills.

In IT circles, calls for a "disconnected enterprise" are gaining traction. Rather than seeking increasing openness and interoperability, some analysts and CIOs are advocating disconnecting sensitive data and systems from external networks, standardizing on a limited number of proprietary, internal tools, and in extreme cases, actively preparing for attacks to their IT infrastructure from traditional hackers and hostile governments.

While this may be a bleak picture, most organizations have gone through similar "dark periods" in the past and survived unscathed. This will likely not be our last recession or the final period in history marred by conflict.

Assess your risk

What the purveyors of the doomsday scenarios avoid discussing is that there's a massive cost associated with disaster preparation, and a corresponding sliding scale of risk mitigation. Could your company fall victim to a devastating act of cyber-terrorism? Certainly, but for 90% of the world's companies, the investment of time and treasure to fully avoid such a fate is unaffordable monetarily and in terms of the cost in time and distraction from higher value activities.

Similarly, no one can predict and plan for every potential disaster, and the time and cost required to attempt to do so would be overly burdensome in itself.

Finding themes

Rather than attempting to predict discrete events that will affect your company, look for emerging themes that will impact your company and your IT operations. Instead of trying to assess the nuances of a Greek default versus a Spanish default, consider how to embed flexibility into your IT systems that could rapidly accommodate regional currency changes, or requirements to divest operations for a region or country.

Instead of lying awake wondering which country's cyber army might place your organization on its target list, consider which information is truly valuable to your company, and fine tune the tradeoffs between sharing this information internally versus having it fall into the wrong hands. Just as it's silly for most companies to plan what to do if a battalion of some invading army showed up on their front lawn, it's silly to focus on "going to war," but potentially prudent to consider which assets need the most protection or how you'd evacuate them in the face of danger.

While I personally don't think it's time to hand out steel helmets, unplug your company from the internet, and move into a bunker in the mountains, every company should assess the risks it's exposed to and build mitigation strategies into its IT infrastructure. Avoid getting caught up in the doom and gloom, and rather look for low cost countermeasures that might mitigate anything from a national default to a nefarious hacker, and move on with life.

About

Patrick Gray works for a global Fortune 500 consulting and IT services company and is the author of Breakthrough IT: Supercharging Organizational Value through Technology as well as the companion e-book The Breakthrough CIO's Companion. He has spent ...

5 comments
afuller
afuller

the first point of your article is "In IT circles, calls for a ???disconnected enterprise??? are gaining traction." Everything before does not answer the implicit issue as to why calls for ???disconnected enterprise.??? Surely it is not the history of opensource software. Why so much vacuousness? Do pretend to give a "history" as introduction as if you are fully conversant with the issue(s). Make your point directly and humbly ... it might be more interesting.

earlehartshorn
earlehartshorn

I've worked with some business systems that were segregated from the rest of the network for security reasons, and was glad to see the day when they were upgraded to be considered secure enough to be connected to the general network via a secure gateway. It's a matter of weighing the risks and benefits. One must be careful to do a proper evaluation and continue to do ongoing evaluations. As someone once said, we must be careful to walk the middle of the road. It's easy to fall into the ditch of being paranoid and totally disconnected on one side, and easy to fall into the ditch of leaving ones systems wide open on the other. It takes a careful and continuous balancing act to walk the middle path of interconnected but secure systems. If it is necessary to be productive, people will connect the secure and unsecure networks together somehow, by sneaker-net if nothing else. The trick is to balance productivity against the need for security. I doubt there is a formula to determine the balance. And I am certain that there are systems that should never be connected to an open network, but you won't find me working on those systems by any choice of my own.

dl_wraith
dl_wraith

I'm a fan of openness - I like the innovations it can bring. The fact that so many people can tinker with a technology or product and find new ways to enhance or use it. It can drive innovation - but there is a drawback. Openness can bring about a large quantity of poor quality products. With an open platform, innovators tinker and good stuff is made but for every bright spark there's ten 25 watt bulbs. People who can produce functional stuff that's OK, but doesn't really exceed or advance the tech. This can mean that openness can lead to mediocre advances or even poor quality products. With an open platform there's often no [organised] quality control. This can leave users swamped with inferior options and can sully the reputation of the open platform, meaning that users miss the genuine leaps forward that such an open model can bring when it works well. Let me give you a non-IT example of this at work: In 2000 Wizards of the Coast released the 3rd version of a popular role playing game. With this new game system they also launched an open licence allowing pretty much anyone to write compatible products for their game system. The result was a lot of products - most of them considered mediocre by the gaming community and only a few of them real gems. The stand-out products were difficult to identify and the general consensus is that many great products were lost in the flood of average products that the open licence enabled. If you want an IT related example of this I have two words: Google Play (and I'm an avid fan of Android as a platform!) What would be great is if a method for allowing the great innovation that an open platform enables could somehow be effectively combined with a quality control mechanism. Only the products that improve upon the original platform somehow could be allowed to be released. Unfortunately I can't see how such an aim could be effectively achieved as there is a high chance that it would be flawed.

Deadly Ernest
Deadly Ernest

From the late 1980s through to the early 2000s I spent a lot of time working in or with government departments and agencies, and one of the most common IT criteria was to keep the critical operational and classified systems segregated from the general administration and management systems; later when the departments were connected to the Internet via a high security gateway, only the admin systems were allowed to be connected. This meant a minimum of two networks that had not physical links other than staff who stood up from one computer and walked to where a computer to the other system was situated. This same segregation of critical systems also occurred in many private enterprise organisations I interacted with or later worked with. One organisation did a lot of manufacturing, the two main factories had all of of the computer controlled machinery connected to a network to enhance the production process, but they were segregated from the administration system as the admin system had connectivity to the outside world via an Internet gateway. This was the standard way of setting things up for just about every government and major business I had an interaction with during the 1990s and the first five years of the 2000s. I find it frightening that segregation of critical and key systems from the general admin system is not still a basic standard of operations for any business. From the non IT standpoint an openness of administration and management is not only good, it's just about mandatory now, but it does not need to include, and I doubt it every included, and openness about classified information or activities. The issues some organisations have had in the past has been in how and what they decide is classified.

bboyd
bboyd

All it takes for evil to thrive is for good to do nothing... You should invest in security not just for yourself but for benefit of others. One less computer participating in a Distributed Denial of Services attack is not bad thing. Your title and a general negative view of open as un-secure is not predictive of real security performance. In fact Open source projects are very commonly the source of a solution for a closed source security issue. Just ask Microsoft how its proprietary update systems are doing this week...