Collaboration

Tech Tip: Internet security: Where do we go from here?

By Jonathan Yarden

When it comes to information security, current methods haven't changed much in the past few years. The weakest link in network security remains the ever-present human factor, and the best network security equipment is still only as good as its configuration and software.

While this doesn't mean the old methods are failing, it does mean that the tasks of keeping firmware and software updated, keeping track of security warnings, and staying informed about Internet security issues have become more than a full-time job. Costs for reliable security don't come cheap, so many companies simply "wing it."

Many companies still don't have a solid understanding of what Internet and computer security entails. Most are only beginning to recognize the incredible complexity of the security of their own internal processes and systems, much less what comes in from the Internet. The problem with current Internet security methods is that many solutions are only single-source perimeter protection methods.

Most small to midsize companies use a single Internet connection. A simple DoS attack on the perimeter router for such a company effectively knocks it offline. How this company responds to this event depends almost entirely on the ability of its staff—not the quality of its Internet security equipment.

Internet security is still essentially reactionary—there's no current method to guarantee security, and there never will be. Companies can only do the best they can with what they know about. They can minimize risk, but they can never eliminate it, which isn't a popular thing to say to upper-level management when trying to justify Internet security costs.

In general, a benchmark of good Internet security is the speed at which a company responds to a problem by retrofitting security solutions into existing systems and processes, including fixing open SMTP relays, putting antivirus software on previously compromised systems, and upgrading exploited operating systems.

A few companies proactively manage their security, but even with the best defenses, unauthorized users (often from within the company itself) still compromise and misuse systems. The status quo of Internet and computer security is basically firewalls and content scanning, whether that content is e-mail, files, or Web pages.

As for internal security, anybody with a packet sniffer can cause more damage inside a company than any outside intruder, but few companies use interior firewalls between networks and apply layered security methods. The focus is still on the perimeter of the network. But that doesn't do much good once a worm or virus gets inside a network.

Generally, an upstream Internet service provider doesn't determine the "validity" of Internet traffic to and from a customer. A dramatic change in Internet traffic doesn't generally trigger a response from the ISP unless it affects the ISP itself or the customer notifies the ISP of a problem. Again, it's a reactive response, which is generally the status quo of Internet security.

Some ISPs proactively protect their customers, but those customers who see it as invasive don't always welcome the protection. Realistically, it's best to leave the ISP in charge of securing the network perimeter, rather than deferring the responsibility to the customer. Since both the ISP and customer benefit from stronger perimeter security, this is a logical next step in the evolution of Internet security.

ISPs typically accomplish this using simple TCP and UDP port filtering, simply denying access to specific services from or to the Internet. Port blocking is already common with large DSL and cable modem providers since hackers frequently use these networks to attack other systems. Unfortunately, this practice isn't common enough; the majority of DDoS attacks and junk e-mail results from compromised computers on broadband networks.

In general, it's best to implement a combination of both methods: service restrictions and active monitoring. In my opinion, implementing a mix of both ISP firewalls and proactive monitoring of Internet traffic, both to and from the customer, is the direction in which Internet security is moving. Let's hope it gets there sooner than later.

Jonathan Yarden is the senior UNIX system administrator, network security manager, and senior software architect for a regional ISP.

Editor's Picks

Free Newsletters, In your Inbox