Tech & Work

Talking Shop: Active management is the key to mitigating harassment liability

Argues the case that management must enforce Internet and e-mail policies and not rely on filtering software to accomplish the job


Many companies trusted their staff with unfettered access to the Internet, only to be hit with settlement payouts to employees who received sexually explicit e-mails or were otherwise harassed via the company’s computer equipment.

Now, IT departments may be making a similar mistake by trusting content filtering software and other technologies to mitigate the threat of hostile workplace liability rather than focusing on the real solution, which is working with HR and management to actively enforce your company’s policies on Internet and e-mail use.

“There’s no question that IT departments can monitor employees' Internet usage; it’s just a question of will,” said Rodney Glover, a Washington, DC-based attorney and published author on Internet use policies and employee privacy issues. “Having a policy in place and then failing to enforce it is one of the worst things you can do, in terms of a plaintiff’s claims.”

Knowing the threat is just half the battle
Most companies are taking some action, Glover said, on the liability threat posed by employee misuse of company equipment, such as viewing or circulating objectionable materials or harassing other employees. His assertion is backed up by reams of industry research, including a 2001 American Management Association survey that reports 77 percent of major U.S. firms now monitor employee communications, including phone calls and e-mails. Similarly, The Privacy Foundation estimates that 14 million people, or 35 percent of the U.S. workforce, are monitored every time they use e-mail or the Internet—a figure that has doubled since 1997.

With this emphasis on preemptive technologies, why has the number of electronic harassment suits continued to rise? The risk, experts say, is the possibility of your system admins or, even worse, your management team undercutting your efforts by failing to follow up on the company’s stated policies with comprehensive, swift, and decisive action.

“Because our appliance gives the customer all the levers, they can turn our good solution into a lousy solution in a hurry,” said James Punderson, the CEO of Networks & More, which produces the ISBoss content filtering appliance.

Like most content filtering systems, ISBoss comes preloaded with a list of known “bad” URLs and filters for objectionable terms, including sexually explicit language and racial slurs. But lax admins can set the thresholds for these filters so low that virtually any type of material can make its way onto your client systems, said Punderson, whose product was created for and marketed to educational organizations.

Similarly, admins can simply fail to report violations of company Internet use policy to the appropriate managers. But perhaps the most damning breakdown of a content monitoring chain happens when the HR department or other managers fail to do anything about notices of misconduct.

"If you've got someone pulling down Playboy and circulating it—it's not all that uncommon, believe it or not—if you have a policy and you know it was going on, and yet you did nothing about it…that's very strong evidence to bring before a jury,” said Glover, whose employment law practice has gravitated toward issues of electronic privacy over the last few years.

More often than not, electronic harassment policies fall prey to the all-too-common hobgoblin of office politics, Glover said, and not an IT breakdown. "If a CEO or the company's top salesman goes out and pulls down something pornographic, the company may look at this differently than if some clerical staffer does the same thing,” he said. Such favoritism actually is among the most damaging missteps in the eyes of the courts, he added.

Implementing your plan
Clearly, filtering technology should be just a part of your company’s protection against liability for misuse of its equipment. So how do you roll out or modify a comprehensive policy on Internet and e-mail use? Glover and Punderson offered these insights.

Don’t try to go it alone
Your first stop in evaluating the need for content filtering should be your HR department, since it is ultimately responsible for enforcing the company’s policies. Don’t be afraid to advocate the need for content filtering and the capabilities you think IT can bring to the table, but don’t try to assume full ownership of how employees use the company’s equipment.

Glover said that implementing a content filtering system is a huge step that can actually create risk as it seeks to eliminate other windows of liability. One of the primary risks of implementing a monitoring program is that it sets a very clear expectation of the company's standards, Glover said. If the company fails to meet those expectations, it will most likely have to settle a harassment claim. So, remember: a policy and a commitment to enforce it first, technology to back it up, second.

Publicize the policy to everyone affected
U.S. courts have held that companies can monitor the use of their own equipment as they see fit, so long as they publicize the fact that they plan to do so. So it’s imperative that HR and IT send at least a couple notices to staff members that their e-mail and Web surfing is going to be watched; add such a disclaimer to the new employee orientation kit, as well. If you fail to do so, your company can be open to claims that they’ve violated employees’ privacy.

Glover also warned that international courts have shown greater deference to employees' privacy, which means that you may not be entitled to read that e-mail, after all. If your company has overseas offices, you may well need to implement distinct monitoring policies for those locations, based on prevailing law.

Focus on the protection offered by a content filtering policy
When rolling out your policy, stress the message that you’re only looking for the kinds of communication that no one would condone—well, at least not publicly. “Harassment usually comes from employees, and employees are the targets of harassment, so by implementing a policy like this, you are really protecting employees,” Glover said.

Be reasonable about how vigorously you limit access
Some companies reacted to the emerging threat of electronic harassment liability by simply shutting down employees’ access to the Internet. But that approach really doesn’t fly. “If I’m a developer and I’m told I can’t surf the Internet to research my job, then I’m going to go find another job,” said Glover, who added that being able to execute some personal business—such as ordering pharmaceuticals and online banking—may actually improve workplace productivity.

Most of Glover’s clients filter for the obvious criteria: common obscenities, racial pejoratives, and sexual references. Since the 9-11 terrorist attacks, a growing number of companies also are looking for threats of violence, he added.

Another reason not to go overboard in your efforts to guard against misuse of your technology is that a technically savvy employee, who is dead-set on getting around your preventative measures, will probably be able to do it. Many filtering solutions offer a fine degree of control over the content your users can view and distribute, but no system is perfect, Punderson warned. For example, ISBoss can scan the full text of an HTML document for sexually explicit terms, but it can’t look inside an image file called into that page. “The name of the file may be truck.gif, but it’s not a picture of a truck,” he said.

Several Web sites are actually devoted to beating content filters, so the rare hacker will always be a problem you’ll just have to deal with on a case-by-case basis. But Punderson said typical corporate users won’t be able to bypass the solution you employ.

Narrow in on active use of equipment by employees
One use for a content filter is to scan incoming mail, including the flood of pornographic and otherwise objectionable spam that invariably hits any e-mail system. Glover said that while companies should certainly block objectionable spam, the real risk of harassment lawsuits comes from the active misuse of company equipment by employees, not passive sources like unwelcome e-mails. In other words, you should be much more worried about an employee who uses a corporate system to sign up for a porno list serve or to circulate such mails via your own clients and servers.

Don’t forget sites you don’t control
A 2000 case in which Continental Airlines was found liable for postings employees made on a bulletin-board site sparked a wave of media coverage about the emerging risk of electronic harassment suits. The site in question was maintained by the airline’s ISP, but other companies—including TechRepublic’s parent company, CNET—have shut off network access to third-party forum sites where they believe employees are engaging in libelous or harassing behavior.

Glover said that while he doesn’t consider this kind of network misuse the most pressing risk most companies face, he would advise clients to block access to such Web sites if they have good reason to believe their own equipment is being used to post malicious materials. Blocking access to URLs is a common feature of any content filtering system and can also be accomplished via your network firewall.

Keep HR in the loop constantly
Many filtering systems can be set to send e-mail notices of inappropriate content access events, and Glover suggested HR be on that list. In many cases, no action will be necessary, but again, the purpose of a content filtering system and policy is to prove that your company is actively trying to suppress any kind of harassing conduct.

Don’t be reactionary in enforcing the policy, but don’t be overly generous, either
An occasional ping that a user has tried to access a blocked Web site shouldn’t set an inquisition in motion, Glover said, since there is just so much garbage floating around on the Web. Typos happen, after all.

"If you've got somebody generating hits once or twice a month, then it may be no big deal," Glover said. "If a person is generating one or two hits a week or day, then you have to apply rule of reason….If you let this stuff go on for a month, that's probably too long."

Keep feedback channels open
The ISBoss filtering appliance presents users with a Web form they can use to request that admins open access to a site that has been blocked for some reason, Punderson said. The software also lets users request that a site be added to the “bad” list if a user finds some objectionable material there. Regardless of the tool you employ, keeping such lines of communication open is vital as you tune your content filtering appliance to meet your company’s needs.

Respond to any complaints immediately
Harassment of any kind is serious business. If you or your admins receive complaints of objectionable use of the company’s equipment, notify HR immediately.

The polices and practices above should create a solid case that your company is doing everything in its power to prevent its equipment from being used to harass its employees, which is the acid test for any hostile workplace or harassment claim. Just remember that no technological solution can replace responsive and active management of a serous threat like harassment liability.

About Ken Hardin

Ken Hardin is a freelance writer and business analyst with more than two decades in technology media and product development. Before founding his own consultancy, Clarity Answers LLC, Ken was a member of the start-up team and an executive with TechRe...

Editor's Picks