Security

FTP remains a security breach in the making

Many IT administrators still rely on FTP to move files around on enterprise networks, download patches and share data. However, FTP poses some major security challenges and can leave networks open to intrusion.

Surprisingly, FTP is becoming popular again, especially with administrators looking to move away from hosted file sharing services that have become the targets of hackers, crackers and online criminals, as well a challenge for those looking to adhere to Sarbanes-Oxley, HIPAA and GLBA requirements.

After all, FTP proves to be a great way to move files between systems, users and networks and it is fast, simple to deploy, simple to use, controlled by IT and most importantly - inexpensive. So, what's not to like? Simply put, FTP can quickly become a security breach in the making, its benefits prove to be the very Achilles' heal of the aging protocol - simplicity and abundance have contrived to make FTP a target for incursions.

The shortcomings of FTP stem from both the design of the protocol and evolving business requirements. Exponential growth in file transfers caused by increasing business automation, rising awareness of corporate vulnerability to data leaks, and the need to maintain meticulous audit trails to fulfill regulatory mandates have come together to expose the security deficiencies of FTP. Those deficiencies include:

Inadequate Authentication: The Ponemon Institute estimates the average cost of a data breach at $7.2 million. Something to think about, when using FTP - which lacks built-in strong authentication, along with non-repudiation functionality, which can lead to messages being sent and received by unauthorized users, and to denials that a message was sent or received.

Policy Enforcement: FTP also lacks the ability to filter content to enforce corporate information security policies, as well as checkpoint and restart functions that ensure message delivery.

Delivery Controls: FTP lacks file versioning, auditing and other controls that can prevent data duplication or data loss, as well as other security and delivery controls.

Improper Administration: Many organizations deploy multiple FTP servers, resulting in a patchwork of FTP servers for different operating systems, applications, company locations, departments or even users. That means several paths for incursion have been opened in the enterprise, with little or no administrative control.

Lack of Automation: FTP has no process management framework to automate operations like scheduling across multiple FTP servers. The need to write scripts to schedule file transfers, apply event-based routing triggers, or route files through a workflow adds to the burden and expense of script maintenance, including changing passwords and IP addresses for each customer or partner with whom files are exchanged. That lack of automation means that password changes, authentication requirements and the like can slip through the cracks and become exposed.

Lack of Scale: FTP suffers from an inability to effectively support high volumes of concurrent file transfers or simultaneous transfers to multiple recipients. Without the ability to queue data transfers, users start to receive error messages that in turn increase help desk calls, which may lead to FTP sprawl, where a new server is launched and forget to be taken down after the need for it expires.

While FTP seems to hold many promises as a replacement for file sharing services, FTP still carries significant IT administration costs, data security risks, and transparency limitations that compromise businesses' ability to operate efficiently, safely, and in conformance with regulations. So before abandoning a paid service, consider the implications alternatives have on regulatory requirements and adopt something that protects data, while enforcing regulations and company policies.

About

Frank J. Ohlhorst is an award-winning technology journalist, author, professional speaker and IT business consultant. He has worked in editorial at CRN, eWeek and Channel Insider, and is the author of Big Data Analytics. His certifications include MC...

6 comments
pgit
pgit

All true. The best I've come up with is tunneling the whole mess through ssh, using keys rather than passwords. (I always disable sshv1)


I've also used a "knock" like approach in a couple instances (including knock :)  ) with acceptable results.


In all my deployments over the years the ftp server is not running and available 24/7, the user starts the server with scripts (or manually, if savvy enough) in each case.


As mentioned in the article this makes for a lot more work, with itinerant worry, on my part.

I'm lucky in that I've not run into a client that needs to be able to shovel files to anyone/everyone on a moment's notice. All the ftp deployments I've ever serviced were completely 'in house' operations.

The trend I personally face is that ftp use is on it's way out. I've not deployed a new ftp system in about 3 years now, and the existing ones are slowly being abandoned and replaced with a 'cloud solution.' I don't mind this particular loss of business. It results in less of the more headache inducing work, in part because ditching ftp is an automatic increase in overall security.

 

straightp
straightp

SFTP should be used for data being transferred over public networks.  Public Key authentication is pretty solid to secure it, and easy to manage.


If FTP must be used, try to utilize a VPN.  Then you can use the unencrypted protocol over an encrypted tunnel.  As for Delivery Controls, look at OpenVMS. 

seth2011
seth2011

TeamViewer is being used as an alternative to perform FTP simple operations

kokeefe1
kokeefe1

True, plain FTP can be a risk.  To gain more security, compliance, control, and management around FTP you should look into SmartFile (www.smartfile.com).  The ability to go between secure FTP (SFTP, FTPS, or FTPES) and secure web access (with auditing, user management, permission control and more) gives networks and admins a perfect hybrid solution.

Editor's Picks