High Performance File copies in high IO environments

By gumly ·
Hi All,

I am trying to solve a throughput problem on our network. We process huge volumes of data daily. Our average file size is around 750MB and we have 70+ PC's that read data from our fiber channel DAS thru file servers onto the local drives of our servers - we have around 450TB of FC SANS pulling thru file servers). The workstations then process the files and copy the massaged files back to the SAN.

Each process can take 3-5 minutes and we process 60K worth of files in an average session which may take 2 weeks to run - its satellite imagery in case you are wondering.

We use Robocopy to copy the data to/from the SAN but we are also looking at what other technology may be out there that would bypass the OS overhead. I have seen some references to LAN pumps, etc., which permit unbuffered reads and write by creating a sink, but there seem to be no commercial products out there that I can find that implement or use this concept. Any ideas on this or another utility that would copy large chunks of data faster than RoboCopy? We use Perl extensively on the backend but haven't found any utilities on CPAN that appear to implement this either.

We are running on Gig-E and routinely saturate the bandwidth so this is the major bottleneck in our architecture.

Thanks in advance,


This conversation is currently closed to new comments.

2 total posts (Page 1 of 1)  
| Thread display: Collapse - | Expand +

All Answers

Share your knowledge
Back to Networks Forum
2 total posts (Page 1 of 1)  

Related Discussions

Related Forums