General discussion

  • Creator
  • #2191154

    Duplicate files.


    by tonythetiger ·

    Is this a big deal to most of us or is it just me?

    Users here are constantly copying a file from a shared directory to their personal directory, or making extra copies of a file, etc. and it’s taking a good chunk of my day to keep things cleaned up.

    I’ve tried teaching them how to use shortcuts, but they never stick with it very long. How to you monitor and deal with this?

    Related are the users who insist on sending a 500k attachment to everyone in the organization instead of sending a link to a file on a shared direstory. Of course, 300 of them detach the file… to their personal drive… where it will sit, never to be looked at again .

All Comments

  • Author
    • #3073323

      When you find the solution, let us all know.

      by charliespencer ·

      In reply to Duplicate files.

      End users. Can’t live with them, don’t have a job without them.

      This used to be a major issue back when floppies would only hold 1.44 megabytes and the biggest hard drive was only 10 or 20 meg. End users had to know more about file management or they would run out of space. Now with 40 and 80 gig drives as standard equipment, 512 meg thumb drives, and “unlimited” (as far as end users care) network storage, end users don’t have reasons to manage drive space.

      I can’t speak for the non-Windows NOSs, but there are standard tools to manage this. Exchange will allow you to set maximum message sizes to limit the attachments. You can set max sizes on the personal storage space with group policies. You should also have upper management buy-in before implementing these. Otherwise you’ll just tick off The Powers That Be and possibley wind up on the street.

    • #3073282

      Because we are magicians

      by antuck ·

      In reply to Duplicate files.

      Seems the end users think we can just magically fix there data loss just by pressing a couple of keys on the keyboard.

      I had my cousin call me last week from college. Seems when he turned on his Compaq computer, it came up to the recovery sceen. Of course he doen’t know why or what was on the screen, but he pressed a key and after rebooting all of his data was gone. I was some how over the phone suppose to magicly bring his data back. Of course no back ups and it was homework he needed. After talking to him some more he kept talking about mapping a drive to the campus. I asked him what was suppose to be on the mapped drive, his school work for this class. I asked him why did you not save everything on this mapped drive? You can probably guess already the answer… it was I don’t know. I wanted to have it saved to my hard drive so I knew where it was at. I shook my head and explained the importance of this mapped drive. Although, I’ll bet after this he still will not save everything to this network drive. And will call me again to perform my magical duties.

      • #3072857

        another doozy

        by tonythetiger ·

        In reply to Because we are magicians

        Check out this filename:

        J:\Traffic\Ross\US 35\US 35-Larrick’s Lane Interchange Justification Study\Signal Warrants\US 35 WB off ramp (prop)-Egypt Pk & PV Rd signal warrant analysis\COUNT-PV Rd WB West of Warden’s(used to project traffic on East leg of new intersection).pdf

        It is 249 characters long. Now where it’s at, it’s not a problem. J: is just one directory on the root of a server drive. But something weird happens if the user moves this file to his W: drive (which is 4 directories from the root on another drive, making the total drive:\path\filename some 280 characters). He can see it, edit it, copy it, move it, etc. from his workstation, but if you go to the server console, you cannot find this file! Nor does it get backed up! No error messages enther. I told them, 190 characters max.

        I’ve made a pretty good dent in the duplicates, but it’s been literally weeks of work. We went from over 450,000 duplicate files to just under 200,000. Most of these were mistakes in creation of directory structure, and the mistake wasn’t removed, to one case where a user was flat out paranoid and copied everything… twice. Mostly small files though, as the removal only cleared about 60 gig.

        As to what I’m using. an old PC mag utility Dupeless. But it takes forever to run, I started another run at about 8:15 a.m. and it’s still running at 2:12 p.m. 🙁

Viewing 1 reply thread