Folders with Tens of Thousands of FilesLocked
A Windows 32-bit Visual Basic application creates and stores large numbers of graphics files (containing signatures captured using signature pads) in a single folder over time (let’s say 20,000 – 100,000 files in 3 months). The application is a non-database application using flat files.
The application will have to create new signature files, access them and display them. There will be multiple users on 3-10 workstations creating new signature files and accessing these files concurrently. Each file must have a unique name, and will never be overwritten. These files will all be stored in the same folder that already contains well over 500 files.
Each quarter, the process will start over again, creating the first and then creating 20,000 – 100,000 new files. The old quarter’s signatures will be stored and accessed in a history folder. All files must be retained for 7+ years.
The main question relates to concern about throughput issues:
When running the application will it take longer to create new files from within the application as the number of files increase or will it take the same time no matter how many files are currently in the folder?
Will it take longer to look them up and display them as the number of files increase, and the history grows?
Please explain why and if there are suggestions to prevent throughtput problems if any.