Web Development

General discussion


Folders with Tens of Thousands of Files

By bagadonuts ·
A Windows 32-bit Visual Basic application creates and stores large numbers of graphics files (containing signatures captured using signature pads) in a single folder over time (let's say 20,000 - 100,000 files in 3 months). The application is a non-database application using flat files.

The application will have to create new signature files, access them and display them. There will be multiple users on 3-10 workstations creating new signature files and accessing these files concurrently. Each file must have a unique name, and will never be overwritten. These files will all be stored in the same folder that already contains well over 500 files.

Each quarter, the process will start over again, creating the first and then creating 20,000 - 100,000 new files. The old quarter's signatures will be stored and accessed in a history folder. All files must be retained for 7+ years.

The main question relates to concern about throughput issues:

When running the application will it take longer to create new files from within the application as the number of files increase or will it take the same time no matter how many files are currently in the folder?

Will it take longer to look them up and display them as the number of files increase, and the history grows?

Please explain why and if there are suggestions to prevent throughtput problems if any.

This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

by dryflies In reply to Folders with Tens of Thou ...

use a database to store the files to avoid this problem. I can't say for certain, but I think a folder with 100K files is going to have issues.

Collapse -

by bagadonuts In reply to

Poster rated this answer. Thanks for the response. This answer states my re-states my concern but I was looking for a more detailed/specific answer. Thanks.

Collapse -

by reitzen In reply to Folders with Tens of Thou ...

Creating the files will not take much time at all for your application. The Windows FSO will take care of where to place the file.

Finding the file using the exact path/name will not take much time either. If you have to search for the file using wild cards or display a list of files, well, just go get a cup of java.

We have a file server (document imaging) that has over 2.5 million images. Scanned docs are imaged and written to the appropriate folder at the rate of 30 - 50 pages a minute. Retrieving the file and displaying them takes virtually no time.

Use fully qualified path/names when writing/reading and you should have no problems.

Collapse -

by Dr Dij In reply to Folders with Tens of Thou ...

even if the files are OK, windows explorer can fail displaying them. We had this problem, had to exit to dos, do a dir > file and read file to delete / move the files.

Collapse -

by ljohnson In reply to Folders with Tens of Thou ...

I agree with No. 1. A redesign to incorporate a database would be the most efficient and logical way to go. The database way is cleaner, faster, easier, and can be better managed for future growth. Just my humble opinion.

Related Discussions

Related Forums