General discussion

Locked

VB6 inefficiency using INPUT

By grbarker ·
I have a VB6 program which reads fixed length records in Binary mode using the INPUT function with a length. Then the program performs some logic and outputs the input records as well as occasionally generated additional records using the PRINT# function. On a test of a small file of 600,000 records my throughput is over 5000 records per second with simulated reading + logic and no output. In another test my throughput is over 4000 records per second with simulated reading + logic + all output. I found my performance issue in a 3rd test where my throughput is 238 records per second with actual reading + logic + no output. There is definitely a performance issue with the INPUT function in VB6. I have found some efficiency gain by performing I/O in 32k chunks and refreshing my progress window less frequently. But I have to find a different method of reading my input file. The input file contains binary data as well as commas and it does not have length identifiers. So I think FileSystemObject methods and any other textstream or line data methods are out. The programs which later use the output file do not support database operations. So that option is out too.

Short of writing a high performance read subroutine in C or Cobol are there any methods of reading a variable number of bytes near 32k from a disk file within VB?

This conversation is currently closed to new comments.

1 total post (Page 1 of 1)  
| Thread display: Collapse - | Expand +

All Comments

Collapse -

This works better

by grbarker In reply to VB6 inefficiency using IN ...

Never mind folks. I found that by opening my input as binary with no length and no record structure associated then setting my recieving variable to the number of bytes to be read that I was able to use the GET function to specify the filepath, starting byte location and recieving variable to read my file. Performance was boosted from 238 records per second to 3,000 records per second for combined read, array logic, writing and reporting operations. Another performance consideration is that I read and write as many records as will fit in 32k to reduce physical I/O operations by deblocking input buffers and blocking output buffers in memory using the MID$ function. Reading and writing I/O in multiple record blocks yielded a 40% performance gain. I hope that info helps someone out there.

Back to Community Forum
1 total post (Page 1 of 1)  

Related Discussions

Related Forums