Our forums are currently in maintenance mode and the ability to post is disabled. We will be back up and running as soon as possible. Thanks for your patience!

General discussion


Access too slow over WAN

By jfejsa ·
Our organisation has a large system compromising of 8 large linked Access XP databases (Human Resources, Grants, Projects, Publications, Staff Development and Staff Appraisals to name few). The whole system works great; that is, it worked great until recently. In the last two years we merged with two other units ? one last year and another this year) which made our current organisation reasonably large, both in staff member and area. We are now spread over an area of about 132,845 square kilometres and we found out that access database do not travel very lightly across the WAN, as a matter of fact they are unacceptably slooow.

We are currently looking at converting our entire Access database system to a system that?s most suitable for wide area distribution.

Most front-ends in the current system are linked to at least three to eight back-end databases in the system (using one or more tables from each liked back-end database). For example, Human Resources database, a critical database,is linked to every other database in the system.

DAO used throughout, larger number of action and selection queries with multiple parameters (most using dynamic IN and LIKE) and most queries created at runtime depending on user selection so we may find it difficult to convert them to SQL Views. The same applies with most reports; about 90% are generated in code. Quit a large number of our output is generated using Automation (outputting to Word/Excel throught code)

We need to develop a system that will be acceptably fast over the network and/or over the Internet. What do you suggest?

1) What back-end do you suggest and why (ie, SQL Server, Lotus Domino etc)?
2) What front-end do you suggest and why (ie., Visual .Net, Access ADP, HTML using ColdFusion server, Crystal Reports for report etc)?
3) Are there are any tool that will successfully convert Access MDB files with code and large number of linked tables from number of databases?
4) Are there are any pitfalls to look out for (when developing, converting etc)?
5) What about existing data (most databases in the system used Autonumber as the primary key; how will that come across to the converted system)?
6) Any other issue/questions/tips?

It looks like a huge job and I would love to hear suggestions from anyone who went thought the same process. As a matter of fact, I would appreciate suggestions from all the professionals in the forum. Any hint/tip are sought and appreciated?

Thank you

This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

For starters...

by illilli In reply to Access too slow over WAN

So, that you won't feel lonely, I'll get the discussions started...

1. You seem to already be in a Microsoft environment, so I recommend you stay with it. Use SQL Server.

2. Wow, here we have a LOT of choices. We usually do one of two things: a.) Keep our front-ends in Access or b.) Create Web-based front ends in ASP or .NET. One reason for keeping the Access front end is that you won't have to re-code a lot of stuff. We have used several tricks to optimize the Access/SQL Server relationship. Also, there are some cute things you can do with access that are difficult (or different might be a better word) to recreate with other tools. My co-worker swears by the Cold Fusion/Crystal Report front end angle by the way. I guess it comes down to what you have a skill base in.

3. We recently did a conversion and used Microsoft Data Transformation Services coupled with some Access scripting and exporting.

4-5. Many. We changed our schema to combine several de-normalized tables. We had to combine them with a new autonumber, but keep our old relationships. It was tricky, but you just have to carefully work through it. Another problem arises if you use cross-tab queries. You'll need to write stored procedures to simulate those in SQL.

6. Even if you decide to use another front end, you could do the project in two stages where the first stage is to migrate the back-end to SQL and the second stage is to re-write the front end using another solution.

I don't have all the details for doing this type of conversion, but I know we have done it successfully several times. I hope that helps.

Collapse -

Thanks for your feedback

by jfejsa In reply to For starters...

Thanks for your feedback illilli, I appreciate your input. I will just have to look at all options and pick the ?best? one to pursue.

Firstly, my whinge? It's typical, just when everything jells-in and works great the government steps in and comes with different requirements. If the government keeps up with its standards we?ll be de-merging within five years.

Secondly, for anyone else reading, I don?t think I made myself clear on few points in my original post.

1) Number of users and distance.
Our unit only recently merged with units spread over a large geographical area. Even now, we use the entire system in the main office only - both for data entry and reporting. Most reports can be converted to Word doc/rtf or pdf files and emailed to other branches as required. The management now wants all other branches (spread over 132,845 square kilometres) to have access to our system so couple weeks ago we decided to tested the system by asking a person from one of the other branches to run one our small database from their location. During the test Access over the WAN was unacceptably slow. There was only one user trying to use the system from Tamworth which is about 280 km from Newcastle Main Office. Therefore, number of users wasn?t taken in account (only one user), the distance wasn?t as great as it could be further north and tested database wasn?t the larges or the most complex database in the system.

2) Potential loss of data
We are looking at developing a system that will work with constrains imposed on us by WAN speed or a system that will work with our existing data. If we can?t workout how to use our existing data we could potentially loose 8 years or data or employ an admin to spend months of re-entering current data in the new system.

NOTE: Our WAN is reasonable fast; the problem seem to be with Access database engine. Other systems we have in place (ie Oricle seem to be reasonably fast across WAN)

3) Loss of tested applications
Most programming has been fully tested and optimised. We really don?t want to loss it if possible, or at least not lose all of it.

4) Number of queries and tables are dynamically created as needed, ie to filter records for reports. Number of our applications will have to be rewritten if we lose that functionality. Could SQL views be dynamically developed in code as required, ie. similar to QueryDef or TableDefs in Access?

5) If we decide to use Internet as our delivery mechanism, how easy/hard is it to use Office Automation in VBS/ColdFusion/JavaScript? Number of our reports are generated as Word documents in code dependiong on varios parameters selected by user (could be up to 60 different parameters). Rather than having parameter queries with up to 6o criterier variations we usually create queries in code as required. How hard would it be to duplicate using other options?

There are so many questions but at this stage I just need to get some ideas which way we need to point our investigation.

Thanks for all your suggestions so far.

Collapse -

List of Helpful Links

by illilli In reply to Thanks for your feedback

Since I am not an expert on anything really, I rely on the wisdom of others. I was curious about the problem with autonumbers coverting over to SQL and did this Google Query.,GGLC:1969-53,GGLC:en&q=sql+server+2000+autonumber

I think you might find some of the links particularly good reading.

Collapse -

Where do I start

by Tony Hopkinson In reply to Access too slow over WAN

You may as well stick with MS, so SQL server. Anything would multiply the problems you are going to encounter enormously.
Step one is to get your data across and dts is the tool for that.
The DTS tool you get with SQL Server Client will cope with the migration admirably, Or you can script it.
Autonumbers (IDENTITY in MSSQL) will be a bit of fun. How much depends on whether and how much you are using them to relate records.

Mutiple databases will not cause you problems as such, you can keep them separate or merge them. Obviously for a a merge, you need to identify any common data and avoid you DTS pacakeges/scripts trampling other data sources.

Connectivity is unlikely to be a problem.

If all accesses are from members of a domain use the windows only security model in MSSQL. You can also used mixed or even SQL Server only but avoid those if you can. If you have many accounts that are not part of a domain you control, it will be easier to use mixed, than to attempt to administer that.

All your front ends should work with a bit of twiddling as they are with MSSQL. Connection string will have to change but that's about it.

Depends on how your system(s) are written on how many more problems you get.

Just done a port from paradox to ms sql. The largest problems were not data or connectivity, but optimisation, transactions and locking. Desktop database don't tend to bother with this sort of thing MSSQL/ADO will by default. How big a problem you get will depend on the functionality and their current designs.

If you are allowed, use stored procedures where you can. Transisolation will have a big impact on a multiuser desktop application. If you have a lot of data crunching logic client side, particularly select * from table and then Locate/find records, you will get a serious performance drop. In one case in the recent project it took 4 seconds to open a table like this in ado/mssql, milliseconds in paradox. However switch to select fld1,fld3 from where fld1 = ? is also milliseconds, but that will have an enormous impact on your code base though.
Lots of luck, get a pilot project going as quick as you can. When you get to specifics if you get stuck I may be able point you in some possible directions.

As a PS If you haven't already, get some tests set up, it's amazing how a simple change to your app can wipe it out.
Look at specifying and writing a noddy for data verification. Connects to old and new database, compares structure and content and reports any funnies. Comes in to its own for testing.
Start both databases 'equal' run a function in the old and new system, then compare again. A bit of work on this front can save you a bucket load of debugging code that doesn't crash but does fail.

Collapse -

More Suggestions

by teebes2004 In reply to Where do I start

Regarding the use of MSSQL DTS and your autonumber/identity fields, there is a check-box to turn on "identity insert" so that you should not lose your links when converting data.

I agree with previous comments about using stored procedures whenever possible. The easiest way to do this is to always use queries to manipulate your data and load each of these queries to a stored procedure. Then execute the store procedure, along with any input parameters, using the execute method of a connection object.

A few years ago I had a similar problem using Access97 for both the front-end and back-end of an app. On a hunch, I made a cut-down of the back-end database to fit onto a floppy disk. I noted the time it took to copy the DB from the floppy. I then opened several instances of the app linked to the floppy drive. Each instance took the same time to load as copying the database from floppy drive. The implication is that each of your users are dragging their own copy of the database across the network.

I hope this helps. Good luck on your project.

Collapse -

Thats the chap

by Tony Hopkinson In reply to More Suggestions

Use a ADO Table relative or Select * From Table and the lot will be copied into memory. Course if the MDB file is local and it is a desktop application ie single user, it's not a problem. Try to scale a solution like that though and you end up in this situation.
Get's even worse if you have gone big time ito RAD & data-aware controls, optimising to use global database components throughout your forms. You only need a year of 'quick' extra features with no refactoring and you end up with a tangled mess, that's a right pain to unravel.

Collapse -


by abhurke In reply to Access too slow over WAN

Dear Sir,
Can you tell me how you imlemented msaccess over wan.
Please give me some idea as i want to implement such type of project using vb6.0 as front end & msaccess as back end over wan.
My email id:-

Related Discussions

Related Forums