Questions

Will connecting MS Access kill my SQL Server?

+
0 Votes
Locked

Will connecting MS Access kill my SQL Server?

r_nichol
I am working as a DBA and have been told some customer who use access need to connect to some tables in a SQL Server Database (some of these tables a containng 1 million rows+)

MY concern is does a query written access use the SQL Server Engine or does it just do a full table scan, pull all data and then get rid of what it does not need? My concern is will this connection put a load on my server and possible screw up my database? Note - they only need READ ONLY.

Really appreicate help on this
  • +
    0 Votes
    Tony Hopkinson

    Queries are passed through to SQL Server.
    BUT
    You will have no control over what queries they run
    So numpty queries like cartesian products of your eight biggest tables etc are a possibility.

    Seeing as this sort of thing is usually for adhoc 'historical' analysis.
    I prefer to create a copy of the DB and make t readonly if useful. Then it's a simply a question of how often you synchronise.

    If it's your live database, one ah misjudged query will bring your system to it's knees and the last place you'll look is for a graduate marketting type with access as teh cause.

    +
    0 Votes
    alaniane

    of the tables and give them access to the view. You can make the view readonly and also you can restrict the columns and even the rows that are accessed to only what the customer needs to access. If they don't need live access to your data then you may just want to send the data to them in xml format and let them import into Access. That way they won't be able to mess up your database. Of course I would encrypt the file if it sensitive data.

    If you do connect them directly to your database, beaware of what permissions you give them because Access can update and delete records from SQL server.

    +
    0 Votes
    Dhimant Trivedi

    First of all, no matter what kind of connection it will be, but the load will be there on the server. Now, what you can do to minimize the load is to separate out the load. my suggestions will be:

    1. You can create a separate database for that particular task and then you'll have to synchronize the data. This way your live data will not affected by out side transactions. Remember synchronization itself can put some load on your server too, but that should not be more than when somebody requests the data from out side and the data is returned back.

    2. You can create just the necessary tables in the that database and as you've told that the tables are going to be READ ONLY only, you can create the Indexes in the best way possible that every time data is requested the chances will be higher for the server to use the indexes rather then Table Scans. Remember, indexes and typically poorly designed indexes can hurt you also.

    Hope this will help.
    Dhimant Trivedi

    +
    0 Votes
    mcintyak

    If you use pass through queries and not linked tables then the SQL Server will do all the work. Don't runt Access queries on linked tables unless you want the light to dim.

    +
    0 Votes
    christian_mercure

    ODBC is not as bad as people think. I have used Access query on million records table with acceptable (if not blazing) performance. However it is important to write the query in such a way that ODBC will be able to analyze it and have SQL do most of the work.

    Queries doing SUM(), Coun() usually run fast. Query with many joins and pulling lots of rows will evidently be slower. SQL is quite capable of pulling 100,000 rows in a short time but it?s your Access front-end that will have a hard time receiving all this data and posting it on a datasheet or multirow form. Another killer is when you use a VB function within the query. If ODBC is not able to optimize the query it will attempt to have SQL do as much of the work as possible and then pass all the received rows to Access for futher processing. If you pass a field value to a VB function then it will mostly be quite slow if ODBC get?s back 1 million rows.

    If users are good enough with pure SQL then go with Pass-Through queries (much faster).

    On the other hand, nothing prevents the user from opening the attached table and do CTRL-End to go at the end of the data (Causing Access to pump-down 2 million records, not a good thing to do).

    To prevent long running queries make sure the server is configured to use the query governor and set value at 300 or 600 seconds. SQL will kill any queries exceeding this value.

  • +
    0 Votes
    Tony Hopkinson

    Queries are passed through to SQL Server.
    BUT
    You will have no control over what queries they run
    So numpty queries like cartesian products of your eight biggest tables etc are a possibility.

    Seeing as this sort of thing is usually for adhoc 'historical' analysis.
    I prefer to create a copy of the DB and make t readonly if useful. Then it's a simply a question of how often you synchronise.

    If it's your live database, one ah misjudged query will bring your system to it's knees and the last place you'll look is for a graduate marketting type with access as teh cause.

    +
    0 Votes
    alaniane

    of the tables and give them access to the view. You can make the view readonly and also you can restrict the columns and even the rows that are accessed to only what the customer needs to access. If they don't need live access to your data then you may just want to send the data to them in xml format and let them import into Access. That way they won't be able to mess up your database. Of course I would encrypt the file if it sensitive data.

    If you do connect them directly to your database, beaware of what permissions you give them because Access can update and delete records from SQL server.

    +
    0 Votes
    Dhimant Trivedi

    First of all, no matter what kind of connection it will be, but the load will be there on the server. Now, what you can do to minimize the load is to separate out the load. my suggestions will be:

    1. You can create a separate database for that particular task and then you'll have to synchronize the data. This way your live data will not affected by out side transactions. Remember synchronization itself can put some load on your server too, but that should not be more than when somebody requests the data from out side and the data is returned back.

    2. You can create just the necessary tables in the that database and as you've told that the tables are going to be READ ONLY only, you can create the Indexes in the best way possible that every time data is requested the chances will be higher for the server to use the indexes rather then Table Scans. Remember, indexes and typically poorly designed indexes can hurt you also.

    Hope this will help.
    Dhimant Trivedi

    +
    0 Votes
    mcintyak

    If you use pass through queries and not linked tables then the SQL Server will do all the work. Don't runt Access queries on linked tables unless you want the light to dim.

    +
    0 Votes
    christian_mercure

    ODBC is not as bad as people think. I have used Access query on million records table with acceptable (if not blazing) performance. However it is important to write the query in such a way that ODBC will be able to analyze it and have SQL do most of the work.

    Queries doing SUM(), Coun() usually run fast. Query with many joins and pulling lots of rows will evidently be slower. SQL is quite capable of pulling 100,000 rows in a short time but it?s your Access front-end that will have a hard time receiving all this data and posting it on a datasheet or multirow form. Another killer is when you use a VB function within the query. If ODBC is not able to optimize the query it will attempt to have SQL do as much of the work as possible and then pass all the received rows to Access for futher processing. If you pass a field value to a VB function then it will mostly be quite slow if ODBC get?s back 1 million rows.

    If users are good enough with pure SQL then go with Pass-Through queries (much faster).

    On the other hand, nothing prevents the user from opening the attached table and do CTRL-End to go at the end of the data (Causing Access to pump-down 2 million records, not a good thing to do).

    To prevent long running queries make sure the server is configured to use the query governor and set value at 300 or 600 seconds. SQL will kill any queries exceeding this value.