How to improve a storedprocedure / sql statement on multiple sessions

By rodz ·
Hello I am new to SQL Server . I've used idera diagnostic software to evaluate the performance of my storedprocedure and sql statement and i've found out that as i increase the number of users connected on my server, some of my stored procedures tend to have a higher duration greater than 5000 ms in which in a normal situation when there is only 1 user, usually it will not took < 20 ms. Even a simple select statement would take 10000+ ms ( if the server is busy ). . . Any idea why these things happend? or any suggest as to how can i improve ( if there's a way ) . . . considering I am using two database.1st was used 4 testing (getting the server busy).while the other is for performance evaluation which i profile.

This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Answers

Collapse -

This might help you on your issue...

Please post back if you have any more problems or questions.

Collapse -

Too many factors.

by Tony Hopkinson In reply to How to improve a storedpr ...

Locking would be my guess, but that's all it is. Can you set your database to read only, or perhaps use read uncommitted, that will tell you something

When you say increase the number of users, are their connections persistant ? That will eat up resources.

Where's the threshold? Run your tests with 100, 200 users etc, graph the result.

This is useful site and well named too.

Collapse -

Performance issue

by rodz In reply to Too many factors.

it seems some of my procedures are consuming too much resources. . .can u advise me as to what will i do inorder to resolve this issue?

i've set the minimum duration to 5000ms

here is my sample output trace incase u wanted to see:

Collapse -

Well it's definitely struggling

by Tony Hopkinson In reply to Performance issue

If I was looking at this direct, I'd be less concerned initially with the SPs and start concentrating on the Selects that are taking a long time, especially if I knew it was my SPs that were firing them off, or that they needed data from the same source.

Some things that stood out were
selecting from views with a bunch of strings in the where clause.

I'd be doing them manually from visual studio with showplan on to see what impact building the view to then query was having. Going to the underlying tables might give the optimiser a better chance, or for the regular ones creating a more specific view. You can also now build indexes based on views, thought that is a bit too smoke and mirrors for me.

There are alot of parameter style lookups in there, like looking up separators for currency.
Presumably this sort of information doesn't change too often.
A nice trick is to add another filegroup to the database. Move the table into it, rebuild the indexes and then make the file group read only. If there's a rare update to it, then it's just a matter of setting readonly off doing your stuff and putting it back on again.

Can't really tell from the log but I do wonder if things are getting in each others ways as the duration is long , but cpu time is short, which suggests to me waiting for locks to be freed. Readonly groups or even shifting fixed stuff into a readonly database, removes any locking requirement.

Hope I've been some use, looks like you have a fair job on your hands, butbthis could be down to just a few or even just one table, that gets used a lot.

Attack the biggest one or the most used on, get an improvment, then analyse again, is the way to go. Avoid anything that looks like rewrite the entire thing from the ground up though, until extreme desperation kicks in anyway.


Related Discussions

Related Forums