If you are developing a database application with SQL Server, you must take time to learn how SQL Server works (especially SQL-DMO and DTS). There are a lot of different techniques and solutions available and there is a lot of discussion about best practices. These practices look good on paper, but how to implement them or design them is often unclear. The hard work of getting the database server designed and implemented correctly will pay off in the future, so let’s take a look at a few best practices for your SQL Server environment to set you on the right track.

Built for speed
Using SQL Server views is a good idea, but there are still some serious drawbacks. For the most part, they are slower than inline SQL Statements. But if you have an enterprise version of SQL Server 2000, you can add indexes to greatly improve view performance. However, the more common standard version of SQL Server 2000 doesn’t support indexed views. If performance is not an issue, though, views are a good way to aggregate data and control permissions. But SQL Server must retrieve the view definition and compile it for every call.

If database tables are constantly updated, use the NOLOCK hint when running queries on these tables. You may not see a difference if certain tables aren’t being updated, but if they are, you’ll see a noticeable improvement in your update commands and queries.

Stored procedures are the recommended way of querying the SQL Server and usually offer very good performance. But as with any programming language, care must be taken to make sure the code isn’t sloppy or designed so poorly that performance and maintainability are forsaken. Writing stored procedures and using the sp_execsql command makes little sense and should be avoided whenever possible. If a stored procedure has been written calling sp_execsql, you should rethink the design to eliminate the need to call it and save SQL Server the time of compiling the statement. For example, using conditions to execute different statements instead of building a statement dynamically can be a better option.

Another issue always associated with stored procedures is the use of server-side cursors. Opening a server-side cursor is similar to opening an ADO record set to perform row-based manipulation of the data. In theory (and in practice) many of the same functions can be done with set-based processing.

Set-based processing is using advanced select statements and Transact-SQL (Microsoft’s SQL language) to perform data manipulation. A very simple example is using the query: update set a=a+1 instead of looping through a table, adding 1 to a value, and then updating the row. A more complex example would be using the CASE switch or other Transact-SQL (T-SQL) functions. Transact-SQL contains a lot of functionality that many programmers don’t take the time to learn. Not only can Transact-SQL make your program logic cleaner, it will speed up your application considerably.

Developing with scripts
Until this year, my database projects evolved around physically changing the database and then creating backups and backup scripts. Earlier in the year, I made a decision to make most database modifications using SQL Scripts that can be saved in SourceSafe. This gives me an audit trail and history about what was worked on within a project and who did it. Also, if the changes need to be applied to different servers, I don’t have to remember which changes were made; at any time, I can check the scripts out of SourceSafe and run them against the database.

Backing up
Make sure the database is backed up nightly or weekly. For nonproduction databases, set the backup method to Simple and use the maintenance wizard to back up each night. This keeps the transaction log small and conserves hard drive space. Many times, a client has called me in a panic because they receive a Log-File Full error, and they can’t understand why their hard drive is full, or they can’t restore the database to another machine with a smaller hard drive. In every case, it is because, the database was set to Bulk-Log and the transaction log has never been backed up or truncated.

If the hard drive is full: Back up the database and, inside the Shrink Database option, you can select the Files… menu item and the Transaction Log .ldf file and shrink it. I’ve shrunk a 5-GB file down to 22 MB, and then created a database plan to back up nightly and truncate the log.

For production databases, it is the same principle. But instead, you use a full recovery plan and create a separate maintenance plan that backs up and truncates the log every couple of hours. I’ve been very fortunate to have needed to conduct a restore only twice in the last three years.

Authentication security
The best practice for security is to use Windows security. It takes more time to maintain but, as more people and developers use the server, giving out the same login to everyone becomes more dangerous because people can log in and you won’t know who did what. If you use mixed SQL/Windows security, then guard the SA password and make sure it isn’t blank. Even on development servers, having a blank password is a bad practice.

Other tips and recommendations
Keep the log files and data files on different drives. The drive with the log file shouldn’t be a RAID 5 drive but a mirrored drive or RAID 1.

Increase the tempdb database if you’re conducting many imports and exports. SQL Server uses the tempdb to hold temporary data and tables. Also, if you let the tempdb grow automatically, the percentage of growth should be at least 10 percent. If the server slows down at anytime, you need to enlarge the tempdb database.

For very common tables that don’t change often (such as a product line / category table), PIN the table to memory (see Books Online). Use Books Online—I’m always impressed by how much is actually there.

Use Update Statistics/Reorganize Indexes to make sure SQL Server knows the best way to self-tune queries.

These tips should help out in the long run
Applying the practices I’ve described to a basic, out-of-the-box database should eliminate the potential for some serious problems. And although utilizing these SQL Server best practices takes more time at the beginning of a project or business venture, in the end, you have much better control over the environment and project.