Tuesday, July 31, 2007

Thinking Outside the Box - Help Writing Faster Queries - Part 3 of 3

Hello and welcome to our blog!

This is part three of our series to help you write faster queries.

In part one you learned about the power of indexes, when an index should be used and how to create an index.

In part two you learned about the performance hits associated with using function calls in your Where clauses.

Now, in part three, let’s think outside the box a bit. I know I said I was going to give tips on how to make your query faster… and I am. However, this may be a bit off the beaten path. But doesn’t that make for a great tip? Something you haven’t already heard a hundred times?


Help Writing Faster SQL Queries
Part 3 : Thinking Outside the Box!


Here’s the backdrop: The app is in the can, working great, and the roll out begins. Within a few days, you start hearing murmurs that parts of the app are slow. These murmurings soon become help tickets. These help tickets soon become meetings with the boss and the users. These meeting often turn into alcoholism…. You get the point.

This unfortunate situation arises all too frequently, and there is a relatively simple solution to it.

You see, everything always performs well in your development environment because you have relatively few records in your development database! Sure, if I only had 30 records in my main table, my application will work wonderfully too! Once it gets into the real world, and a few hundred thousand or millions of records are input or imported, the need for performance tuning becomes a critical apparentness.

The solution is simple; create the same number of records in your development environment, as you’d expect to have in the first year of operation. If you have importable data (say, from legacy systems or third party feeds) them pump them in. If you do not, then there are many open source utilities to create mock data that would closely conform to your actual data. If you are the creative type you may write your own code to populate the tables with "dummy data". Regardless of how you fill the tables, the point is that now, as your first writing your queries, you’ll have a really good idea if you need to concentrate on a particular query to avoid future headaches and counseling bills.

Another thing you may wish to do is model the effect of concurrent users in your database system. One easy way to do this is to create a script that simply performs searches on a table (or across multiple tables joined together). In addition, you can create another script to insert records in a table and a third script to update records.

Schedule these scripts to run every minute throughout the day and night. Doing this will result in 60 "dummy records" being inserted each hour (1,440 records added each day). Of course, you should tweak the time based on the amount of transactions you expect your system to have "in the real world". You may want to add one record per hour or you may want to add 100 records every minute.

Now, when developing your database storage and retrieval systems you will not be "alone in the box". These automated processes will simulate some of the load you can expect to deal with on your Production environment when users are searching for information, entering information and updating information. This can go a long way to help you identify areas vulnerable to contention.

This concludes our series providing help writing faster sql queries.

However, this certainly is not the end of the information we intend to share!

Please bookmark this blog now so you can easily come back tomorrow and learn more great SQL programming tips and tricks.

The SQL Coach Team

Sunday, July 29, 2007

Function Calls are Evil Incarnate - Help Writing Faster SQL Queries - Part 2 of 3

Hello again and welcome to our blog.

This is part two of our three part series to help you write faster queries.

In part one you learned about the power of indexes, when an index should be used and how to create an index.

Let's get on to part two, shall we?


Help Writing Faster SQL Queries
Part 2 : Function Calls are Evil Incarnate!


The following queries look extremely innocent. However, they are all application killers once the table grow to more than a couple thousand rows:

Select Person_Data_ID, Name_First, Name_Last
From Person_Data
Where Next_Contact_Date > GetDate()

Select Count(*)
From Person_Data
Where Upper(Name_Last) = ‘SMITH’

Select Avg(Collar_Size)
From Person_Data
Where dbo.MyFunc(Person_Data) > dbo.MyOtherFunc(22)

For each of these statements, a table scan will be used to try and fetch data.

A table scan is simply an operation where the database looks at each and every record in the table to see which ones meet it’s criteria in the Where clause.

We would much rather the database use an index and only fetch the 20 records it needs rather than reading 200,000 records in a table scan fto locate the 20 records it needs. However, the use of the functions (GetDate(), Upper(), MyFunc(), MyOtherFunc(), etc) keep it from doing that.

Why do functions keep the database from using indexes? It’s simply because the database does not know the result of the function.

Each time GetDate() is called, a different answer could come back. Each time MyFunc() is called it could come back with a different result since the function could potentially be based on the input value, the time of day, or data that is being modified elsewhere in the database. The only way the database can know for certain the answer to the function call is to make it and see what the answer is. Since it doesn’t know ahead of time what the answer is, it cannot use the index to go to the corresponding entry.

So, whenever possible, avoid the use of functions in your Where clauses. A great solution to these problems are (in order):

Select Person_Data_ID, Name_First, Name_Last
From Person_Data
Where Next_Contact_Date > GetDate()

Becomes

Declare @DT DateTime
Set @DT = GetDate()
Select Person_Data_ID, Name_First, Name_Last
From Person_Data
Where Next_Contact_Date > @DT

Voila, the database optimizer gets the value of GetDate() ahead of time. Since the value of @DT will not change while the query is running, it can use an appropriate index.

Select Count(*)
From Person_Data
Where Upper(Name_Last) = ‘SMITH’

Becomes

Select Count(*)
From Person_Data
Where Name_Last = ‘SMITH’

“Whoa?” I hear you cry. All you did was take the Upper() function out. What about case sensitivity?!? Actually, I’m cheating a bit here. With Microsoft SQL Server, the default index collations are case insensitive. In plain terms, that means the searches are case insensitive by default. With Oracle and other database engines, you can choose to make your indexes case insensitive, so why not? For a simple name search, it’s a good idea. And, it avoids the dreaded Table Scan.

The third example is a bit tougher. Essentially, if your function is extremely complex, you may not have a choice but to use it and let the user know to expect poor performance (a please wait screen goes a long way here, guys). However, if the function is merely a simple multiplication (i.e. Unit_Price * Qty) then simply do the math in the statement and avoid the function.

Please note: Many developers have it beaten into them to make reusable code so changes can be made in one central place. The fact of the matter here is that a SQL statement is not the same as a C program!! You should adjust accordingly. One of the adjustments is to give up reusable code in this context in favor of better performance.

We hope you have enjoyed the information presented in Part 2 of our series. Were you paying attention? Let's do a quick test and find out?

Is the following statement true or false?

You should always strive to use function calls in your Where clause to maximize Code Reuse and eliminate redundant code from your database?

We really hope you said the answer to the above question is false!

Part three will be ready in a day or so. We recommend that you bookmark this blog now so you can easily come back tomorrow and learn more great SQL Server tuning practices.

The SQL Coach Team

Friday, July 27, 2007

SQL Server Performance Tuning Video Lesson #1 Now Available!

Just letting you know the first video is now available.

This video shows you how to add an index to a table.

Watch as I add the index IX_PersonData_NameLastNameFirst on the Person_Data table!! When you see how easy this is you will wonder why on earth you have not done this before.

You can watch this video on YouTube.

Just click on the following link: Learn How to Add an Index to a Table!

I did not realize the video would be shown so small at YouTube. Don't worry though, lesson learned.

Future videos will be zoomed in closer to the action. If enough people request it I will rebuild this video so you can better see what is happening.


The SQL Coach Team

Thursday, July 26, 2007

The Power of Database Indexes - Help Writing Faster Queries - Part 1 of 3

Good evening! Well, it's evening here in Missouri at the time I am writing this. ;)

Before I go any further let me say we will be making some videos to show you how to do this stuff. So, if you are one of those people who learn better / faster by watching rather than reading check back soon. The video lesson for Part 1 will be available within a day or two at most. I actually plan on doing this first video tomorrow after work.

Okay, now are you ready to pick up some quick help writing faster sql queries? I hope so because we are now just moments away from Part 1 of this little series.

Let's go!


Help Writing Faster SQL Queries
Part 1 : The Power of Database Indexes

Many developers ignore indexes altogether. Many do not understand them.

An index is merely a Table of Contents to the data in the database table. While developing an application, print out the schema of the database, including indexes, so you know what they are while writing your queries.

If an appropriate index is missing, have someone experienced with indexes create it (whether that be you, the developer on the other side of the wall, or a DBA).

Here’s the impact. Say you are writing a contact management system. You would probably have a table with first name, last name, etc. For simplicity, we’ll keep this all one big table and obvious field names.

Want an example? Okay, you’d probably write a query that looks likething like:

Select Person_ID, Name_First, Name_Last
From Person_Data
Where Name_Last = @NameLast
And Name_First = @NameFirst

With 100 rows (ie. records), this query will work instantly. However, as your database grows to several thousand rows, this query will slow down, eventually hitting the seconds range. If you merely add an index with the columns Name_Last and Name_First, you will have basically instant return time for the life of the application.


I bet indexes don't seem so complicated now, do they?

Just remember to look for the columns (Name_Last and Name_First in our example above) that appear in the Where clause. These are the columns that need an index.


Creating the index

There are different ways to create the actual index. If you are more comfortable with GUIs then use Enterprise Manager or the appropriate tool for your database.

If you are comfortable working in the query window you can use the command format to build your index.

create index INDEX_NAME_HERE on TABLE_NAME_HERE (column_name, column_name, ... etc, and so on, and so forth).

This is how you create the index needed for the query above:

create index IX_PersonData_NameLastNameFirst on Person_Data (Name_Last, Name_First)

Hope you have enjoyed this first tip and now have a better understanding of indexes.

If you have any questions please post them on this blog!

Just use the link immediately below this post (the one that says comments).


Happy indexing!

The SQL Coach Team

Wednesday, July 25, 2007

New Three Part Series on Writing Faster Queries Starts Soon!

Be sure to bookmark this page and come back tomorrow when we begin our Three-Part Series on Writing Faster SQL Queries.

We'll cover three things you can do immediately to get your queries out of the database faster.

Whether you are writing batch processes, a database-driven website or a database-centric desktop application you can put these techniques into use.

Each will be explained in simple ordinary language and any "Average Joe or Jane" programmer can put these into practice.

How cool is that? =)


Cya tomorrow!

The SQL Coach Team

Monday, July 23, 2007

SQL Coach Welcomes You...

Hello and welcome to the SQL Coach blog.

This blog has been created by two professional SQL Developers. Our mission is to create a fantastic online resource filled with advanced SQL Server information.

Here is a list describing only a portion of the things you will learn in the coming weeks:

  • Discover a very special, often overlooked, Temporary Table that can be used to greatly accelerate your sql queries.

  • Determine which columns and tables to index, when to add indexes... and when not to.

  • What is the "Intra-query parallelism caused your server command to deadlock" error... and how you can protect yourself against it!

  • One of the most common causes of "latch wait" ...and how to greatly reduce latch wait.

  • How a simple trick of rearranging the order of criteria in a where clause can dramatically accelerate the speed of a select statement.

  • ... and much more!

Thanks for stopping by the SQL Coach Blog. We look forward to you returning soon.

This blog was just launched today so please be patient and we will fill this blog with good solid hard-to-find SQL Server information as soon as possible.

While you are waiting for the information to arrive... please help us out by spreading the word. Tell all your co-workers and friends about our new blog, the SQL Coach!

Best regards,

B. Vegiard
G. Benjamin