Parsing and compiling the query, and maybe reading from disk. This is why it runs quick the 2nd time. And takes time to compile again when parameters change.
Data cache. Aka Buffer pool. More RAM won't go wrong with SQL Server generally, ever.
Query 1 and 2 are the same, query 3 is different. I would suggest 2 indexes to start with String1, String 2, DateTime2, DateTime1 INCLUDE DateTime3 String1, String 2, DateTime3 INCLUDE DateTime2, DateTime1 Other thoughts regards datatypes... the smalle the better if course do you need nvarchar? Will varchar be OK?
Ditto datetime vs smalldatetime or (if SQL Server 2008) datetime2 Edit: Disk reads will happen one into memory (simply) so more RAM will help However, I suspect the 20 secodns is compile + statistics etc, not reading from disk.
Thanks gbn. I am selecting around 2k rows and is disk read what's taking the time? Are there any suggestions as to how to make it faster?
– Chi Chan Nov 2 '10 at 17:59 one more question, so there any methods to reduce the compile and statistic time? Is moving those into stored proc going to help? – Chi Chan Nov 2 '10 at 18:13 @Chi Chan: It can do.
You are more likely to get plan re-use than with hard coded parameters – gbn Nov 2 '10 at 18:15 This is a good answer. I would like to add: You will not get plan re-use in plain ol' SQL Server if the query text is different, even by a comment. SELECT ... WHERE String1 = 'Blah' won't use the cached plan for SELECT ... WHERE String1 = 'Foo'.
You'd need SELECT ... WHERE String1 = @String1. – Matt Nov 2 '10 at 22:31 Also, after so many inserts/deletes/etc. SQL Server will decide your statistics aren't accurate and will recompile them on your next query.So, if you are always doing a SELECT after a huge INSERT, you may always have to pay the compile penalty, unfo. – Matt Nov 2 '10 at 22:32.
I'd recommend familiarizing yourself with SQL Server's Database Tuning Advisor (DTA) and Profiler. Here is a good article: zimbio.com/SQL/articles/655/How+Tune+Dat... DTA won't always give you perfect recommendations but it's usually a good start. If you use profiler you can monitor your database for a day and then pass that to the DTA.
There are also other things to take into account such as: How often will specific queries be run How often will the tables have records inserted or updated Keep in mind that having indexes will slow your insert's and updates. Another important thing to do is make sure you are starting from the same baseline with every query you run. I suspect you are seeing the results of 1 and 2 in your question because you have cached data.
Here is a good link that goes over how test sql scripts.
The problem with query 1 is that it has two independent range conditions. There was a very similar question just recently. I explained that particular issue there: stackoverflow.com/questions/4059920/how-... regarding query 2: if you would use the following index: String1, String2, Date2, Date 1 it can serve query2 quite good, and doesn't change query1 (except the condition on date1 is much more selective than date2).
Query3 might need an additional index: String1, String2, Date3 However, I don't like two indexes with the same prefix. I would probably turn String1 and String2 around--just in case some query has String2 only. Want better understanding of all this indexing stuff?
Have a look at my free eBook "Use The Index, Luke.
I think you are doing just fine but I would also see the data selectivity in your query. How many records will by returned in your typical query out of 100,000,000? If your resultset is small(3~5% of entire table), indexing is good resolution.
What is Sql server doing with my RAM? Do I need more RAM because I have a big table? -> DBMS is moving data(block by block) from physical storage to your RAM to execute your query.
Also your query needs to parsing and it would consume some memory.
More RAM will help depending on your operating system SQL Server is installed on, the version of SQL Server and how much RAM you already have. They all don't have the same limits on RAM. For future reference use the Analyze Query in Database Tuning Advisor in SQL Server Management Studio (SSMS).
You may not need it now, but could help when the tables get more complicated. You do have a primary key on this table? Just curious, you never mentioned it.
The Id is the p. Key, it is however, a random int assigned using hi-lo with nhibernate. It's safe to say that I will never use the key to select/update/delete anything.
– Chi Chan Nov 2 '10 at 18:09.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.