Data Reader About the fastest access you will get to SQL is with the SqlDataReader. Profile it It's worth actually profiling where your performance issue is. Usually where you think the performance issue is, is proven to be totally wrong after you've profiled it.
For example it could be: The time... the query takes to run The time... the data takes to copy across the network/process boundry The time... . Net takes to load the data into memory The time... your code takes to do something with it Profiling each of these in isolation will give you a better idea of where your bottleneck is. For profiling your code, there is a great article from Microsoft Cache it The thing to look at to improve performance is to work out if you need to load all that data everytime.
Can the list (or part of it) be cached? Take a look at the new System.Runtime. Caching namespace.
Rewrite as T-SQL If you are doing purely data operations (as your question suggests) you could re-write your code which is using the data to be T-SQL and run natively on SQL, this has to potential to be much faster as you will be working with the data directly and not shifting it about. If your code has a lot of nessecary procedural logic you try mixing T-SQL with CLR Integration giving you the benefits of both worlds. This very much comes down to the complexity (or more procedural nature) of your logic.
If all else fails If all areas are optimal (or as near as), and your design is without fault. I wouldn't even get into micro-optimisation, I'd just throw hardware at it. What hardware?
Try the reliability and performance monitor to find out where the bottle neck is. Most likely place for the problem you describe HDD or RAM.
I have tested some thing, SqlDataReader is obviously faster than DataSet ;) Yes, loading time is hitting performance worst. – watbywbarif Sep 16 '10 at 12:06 And im am not sending to client, . Dll is used on same machine as server for some internal usage.
– watbywbarif Sep 16 '10 at 12:12 Edited to match you updated question. – badbod99 Sep 16 '10 at 16:44 1 +1 for "rewrite as T-SQL". The ideal query is one that only retrieves absolutely necessary data.
If you're retrieving 100k rows to the client app, then processing there, then perhaps you should re-consider your logic. – BradC Sep 16 '10 at 17:09 A well deserved +1. – Steven Sep 18 '10 at 18:18.
If SqlDataReader isn't fast enough, perhaps you should store your stuff somewhere else, such as an (in-memory) cache.
– tomfanning Sep 16 '10 at 11:28 +1 - a caching layer or in memory database is the way to go. – Winston Smith Sep 16 '10 at 11:31.
No. It is actually not only the fastest way - it is the ONLY (!) way. All other mechanisms INTERNALLY use a DataReader anyway.
I suspect that SqlDataReader is about as good as you're going to get.
Ha! Would either of the downvoters care to elaborate on what's wrong with this answer? – LukeH Sep 16 '10 at 13:40.
SqlDataReader is the fastest way. Make sure you use the get by ordinal methods rather than get by column name. E.g.
GetString(1); Also worthwhile is experimenting with MinPoolSize in the connection string so that there are always some connections in the pool.
– watbywbarif Sep 16 '10 at 12:13 In . Net DB connections are returned to a connection pool after being closed and then eventually the underlying SQL server connection is closed after a period of inactivity. This generates the login and logout events.
In certain scenario (infrequent web service calls) it may be beneficial to always have some ready connections in the pool to handle the first request quickly rather than having to open a new connection with the SQL server. – Pratik Sep 16 '10 at 21:28.
SqlDataReader has an optimization for reading a single row (System.Data.CommandBehavior. SingleRow argument of ExecuteReader), so maybe it can improve the speed a bit. I see several advantages: Single row improvement, No need to access an array on each iteration (reader0), Cloning an array (reader) to another one may be faster than looping through elements and adding each one to a new array.
On the other hand, it has a disadvantage to force SQL database to do more work.
It sounds strange but as this . Dll is used on server it seems that I get data fater by SqlDataReader than by building one row in SQL. – watbywbarif Sep 16 '10 at 12:10.
The SqlDataReader will be the fastest way. Optimize the use of it, by using the appropriate Getxxx method , which takes an ordinal as parameter. If it is not fast enough, see if you can tweak your query.
Put a covering index on the column (s) that you want to retrieve. By doing so, Sql Server only has to read the index, and does not have to go to the table directly to retrieve all the info that is required.
Query is only one column select, there is no place for optimization there, only to redesign the database ;( – watbywbarif Sep 16 '10 at 12:08 @watbywbarif an index will still help even on a single column select – msarchet Sep 16 '10 at 13:27 I checked, it is already Indexed. – watbywbarif Sep 17 '10 at 8:59 Have you created an index that ONLY contains the single column you are selecting on? – Ian Ringrose Sep 17 '10 at 10:25 yes, I have this from start – watbywbarif Sep 30 '10 at 12:55.
You have 4 sets of overheads - Disk Access - .net code (cpu) - SQL server code (cpu) - Time to switch between managed and unmanaged code (cpu) Firstly is select * where column = “junk” fast enough for you, if not the only solution is to make the disk faster. (You can get data from SQL Server faster than it can read it) You may be able to define a Sql Server function in C# then run the function over the column; sorry I don’t know how to do it. This may be faster than a data reader.
If you have more than one CPU, and you know a value the middle of the table, you could try using more than one thread. You may be able to write some TSQL that combines all the strings into a single string using a separator you know is safe. Then split the string up again in C#.
This will reduce the number of round trips between managed and unmanaged code.
– watbywbarif Sep 17 '10 at 8:56 Each thread could use it's own SqlDataReader, provided you can something you can be in the "where" to define the data between the threads. – Ian Ringrose Sep 17 '10 at 10:24 Nice idea, maybe it can help. +1 – watbywbarif Sep 30 '10 at 13:00.
Some surface-level things to consider that may affect speed (besides a data-reader): Database Query Optimization OrderBy is expensive Distinct is expensive RowCount is expensive GroupBy is expnsive etc. Sometimes you can't live without these things, but if you can handle some of these things in your C# code instead, it may be faster. Database Table indexing (for starters, are the fields in your WHERE clause indexed?) Database Table DataTypes (are you using the smallest possible, given the data? ) Why are you converting the datareader to an array?
E.g. , would it serve just as well to create an adapter/datatable that you then would not need to convert to an array? Have you looked into Entity Framework?
(might be slower...but if you're out of options, might be worthwhile to look into just to make sure) Just random thoughts. Not sure what might help in your situation.
Provides a way of reading a forward-only stream of rows from a SQL Server database" This is the use of SqlDataReader from MSDN . The Data structure behind SqlDataReder only allow read forward, it's optimized for reading data in one direction. In my opinion, I want to use SqlDataReader than DataSet for simple data reading.
If responsiveness is an issue loading a great deal of data, look at using the asynchronous methods - BeginReader. I use this all the time for populating large GUI elements in the background while the app continues to be responsive. You haven't said exactly how large this data is, or why you are loading it all into an array.
Often times, for large amounts of data, you may want to leave it in the database or let the database do the heavy lifting. But we'd need to know what kind of processing you are doing that needs it all in an array at one time.
Responsiveness is not problem. – watbywbarif Sep 17 '10 at 8:58.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.