and the IO stats were:
--Table 'ExternalLinks'. Scan count 2, logical reads 4804, physical reads 0, read-ahead reads 0. The query was doing 2 scans and 4804 logical reads. When i tried to exeute the query using query analyser without the sp_executesql,SELECT Max(Id) FROM ExternalLinks WHERE ExtHandle='AT51988635-052' i got : --Table 'ExternalLinks'. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0. After investigating, i figured out that the client was using sp_executesql which was passing a parameter as Nvarchar as oppesed to Varchar (the datatype of the table is varchar). This was causing the query engine to convert the data before comparing. This may not be noticable on a small table, but this will make a big impact on large tables. The execution plan for the query that was using sp_executesql is shown below which uses a clustered Scan rather than a seek as it needs to convert all the values before it can compare the values.
The argument shows the convert function.
The second query (without the sp_executesql) uses an index seek which is much more ifficient.
Just to clarify that this is not a shortcoming of sp_executesql, it is rather the client that decided to use Nvarchar for the field rather than varchar. I get the same results if i use the correct data type with sp_executesql.
** If you are using Nhibernate mapping file, you can use type="AnsiString" to force enhibernate to use varchar data type.
1 comment:
I had a query that would take 1 second in mgmt studio but 8 minutes from code using sql parameters. This article pinpointed the problem for me - I had varchars in my table definition but by default the sql paramaters gave strings the type of nvarchar. Manually setting the datatype in the sql parameters did the trick. Thanks so much!
Philip
Post a Comment