I’ve seen the implicit conversion issue in SSMS while looking over query execution plans and I found this blog about the topic: Are VARCHAR or CHAR Hurting Your Performance Due To Implicit Conversions?
The developer’s mistake the author points out is using the convenient method AddWithValue without explicitly specifying the data type:
//Use AddWithValue to assign Demographics. //SQL Server will implicitly convert strings into XML. command.Parameters.AddWithValue("@demographics", demoXml);
The better (explicit) way:
command.Parameters.Add("@ID", SqlDbType.Int); command.Parameters["@ID"].Value = customerID;
Looking into the .Net Namespaces for data access:
- Data Common
- The agnostic version does not have AddWithValue
- Data SQLClient
- AddWithValue is overloaded; however, does not offer an easy-to-use option of: parameter, value and data type.
The better explicit way (using one line):
Either method gives us the convenience of one like
'one line command.Parameters.Add("@ID", SqlDbType.Int).Value = customerID 'extension method <Extension()> Public Function AddWithValue(ByVal sqlParameterCollection As SqlParameterCollection, ByVal parameterName As String, ByVal value As Object, ByVal sqlDbType As SqlDbType) As SqlParameter sqlParameterCollection.Add(parameterName, sqlDbType) sqlParameterCollection(parameterName).Value = value Return sqlParameterCollection(parameterName) End Function 'extension method usage command.Parameters.AddWithValue("@ID", customerID, SqlDbType.Int)
Here is a real example that exhibits this same behavior:
SELECT * FROM tbl_Folders WHERE (RATABLEID = @RATABLEID) AND ( LOC_CODE IN ( SELECT Loc_Code FROM vw_Browse WHERE InactiveFiles = 0 ) )
- This SQL can be tuned, but it is not the focus of this blog.
- Select * is bad.
- Use of IN when EXISTS should be used.
- Use a JOIN instead of a nested sub-query.
Looking at the query plan, I see the yellow bang:
Table ‘tbl_Folders’. Scan count 1, logical reads 1149, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 31 ms, elapsed time = 90 ms.
The issue is the column is varchar(12) and not nvarchar(12) and looking at the code that executed the SQL statement, we are not explicitly setting the data type:
sqlCmd.Parameters.AddWithValue("@" & item.Field.DestinationColumnName, item.FormattedValue)
How big of an issue is this?
When I ran the SQL explicitly as varchar we cut down on the logical reads and CPU time:
Table ‘tbl_Folders’. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
SQL Server Execution Times:
CPU time = 16 ms, elapsed time = 149 ms.
And no more yellow bang :
As the author from the blog pointed out the root cause being the index SEEK vs SCAN and here is the comparison :
With one execution, it’s probably not big of a deal; however, this query runs during an import process over 100,000 times.
Why does this occur?
From Microsoft: When an operator combines expressions of different data types, the data type with the lower precedence is first converted to the data type with the higher precedence. If the conversion isn’t a supported implicit conversion, an error is returned. For an operator combining operand expressions having the same data type, the result of the operation has that data type.
varchar is low on the totem pole and nvarcharis right above it and takes precedence.
- Explicitly set the data type in the .Net SQLClient when adding parameters to the Command object to avoid these implicit conversions.
Related and additional information:
- Data type precedence (Transact-SQL) – 2019
- How expensive are column-side Implicit Conversions?
- Implicit conversion in SQL Server
- Implicit Conversion Is A SARGability Problem
- Intermittent conversion issues