SQLTeam.com | Weblogs | Forums

T-sql 2012 validation of field values


Due to the business rules, this is the environment I have at present:
Table tblMain has fields as follows:
Column1, Column2, Column3, ..., Column10
Each column has a datatype of varchar(1000)
tblMain is populated from the source on a regular basis...
There is another table tblDataTypes which holds the datatypes, etc. of the columns in tblMain
for example the summary of tblDataTypes is as follows:
Column1 --> varchar, Length --> 6, AllowNull --> False
Column2 --> int, AllowNull --> True
Column3, --> date, AllowNull --> False

The goal is to use tblDataTypes to check the values of all the columns in tblMain and to see if the values

are in correct datatypes, etc.

What is the best way to do this check please?
Thank you


While I can't figure out why you would want to do this. (what are the business rules in place?) The easiest is probably to build a CREATE TABLE statement using the data in tblDataTypes, execute that statement with sp_executesql, then INSERT the values from tblMain in to the new table. You can use a temp table for this. SQL will do the type checking for you


You could also use TRY_CONVERT to attempt to convert the column data into the defined data type.


Use datatype "sql_variant" for Column1, Column2, etc..

Then you can use function:
SQL_VARIANT_PROPERTY ( <column_name> , 'BaseType' ) to verify the underlying data's types.

Naturally for char/varchar cols, you would use DATALENGTH() to verify the max length.


SQL_Variant is a great idea but won't work if everything that you're importing is character based... which I think is the likely case.



What is the actual source of data and how are they loading it into the tblMain table?


Excellent point. Upon further consideration, I think what you really need here is an INSTEAD OF INSERT trigger to do the data edits prior to INSERT. You should NOT use dynamic code in the trigger (way too bad for performance), but instead dynamically generate a static trigger based on the current control table values.


still think my earlier suggestion is easier. Let SQL do the validation.


Only if the code that uses that table will still work if you change the underlying column definition. I'm assuming it wouldn't, otherwise the table would already have been dynamically created with specific data types?!


My point was build a dynamic table but only for the purposes of validation. It would be rebuilt each time you needed to validate new rows, so if tblDataTypes changes, the next validation run would use the new types.

Maybe I'm understanding the OP incorrectly, but I thought that validation was the only goal.


Perhaps I misunderstand as well. I thought the data needed to be allowed into the db even if it didn't match the type, but that an audit of what data didn't match was needed. Obviously if it's a SQL data type, the row won't be able to inserted with data of an incorrect type, rather than just being able to post-audit the data.


Apparently, the OP has left the building. :sleeping: