Recently at work I have been working with Database Tables that have been denormalized to “enhance” their performance. As part of the denormilization process table columns where converted from int to the varchar data type. This change lead me to wonder if there is a performance gain from using varchar over int.
After doing some research on this subject it turns out that int columns provide both performance and data integrity.
- varchar takes longer to compare
Due to the varchar type allowing the storage of character data a varchar type takes longer to compare then an int value that uses only numeric values - varchar removes type check
Perhaps the greatest reason for not changing the type to varchar is that if you are expecting the column’s value to be an integer then the int column will do a type check that the value inserted is an integer. However by making the column a varchar this check is removed and your column is open for invalid character data that could cause failure to any sql scripts that do conversions.
To wrap things up, lets keep it simple if you need to save a value in a table that is an integer then use the int data type.
Comments & Questions
Add Your Comment