SQLTeam.com | Weblogs | Forums

Best practice : converting INT to VARCHAR


What are your opinions on converting INT (and indeed anything else like GUIDs) to VARCHAR?

For INTs I see the following used

	']'+CAST(MyINT as varchar(20))+'[' AS [Cast],
	']'+CONVERT(varchar(20),MyINT)+'[' AS [Convert()],
	']'+STR(MyINT,20)+'[' AS [Str()],
	']'+LTRIM(STR(MyINT,20))+'[' AS [LTrim(Str())]
	SELECT	2147483647 UNION ALL   -- Max INT
	SELECT	9223372036854775807    -- Max BigINT
) AS T

I use VARCHAR(20) as being a round number that I can remember, and big enough for any of these, but I typically see CONVERT(varchar(5), MyINT) which to my mind is potentially hiding a future-failure when MyINT ticks over from 99,999 to 100,000. I also see CONVERT(varchar, MyINT) or CAST(MyINT as varchar) which I dislike as default sizing for varchar can be either 30 or 1 :frowning: For me the default of varchar(30) is unnecessarily wide - e.g. if the CONVERT is in a SELECT and the receiving APP decides on a column width based on the size of the column, rather than the widest value it contains. CONVERT(varchar(20), MyINT) may be far too wide for many numbers - I don't think I have ever hit the maximum value for a BigInt! - but it does at least fail safe.

I see LTrim(Str(MyINT)) used but I've never used that style. It is safe up to the max value for INT, but not for BigInt (without adding a without a length parameter). Just seems to me to be hard to generate a left-side-padded string which is them left-trimmed (and there might also be a "cost" to the implicit conversion of INT to FLOAT before str() converts it to string?)

GUID only needs 36 characters (or perhaps allow 38 for surrounding "{...}" which sometimes come into my APP for the ADO layer), but I convert using 40 because its a round number that I can easily/safely remember.

	']'+CAST(MyGUID as varchar(40))+'[' AS [Cast],
	']'+CONVERT(varchar(40),MyGUID)+'[' AS [Convert()]
) AS T

I'd appreciate your thoughts on how you do this, what GotChas you encounter and what you consider to be Best Practice.


Depends on what you're going to do with the result. If it's going to an application. I'd do any conversions in the application, not SQL. If you're going to another table, match the datatype (and length!) to the target table to eliminate implicit conversions.


Good points, thanks.

Do you CONVERT, CAST, STR() ... or something else? Any reason why you do ti that way - e.g. some considered opinion for it being more Defensive Programming or avoiding some GotCha?


CAST unless I need the third argument of CONVERT


That's why I use CONVERT - I see it as being consistent whether I need 3rd parameter, or not.

I have to use SELECT for:

SELECT @MyErrNo = @@ERROR, @MyRowCount = @@ROWCOUNT

(SET cannot set multiple parameters in a single statement) so I always use SELECT to set @variables, and never use SET, for the same reason of consistency.


I try to stick to ANSI SQL where possible, hence using CAST over CONVERT in some cases


I don't use any other flavour of SQL (any more, we used to do Sybase, Oracle, Watcom :smile: forgotten who that was sold to / what it became, and something by Gupta at some dim & distant point in the past), and there is zero chance of any of my MSSQL / T-SQL stuff migrating to A.N.Other database without a huge amount of work, so I am happy to use any MS-specific flavour that seems to be the most "useful" (main criteria for "useful", for me, is how much it helps with "defensive programming")


Kristen, keep the good questions coming! In this case, I am solely learning, but I do have an opinion about where the conversion should happen. I try to always pass something in the form that it is needed as on the other end. That way, it is clear on both end what is being passed. If it is a query passing to an application, the query will do the convert.


I am hopeful that the discussion is useful to others, and that there are at least as many lurkers as there are typers :smile:

You mean having the user's Data Entry Form enforce data validation rules?

Or something else perhaps?

I don't know if it is the right thing to do, but all our SProc that are "fed" from our Web Forms believe & Trust that the data they get is clean. Our Web Forms have very sophisticated validation abilities - our view is that the user prefers to be told that their data is invalid the moment that they type it, rather than "when they press SUBMIT". Immediately after completing a form field the thing they are supposed to be typing is fresh in their mind, and they have the appropriate piece of paper / email / telephone-conversation to-hand (and they have "parsed" that information). Telling them later that the data is goofy means they have to repeat all that data-acquisition & parsing all over again.

Our SProcs which get data from Other Data Sources, particularly other databases which our clients are Living and Dying by!!!, we do not trust - not even one-inch. In fact I am staggered that such clients make money, seemingly easily, with the rubbish systems they use ... We have a Data Vlidiation Errors module as part of our Data Integration module (which is responsible for sucking in data from "other systems"). If we see goofy data (i.e. anything for which we have a validation rule) we flag it in our Data Validation Errors table - their admin user gets an email alerting to "new additions to the errors table", and there is a report they can interrogate (which attempts to show them very specifically which Source Record and Which Column is at fault - in the hope that makes it easy for them to fix the problem in the source system). I'm talking about things like "Invalid email address" in a column that stores an email address. How hard can that be? Even something simple like "Don't allow 2x @" otherwise users will put a list of EMail addresses in there (which might be fine it that's what the field is intended to do, useless otherwise because the automatic mailing tool will not, by default, be expecting to get a list of TO address ...). Anyway, clients who are making good money, and think they have Clean Data, are usually running with 10's if not 100s of thousands of errors that our Data Integration modules has found ... and clearly :frowning: the client is not motivated to fix ... it is depressing.

Hadn't thought about it before, but the chatter in the other recent thread about Static Code Analysis (http://forums.sqlteam.com/t/sql-lint-code-analysis-tools/2639) is also relevant to Client Data. The effort to convert fellow programmers to write tighter Defensive code should be able to be repeated, and sold as a chargeable sales product!!, to Clients who can improve the quality of their data using the same sort of static analysis and users trained to recognise the benefit & cost saving to the organisation.


Last night I gave a SQL lecture at Ryerson University in Toronto. I used SQL Server for my demos, since it has Canadian heritage (WatCom)!


Much simpler than that, I think. If I am generating a spreadsheet and a column requires a calculated date, I would do the conversion in the query and return that column as the final date. Let's say the date is the previous business day -1, a common thing when the client report comes in on one day, is processed the next, and reported on the third business day. The conversion would be done in the query and that column in the table/result returned would be the date the client did the work. On the application or receiving side, the date received is already matching the spreadsheet column name without needing processing.

I realize that filling a web form and filling a spreadsheet have differences, but passing should be similar.


We used Watcom's C compiler (early 80's I guess) because the code it generated was hugely tighter than anything else on the market. We were still on 8-bit Intel platform at the time (Z80 or 8080 ...) and there was a maximum of around 48K of usable memory.

As I recall Watcom then branched out and created a "Light weight SQL", which suited some of our smaller clients. From (aging!) memory it had an optimisation that if you did a COUNT(*) on something there was a reasonable chance you would then ask for the underlying rows, so it pre-fetched them just in case. We used this (systems were relatively slow, back then) to display a message if the user's query would retrieve more than, say, 100 rows:

"Retrieve 1,246 records?"

By the time the user said YES the data was already cached by Watcom :smile: so the query appeared to be really quick, and made our APP look good :sunglasses:

By the 90's we were using Powersoft for our User Enquiry stuff and it came bundled with Watcom SQL for "small systems" ... and then Sybase bought Powersoft ...and that spawned Sybase SQL Anywhere ... whereas MS SQL came out of the Grown Up version of Sybase - hopefully none of your audience will check up on that though :smile:

My goodness MS SQL (and thus Sybase) was a dog back then. I think we started with 6.5, might even have been 6.0, and I remember the first install we did being a nightmare because there was so much we couldn't get working [mostly at the what-can-you-get-through-an-ODBC-connection level] (having only supported Watcom and Oracle up to that point, both of which were far more capable). My undying thanks to the DBA for the client we had in Philadelphia at the time who helped me enormously to migrate code to MS SQL and stopped my first visit to that client from looking like a completely hopeless case!

When fixed filesizes for database went away (SQL 7 IIRC?) our life was made SO much easier ...

I flew up to Toronto to visit a cousin of mine for the weekend on one of those visits. At the time there were advertisements, here in the UK, about how wonderful Air Canada's Executive Class services was - marvellous footage of Canadian comedian Mike McShane (who was well known in the UK) in a generous seat being waited on hand-and-foot, all the walls in the plane covered with Maple leaves in autumn colours ... it looked wonderful :smile: [this wasn't the one, but it was similar https://www.youtube.com/watch?v=CGjo7E0padA ] so I booked myself on Air Canada First Class Return from Philly ... they gave me a packet of peanuts on the way up, and a single Satay stick on the way back!!!