Date format conversion

Date time format conversion
I need to convert the date time format for the column to destination
source : 12/15/2014 --> 12/215/2014 8:59:49.456

I am guessing that the 215 is a typo. Even so, if your source does not have the time portion, how/where do you get it? Is it the current time, or is there another column that stores the time?

12/15/2014 --> 12/15/2014 0:00:00.000 sorry it was copied from other column to indicate the format. yeah if we don't have any time it should come as 00's in format field.

If your data type is DATETIME, it already has a time portion. For example:

DECLARE @today DATETIME = '20150824';
SELECT @today;

This prints out

2015-08-24 00:00:00.000

So you can think of it as the data being stored with a time portion as well, even if you don't have a time portion IF the data type is DATETIME.

If you want to output that in a specific format, you can use CONVERT functions or on SQL 2012 or later, FORMAT functions

Consider NOT using FORMAT. Its 44 times slower than CONVERT for dates/times.

Jeff, can you point me to any references or benchmarks that measures this performance penalty? I was under the impression that the performance of functions implemented in CLR (as FORMAT is) might be slow at the first run if SQL Server has not already loaded the required CLR libraries, but once it loads them, the performance should be comparable, if not better.

Apologies for the delay, James.

No need for a white paper. We can easily do our own benchmarks. Here's the code. See the "Messages" tab for the bad news and confirmation of the performance penalty. It's likely not the "first run of a CLR penalty". It's more like some foolish implementation of RegEx behind the scenes.

Run this and see for yourself.

--===== Do this in a nice, safe place that everyone has.
    USE tempdb
;
--===== Conditionally drop the test table to make reruns easier in SSMS.
     IF OBJECT_ID('tempdb.dbo.JBMTest','U') IS NOT NULL
        DROP TABLE tempdb.dbo.JBMTest
;
--=====  Create a substantial test table with the following columns and values.
     --  SomeID         = Unique Integers starting at 1 up thru the number of rows generated.
     --  SomeDate       = Random Integers 1 thru 50,000
     --  SomeLetters2   = Random letters "AA" thru "ZZ"
     --  SomeDecimal    = Random Decimal amounts from 0.00 up to and not including 100,000
     --  SomeDate       = Random Datetime from 2010-01-01 up to and not including 2020-01-01
     --  SomeHex        = Random hexidecimal characters with random locations of dashes.
 SELECT  TOP (1000000) 
         SomeID         = IDENTITY(INT,1,1)
        ,SomeInt        = ABS(CHECKSUM(NEWID()))%50000+1
        ,SomeLetters2   = CHAR(ABS(CHECKSUM(NEWID()))%26+65)
                        + CHAR(ABS(CHECKSUM(NEWID()))%26+65)
        ,SomeDecimal    = CAST(RAND(CHECKSUM(NEWID()))*100000 AS DECIMAL(9,2))
        ,SomeDate       = RAND(CHECKSUM(NEWID()))*DATEDIFF(dd,'2010','2020')+CAST('2010' AS DATETIME)
        ,SomeHex        = SUBSTRING(LEFT(NEWID(),36),ABS(CHECKSUM(NEWID()))%37,ABS(CHECKSUM(NEWID()))%37)
   INTO dbo.JBMTest  
   FROM      sys.all_columns t1 
  CROSS JOIN sys.all_columns t2 
;
--===== Add the PK
  ALTER TABLE dbo.JBMTest
    ADD PRIMARY KEY CLUSTERED (SomeID)
;
GO
  PRINT '========== FORMAT METHOD ==================================='
GO
DECLARE @BitBucket CHAR(8);
    SET STATISTICS TIME ON;
 SELECT @BitBucket = FORMAT(SomeDate,N'yyyymmdd')
   FROM dbo.JBMTest;
    SET STATISTICS TIME OFF;
GO 3
  PRINT '========== CONVERT METHOD =================================='
GO
DECLARE @BitBucket CHAR(8);
    SET STATISTICS TIME ON;
 SELECT @BitBucket = CONVERT(CHAR(8),SomeDate,112)
   FROM dbo.JBMTest;
    SET STATISTICS TIME OFF;
GO 3

Here are the results from 2012 EE SP3 and the latest CU. Like I said, FORMAT is 44 times slower than CONVERT. In this case, it's 45 times slower. Since they refer us to a .NET page for the formatting part of the function when we look it up in BOL, I wonder if the other .NET based languages also have the same problem.

========== FORMAT METHOD ===================================
Beginning execution loop

 SQL Server Execution Times:
   CPU time = 36735 ms,  elapsed time = 38477 ms.

 SQL Server Execution Times:
   CPU time = 37000 ms,  elapsed time = 38577 ms.

 SQL Server Execution Times:
   CPU time = 36812 ms,  elapsed time = 38577 ms.
Batch execution completed 3 times.
========== CONVERT METHOD ==================================
Beginning execution loop

 SQL Server Execution Times:
   CPU time = 828 ms,  elapsed time = 834 ms.

 SQL Server Execution Times:
   CPU time = 813 ms,  elapsed time = 826 ms.

 SQL Server Execution Times:
   CPU time = 843 ms,  elapsed time = 850 ms.
Batch execution completed 3 times.

Thanks Jeff. That's awesome!!

Well, I didn't mean that the behavior is awesome, it kinda sucks!!:smiley:

It DOES suck! You would think that the folks in Redmond would have some inkling of what the word "performance" means. They did the same for "Preceding Rows" and the like for Running Totals. The "Quirky Update" still blows everything away for performance. I've re-verified the performance penalty of using "supported code" on 2012 SP3 with the latest CU using the fine test harness that Wayne Sheffield provided when it first came out. See the following URL for that testing.

http://blog.waynesheffield.com/wayne/archive/2011/08/running-totals-in-denali-ctp3/

Anyway, my apologies for not originally posting the code on a performance claim. I was in a hurry when I posted that. And thank you for the feedback.

Pl. See this for conversion

SELECT TO_CHAR(Column_name,'DD-MON-YYYY,HH24:MI:SS') FROM Dual

This is a Microsoft SQL Server forum.