SQLTeam.com | Weblogs | Forums

Flat file to sql conversion problem

Hello all,

I am trying to improt flat file to sql:


data format : "1/2/2019 12:00:00 AM"

the data type is datatime2 = 2019-01-02 00:00:00.0000000

I tried setting the flat file source to data tie dt_dbtimestamp2 with scale 7

but it is failing with the "conversion failed because the data value overflowed the specific data type"

Any suggestion and help pleasee!!

Can you post the DDL (CREATE TABLE) statement of the table you're trying to import into?

Also, since your textual dates are not formatted in ISO yyyy-mm-dd, it's possible you have a value that's interpreted with the wrong setting, e.g. D/M/YYYY rather than M/D/YYYY. That could overflow a month value, for instance.

create table obj_test
obs_id int,
per_id int,
obs_sdate datetime2(7),
obs_edate datetime2(7),
ptype_id int

the problem is that if i do straight flat file import it works fine, but i try to create SSIS package to automate the process it fails with overflow error!!!!

and with straight insert it is fine. it is when i try to run update from look up it fails.


The best suggestion I have to to direct the error rows to another table where that column is varchar, rather than a datetime type, and see if there's bad/badly formatted data.

Or try changing the datetime2 columns to varchar and see if the import works. If it succeeds, then try a SELECT query like this:

SELECT * FROM obj_test
WHERE TRY_CONVERT(datetime2(7), obs_sdate) IS NULL
OR TRY_CONVERT(datetime2(7), obs_edate) IS NULL

If that DOESN'T return any rows/bad data, then I'm at a loss.

here is one more scenario :slight_smile:
i changed the source (flat file) column to DT_STR (as original)

and in the lookup i have cast the (datetime columns) to varchar(50).

then mapped the column. the insert went successfully for 500k records. but update part is just hangs there and never finishes..


You mentioned a lookup operation, is that looking up a date value? Is that lookup hitting a table in the same database? Same SQL Server? How large is the table being looked up (rowcount) and is the lookup column indexed?

If the data is imported into the same database as the lookup, do you need to do that lookup in SSIS? From what you describe it sounds like it's scanning a heap or unindexed column, or the connection you're using is implementing a cursor behind the scenes.

SSIS does not recognize that format as a date/time format - and if you want it to be defined in SSIS as a date/time data type you need to use a derived column transformation or a data conversion transformation.

To use either - you need to know the format of the inbound data. Is the format: d/m/yyyy hh:mm:ss tt or is it m/d/yyyy hh:mm:ss tt? Once you know that then you parse out the each part and build the string as YYYY-MM-DD HH:MM:SS to convert to the SSIS date/time data type.

Or - you import the data into SQL Server as a string and perform the conversion there. But again - you need to know the style and use that in CONVERT.

Casting as VARCHAR(50) in your lookup isn't going to work well if you are trying to perform a lookup based on that date/time value.


thanks for reply

the data comes in is from csv file (dt_str) data format : "1/2/2019 12:00:00 AM"

the data type in database is datetime2(7) - 2019-01-02 00:00:00,0000000

agree casting in lookup is not working well at all...



found the solution by adding:
to convert the datatype from dt_str to dt_dbtimestamp2 .

Expression: (DT_DBTIMESTAMP2,7)((DT_DBDATE)(TOKEN(Date," ",1)))

but now it hangs on the deletion: needed to delte if id !=id(from the source)

I tried oledb task to execute delete from table where id !=id(from source) what is better way to do this task.