Bulk Insert a fixed width file

The single-CLOB was just to get into SQL. Of course you would parse and edit it after that. But then the column delimiter and row delimiter could be dynamically determined, or even truly non-existent. Naturally when SQL loads a CLOB it doesn't treat anything but the end-of-file marker as special chars, afaik.

I don't know how important "Missing final row terminator" is - whether that will prevent a Bulk Import from working? but its certainly something that I encounter [not infrequently] in data-to-be-imported that we are asked to build import routines for.

The other thing that we see reasonably frequently is [fixed length] rows with "last column trimmed" of any trailing spaces,

PITA .. but they are the sorts of things that I think your single-CLOB approach would solve.

If the Bulk Import tool will handle any line terminator issues (only one I can think of is "missing from final row") then my preference would be to import it split into [single column] rows based on the row terminator, and then split the single-column into the individual columns under SQL control - that can take care of trimmed final column(s) and checking the validity of the rows.

We definitelyget clients who get 0-byte files due to Disk Full (i.e. its a common occurrence rather than a rarity) and we check for that prior to actually attempting to import them, along with "some files present, some missing" (we abort without importing anything and report to the User), but I imagine a Disk Full could cause a truncated, rather than 0-byte, file. There is perhaps a reasonable chance - hopefully?!! - that it will NOT truncate exactly on a row boundary so if rows are not normally Trimmed then detecting a short final row would be an indicator of that (and would probably cause any regular Fixed Length import tool such as BCP or SSIS to just blow up with an error ...)

I've worked for Clients where the DBAs frequently spend the whole morning scratching their heads as to why some SSIS scheduled jobs have failed because the error messages are so obtuse ... my preference is to build in resilience, and then add to that resilience each time yet-another-unexpected-hosing occurs so that our time-to-diagnose is as short as possible.

Well this was my first run at SSIS. It took some time to set the files up and map them properly because of the number of columns but it worked. My only issue after that was getting it to look for the file that ends with todays date (i.e. (agnt20150731.txt) but I am please to say that I just finished creating my first SSIS package. On to the next phase which is actually getting the data into the appropriate tables and then making all of this part of a stored procedure.

Thanks again for your help....all of you.

Scott

Apologies for stating the obvious, but if it were me I would throw some "deliberately broken" files at your SSIS package to see how it reacts. On my list would be:

Truncated file - broken somewhere in the middle of a row - i.e. last row appears "short" (presumably SSIS can't know it is truncated, unless there is a separate import file with "Number of expected rows per file" or somesuch.

Final row terminator missing

and I suppose, although a bit belt and braces, one/several rows in the file that have the correct row-terminator, but are "short" (the last-column-trimmed syndrome)