SQLCMD Import CSV file from a different server into a SQL sever database(on a different server)

Import CSV file from a different server into a SQL sever database(on a different server)

So, I have Sql Server on ServerA and the CSV file is on ServerB.
So how can I use SQLCMD if the file needs to be on the same server as the database?

I can connect to ServerA within ServerB using SQLCMD.
I want to setup a batch job to import a CSV file from serverB to be imported into Sql server on serverA? There is No network share drive.

I would use BCP rather than SQLCMD.

If file is on ServerB there should be no problem running BCP on ServerB and "pointing it" at ServerA.

If the CSV data was exported from an MS SQL Server database? I suggest that it would be much better to export it using BCP using the NATIVE data format mode option, as that avoids all the hassle of converting datatypes to String, and back again, and all the vagaries of delimiting the data with CSV.

what process generates the csv file? and what is the data source of the data in the csv file?

Hello,

The CSV file is being created on a nightly based from an in house application.

Kristen - Thanks for the replay. I will try the BCP method.

does the in house application have some sort of database backend?

Yosiaz - No, it creates CSV files everynight. then another program combines all CSV files into one file. Then i need to take the combine CSV and import that into our database.

but I am having issue with BCP
bcp report.dbo.TempTextData in C:\inetpub\wwwroot\TwilioSmsHandler\log\combined%latest% -t -c -S %sqlinstancename% -U %PRDlogin% -P %PRDpassword%

I get an error below please help.

Starting copy...
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 13 for SQL Server]Unexpected EOF encountered in
BCP data-file

BCP copy in failed
Press any key to continue . . .

yes I understand that part. I am just wondering where does the data that is in the csv files come from?

Assuming that your CSV file has combinations of:

comma separated fields (that part is fine)
Some columns wrapped with double-quotes (because they contain a comma)
Some columns, additionally, containing doubled-up-quotes - because they are wrapped with double-quotes but, themselves, include a double quote

then I reckon that all bets are off trying to use BCP to import that.

Also if any field contains an embedded line-break, or if any rows have fewer comma delimiters (e.g. because the trailing fields are all "empty")

IME all these things are common (and, indeed, acceptable) in CSV files. Unfortunately they are not (AFAIK) supported by BCP - why? I don't know, its 20 years since I started using BCP, files using CSV are commonplace and the standard is well understood, short of using XML or binary/Native mode files there isn't a suitable alternative (and CSV is no use for fields with embedded line breaks, nor is any other simple-text file format). Microsoft should have fixed all this decades ago ...

If the file's fields are delimited with comma, there are NO wrapping-double-quotes and NO field has an embedded comma, NO embedded line breaks, then you are probably fine.

You also need that the target import table has columns in identical sequence to the CSV file

If you have any of the wrapper-quotes / embedded commas / short-rows problems then the only way that I know is to use BCP to import to a temporary table, treating each row as a SINGLE column, and then "splitting" the rows, into multiple columns, using SQL. This allows a lot of flexibility - e.g. columns NOT in the same sequence as the target table, and you could probably even handle embedded line breaks.

I always use these parameters to store as much output / debug information as possible. BCP won't necessarily abort on error, and these output files may be the only after-the-fact clue that you get that an unattended batch operation has gone haywire!

-e ErrorFileName
-o OutputFileName
-m 1000 (or bigger)
-b (Suitable Batch Size value)

Consider also:

-w if you have Wide characters
-E if you have Identity column
-F if your first row contains column names (Case sensitive)