Bulk Import issue

I am trying to automate an import of csv to a table. My issue is that my csv has the occasional speech mark in a couple of column when a comma is within the cell, this in turn is giving me an incorrect import.

Example of Inventory.csv

Computer Name,Manufacturer,Model,Service Tag,IP Address,Last Logon User,Warranty Expiry Date,Scan Status,OU Name
DEVICE1,"VMware, Inc.",VMware Virtual Platform,VMware-00 00 00 00 00,--,,,Success,VDi Critical Masters
DEVICE2,Dell Inc.,PowerEdge R510,45TERTT,,admin,"Sun, Feb 22 2015, 12:00:00",Success,File Services
DEVICE3,Dell Inc.,PowerEdge R510,7HGDGSD,,admin,"Sun, Feb 22 2015, 12:00:00",Success,File Services
DEVICE4,Dell Inc.,PowerEdge R310,TYUHHFG,,Admin,"Sun, Feb 22 2015, 12:00:00",Success,File Services
DEVICE5,Dell Inc.,PowerEdge R510,VBNCGHJ,,admin,"Sun, Feb 22 2015, 12:00:00",Success,File Services
DEVICE6,Dell Inc.,PowerEdge R210,JGDNFGN,,admin,"Sun, Feb 22 2015, 12:00:00",Success,File Services

T-SQL Script
USE [DatabaseName]

DELETE FROM [tbl_Inventory]

INSERT [tbl_Inventory]
FROM 'D:\SQL\Collections\Inventory.csv'


--Check the content of the table.
SELECT COUNT([Computer Name]) As Entries
FROM [tbl_Inventory]

Any ideas?

Instead of comma delimited, can you use tab delimited data saved as .txt
do you have any control over that?

No, I can save as xlsx

Where are you getting this data and how are you saving it as a CSV file? In that process you want to insure that you are quoting the fields appropriately - and in your bulk insert make sure you tell the system how to process the file:

[ [ , ] FORMAT = 'CSV' ]
[ [ , ] FIELDQUOTE = 'quote_characters']
[ [ , ] FIELDTERMINATOR = 'field_terminator' ]
[ [ , ] ROWTERMINATOR = 'row_terminator' ]

The issue you probably are having is because you have not told the system that the double-quote is the field quote.

The csv file is being report generated from an application.

I've just tried this but the commands

FIELDQUOTE = 'quote_characters',

Aren't working, getting 'Incorrect syntax near FORMAT' and 'Incorrect syntax near FIELDQUOTE'.

What version of SQL Server are you using? The FIELDQUOTE parameter is available only in SQL 2014 or later. If you are using an older version of SQL Server, bulk insert cannot handle embedded commas in data fields.

With older versions, an alternative might be to use SSIS (or Import/Export Wizard) and specify the field quote character. I don't recall the version numbers, but in some of the older versions of SSIS, that was not working very well. See here

If at all possible, have the source file generated using a different delimiter. That would make life much simpler. In my world, I ask vendors to use pipe as the delimiter when sending "csv" files.

Which is why I had asked how the file was being generated in the first place. If the file is in Excel - you can save it as a CSV file - but prior to saving you then modify your system to use | (pipe) as the list separate under Additional Items in the Region and Language applet from Control Panel.

If the file is coming from a custom application - and that can be modified to use a different delimiter - that would be the better option.

The xlsx extension suggests that the files if not created in Excel at least can be read by Excel. If that is the case then you may want to consider using VBA to replace the offending characters. I found on the net a small script which quickly and efficiently did this for some 90k csv files. I am not sure of the appropriateness of posting vba scripts here. If it is ok I will post here so that others may find it should they encounter a similar difficulty. If not, then you could either search for the script on line or message me and I will send it to you that way.