SQLTeam.com | Weblogs | Forums

Bulk insert ignore duplicates


The following is my store proc. I use Bulk Insert to copy files from .txt to my table. Table primary key is tracking_number.
I need to ignore duplicates and move to next record. Is this possible to do this with Bulk Insert. Right now I am getting the following error:

Violation of PRIMARY KEY constraint 'PK_daily_shipments'. Cannot insert duplicate key in object 'dbo.daily_shipments'.

INSERT [dbo].[daily_shipments] from 'D:\OUT\out.txt'


One way is to import into a temp table then copy to the actual table accounting for duplicates.


Another option is to set ignore duplicates in the unique index. See here for details https://msdn.microsoft.com/en-us/library/ms187019.aspx

However, I don't like that option. In most cases, if there is an attempt to insert a dup, you do want the operation to fail and fail spectacularly. So I would recommend @djj55's suggestion.


I prefer djj55's method because it uses a staging table for validation. I never import directly from a file to a final table that already contains data. The staging table gives me the ability to mark rows as bad or for failing validation so that I can either repair them or send them back to the originator with feedback as to what they're doing wrong.

There is, however, another option not yet mentioned. You could allow infinite errors and setup an error capture file. It works for both BULK INSERT and BCP. I still prefer djj55's method, though. Bring it all in, find out what's wrong, fix it or provide feedback.