Hi,
I've been looking around the net for some concrete ways to accomplish the following.
Set up a bulk insert to load data from one table to another identical table on a different server which is used for reporting.
I obviously want to use Fast load or bulk insert for this and commit a certain amount of rows per batch, ideally the commit the same amount of rows as the batch size.
At this time I'm only concerned about restarting the insert from where the last batch failed and rolled back, and not concerned about error rows as the task should only fail due to time outs etc.
For example, if I run batches of 1 million rows for a table that has 10 million rows and the task fails at 5,500,000, then I'd want to be able to restart the package to run from 5 million rows as 4 million rows have been committed.
Hope that makes sense. Does anyone have a robust/good solution to do this?
Any help is appreciated.
Thanks