Transfer table from one instance/server to another instance server

Good afternoon, i am requesting some help and guidance in achieving the following task. I am fairly new to SQL and alot of what i have learned thus far has been with the help of this site, google and youtube and of course the school of trial and error. So for that i want to say a big thank you to all who have answered my posts.

So i have a local instance of SQL on a PC, and i have a remote instance on a server, i am looking for the best approach to transfer a table complete from the local instance to the remote instance.

There has been some ideas tossed around such as on the remote instance have a view that connect back to the local instance then using a stored proceedure call that view and save the results in a table. but the file name changes on the local instance so this method might not be the best.

I am certain other out there have probably done similar things. So if you could lend a helping hand and guide me in a direction that will be the best.

And just so you know we are using SQL 2008

Filenames should not matter to the SQL code as they are not relevant, but perhaps you meant Database Name?

In principle if the table already exists on both the Local and the Remote you can do:

INSERT INTO MyRemoteServer.MyRemoteDatabase.dbo.MyRemoteTableName
SELECT *
FROM dbo.MyLocalTableName
or
FROM MyLocalDatabase.dbo.MyLocalTableName

If you need to "empty" the target table first then

TRUNCATE TABLE MyRemoteServer.MyRemoteDatabase.dbo.MyRemoteTableName

(You cannot do that on a table that has Foreign Keys associated with it, so you would have to use DELETE instead - but beware that that will LOG all the deletions, so on a big table will use a lot of Log File space).

If your table has an IDENTITY column then you need to SET INSERT_IDENTITY on the (remote) table to ON before the operation, and SET it OFF afterwards.

In order to be able to use "MyRemoteServer" that would first need to be set up as a LINKED SERVER.

The 4-part-naming i.e. MyRemoteServer.MyRemoteDatabase.dbo.MyRemoteTableName is fine for one off operations, but if you are going to write operational code, e.g. Stored Procedures and Views, that reference the remote server (or even any 3-part naming to reference a table in a different database [on the Local server]) then I strongly recommend that you always use SYNONYMS.

So ... in the Local database you create a SYNONYM called "dbo.MyRemoteTableName" which physically refers to MyRemoteServer.MyRemoteDatabase.dbo.MyRemoteTableName

Then (executing in your local database) your code becomes

INSERT INTO dbo.MyRemoteTableName
SELECT *
FROM dbo.MyLocalTableName

and in the event that anything about the Remote Server/Database/Table name changes you can just DROP and reCREATE the Synonyms pointing to the new, correct, location.

All this said, copying data to a remote server can be slow (or tie up the network) if the table is big. So you could consider another approach which is to export the table(s) to a file(s), transfer the file (ZIP it up / whatever) and then import the files(s) at the far end. There are various ways to do this, but we use the external program BCP.

Benefits of this also include being able to control the batch size of the task - using the plain SQL statement INSERT INTO will do ALL rows in a single transaction - that may be A Big Processing Job for the receiving server :frowning: so using a Bulk Import (and setting a sensible Batch Size) helps with that, and you can also pre-sort the data , when exported into the transfer file, into Clustered Index Order, AND provide a HINT to the Bulk Import that the file is pre-sorted, and SQL will import the data more efficiently.

Another performance suggestion, for import of larger tables, is to DROP all non-clustered indexes BEFORE the Import, and recreate then AFTER

You should also consider how you handle any referential integrity - if you transfer multiple tables which have FOREIGN KEY relationships. Ideally you would import PARENT Table first then CHILD Table, but sometimes relationships are circular - so sometimes it is easier to drop the Foreign keys before and recreate them after.

Also watch out if you have TRIGGERS on the remote tables - you might want to DISABLE TRIGGERS before the import, and ENABLE afterwards.

Beware of Log File space usage during long operations. If the target database is in FULL Recovery Model then I recommend that you set up a Log Backup to run EVERY MINUTE during the import (depends a bit on whether the import is a single transaction, and a single table - multiple transactions (batches) and multiple tables will backup more often, in smaller pieces)

Hopefully way more things there than you ACTUALLY need to worry about! Either way, should help you (as one of my clients often says ...) "cover your Front, Back and BOTH Sides" :slight_smile:

I agree with @Kristen solution and you need to consider several performance factors while moving data from one machine to another especially when you transfer multiple tables which have FOREIGN KEY relationships.
Therefore you can also try some third party utilities like SysTools SQL Server Database Migrator which can help you to move data from One instance to another.