How to synchronize large database automatically?

Hi all,

I have more than 2000 tables in a database and i would like to synchronize it with another database for every one hour. What is the best solution for synchronizing database ?

Thank You

Log shipping.

1 Like

Hi Ahmed,

Thank you. I have researched about log shipping. It is very useful. But i have one question, Whether this method is good in performance becuase my daabase contains more than 2000 tables. I would like to synchronize it with another database for every one Hour.

Log shipping is pretty light weight, hence the overhead (or performance impact) is minimal, however, you would need to consider that you will lost 1 hour of data if your server failed (if you are using it for DR solution).

hope this helps

Reducing the sync frequency to 15 mins will help you to recover in time if disaster occurs and also the amount of transactions would also be less that have to be applied on your secondary server.

Hi dennisc,

Thank you.

Hi ahmed,

Thank you

Hi all,

Can we log ship database with a single server instance ?. I understand that this scenario wont help in disaster recovery.

you can do it.but the secondary database has to be different name.

Thank You

How much data changes in that time? (A lot ... or Not much?)

Hi Kris,
Not much...But the number of tables is 2000

Log Shipping sounds good then. Watch out if you do a huge import / update, and probably Index Rebuilds etc too. You always have the option to break log shipping, then take a fresh full backup and start log shipping from that point (if that will use less bandwidth than the Log Shipping does). Dunno if that might be an issue for you?

We've done a roll-our-own Log Shipping using RoboCopy which allows throttling for bandwidth, allowing log shipping to "fall behind" when the Logs were huge, and preventing the network being swamped (only during office hours).

But if you are all-on-one-machine that's not going to be a problem for you, and given that you will be making Log Backups anyway?? then disk space shouldn't be any different either.

P.S. With log shipping you won't need to restrict to once-an-hour, unless you want to (users learn that "on the hour" the data changes, and is stable in between" (if that suits them?), you could have the log backups run, say, every 10 minutes ... or even every minute if you like ... the total disk space in the day will be the same (give-or-take some overhead on each log backups), just you will have a lot more individual files.

Thank you Kristen...But i have one question, The secondary database is not operational; it can stay in either read-only mode or no-recovery mode.What that means ? I cant use that secondary database for any other use...

ReadOnly mode-users can query the database.But when the restore job starts the users will be disconnected.

1 Like

Is that "solvable" by having another database, to which the users connect, which just has Views / Synonyms to the Read-Only log-ship'd database?