I want to take a database backup automatically and to zip the database automatically

How can I want to take a database backup automatically and to zip the database automatically. in sql server 2008 r2

Hi nagaraju,
I would suggest to turn on backup compression rather then using zip, as backup compression can be use in SQL 2008 R2 standard edition as well, which already make the backup file pretty small.

I just tested on one of my backup file, having 20Gb (11.5G used) database, the backup file is only 1.36Gb. I tried to zip it after that, and it only get down to 1.26Gb, I don't think it worth doing given you need to unzip if you ever need to restore it and all the overhead of zipping since it only gives you less then 10% of space saving.

If you really looking for compressing your backup, try some 3rd party software, I tried some of them and they can give you the little space back, but in return, it will be higher CPU and maybe a little bit longer overall duration. However it does integrated with SQL and it can perform backup and restore directly (without unzipping it first).

Hope this helps.

4 Likes

Is there any sql code to zip the database backup automatically

i mean is there any script to data base backup zip automatically

Is there in SSIS to take database backup and zip automatically.

Thanks & regards
Nagaraju

Why do you want to use ZIP rather than SQL's inbuilt backup compression?

ZIP requires multiple steps to achieve compression, in SQL you just add the compression argument to the BACKUP command

I think the test needs to be using ZIP on an UNcompressed backup. Any file that is compressed will then not compress well thereafter, and SQL's compression is "moderate" as it does it during streaming rather than being able to review the whole file, as a post-process.

I raised the question about compressed backups in this thread:
http://forums.sqlteam.com/t/pros-cons-of-database-backup-compression/2263

SQL's inbuilt compression has significant gains because the output file is smaller it takes less I/O to write it. So although it uses more CPU than a normal backup it uses less I/O which is more likely to be critical on a server.

My tests gave 80-86% filesize saving and reduced backup time by ~40%

If you use ZIP then you have the original backup i.e. 100% filesize and backup time. THEN you have to ZIP the file, so the file has to be re-read, compressed, and re-written. Using normal ZIP compression I got about 2% more saving, which is negligible compared to SQL's compression. Using the most aggressive compression I was able to get 91% compression, but it took 13x as long, so unless filesize is absolutely critical I just can't see the point.

At the other end you have the Restore. A backup file compressed by SQL will restore more quickly (than an uncompressed one) because the I/O is less. For ZIP you have to read, uncompress, write and THEN restore, and the bigger, uncompressed, file will restore more slowly too. Thus your disaster recovery time is greatly increased.

2 Likes

We have 100 data base in a server for that have to take all the data bases in a single backup file at a same time and it should be zip (to decrease the size of a file) as automatically . and it will be scheduled as once in a week . There is another point is the zip file must be transfer to another server this also must be automatically done.

Thanks & Regards
Nagaraju

Why all-in-one file? Anything goes wrong - such as disk full / network error - then you have nothing.

Individual files - then you have whichever files are not damaged / missing.

As I said ZIP takes a LOT more CPU and time to create, and to restore, and it at best 10% smaller than using SQL Compression. To get 10% smaller you will be taking hours of CPU time on large files, for "reasonable fast" ZIPping you will only get 2% saving compared to SQL compression.

My figures, in the thread I linked to, for a 5GB database were:
23 seconds Backup using SQL Compression
38 seconds Normal SQL backup
80 seconds Normal SQL backup + ZIP (fastest ZIP, least compression)
623 seconds Normal SQL backup + Best ZIP (most compression)

I would do this:

  1. Set up a backup task, using compression, that outputs to x:\path\MyDatabaseName_yymmdd_hhmm.BAK

  2. Use a RoboCopy task with "Mirror" parameter which copies the files to \RemoteServer\Path

  3. Set up a task on Remote Server which detects a new file and restores it (assuming you need that option, maybe you just want to have the latest files on that server and not actually restore them?)

  4. Some sort of "purge" which deletes stale files on Source after a reasonable retention time. Using RoboCopy Mirror will cause the files to also be deleted from the Remote Server.

Notes:

At (1) I would have a Database Exclusion List so that you can prevent a database being backed up / copied - i.e. any database that you do NOT want to be transferred to remote server. We do this for any database with the name "RESTORE_*" so that we can make a temporary restore without it becoming part of our automatic backup process.

You might also want to use the COPY option for a FULL BACKUP - otherwise you run the risk that the backup becomes the master for any subsequent DIFF backups. I guess that if you do not use DIFF backups then a regular Full Backup will do.

Personally, as size is important to you, I would take a TLog backups (if the database is not using Simple Recover Model) immediately before the Full Backup as this will help make the full backup smaller.

At (3) you need to be careful not to start restoring before the file has finished copying :frowning:

1 Like

Is there any script that to compress the backups automatically .
When the data base backup is done successfully then have to compress the backup file as automatic because i will schedule the job on 12:00am night as in a SQL Server agent .

We use the zip to add encryption, then move to offsite storage.

This is done by a Windows task that is scheduled after the maximum time to backup the database. Not the best way, but it works for us.

1 Like

is there any script for that to schedule it as a job

We just schedule the SQL job with a step like:

BACKUP DATABASE MyDatabase TO DISK = N'G:\SQLBackup\MyDatabase_Backup.bak' WITH INIT, NAME = N'MyDatabase Backup', SKIP, NOFORMAT, COMPRESSION

Then after we know how long the backup will be create a Windows task to execute a batch file like (using 7zip):

"\\OurServer\C$\Program Files (x86)\7-Zip\7z.exe" a -t7z "G:\SQLBackup\MyDatabase" "G:\SQLBackup\MyDatabase_Backup.bak" -m0=lzma2 -mx3 -mmt=on -mhe=on -p"OurPassword" -ssw -y

Hope this helps

1 Like

Hi,
As stated by Kristen and myself, it would be better off to use the native backup, it might not be related, but I did a test on SQL native backup with compression compare to using 3rd party (Redgate), you can see the result here.

I strong suggest you review your backup strategy, and ensure that you can restore your database in a disaster.

Hope this helps

1 Like

Good Morning
By using SSIS is there any possibility to take database backup automatically and to compress the backup
if it possible means i need the link.

Thanks & Regards
Nagaraju

You haven't answered any of the points raised here, you just keep asking for the same thing.

You started out by telling us what your solution was without adequetly explaining what problem you were trying to solve. You have now explained that you want to backup a large number of databases, once a week, and transfer to a remote server; I assume from what you have said that transmission bandwidth is an issue, and thus you need compression.

Whilst it can be done the way you say - take a backup, zip, transfer, unzip - folk here have questioned whether that is a good idea and suggested that you would be much better off just using SQL's inbuilt compression. Yet you keep ignoring that advice and repeating your request for an SSIS script that will then ZIP the files. Why?

Are you trying to export just PART of the database perhaps?

1 Like

Actually i need the answer but you are giving me questions how can i say the solution to you .
if you give me solution means i will try in my server .

Actually i know there is some possibilities but i don't know how to write a script to this .Am asking for the script .

If scripts existed for SSIS I expect someone would have pointed you in the right direction (or Google will find one for you). so I doubt that there is a solution via that route.

I am less sure about a script for precisely what you want. I think most people who want to "copy" a database to a remote server want to do it more often than once-a-week, so it is more usual to use Log Shipping or Replication [which copies the database, continuously, in real-time or near-real-time

For "less frequent" copying I think the most likely equivalent [to find a script for] would be Developers copying a Production Database to a Development server.

My way, copying from above, would be

Set up a backup task, using compression, that outputs to x:\path\MyDatabaseName_yymmdd_hhmm.BAK

Scripts for Backup are readily available, however it is not entirely straightforward. WiIl you use the same filename each time? (if so how will you know whether the file is New, or just the old one from last week?). If you include the Date/Time in the filename then you have to determine a means of copying the newly named file (RoboCopy "Mirror" mode will do this automatically) but you also have to figure out how to delete the file (after a week or two, or after the Restore has been completed on the Remote - in which case the Remote has to communicate back "Restore completed successfully").

You don't have to have this level of sophistication ... but ... if you don't a) how will you know that it has worked? and b) how will you get an alert if & when it fails for some reason? maybe it doesn't matter if it fails?

Suitable backup command would be:

-- If database in FULL Recovery Model ideally make a LOG backup first
DECLARE	@strOutputFilename	varchar(256)
SELECT	@strOutputFilename
		= 'x:\Path\'  -- '
		+ 'MyDatabaseName'
		+ '_'
		+ CONVERT(varchar(8), GetDate(), 112)	-- yyyymmdd
		+ '_'
		+ REPLACE(CONVERT(varchar(8), GetDate(), 108), ':', '')	-- hhmmss
		+ '.BAK'
BACKUP DATABASE MyDatabaseName
TO DISK = @strOutputFilename	-- e.g. "x:\Path\MyDatabaseName_yymmdd_hhmm.BAK"
WITH
	COPY_ONLY
	, COMPRESSION
--	, DESCRIPTION = 'text'
	, INIT	-- This will overwrite any existing file, rather than append
	, CHECKSUM
	, CONTINUE_AFTER_ERROR	-- But consider "STOP_ON_ERROR" instead
	, STATS = 10
--	, ENCRYPTION ... if you need that

Having said that, presumably you already have a backup of your databases? If not you need to!! In which case why don't you just copy the most recent full backup file, once a week, instead of making another one?

Use a RoboCopy task with "Mirror" parameter which copies the files to \RemoteServer\Path

You would need to consider what options you need in RoboCopy - personally I would want logging to help diagnose any problems, and you might want specific file types only, all sub folders, and so on.

The basic command is:

robocopy x:\SourcePath \\RemoteServer\Share\DestinationPath /mir 

What about sending you an alert (e.g. by Email) if the copy fails? e.g. disk-full on Remote Server, or network down, or DNS cannot resolve the RemoteServer name? (RoboCopy will re-try the copy so intermittent network problems should be fine)

Set up a task on Remote Server which detects a new file and restores it (assuming you need that option, maybe you just want to have the latest files on that server and not actually restore them?)

You haven't said what you want to do with the file on the target server??

Some sort of "purge" which deletes stale files on Source after a reasonable retention time. Using RoboCopy Mirror will cause the files to also be deleted from the Remote Server.

Do you need to keep the backup file after it is transferred? How long for? DO you need to know if the file has been "processed" successfully by the remote server?

1 Like

is there any possibility by using maintanance plan in sql server to take a database backup

Yes. Use the Maintenance Plan Wizard in SSMS.

BUT ... the maintenance plans are a lousy APP, they suffer from all sorts of fragility. e.g. if a database changed from FULL to SIMPLE Recovery Model, or is set to READ_ONLY / OFFLINE / etc.

That said, Maintenance Plan will give you a Scheduled Backup which will delete the backup file after X-days retention period and Email you if it goes wrong etc. PROVIDED that you answer all the questions correctly. No reason why you shouldn't, but there are an AWFUL LOT of questions and not understanding / realising what the implications are might lead you into trouble.

A couple of days ago @gbritton told me about an excellent SQL Housekeeping APP called "Minion Backup" (they also do other APPs including Reindex and CheckDB, and Enterprise solutions)
http://minionware.net/backup/

that might be a better solution for you. I haven't tried it, but from what I have read you should be able to just install it and it will start making backups of all your databases, simple as that (VERY IMPORTANT: Make sure you edit the install script with the PATH LOCATION where you want the backup files stored)

NOTE that you will need to enable xp_cmdshell, if not already (it is DISABLED by default, because there are security implications to consider)

Just installing it will give you FULL Backups daily on System database and on Saturday evening for User Databases with Differential Backups on other evenings in the week, and Log Backups on any database set to FULL Recovery Model every 30 minutes. You can change the schedule for any/all of those, but from the moment you install it that is what you will get until you change anything.

Just checking the DOCs I see that Minion Backup has an option to copy a backup file to another location, so that might be a way for you to get your Backup File to your remote server?

1 Like