SQLTeam.com | Weblogs | Forums

Database refresh from Production DB to Test DB


#1

How to make Database refresh from Production DB to Test DB?

Sql server 2014

Thank you!


#2

one time or continuous?


#3

I am looking best practice for regular basis so development users don't have any issues and they perform jobs like as it is normal.


#4

this is what I have done in the past. but depending on your setup and environment, number of developers , number of dev servers etc. it can as fancy automated process as you want or super simple.

  1. On demand: provide devs a mechanism that does not involve you having to do backup restores every time they need one. One way to do this is as follows.
    You can make it so that devs can get a database restored to dev instance on demand by for example dropping a file to a hot folder with the list of database(s) they want restored to dev, then you can create a SQL job that runs say every few minutes and reads from folder and finds that file, reads and finds the backup from remote back folder, copies it to local dev server and then restores.
  2. Scheduled
    you create a sql job that pretty much say every day after work, pulls all of the databases from prod, restores them to dev, then runs all of the sql script changes dev are doing from a certain feature folder?

Which one would you prefer and I will detail the exact way of doing this.


#5

One thing to keep in mind when restoring production to any other non-prod environment is PCI data. You may have to run some procedure to remove this from the lower environments


#6

Thanks Yosiasz for detailed outlined procedure.
We don't need to do frequent refresh but let say every month or twice a month but not that frequently.
Either way i am fine but 2nd option looks like more easy way?
My purpose is to make test like production data wise but keep the test environment as is so developer /users don't have any issue after that.


#7

Thanks Mike for the PCI data, we have to keep in mind.


#8

you will need to obfuscate the data if there are a lot of personal and sensitive data as @mike01 mentions. also touch base with your legal team in case of issues. you dont want to end up being shown the door. you if dba are responsible to find these things out.


#9

both are easy to implement. depends what you want to do. what I have found out if #1 is better so that it is a hands off approach. CI/CD - ish :wink:


#10

Thanks again Yosiasz.
If you can send me the detail as you mentioned "Which one would you prefer and I will detail the exact way of doing this", then it will be great!


#11

ok for starters so as not to swarm you with all of the details, let's take baby steps. I would expect you to do most of the leg work and try to understand what this script is doing.
Please save this as ps1 , powershell, modify the $SourceFolder and $DestinationFolder to your specific environment and let's try that out first part which is
Please try this and post back if it worked or not.

  1. copying from prod to dev server.
$SourceFolder = ("\\your_backup_server\e$\Backup"); 
$DestinationFolder = ("\\your_dev_server:\d$\RestoreDB");
$date = (Get-Date).AddDays(-1).ToString("yyyyMMdd");	
	

#Delete All Files
Get-ChildItem -Path $DestinationFolder -Include *.* -File -Recurse | foreach { $_.Delete()}

#List all folders excluding system databases and others
$BackupFiles=$(get-childitem "$SourceFolder" | Where-Object { ($_.Name -Like "*bak*") -and ($_.Name -NotLike "master") });
$Yesterday = (get-date).AddDays(-1).ToString("yyy_MM_dd")


foreach( $backupfile in $BackupFiles)
{
	$yesterdaybackupfiles = $(get-childitem $backupfile.FullName | Where-Object {$_.Name -match $Yesterday}); 
        
	foreach( $file in $yesterdaybackupfiles)
	{
        $destinationFileName =  $file.Name
        $cleanfilename = $destinationFileName.Substring(0, $destinationFileName.IndexOf("_backup"))
        robocopy $SourceFolder $DestinationFolder $file.Name
        Rename-Item $DestinationFolder\$destinationFileName "$cleanfilename.bak";
    }

}

#12

Thanks Yosiasz for detail steps, I will run it and let you know.
But meanwhile, i found the solution and i think this will also work as this is not ongoing refresh as it might be a once a while, please correct me.

  1. Take Backup from Production DBs (bunch of different DBs backups)
  2. create Login script from DEV server to avoid SID mis match, Run Transfer logins and passwords between instances of SQL Server script which will generate sp_help_revlogin procedure
  3. Restore Production Database Backups for each Database backup respectively in DEV server and execute that procedure "sp_help_revlogin" so user can have original login after refresh
  4. Will run DB related Change script if any
    Do i need to do anything with SSIS or other jobs?

#13

Yes your approach is very good! you dont need to do SSIS or SQL jobs if you will be running all this manually.


#14

Thanks.
Some of the tables we don't want to restore or we have to keep the some of the data from DEV we need to keep it so any better idea or just keep the deployment script and rerun again in DEV?


#15

Your after restore script should truncate those tables. Of course you should to see if there are fo references and all.so restore and scrub rinse repeat


#16

Thank you again.
if we implement TDE in sql server 2014 for Database at Rest, During Refresh from PRODUCTION to TEST, what happened?
How to take care of it?


#17

do you have TDE now implemented. and why would you want to implement TDE inthis refresh scenario.


#18

I don't have TDE but we will be implementing.


#19

Might it be overkill if you will be scrambling the data or will you not be doing that?
Why do feel it is necessary to implement that?
I try to keep all environments as identical as possible that way there are no surprises on the application end, one less thing to worry about.
Sonic you implement the in one environ u should in all environs


#20

It's company requirement that thinking in that direction.