Backing up partial data to a new DB

Hi,
Running A MS SQL 2019 server there are several databases with thousands of tables, due to the nature of the business it has to be kept, however, as developers it feels a bit wrong to work on the live DB for development, so I downloaded 2022 developer edition, and what I am trying to achieve is loading a small-ish(say 1000 rows from each table)subset of data onto my local developer, I have tried the generate script but that seems to output the entire data, and so does backup.
Is there a tried and tested method to get 1000 rows from each table in a database and then insert them int another? I am using SSMS for most "manual" tasks.
TIA

1 Like

I've done variations of this a few times, this works in a command line environment:


bcp "select table_schema, table_name FROM myDatabase.INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE_TABLE" queryout  all_tables.txt -S servername -T -c -t","

for /F "tokens=1,2 delims=," %a in (all_tables.txt) do bcp "select top 1000 * from myDatabase.[%a].[%b]" queryout "[%a].[%b].data" -S servername -T -c

for %a in (*.data) do bcp "myOtherDatabase.%~na in "%~fa" -S otherServerName -T -c

Fair warning, I just wrote that off the top of my head, so might have some typos. The general logic is:

  1. First bcp command gets a list of tables (and their owning schema) into a plain text file. This is a source for the next step.

  2. The first "for" command opens the all_tables file and parses each row, putting each element into different variables (%a and %b), which represent the schema and table name of each table. It then substitutes these in a generated bcp statement that gets the top 1000 rows from that table and stores it in a file whose name matches schema.table.data

  3. The second "for" command enumerates over these .data files, constructing a reciprocal bcp statement to import those 1000 row data files into your other database. If you have identity values you need to preserve, add the "-k" parameter at the end (without quotes). This assumes the destination database has all the same tables with the same names, columns, types, column order, etc., otherwise you'll get a bcp error.

If that doesn't work let me know, there's an alternative way of writing it that might be better. You may also encounter problems if you have computed columns, or blobs/varchar(max) or other character data that contains embedded tabs, line feeds, etc.

1 Like

Thanks, i'll give it a go and let you know the outcome!

I tried your suggestion, however this is the output I get:

usage: bcp {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]            [-f formatfile]          [-e errfile]
  [-F firstrow]             [-L lastrow]             [-b batchsize]
  [-n native type]          [-c character type]      [-w wide character type]
  [-N keep non-text native] [-V file format version] [-q quoted identifier]
  [-C code page specifier]  [-t field terminator]    [-r row terminator]
  [-i inputfile]            [-o outfile]             [-a packetsize]
  [-S server name]          [-U username]            [-P password]
  [-T trusted connection]   [-v version]             [-R regional enable]
  [-k keep null values]     [-E keep identity values][-G Azure Active Directory Authentication]
  [-h "load hints"]         [-x generate xml format file]
  [-d database name]        [-K application intent]  [-l login timeout]

D:\Backup>```

Sorry about that, typo as I expected, this should fix it:

bcp "select table_schema, table_name FROM myDatabase.INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE_TABLE'" queryout  all_tables.txt -S servername -T -c -t","

for /F "tokens=1,2 delims=," %a in (all_tables.txt) do bcp "select top 1000 * from myDatabase.[%a].[%b]" queryout "[%a].[%b].data" -S servername -T -c

for %a in (*.data) do bcp "myOtherDatabase.%~na in "%~fa" -S otherServerName -T -c
1 Like

Thanks I get this:

Starting copy...

0 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total     : 79

Sorry, I should have explained better. You would replace "myDatabase" with the name of the database you want to export from, "servername" with the name of your SQL Server instance. Same for "myOtherDatabase" and "otherServerName" where they appear.

If you still get zero rows, that means there are no tables.

1 Like

I did change those values, i'll double check, it shows 100's (if not 1000's) of tables in SSMS

I double checked and it still isn't working, I tried on several machines and again on the actual DB server. There are definitely tables present.

Ugh, I'm a dumbass.

It's BASE TABLE, not BASE_TABLE.

Kids, don't drink and write SQL late at night.

1 Like

I'll try it and let you know, it's fine, I appreciate the help!

You sir, are a genius!
Worked fine getting the data out (well, I have files to copy across to my backup DB).
I have another issue when trying to get the data in at the other end.

Copy direction must be either 'in', 'out' or 'format'.
usage: bcp {dbtable | query} {in | out | queryout | format} datafile
  [-m maxerrors]            [-f formatfile]          [-e errfile]
  [-F firstrow]             [-L lastrow]             [-b batchsize]
  [-n native type]          [-c character type]      [-w wide character type]
  [-N keep non-text native] [-V file format version] [-q quoted identifier]
  [-C code page specifier]  [-t field terminator]    [-r row terminator]
  [-i inputfile]            [-o outfile]             [-a packetsize]
  [-S server name]          [-U username]            [-P password]
  [-T trusted connection]   [-v version]             [-R regional enable]
  [-k keep null values]     [-E keep identity values][-G Azure Active Directory Authentication]
  [-h "load hints"]         [-x generate xml format file]
  [-d database name]        [-K application intent]  [-l login timeout]

Well, that doesn't help me troubleshoot. Give this a try:

for %a in (*.data) do @echo bcp "myOtherDatabase.%~na" in "%~fa" -S otherServerName -T -c

That should generate bcp commands, you can copy and paste a few and try running them, or pasting them here. If they do work, then remove "echo" from the command and they should just go right in.

If they don't work, you should get a more descriptive error, and paste that here as well.

1 Like

Works like an absolute charm, once I remembered I hadn't got as far as setting up the database user permissions on the backup DB so I was getting access denied, it was all plain sailing.
Than you so much for your help!

edit: I've marked your first post as the solution, don't know if you want to update it with the subsequent commands or just leave it.