How to import multiple SQL Server tables of same schema into SQL Server using a Stored procedure or SSIS

Hello All,

I have a data request to import all SQL Server table of same schema into one new table (Master_table) regularly.

I have tables created daily like,

table_20150701 table_20150702 table_20150703 table_20150705 . . . table_20150716 like this everyday I will get data feeds from external sources to a table in on of my database, the table has a lastUpdate column as well with the datestamp.

My request is to import all the tables data into one single table (Master_table) and add the next days data directly into the new file

Next day when it creates a table_2015xxxx... it should automatically load data int to new table (Master_table)

This is a continuous process so we can use the new table for analysis purpose.

Appreciate your quick help!!

Thanks

It's not clear to me what you intend this logic to do. You have listed a few:

  • Migrate data from each "daily" table into the master table
  • Create a new table to hold a new day's imports
  • Load data into the "daily" table
  • Tomorrow "know" that the original daily tables have already been imported into master and only import the new table
    Can you elaborate on your requirements?

Thanks for your response.

I have tables from 2012, like I have close to 800 tables now in my database with name as tablename_yyyymmdd.

Now I want to make all in one new table.

And from tomorrow when a table is created with tablename_yyyymmdd, a procedure or ssia package should write all the date into new table we have created.

and the next step is to eliminate the temp/stage table and load data directly into the master table or., new table.

Thanks again.

This script should (untested) get you the initial set of tables migrated to the master table. With a little tweaking you can get it to import "today's" table. You'll need to supply the actual table and column names;

[code]declare
@PKey int = 1,
@MaxPKey int,
@sql nvarchar(max) = N''

declare @InsertTemplate nvarchar(max) = N'
insert into dbo.MasterTable( <> )
select <>
from <>;';

declare @tbl table (
pkey int not null identity(1, 1),
sql nvarchar(max)
)

insert into @tbl(sql)
select
Replace(
@InsertTemplate,
'<>',
t.name
) sqlStmt
from
sys.tables t
where
t.name like 'tablename[_][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]'

select
@PKey = 1,
@MaxPKey = @@rowcount


--- Main Loop

while @PKey <= @MaxPKey
begin

--- Get single table entry

select
@sql = sql
from
@tbl
where
PKey = @PKey


--- Execute/Display current item

exec sp_ExecuteSQL @sql


--- End-of-loop processing

set @PKey = @PKey + 1;
end[/code]


Could you just use a partition view instead?

You'd have to re-create it every day, but it would be easy enough to do that dynamically (e.g. interrogating sys.tables to find out what table names physically exist)

That would have the benefit that you could query all the data, given the right indexes :slight_smile: SQL will intelligently only query the physical tables that are required (i.e. based on any Date Range in the WHERE clause) so performance should be good.

And you won't duplicate the data storage in both the yyyymmdd table AND the Master Table.

If this exercise is to MOVE the data from yyyymmdd table INTO the Master Table then ignore me! (Although ... if there are a LOT of rows in each yyyymmdd table then using a Partitioned View is probably a better, long term, solution anyway)