Multiple Tables from Query (Import/Export Wizard)

This is so cool! I just copy and paste the image to this, without having to upload it.

Anyway, to my question. When I am doing an import/export and would like to write query/queries to produce multiple tables, is this possible?

And yes, I know that I can select the other choice to import the tables in but I need to put a "WHERE" clause in for a few tables before I import the data.

Thanks

I guess someone else asked this also.

https://www.tek-tips.com/viewthread.cfm?qid=526300

What is your end game. If you explain your endgame might be better for proposing a solid solution

No - you cannot use multiple embedded queries like that. What you can do is create views on the source system and select from those views.

Based on your other requests though - it appears you are looking for an easy way to build your SSIS packages automatically. There just isn't going to be a way to do this without spending some time and effort...regardless of the approach you take.

I suggest you take a look at BIML - and see how that would fit in with your development efforts. I don't utilize it myself, but from what I can see it shouldn't be too hard to build the scripts to generate the packages you need - with the ability to rebuild/regenerate the packages when something changes in the source system.

1 Like

Thanks for the replies Yosiasz and Jeff.

Jeff, you are right on the mark and thanks for taking into consideration of the other posts. I am/have been looking into BIML as you have suggested, that maybe the solution to build and automap the fields.

As for building out the views on the Source system, good suggestion but the vendor won't let me. As a matter of fact, I had another method that would have done the job by using OpenRowset in a SSIS Connection but they do not support turning on "Ad Hoc Queries". It would have been a beautiful solution but so that solution went out of the window.

For sure Link Server SUCKS!!! which I have to agree with Graz.

When quering against the vendor's database with SSIS Connection Manager (or in SSMS) is as fast as using an ADO.Net recordset. I have been playing with VB/C# code and I may be very close to a solution soon on writing a wrapper and use xp_cmdshell to fire it off or use the CLR method. Have you worked with CLR?

https://docs.microsoft.com/en-us/sql/relational-databases/clr-integration/database-objects/getting-started-with-clr-integration?view=sql-server-ver15

I will keep you updated.

One more question Jeff since you got what I am trying to achieve and since I don't know enough about the subject of Replication, could Replication be also a solution of getting to what I am looking for.

I guess Yosiasz question about the end game is right. So what I am looking for is to refresh our internal database against the Vendor's database as close to real time as possible. Right now, we are doing our reporting directly off of the Vendor's database but they have been recommending us to create a copy of their database internally and sync it.

I have a package that can flush the entire database and then refresh it, taking about 60 minutes. I am trying to get the process down to minutes by just flushing the records (based on the modified timestamp field) that has been modified and re-insert those records from Vendor's database.

I will be researching into Replication but would Replication be a solution for this?

Replication would not be a solution in your case - the vendor would have to set it up and manage the replication from the source system to the destination (which I doubt they would be willing to do).

For reporting - do you really need near real-time data? Generally, I have found that most reporting does not require real-time data and can be performed on a copy with a lag. That is - if you populate the data today it would be valid up to midnight of yesterday.

For real-time reports, those would have to be run directly on production - but those types of reports generally look to current data and can be optimized to run very quickly. Or - they run for a single entity which can also be optimized to execute very quickly.

A daily refresh that takes 60 minutes would not be an issue for me - it could be scheduled to run early morning and the system ready for users when they start the day.

Hi Jeff,

Thanks for the confirmation on the Replication because I lack knowledge in that area. It is good to know it comes from the source system.

As for the reporting, yes, we do need it near real-time. You are correct that these reports look at current data but we also can aggregate against daily data because we snapshot the deltas of the changes each night.

If we were to only do reports off of vendor's data, then we could get aways with a straight connection in SSRS.

Unfortunately, we are aggregating data among multiple vendors data. So we have a choice of either use Link Servers or pull in vendors' data at near real-time.

As for the daily refresh, exactly! That is how we are setting it up.

Thanks for pointing me in the right direction and preventing go down the wrong path (such as Replication). I will confirm in a couple of days on how the C#/VB script goes on performance.