I am trying to query against a database in which I do not have create table permissions. I have a list of 50,000 IDs that I'd like to lookup on a table in the database. The ID field is indexed. However, it is too many IDs to hard code into the SQL statement itself. Ideally, I would import the list as a table, and join it to get results; however, I do not have create table privs. Is there a voodoo way to do this? Perhaps reference/link a flat file into the query as a pseudo table somehow?
Hehehe ... Can you create a #TEMP table? (or even use an @TEMP Table Variable instead?)
The table will only exist for the duration of your query though ... which may be the problem?
Thanks @Kristen, that's definitely a start. It's fine that the table will be only available until the session ends.
I can create local temp tables, but I do not have privileges to use BULK INSERT or OPENDATASOURCE in order to populate it. Any suggestions on that? Hopefully something other than 50,000 insert statements...
You can use OPENROWSET to import the data from an Excel file (Access file) into a table variable or temp table. But, given that you don't have permissions to create a table, this is likely to fail because it needs some additional permissions and configurations (Distributed Adhoc Query for example).
Another crappy way of accomplishing this is by putting the Id's into an XML fragment and parsing that into a table. I say it is crappy because using XML to split a set of tokens is one of the slowest methods.
Splitting a Comma Delimited String might be faster? (for that number of elements critical that an efficient function / algorithm is used)
Still got to "read" the list from somewhere.
One could conceivably copy and paste the list into an SSMS window and use regex editor to convert it into an XML string. Not sure if it will cause SSMS to crash and burn.
If you have the list in a file - you can put that list into Excel and create the 50,000 insert statements needed to populate your temp table.
Not an ideal solution - but adequate for a one time process. I would use a global temporary table for this - create that in a separate query window, then query the global temporary table from a separate query window.
If this needs to be a repeatable process - with different sets of ID values, then you need to get with your DBA so you can setup a proper import process to populate a permanent table.
Attempted and evidently I'm not given privs to this or else is not enabled: "Distributed Adhoc Query", so the OPENROWSET and I guess OPENQUERY capabilities are a no go.
Definitely not ideal, but I may be stuck with the Excel approach. Open in Excel, build a formula with cell references to create a string to represent an individual insert statement for each row. Drag the formula down. Copy it over to run it.
The ask is what I consider "repeatable ad hoc". It's a onetime request, not going into production, but the type of request will get repeated several times a year for various projects. It's just tough with a server that's so locked down. Can't build a script to run the whole process when you get into this kind of conundrum.
Thanks for all the suggestions.
I think a splitter would be a lot faster (as it is set based) ... but its a guess. I would certainly be interest to know how they compare. Where's @JeffModen ? He may already have done that test when he was souping-up this String Splitter
P.S. Might be important that each string being split is under 8K
P.P.S. If the splitting was Fixed Length that might be WAY better (no need to search for delimiters, just split based on a Tally Position). Perhaps all the IDs are either fixed length, or the Max Length is not very different from the Min Length and short ones could be padded?
Yes, there are some pretty good "voodoo" methods for this and some of them have already been identified by the good folks on this thread. I need more information on how the string of ID's is being passed, please. Where does this list of ID's exist? Is it just one string of 50,000 or does it live in a spreadsheet or in a file or ???