SQLTeam.com | Weblogs | Forums

I need a Skiny Method to Import XML into SQL Server



This example lifted from Microsoft DOCs

Given this:

DECLARE @docHandle int,
	@XmlDocument nvarchar(1000)
SET @XmlDocument =N'<ROOT>
<Customer CustomerID="VINET"  >
     <ContactName>Paul Henriot</ContactName>
   <Order OrderID="10248" CustomerID="VINET" EmployeeID="5" 
      <OrderDetail ProductID="11" Quantity="12"/>
      <OrderDetail ProductID="42" Quantity="10"/>
<Customer CustomerID="LILAS" > 
     <ContactName>Carlos Gonzlez</ContactName>
   <Order OrderID="10283" CustomerID="LILAS" EmployeeID="3" 
      <OrderDetail ProductID="72" Quantity="3"/>

EXEC sp_xml_preparedocument @docHandle OUTPUT, @XmlDocument

INSERT INTO dbo.Customers
	CustomerID, ContactName
SELECT	CustomerID, ContactName
FROM OPENXML (@docHandle, '/ROOT/Customer',3)
		CustomerID  varchar(10),
		ContactName varchar(20)

EXEC sp_xml_removedocument @docHandle

why do all the examples I see use individual column definitions in the WITH clause (which seems like a bug-haven as the DDL changes over time ...) rather than just doing this: [also retaining the @docHandle stuff etc]

INSERT INTO dbo.Customers
	CustomerID, ContactName
SELECT	CustomerID, ContactName
FROM OPENXML (@docHandle, '/ROOT/Customer',3)
	WITH dbo.Customers

I want to "copy" some MetaData, used to control our APP, from one DB to another. I used to be able to use LinkedServer to connect them with a Quick&Dirty Insert/Select statement, but we now have servers that cannot see each other so my thought was to just output some XML (of the relevant / recently changed rows) and then make an Import Page on the Master Database site and Paste the XML into a Form Field, and then process that in SQL (allowing me to display DIFF and make choices, such as manual override etc., before actually doing the UpSert)


Are your servers SQL Server 2005 or higher? If so, why not use XQuery/XML methods?


Yes, SQL 2012. I thought there must be a better way ... everything I Google seems to have pointed me to sp_xml_preparedocument etc. or processes that involve "declaring" every column that I want to extract.

I am looking for a method that will let me Export and Import easily - i.e. both ends of this puzzle are identical tables / columns, so the data should easily fit into the target, and I would like a solution that is immune to DML changes - extra column is fine, but I don't want to have to change anything if a column changes size / datatype etc.

I'll go and lookup XQuery ... but I think the examples I saw had an XPath associated with each column, or something (complex!) like that. Hopefully just my lack of XML knowledge ...


My perception, perhaps wrong?, is that I would have to do something like this:

SELECT @xml = BulkColumn

INSERT INTO dbo.mail
    Email, FullName, Title, Company, City 
Email = t.c.value('@Email', 'VARCHAR(255)')
, FullName = t.c.value('@FullName', 'VARCHAR(255)')
, Title = t.c.value('@Title', 'VARCHAR(255)')
, Company = t.c.value('@Company', 'VARCHAR(255)')
, City = t.c.value('@City', 'VARCHAR(255)')
FROM @xml.nodes('users/user') t(c)

which seems like a lot of code to me, particularly the definitions - which I think should be got from the DB (i.e. in real time / dynamically), rather than being hardcoded in the import routine.


I agree about the dynamic/programmable typing, unfortunately the only workaround would be dynamic SQL against a table with definitions. Tricky but not insurmountable. My preference for XPath/XQuery is that it is much easier to dive into a deeply nested document than using sp_xml_preparedocument and such, and it seems cleaner syntax once you get the hang of it.


Yes, that's probably the biggest part of the problem for me ... but it would be a useful skill to have / improve.

I've stuck with sp_xml_preparedocument for now, its Cheap & Cheerful for me with only a 2-level deep XML stream value.

My previous Quick & Dirty Meta Data XFer was to generate a text script for SProc Execs that populated the Meta tables. I had a

usp_MyTableExport @ListOfIDs

to generate the export and

usp_MyTableImport @ID, @Col1='Value1', @Col2='Value2', ...

along with options to only update the target if one/many columns were different (and output "Unchanged" / "Updated" / "Inserted". This was OK but required me to use a SQL Tool for the Import of course, and not much good at Deleting any now-stale rows.

The XML route allows me to Paste into a Form Field on the target system, which is a lot more usable :slight_smile:, it iterates around the records and calls the original usp_MyTableImport retaining all the original validation etc. logic ... but its a LOOP :frowning: relatively few rows in a batch though.

I also do a delete based on "This block of records is all children for Parent-X, so delete any Children, on Destination Child Table, that are not represented in the XML" - which is a bonus.

I'll give some thought to dynamic SQL to handle this. I like the idea of that ... for now I have gone with making a VIEW which identifies which/all columns in a table that I want to transfer. Both the Export and the Import use the same VIEW, so they are "matched", so to speak. The VIEW allows me to fairly easily add additional columns if that becomes necessary over time ... and I have also added some code-lookup columns so that the XML contains elements that assist Human Reading - such as the Name / Description of the Parent record (formerly a Comment in my mechanically generated SQL script)

Perhaps I should just have made a Binary Format BCP export / import. I could probably have represented that as a text output, to allow cut & paste direct from one instance of the APP into a formfield on another. Not sure that would have been any better than XML though ... and my browser does a lovely job of displaying indented, raw, XML :slight_smile: