SQLTeam.com | Weblogs | Forums

Can SQL Server create JSON files?

Need to create large number of JSON files as migration from SQL Server to noSQL DB.

TSQL to JSON conversion scripts are authored, but cannot see any method to create a physical JSON file. Lots of articles on the web about FOR JSON AUTO output but none provide method to save it as a .JSON file to disk.

Can only see tSQL to XML to JSON conversion methodology or save results as file/text and the you would have additional step changing the file type.

Would Azure Data Factory been a better tool?

IN general, T-SQL doesn't do well at creating files on the file system. I'd usually reach for SSIS for that.

Thanks and agreed.

We tried with SSIS, I also manage to create the physical file using dynamic TSQL. See below.

--proc
CREATE PROCEDURE DemoProc
as
DECLARE @ProductID INT -- ProductID
DECLARE db_cursor CURSOR FOR
SELECT ProductID
FROM Production.Product
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @ProductID
WHILE @@FETCH_STATUS = 0
BEGIN
some_sql
FOR JSON AUTO

FETCH NEXT FROM db_cursor INTO @ProductID
END

CLOSE db_cursor
DEALLOCATE db_cursor

------ Logic to create physical file

DECLARE @ProductID VARCHAR(100)
DECLARE @strFileLocation VARCHAR(100)
DECLARE @strFileName VARCHAR(1000)
DECLARE @bcpCommand VARCHAR(8000)
DECLARE @strSQL VARCHAR(2000)

SET @strFileLocation = 'c:\JSON'
SET @strFileName = 'demo.json'

SET @strSQL = 'Exec DB.dbo.demoproc'
SET @ProductID =2
SET @bcpCommand = 'bcp "' + @strSQL + ' '+ @ProductID + '" queryout "'
SET @bcpCommand = @bcpCommand + @strFileLocation + @strFileName + '" -T -c'

EXEC master..xp_cmdshell @bcpCommand

Of course that presumes that you've set up an account that has write access to the target file system. Still it's a creative solution! But, why the cursor?

Also they're tons of fear-mongering around xp_cmdshell. FWIW I think that's mostly nonsense.

Thinking out loud: You could do it all with a PoSH script in Agent, or if you are v.next, with Python in SQL.

what consumes these json files or is this a migration thing to the cloud where you consume these json files?
maybe create a rest api with node and express.js and express.js will give you a nice json data, unless this is a one time thing.

http://localhost:7203/api/currentuser
res.send(recordsets[0])
[
  {
    "name": "Analyzer A",
    "uniquename": "analyzera",
    "type": "user"
  }
]