Array as parameter to function

Hello,
I have a three-dimensional set of data coming from a c# application and that data must me updated or inserted in a SQL Table.
The set can be several hundred records long.
It would be more efficient to do the processing on the server side but that means I have to send that array as a parameter of the query.

Do you think that could be a problem (due to the size of the data sent)?

The other solution is to loop that set on the app side and process it one-by-one.
Thank you.

Hi,

I don't know if this may help you but when I work with arrays from the front-end or reporting tier, I would leverage the SPLIT and JOIN functions. When passed to the database, I have a function that splits the array parameter and stick them into a table.

For example, when I pass multiple IDs from SSRS (reporting tier), I would use the JOIN function in SSRS to convert an array to a string. Since Split function creates an array. Join function convert that array back to a string with a delimiter. I then pass that string into the SP's parameter using a parameter that is declared VARCHAR(Max). In general practice, the first section of my SP, I usually stick parameters into a temp table for easier testing.

If your data cannot fit into VARCHAR(Max) then I would use ADO.Net or another method to import it into the database.

I use a pipe sign as a delimiter. Here is the split function that someone was kind enough to share:
http://www.sqlservercentral.com/blogs/querying-microsoft-sql-server/2013/09/19/how-to-split-a-string-by-delimited-char-in-sql-server/

It seems starting with SQL Server 2016, SPLIT function is an official built-in function.
https://docs.microsoft.com/en-us/sql/t-sql/functions/string-split-transact-sql?view=sql-server-2017

1 Like

I remember that I worked for a company where it took 3 days to run their monthly reports. When I looked at the process, the data was being massage on the client side. CPU and Memory intensive! As soon as we moved that data to the data tier and moved the business logic into a stored procedure, BAM! It took less than 5 minutes to process the same amount of data (over 2 million records).

1 Like

You can also try "Table Valued Parameters". See the following Google search.
https://www.google.com/search?q=table+valued+parameters+sql+server

3 Likes

From our angularJS applications we send our arrays as xml (for some reason I am addicted to xml) via our rest api. on the other end our stored procedure parse this to do whatever needs to be done. but I am not sure as to the xml size limitations vs varchar(max) or whatever data type is recommended to send down the pipeline. But I would go with what @JeffModen recommends for sure!

1 Like

Hi Jeff,

That is pretty cool. I have never heard of Table Valued Parameter so I clicked on your provided link and went to the first link which is from Microsoft. I was playing with Microsoft provided code sample and came up with this sample for my personal script library. This is for the OP as well too. Thanks

--https://docs.microsoft.com/en-us/sql/relational-databases/tables/use-table-valued-parameters-database-engine?view=sql-server-2017
--table-valued parameters perform well for inserting less than 1000 rows.
USE [ReportServer]
GO  

--IF EXISTS (SELECT * FROM sys.objects WHERE [NAME] = 'tblTest' AND [TYPE] = 'u') DROP TABLE [dbo].[tblTest]; 
IF OBJECT_ID('[dbo].[tblTest]', 'U') IS NOT NULL DROP TABLE [dbo].[tblTest]; 
--IF OBJECT_ID('tempdb..#tempdbname') IS NOT NULL DROP TABLE #tempdbname;

CREATE TABLE [dbo].[tblTest]
(
	  [ID]					INT IDENTITY(1,1)	NOT NULL
	, [Main]				NVARCHAR(100)		NULL
	, [Field_Number]		NVARCHAR(20)		NULL
	, [Field_Alpha]			NVARCHAR(20)		NULL
) ON [PRIMARY]
GO
----------------------------------------------------------------------
IF TYPE_ID('dbo.tLocationTableType') IS NOT NULL DROP TYPE dbo.tLocationTableType;

/* Create a table type. */  
CREATE TYPE dbo.tLocationTableType AS TABLE   
(
	  LocationName			VARCHAR(50)  
	, CostRate				INT
)
GO
----------------------------------------------------------------------
/* Create a procedure to receive data for the table-valued parameter. */  
IF EXISTS(SELECT 1 FROM sys.procedures WHERE [NAME] = 'usp_InsertProductionLocation')
BEGIN
    DROP PROCEDURE dbo.usp_InsertProductionLocation
END
GO

CREATE PROCEDURE dbo.usp_InsertProductionLocation  
    @TVP			tLocationTableType	READONLY  
AS
BEGIN
	SET NOCOUNT ON  
	INSERT INTO [dbo].[tblTest]  
	(
		  --[ID]
		  [Main]
		, [Field_Number]
		, [Field_Alpha]
	)  
	SELECT *, GETDATE() FROM  @TVP;  
END
GO
----------------------------------------------------------------------
/* Declare a variable that references the type. */  
DECLARE @TestTVP AS tLocationTableType;  

/* Add data to the table variable. */  
INSERT INTO @TestTVP (LocationName, CostRate)  
SELECT [RoleName], 55
FROM [ReportServer].[dbo].[Roles]  

/* Pass the table variable data to a stored procedure. */  
EXEC usp_InsertProductionLocation @TestTVP;  
GO

SELECT 
	  [ID]
	, [Main]
	, [Field_Number]
	, [Field_Alpha]
FROM [tblTest]
----------------------------------------------------------------------
/*
SELECT TOP (1000) [RoleID]
      ,[RoleName]
      ,[Description]
      ,[TaskMask]
      ,[RoleFlags]
FROM [ReportServer].[dbo].[Roles]
*/
1 Like

Thanks for the feedback.

XML for parameter passing isn't so bad except for the toll it takes on the pipe. It seems (to me) like XML takes a typical 8 to 16 times more transmitted bytes than transmitting a CSV, for example.

1 Like

I concur but...who is up for some benchmarking?

"It depends". What would you like to include in the testing?

XML vs csv vs tvf as parm