Urgent Help in SQL

I want to know what is the best way to Delete records from Production DB. The scenario is, the user can Pick Up the file from Application to delete the whole batch. File data could be 30 rows or 2 Hundred thousand rows. Right now I am just using simple DELETE Statement DELETE MYTABLENAME WHERE ID = BATCHID. Know I am scared to use this statement it could possible to lock the table. I would like to know from all gurus out there, what would be the best way to delete the records in Production. I am not deleting records manually. Is the best way to use top 1000 and then next 1000... IF yes, Could you please show me the way, how it will work in 4 different tables. Please guide me it is urgent.

Thank You.

I didn't get what you meant by "the user can Pick Up the file from Application to delete the whole batch".

Regardless, delete in small batches (and do check point if in simple recovery model or log backups otherwise). There are detailed examples here.

Sorry for the confusion. I was saying, The user will Pick up the file/batch through Front End Application to delete the batch. Two questions.
Because I don't know the file size, it could be big or could be small. Is it a good idea to delete the batch in small sets?
2) It will help if I delete the batch in Loop?

Please advise. BTW thanks for your reply.

Is the table clustered on BatchId? If so, you can DELETE in rather large chunks, 5K-10K rows, without any real issue.

In One table, Yes Clustered Ind and other three tables Non Cluster Ind.

Does below SQL Looks good to you? My Understanding in below Delete Statement with Loop, It will delete top 5000 rows and then next 5000 rows and so on if data is more than 5000, am I right?
Second Question, Should I Use BEGIN Trans and Commit Trans table by table?

BEGIN TRANSACTION trans

	DECLARE @Rowcount INT = 1
	WHILE @Rowcount > 0
	BEGIN
		DELETE TOP (5000) 
		FROM [FirstTable-Big]
		Where ID IN @BatchID
	SET @Rowcount = @@ROWCOUNT
	END

Delete MySecondTable-Small
Where ID in (@BatchID)

	DECLARE @Rowcount INT = 1
	WHILE @Rowcount > 0
	BEGIN
		DELETE TOP (5000) 
		FROM [ThirdTable-Big]
		Where ID IN @BatchID
	SET @Rowcount = @@ROWCOUNT
	END


	DECLARE @Rowcount INT = 1
	WHILE @Rowcount > 0
	BEGIN
		DELETE TOP (5000) 
		FROM [FourthTable-Big]
		Where ID IN @BatchID
	SET @Rowcount = @@ROWCOUNT
	END

COMMIT TRANSACTION trans

Your syntax does not seem right. What is in @BatchId?

Regarding whether or not to use transaction, it depends on whether you want to allow partial deletes to stick if there is an error or not. If you do choose to delete in a transaction, insert this before the start of the script

SET XACT_ABORT ON;

We do't usually delete more than 10K rows at a time, we then use WAITFOR to delay for a couple of seconds before Looping to delete the next block of rows.

However, that is not ATOMic - if we hold a transaction for the duration of the delete we might as well do it in a single statement - the strain on the TLog will be the same etc.

Another way might be to have a "Deleted" BIT flag and Update that to set it (in place of any actual delete). All queries would need to exclude rows with [Deleted] = 1 - so all queries would need to be changed (or rename the table to YourTableName_V2 and create a view with the original YourTableName which excludes any row where [Deleted] = 1). Then you can have a physical-delete batch process - e.g. running in the middle of the night. It won't matter that it is not ATOMic, and it can delete in small-ish batches in a way that won't strain the system, be greedy for log disk space, nor steal all the CPU that Users need.

1 Like

Of course!! If you put it in a transaction, then there is no point in doing all the rigmarole of deleting in chunks and all that stuff! Thanks Krisen!!

Better to use a loop though (for large numbers of deletes)? or have you got a different idea / viewpoint?