SQLTeam.com | Weblogs | Forums

Trigger with WHERE clause



Is it possible to have a Trigger, that only triggers if a WHERE clause is met?

Something like this,

CREATE TRIGGER dba.new_840
ON dba.proditem where prodref like '840%'
declare prodref varchar(25)
exec create_links

IF UPDATE(prodref)
    IF prodref LIKE '840%' EXECUTE create_links;

Might this work


Thanks, with me being an SQL newb, am I replacing my trigger with this statement or adding it to my statement?


The trigger will be called once for however many rows are in the INSERT statement, so you need to allow for that - its a common misconception that a trigger is called for each row, sorry :frowning: but that's not the case.

Your Procedure will only operate on a single row (well ... it could operate on multiple rows, but its a fair bit of work to transfer a "list" of all the IDs/PKeys of the newly inserted rows to the Procedure ... so I'll assume you don't want to do that)

Thus I think the only way you could do this is to loop round the rows (which the Trigger will make available in the pseudo table [inserted] ) and then EXEC your create_links procedure for each one.

I expect your create_links procedure is going to need a parameter to know which row to process? (Unless it processes all newly-inserted-rows perhaps?)

Something like this perhaps

CREATE TRIGGER dba.new_840
ON dba.proditem

DECLARE @MyPKey int = 0,
        @intRowCount int = 1    -- Force first iteration

WHILE @intRowCount <> 0
    SELECT TOP 1 @MyPKey = MyPKey
    FROM inserted
    WHERE MyPKey > @MyPKey
          AND prodref LIKE '840%'
    SELECT @intRowCount = @@ROWCOUNT

    IF @intRowCount = 1 EXEC create_links @MyPKey = @MyPKey


HI Kristen,

Thanks for the response.

Just so I am clear. Will I need to clear down the [inserted] table once the rows have been inserted to the 'real' table.


I'd use a temp table, or a table variable passed to the proc, to avoid having to loop in a trigger. For example:

CREATE TRIGGER dba.new_840
ON dba.proditem
IF EXISTS(SELECT 1 FROM inserted WHERE prodref LIKE '840%')
    IF OBJECT_ID('tempdb.dbo.#proditem_new_840') IS NOT NULL
        DROP TABLE #proditem_new_840
    SELECT prodref
    INTO #proditem_new_840
    FROM inserted
    EXEC dba.create_links --stored proc uses table #proditem_new_840, if it exists, and processes those row(s)
END /*IF*/
GO --end of trigger


No, the rows will already be present in the target table (by default Triggers are AFTER, but it is also possible to specify them as INSTEAD OF - in which case then YOU have to insert any data, or not, into the actual table).


Looking at your other posts it looks as though you could possibly be using ASA and not SQL Server.

If you are using ASA, you should really look for advice on an ASA forum as Triggers have different performance implications and different ways of working depending on the vendor.
(I think ASA may have the option of using row level triggers.)

With SQL Server, triggers really need to be kept as simple as possible otherwise thay are a performance issue waiting to happen. If possible, I try and avoid triggers and would certainly not be keen to have a loop in a trigger. When anything complicated needs to happen in a trigger I normally make it ASYNC either by using Service Broker or just inserting the basic data into another table to be dealt with by another process. Making triggers ASYNC allows your normal transactions to commit as quicly as possible but can get quite complicated. You may want to google something like 'sql server asynchronous trigger'.


Agree 100% with that. We avoid any complicated processes in triggers, but we do use them for:

Final data validation - to prevent anything goofy getting into the DB. (This is mainly done in a middle layer, but we can't prevent clients and their "other trusted parties"[that's a joke!] usually require some sort of direct access to the tables)

Copying the "before" data into an Audit / History / Archive table

Placing some sort of indicator (either an ID, some columns, or the whole row) into a Queue Table, or Service Broker - to be processed by something else

All these operations can easily be performed on a set of data (e.g. if the INSERT, UPDATE or DELETE is multi-row) and IME perform very quickly. We have triggers on every table, Archive tables on every table, the triggers for all those tables do the same tasks of basic data validation and copy-to-Archive-table, and we've never had a problem with performance [when using them in that way]


@Ifor @Kristen - Thank you both for your help, I would like to say I understand it fully, but I'm not quite there yet.

I have to use both ASA and SQL Server in my day to day job (Marketing/IT), and I generally find SQL Server forums better.