Quantcast
Channel: Slow Trigger performance when big batches - Database Administrators Stack Exchange
Viewing all articles
Browse latest Browse all 3

Slow Trigger performance when big batches

0
0

I have an update trigger that inserts into auditing tables. We had no problem until someone decides to update over 1 million records... (That's my bad. I didn't think it would be a problem when developing).Now facing reality, I need to find a solution...

I've been doing many tests and researches to try to figure out how to solve my issue of having a trigger perform poorly... I've come to the conclusion that to minimize the bad performance of the "Table Insert" in the execution plan, I need to insert in smaller batches.

The question is: Since I'm not sure of where all the different updates can come from, I'm trying to figure out how I can insert the auditing records in batches within the trigger?

example, The update of the main table for 1 million records would happen and call the trigger, which would insert 100 thousand records at a time in some type of loop.

Is this possible? If so, how do you suggest?If not, how else can I improve the table insert of the execution plan?

Addition of test scripts to reproduce:

This is a simplified version of the real thing

-- drop trigger PriceHist_trig_U -- drop table MyPriceTable-- drop table price_historyCreate Table MyPriceTable (SKU varchar(13), PriceGroup varchar(5), PriceLevel int, Price float, Qty float, ManyOtherColumns Varchar(100)CONSTRAINT [PRICE_TAB_P01] PRIMARY KEY CLUSTERED (    SKU ASC,    PriceGroup ASC,    PriceLevel ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY]Declare @Id intSet @Id = 1While @Id <= 1000000Begin    insert into MyPriceTable values (right('000000000000'+ CAST(@Id as nvarchar(10)),13),'Grp '+ CAST(@Id%10 as nvarchar(10)), @id%3, RAND()*(25-10)+10, 1, 'there are many other columns')   Print @Id   Set @Id = @Id + 1End-- Drop table   price_history create table price_history (SKU varchar(13), PriceGroup varchar(5), PriceLevel int, Price float, Qty float, ManyOtherColumns Varchar(100), historyDate datetime, ChangedColumns varchar(Max))CREATE NONCLUSTERED INDEX price_history_nc1 ON price_history(    HistoryDate ASC,    SKU ASC,    PriceGroup ASC,    PriceLevel ASC)goCreate TRIGGER PriceHist_trig_U ON MyPriceTable FOR UPDATE AS INSERT INTO price_history (SKU, PriceGroup, PriceLevel, price, Qty, ManyOtherColumns, HistoryDate, ChangedColumns)             SELECT INS.SKU,INS.PriceGroup,INS.PriceLevel,INS.Price,INS.Qty,INS.ManyOtherColumns, getdate(),  CASE WHEN update(Price) and INS.Price<>DEL.Price THEN 'Price-' ELSE '' END +CASE WHEN update(Qty) and INS.Qty<>DEL.Qty THEN 'Qty-' ELSE '' END +CASE WHEN update(ManyOtherColumns) and INS.ManyOtherColumns<>DEL.ManyOtherColumns THEN 'other-' ELSE '' END FROM INSERTED INS JOIN DELETED DEL ON DEL.sku=INS.sku AND DEL.PriceGroup=INS.PriceGroup AND DEL.PriceLevel=INS.PriceLevel WHERE  (update(Price) and INS.Price<>DEL.Price)     OR (update(Qty) and INS.Qty<>DEL.Qty)     OR (update(ManyOtherColumns) and INS.ManyOtherColumns<>DEL.ManyOtherColumns)/* tests */ update MyPriceTable set price = price-1

When I run this with the trigger disabled, it runs in 2 seconds.When the Trigger is enabled, it took 32 seconds to complete.The Execution Plan shows 98% on the "Table Insert"

I've been trying to figure out how to improve the table insert, but can't find anything concrete...

I've tried with a Clustered index and the performance is worse.

Any help would be appreciated


Viewing all articles
Browse latest Browse all 3

Latest Images

Trending Articles





Latest Images