"No one is harder on a talented person than the person themselves" - Linda Wilkinson ; "Trust your guts and don't follow the herd" ; "Validate direction not destination" ;

July 19, 2009

Deleting Huge Recordset from Table in batches

While deleting a Dataset < @consition from a huge table (ex-millions of records we would get transaction log full error). The best practice is to do a batch delete. A good example is provided here One Way is mentioned here recurring step

I would recommend the solution mentioned here

--CREATE Testt Table
create table testt
(id int identity primary key not null
,name char(10)
)

--Insert Entries
declare @i int
set @i=0
while (@i <10000)
  begin
      insert into testt (name)
      select 'name'+CONVERT(char,@i)
      set @i=@i+1 end

--Delete Entries
set rowcount 5000
while 1=1
  begin
      delete from testt where id > 1000
      if @@ROWCOUNT = 0
      break;
  end
set rowcount 0

select count(1) from testt (nolock)
--1000

Also additional benefit is when batches are committed log file portion which was used can be reused as data is committed already. Plus only a range lock would be placed for the portion of records (range lock), Possibilities of escalating to a table lock is less.

Lock escalaltion is triggered when A single Transact-SQL statement acquires at least 5,000 locks on a single nonpartitioned table or index. source here

Performing batched updates

While BCP for faster inserts the database recovery model of the target table must be either Simple or Bulk Logged. From Msdn link. Anothere Reference link

DELETE TOP x rows avoiding a table scan
Update Record in Batches - TSQL coding patterns I
Gradually Deleting Data in SQL Server

Happy Learning!!!

No comments: