Paul van Oordt SQL Services
Freelance SQL Server specialist and troubleshooter, Utrecht / Antwerp
0627400408 - WhatsApp - info@vanoordt.nl
LinkedIn - curriculum vitae
vanoordt.nl - Nederlands
newsletter
Every now and then I teach a few days of SQL Server programming and often the class and I work out the example of a bank transfer. Below you will find the entire implementation and a nice example of the performance that SQL Server achieves.
If you don't want to receive my newsletter, let me know.
Best regards,
Paul van Oordt
PS I saw this one coming for some time, I had my 25 years freelance jubilee last September. What a great profession I have chosen. I will keep on SQL Servering for a while!
The bank transfer is the standard example of a transaction: there is a debit and a credit, both or neither of which must be made. There is also a balance check, recording that the transfer has been made and postponing it if the balance check fails. The concurrency issues are non-trivial with an update lock and serializable isolation. In a course it is an ideal case in which things as server-side programming, stored procedures, transactions, locking and error handling can be discussed. And of course, as with every query, performance and scalability.
The elaboration below includes three tables: account, account mutations, and transfer. In addition to the inserts and updates, there are two queries: finding the transfers to be made and (especially) the balance check based on an opening balance in the account and the account mutations. Both queries are made scalable with a good index.
Transfers must be processed in a loop, because each one must succeed or fail by itself. So we get one short procedure that is repeated for each transfer.
The code is concise and understandable and should perform well. But how well?
I populated the tables with 10,000 accounts and a million transfers to be processed, leading to two million account mutations if all balance checks succeed.
The performance: 100,000 transfers in 25 seconds, 4,000 per second, when there is an average of 100 account mutations to be summed per account. On my laptop (with power supply plugged in). On one core. That is how fast SQL Server is!
The entire code including test data can be found below. And if you and your colleagues want to learn more about SQL Server programming, take a look at my site: SQL Server crash course. I am looking forward to visit you.
Thanks for reading.
use master
go
create database bank_test;
go
use bank_test;
go
create table dbo.account
(accno int primary key,
begin_balance dec(18) not null default (0),
last_processed_mutation int not null default (0), -- in begin_balance
minimum_balance dec(18) not null default (0));
go
create table dbo.[transfer]
(id int identity not null primary key,
debaccno int not null references dbo.account (accno),
credaccno int not null references dbo.account (accno),
amount dec(18) not null,
fromdate date not null default (getUTCdate()),
processed bit not null default (0));
go
-- index to find the transfers to be processed
-- especially if this is a small subset
create index idx_transfer_not_processed on dbo.[transfer] (id)
include (debaccno, credaccno, amount, fromdate, processed)
where processed = 0
go
create table dbo.account_mutation
(id int identity not null primary key,
[transfer] int not null references [transfer] (id),
accno int not null references account (accno),
amount dec(18) not null,
date_time datetime2 not null default (getUTCdate()));
go
-- index to determine the balance
-- this index fragments and range-scans are done; so it should be defragmented now and then
create index idx_account_mutation_accno on dbo.account_mutation (accno, id) include (amount);
go
-- testdata
-- 10K accounts
with n0 as (select 1 as c union all select 1) -- 2 rows
,n1 as (select 1 as c from n0 as a cross join n0 as b) -- 4 rows
,n2 as (select 1 as c from n1 as a cross join n1 as b) -- 16 rows
,n3 as (select 1 as c from n2 as a cross join n2 as b) -- 256 rows
,n4 as (select 1 as c from n3 as a cross join n3 as b) -- 65536 rows
,nums as (select row_number() over(order by (select null)) as n from n4)
insert dbo.account (accno, begin_balance, minimum_balance)
select n, n%10*100, -n%10*100
from nums
where n <= 10000;
-- 1M transfers to be processed
with n0 as (select 1 as c union all select 1) -- 2 rows
,n1 as (select 1 as c from n0 as a cross join n0 as b) -- 4 rows
,n2 as (select 1 as c from n1 as a cross join n1 as b) -- 16 rows
,n3 as (select 1 as c from n2 as a cross join n2 as b) -- 256 rows
,n4 as (select 1 as c from n3 as a cross join n3 as b) -- 65536 rows
,n5 as (select 1 as c from n4 as a cross join n2 as b) --1048576 rows
,nums as (select row_number() over(order by (select null)) as n from n5)
insert dbo.[transfer] (debaccno, credaccno, amount)
select (n%10000)+1, ((n+1)%10000)+1, n%100
from nums
where n <= 1000000;
go
-- inline table valued function to determine current and minimum balance
create or alter function dbo.tvf_acc_balance (@accno int) returns table
as
return (
select isnull(sum(AM.amount), 0)+A.begin_balance balance, A.minimum_balance
from dbo.account A
left join dbo.account_mutation AM with (updlock)
on AM.accno = A.accno
and AM.id > A.last_processed_mutation
where A.accno = @accno
group by A.accno, A.begin_balance, A.minimum_balance, A.last_processed_mutation
);
go
-- procedure for processing a single transfer
create or alter proc dbo.usp_transfer
@id int
,@debaccno int
,@credaccno int
,@amount dec(18)
as
begin try;
begin tran;
-- balancecheck
if exists
(select *
from dbo.tvf_acc_balance(@debaccno)
where balance - @amount >= minimum_balance)
begin; -- balance is ok
-- bijboeken
insert dbo.account_mutation (accno, amount, [transfer])
values (@credaccno, @amount, @id);
-- afboeken
insert dbo.account_mutation (accno, amount, [transfer])
values (@debaccno, -@amount, @id);
-- statusupdate
update dbo.[transfer]
set processed = 1
where id = @id;
end;
else
begin; -- insufficient balance, postpone processing with 1 day
update dbo.[transfer]
set fromdate = dateadd(dd, 1, getUTCdate())
where id = @id;
end;
commit tran;
end try
begin catch
rollback tran;
end catch;
go
-- process batch of transfers
create or alter proc process_transfers_serially @number int
as
set nocount on;
set transaction isolation level serializable;
declare @id int
,@debaccno int
,@credaccno int
,@amount dec(18);
declare c_transfer cursor forward_only for
select top (@number) id, debaccno, credaccno, amount
from dbo.[transfer]
where processed = 0
and fromdate <= getUTCdate()
order by id;
open c_transfer;
fetch next from c_transfer into @id, @debaccno, @credaccno, @amount;
while @@fetch_status = 0
begin;
exec dbo.usp_transfer @id, @debaccno, @credaccno, @amount;
fetch next from c_transfer into @id, @debaccno, @credaccno, @amount;
end;
close c_transfer;
deallocate c_transfer;
go
-- test batch of 100000
exec process_transfers_serially @number = 100000
-- results
select count(*) from dbo.account_mutation
select top 100 * from dbo.account_mutation
select top 100 A.accno, A.minimum_balance, A.begin_balance, S.balance
from account A
cross apply dbo.tvf_acc_balance(A.accno) S
select top 100 * -- transfers postponed
from [transfer]
where fromdate > getUTCdate()
December 2022; the newsletter appears a couple of times per year; subscribe / unsubscribe