DotNet Core, async/await and TransactionScope

Most of our code uses a Repository Pattern for database access.  We create a new connection in each method, make the call, usually with Dapper, then Dispose the connection.  This has served us well, until recently when we tried to use a TransactionScope and async/await calls in the repositories.

public async Task<IEnumerable<int>> GetBatch(int count, int status)
    using (var connection = _orderDbConnectionProvider.GetNewConnection())
        return await connection.QueryAsync<int>("GET_ORDER_BATCH", 
            new { count, status }, 
            commandType: CommandType.StoredProcedure);

Internally, TransactionScope uses thead-local storage by default, which doesn’t work with async/await, as you don’t know what thread the call will be made with.  That’s an easy fix.  It’s even easier if you have a helper class to create transactions.

public TransactionScope CreateTransactionScope()
    var transactionOptions = new TransactionOptions
        IsolationLevel = IsolationLevel.ReadCommitted,
        Timeout = TransactionManager.MaximumTimeout // use a sensible value here
    return new TransactionScope(TransactionScopeOption.Required, 

The “TransactionScopeAsyncFlowOption.Enabled” bit it what fixes async/await.  Once that was fixed, we still had a failing integration test.  We tend to test using NUnit, calling our database calls, to make sure everything is mapped to the stored procedure correctly.  We often use a TransactionScope there to roll back changes made to the database during the test.  It is out development database, but we still don’t want to clutter it.

[TestFixture, Category("Integration")]
public class OrderRepositoryTest
    private IOrderRepository _orderRepository;
    private TransactionScope _transactionScope;
    private const string _connectionString = "Data Source=devdb;Initial Catalog=somedatabase;Integrated Security=SSPI;";
    public void Setup()
        _transactionScope = new TransactionScope(TransactionScopeOption.Required, 
            new TransactionOptions
                IsolationLevel = IsolationLevel.ReadCommitted
            }, TransactionScopeAsyncFlowOption.Enabled);
        _orderRepository = new OrderRepository(new OrderDbConnectionProvider(_connectionString));
    public void TearDown()
    public async Task UpdateOrderStatusTest()
        var orderId = await _orderRepository.SaveOrder(1234, DateTimeOffset.Now, "1234", "1234", "1234", DateTimeOffset.Now, "1234", DateTimeOffset.Now);
        await _orderRepository.UpdateStatus(orderId, 2);
        var result = await _orderRepository.GetOrder(orderId);
        Assert.AreEqual((int)result.OrderStatusId, 2);

The test code above works.  However a previous version was missing one await call in the test method: await _orderRepository.UpdateStatus(orderId, 2);

The developer forgot the await because we don’t return a value, just a Task.  This caused the next line to execute too soon and messed up our Transaction.  Once this was fixed, it all worked.

Batch Image Processing and Uploading to S3

I run another blog where my images are stored in AWS S3.  I also upload the images in two sizes, and don’t upload the original.  I use Linux on my personal computers, so I wrote a shell script to make this easier.

# create full size images
mkdir ./$1/1000
mogrify -auto-orient -strip -resize 1000x1000 -quality 60 -path ./$1/1000 ./$1/*.jpg
# create thumbnails
mkdir ./$1/400
mogrify -auto-orient -strip -resize 400x400 -quality 60 -path ./$1/400 ./$1/*.jpg
# delete originals
trash-put ./$1/*.jpg
# upload to S3
aws s3 cp ./$1 s3://pics-fatguy-org/$1 --recursive

This assumes you have ImageMagick, aws-cli tools, and trash-cli installed.  You can skip trash-cli and change the trash-put line to rm.

I keep the shell script named fatguy in the root folder of my local copy of images.  There are subfolders, each an album.  I drop a copy of full-size images in the folder, run the script against the folder:

./fatguy album_name

This creates two subfolders “400” and “1000” with the resized images, deletes (to trash) the originals, then uploads what is left to my S3 bucket.

I may change it later to output the full url of each image for easy pasting into WordPress, but this works well enough for now.