.NET Core web API + Dapper – Redis Cache

The IDistributedCache has Memory, SQL Server and Redis implementations so I wanted to explore how the Stack Exchange Redis library works. The ConnectionMultiplexer class in the Stack Exchange Redis library hides the details of managing connections to multiple Redis servers, connection timeouts etc. The object is fairly “chunky” so it should be initialized once and reused for the lifetime of the program.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    builder.Services.AddSingleton<IConnectionMultiplexer>(s => ConnectionMultiplexer.Connect(builder.Configuration.GetConnectionString("Redis")));

    var app = builder.Build();

    // Configure the HTTP request pipeline.
    app.UseHttpsRedirection();
    app.MapControllers();

    app.Run();
}

I trialed the initial versions of my Redis project with Memurai on my development machine, then configured an Azure Cache for Redis. I then load tested the project with several Azure AppService client and there was a significant improvement in response time.

[ApiController]
[Route("api/[controller]")]
public class StockItemsController : ControllerBase
{
    private const int StockItemSearchMaximumRowsToReturn = 15;
    private readonly TimeSpan StockItemListExpiration = new TimeSpan(0, 5, 0);

    private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]";
    //private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02'";

    private readonly ILogger<StockItemsController> logger;
    private readonly IDbConnection dbConnection;
    private readonly IDatabase redisCache;

    public StockItemsController(ILogger<StockItemsController> logger, IDapperContext dapperContext, IConnectionMultiplexer connectionMultiplexer)
    {
        this.logger = logger;
        this.dbConnection = dapperContext.ConnectionCreate();
        this.redisCache = connectionMultiplexer.GetDatabase();
    }

        [HttpGet]
    public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
    {
        var cached = await redisCache.StringGetAsync("StockItems");
        if (cached.HasValue)
        {
            return Content(cached, "application/json");
        }

        var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);

#if SERIALISER_SOURCE_GENERATION
        string json = JsonSerializer.Serialize(stockItems, typeof(List<Model.StockItemListDtoV1>), Model.StockItemListDtoV1GenerationContext.Default);
#else
        string json = JsonSerializer.Serialize(stockItems);
#endif

        await redisCache.StringSetAsync("StockItems", json, expiry: StockItemListExpiration);

        return Content(json, "application/json");
    }

...

    [HttpDelete()]
    public async Task<ActionResult> ListCacheDelete()
    {
        await redisCache.KeyDeleteAsync("StockItems");

        logger.LogInformation("StockItems list removed");

        return this.Ok();
    }
}

Like Regular Expressions in .NET, the System.Test.Json object serialisations can be compiled to MSIL code instead of high-level internal instructions. This allows .NET’s just-in-time (JIT) compiler to convert the serialisation to native machine code for higher performance.

public class StockItemListDtoV1
{
    public int Id { get; set; }

    public string Name { get; set; }

    public decimal RecommendedRetailPrice { get; set; }

    public decimal TaxRate { get; set; }
}

[JsonSourceGenerationOptions(PropertyNamingPolicy = JsonKnownNamingPolicy.CamelCase)]
[JsonSerializable(typeof(List<StockItemListDtoV1>))]
public partial class StockItemListDtoV1GenerationContext : JsonSerializerContext
{
}

The cost of constructing the Serialiser may be higher, but the cost of performing serialisation with it is much smaller.

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    var cached = await redisCache.StringGetAsync("StockItems");
    if (cached.HasValue)
    {
        return Content(cached, "application/json");
    }

    var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);

#if SERIALISER_SOURCE_GENERATION
    string json = JsonSerializer.Serialize(stockItems, typeof(List<Model.StockItemListDtoV1>), Model.StockItemListDtoV1GenerationContext.Default);
#else
    string json = JsonSerializer.Serialize(stockItems);
#endif

    await redisCache.StringSetAsync("StockItems", json, expiry: StockItemListExpiration);

    return Content(json, "application/json");
}

I used Telerik Fiddler to empty the cache then load the StockItems list 10 times (more tests would improve the quality of the results). The first trial was with the “conventional” serialiser

The average time for the conventional serialiser was 0.028562 seconds

The average time for the generated version was 0.030546 seconds. But, if the initial compilation step was ignored the average duration dropped to 0.000223 seconds a significant improvement.

.NET Core web API + Dapper – MiniProfiler Revisited

While exploring some of the functionality of MiniProfiler there were some 3rd party examples which caught my attention.

using (SqlConnection connection = new SqlConnection(@"Data Source=...; Initial Catalog=SyncDB; Trusted_Connection=Yes"))
{
    using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
    {
    if (profiledDbConnection.State != System.Data.ConnectionState.Open)
        profiledDbConnection.Open();
    using (SqlCommand command = new SqlCommand("Select * From Authors", connection))
    {
        using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, connection, MiniProfiler.Current))
        {                               
            var data = profiledDbCommand.ExecuteReader();

            //Write code here to populate the list of Authors
        }
    }
}                  

“Inspired” by code like this my first attempt to retrieve a list of stock items didn’t look right.

[HttpGet("AdoProfiledOtt")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAdoProfiledOtt()
{
    List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();

    using (SqlConnection connection = new SqlConnection(configuration.GetConnectionString("default")))
    {
        using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
        {
            await profiledDbConnection.OpenAsync();

            using (SqlCommand command = new SqlCommand(sqlCommandText, connection))
            {
                using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, profiledDbConnection, MiniProfiler.Current))
                {
                    using (SqlDataReader reader = await command.ExecuteReaderAsync())
                    {
                        using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
                        {
                            var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();

                            while (await profiledDbDataReader.ReadAsync())
                            {
                                response.Add(rowParser(profiledDbDataReader));
                            }
                        }
                    }
                }
            }

            await profiledDbConnection.CloseAsync();
        }
    }
}

Often Dapper has functionality like closing the connection if it needed to open it to reduce the amount of code required and this code looked verbose.

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
	IEnumerable<Model.StockItemListDtoV1> response = null;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
	}
	return this.Ok(response);
}

It seemed a bit odd that so many “usings” were needed so I had a look at ProfiledDBConnection.cs

/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbConnection"/> class.
/// Returns a new <see cref="ProfiledDbConnection"/> that wraps <paramref name="connection"/>,
/// providing query execution profiling. If profiler is null, no profiling will occur.
/// </summary>
/// <param name="connection"><c>Your provider-specific flavour of connection, e.g. SqlConnection, OracleConnection</c></param>
/// <param name="profiler">The currently started <see cref="MiniProfiler"/> or null.</param>
/// <exception cref="ArgumentNullException">Throws when <paramref name="connection"/> is <c>null</c>.</exception>
public ProfiledDbConnection(DbConnection connection, IDbProfiler? profiler)
{
    _connection = connection ?? throw new ArgumentNullException(nameof(connection));
    _connection.StateChange += StateChangeHandler;

    if (profiler != null)
    {
        _profiler = profiler;
    }
}
...
/// <summary>
/// Dispose the underlying connection.
/// </summary>
/// <param name="disposing">false if preempted from a <c>finalizer</c></param>
protected override void Dispose(bool disposing)
{
    if (disposing && _connection != null)
    {
        _connection.StateChange -= StateChangeHandler;
        _connection.Dispose();
    }
    base.Dispose(disposing);
    _connection = null!;
    _profiler = null;
}

One less “using” required as ProfiledDbConnection “automagically” disposes the SqlConnection. It also seemed a bit odd that the SqlCommand had a “using” so I had a look at ProfiledDbCommand.cs

 /// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbCommand"/> class.
/// </summary>
/// <param name="command">The command.</param>
/// <param name="connection">The connection.</param>
/// <param name="profiler">The profiler.</param>
/// <exception cref="ArgumentNullException">Throws when <paramref name="command"/> is <c>null</c>.</exception>
public ProfiledDbCommand(DbCommand command, DbConnection? connection, IDbProfiler? profiler)
{
    _command = command ?? throw new ArgumentNullException(nameof(command));

    if (connection != null)
    {
        _connection = connection;
        UnwrapAndAssignConnection(connection);
    }

    if (profiler != null)
    {
        _profiler = profiler;
    }
}
...
/// <summary>
/// Releases all resources used by this command.
/// </summary>
/// <param name="disposing">false if this is being disposed in a <c>finalizer</c>.</param>
protected override void Dispose(bool disposing)
{
   if (disposing && _command != null)
   {
       _command.Dispose();
    }
    _command = null!;
    base.Dispose(disposing);
}

Another “using” not required as ProfiledDbCommand “automagically” disposes the SqlCommand as well. It also seemed a bit odd that the SqlDataReader had a using so I had a look at profileDbDataReader.cs

/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }

/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="behavior">The behavior specified during command execution.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
{
    WrappedReader = reader;
    Behavior = behavior;
    _profiler = profiler;
}
...
/// <summary>
/// The <see cref="DbDataReader"/> that is being used.
/// </summary>
public DbDataReader WrappedReader { get; }

/// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
protected override void Dispose(bool disposing)
{
    // reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
    // an unwrapped reader from the base command
    WrappedReader?.Dispose();
    base.Dispose(disposing);
}

Another “using” not required as ProfiledDbDataReader “automagically” disposes the SqlDataReader. This was my final version of profiling the System.Data.SqlClient code to retrieve a list of stock items.

[HttpGet("AdoProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetProfiledAdo()
{
    List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();

    using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection((SqlConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        await profiledDbConnection.OpenAsync();

        using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(new SqlCommand(sqlCommandText), profiledDbConnection, MiniProfiler.Current))
        {
            DbDataReader reader = await profiledDbCommand.ExecuteReaderAsync();

            using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
            {
                var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();

                while (await profiledDbDataReader.ReadAsync())
                {
                    response.Add(rowParser(profiledDbDataReader));
                }
            }
        }
    }

    return this.Ok(response);
}

The profileDbDataReader.cs implementation was “sparse” and when loading a longer list of stock items there were some ReadAsync calls which took a bit longer.

/// <summary>
/// The profiled database data reader.
/// </summary>
public class ProfiledDbDataReader : DbDataReader
{
    private readonly IDbProfiler? _profiler;

    /// <summary>
    /// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
    /// </summary>
    /// <param name="reader">The reader.</param>
    /// <param name="profiler">The profiler.</param>
    public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }

    /// <summary>
    /// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
    /// </summary>
    /// <param name="reader">The reader.</param>
    /// <param name="behavior">The behavior specified during command execution.</param>
    /// <param name="profiler">The profiler.</param>
    public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
    {
        WrappedReader = reader;
        Behavior = behavior;
        _profiler = profiler;
    }

    /// <summary>Gets the behavior specified during command execution.</summary>
    public CommandBehavior Behavior { get; }

    /// <inheritdoc cref="DbDataReader.Depth"/>
    public override int Depth => WrappedReader.Depth;

    /// <inheritdoc cref="DbDataReader.FieldCount"/>
    public override int FieldCount => WrappedReader.FieldCount;

    /// <inheritdoc cref="DbDataReader.HasRows"/>
    public override bool HasRows => WrappedReader.HasRows;

    /// <inheritdoc cref="DbDataReader.IsClosed"/>
    public override bool IsClosed => WrappedReader.IsClosed;

    /// <inheritdoc cref="DbDataReader.RecordsAffected"/>
    public override int RecordsAffected => WrappedReader.RecordsAffected;

    /// <summary>
    /// The <see cref="DbDataReader"/> that is being used.
    /// </summary>
    public DbDataReader WrappedReader { get; }

    /// <inheritdoc cref="DbDataReader.this[string]"/>
    public override object this[string name] => WrappedReader[name];

    /// <inheritdoc cref="DbDataReader.this[int]"/>
    public override object this[int ordinal] => WrappedReader[ordinal];
...
    /// <inheritdoc cref="DbDataReader.GetString(int)"/>
    public override string GetString(int ordinal) => WrappedReader.GetString(ordinal);

    /// <inheritdoc cref="DbDataReader.GetValue(int)"/>
    public override object GetValue(int ordinal) => WrappedReader.GetValue(ordinal);

    /// <inheritdoc cref="DbDataReader.GetValues(object[])"/>
    public override int GetValues(object[] values) => WrappedReader.GetValues(values);

    /// <inheritdoc cref="DbDataReader.IsDBNull(int)"/>
    public override bool IsDBNull(int ordinal) => WrappedReader.IsDBNull(ordinal);

    /// <inheritdoc cref="DbDataReader.IsDBNullAsync(int, CancellationToken)"/>
    public override Task<bool> IsDBNullAsync(int ordinal, CancellationToken cancellationToken) => WrappedReader.IsDBNullAsync(ordinal, cancellationToken);

    /// <inheritdoc cref="DbDataReader.NextResult()"/>
    public override bool NextResult() => WrappedReader.NextResult();

    /// <inheritdoc cref="DbDataReader.NextResultAsync(CancellationToken)"/>
    public override Task<bool> NextResultAsync(CancellationToken cancellationToken) => WrappedReader.NextResultAsync(cancellationToken);

    /// <inheritdoc cref="DbDataReader.Read()"/>
    public override bool Read() => WrappedReader.Read();

    /// <inheritdoc cref="DbDataReader.ReadAsync(CancellationToken)"/>
    public override Task<bool> ReadAsync(CancellationToken cancellationToken) => WrappedReader.ReadAsync(cancellationToken);

    /// <inheritdoc cref="DbDataReader.Close()"/>
    public override void Close()
    {
        // reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
        // an unwrapped reader from the base command
        WrappedReader?.Close();
        _profiler?.ReaderFinish(this);
    }

    /// <inheritdoc cref="DbDataReader.GetSchemaTable()"/>
    public override DataTable? GetSchemaTable() => WrappedReader.GetSchemaTable();

    /// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
    protected override void Dispose(bool disposing)
    {
        // reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
        // an unwrapped reader from the base command
        WrappedReader?.Dispose();
        base.Dispose(disposing);
    }
}

In the [HttpGet(“DapperProfiledQueryMultipleStep”)] method I wrapped ReadAsync and could see in the profiling that every so often a call did take significantly longer.

using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
{
   response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
}

I did consider modifying profileDbDataReader.cs to add some instrumentation to the Read… and Get… methods but, the authors of miniprofiler are way way smarter than me so there must be a reason why they didn’t.

.NET Core web API + Dapper – Distributed Cache

I have used LazyCache for several projects (The Things Network V2 HTTP, The Things Industries V2 MQTT The Things Industries V3 and Swarm Space Azure IoT Connector etc.) to cache Azure IoT Hub DeviceClient and other object instances.

The note on the wiki page For LazyCache v2+ users, you should consider switching away from LazyCache to IDistributedCache. More information at #59 caught my attention.

I have written other posts about caching Dapper query results with the Dapper Extension Library which worked well but had some configuration limitations. I also have posts about off-loading read-only workloads with Azure Active geo-replication or SQL Data Sync for Azure, which worked well in some scenarios but had limitations (performance and operational costs).

The IDistributedCache has Memory, SQL Server and Redis implementations so I built an Azure AppService to explore the functionality in more detail. In another project I had been working with the Azure SignalR Service and the use of the MessagePack library(rather than serialised JSON) caught my attention so I have added basic support for that as well.

I explored the in-memory implementation (AddDistributedMemoryCache) on my development machine and found “tinkering” with the configuration options had little impact on the performance of my trivial sample application.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddSingleton<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

#if SERIALISATION_MESSAGE_PACK
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options;
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4Block);
    MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4BlockArray);
#endif

#if DISTRIBUTED_CACHE_MEMORY
    builder.Services.AddDistributedMemoryCache(options =>
    {
       options.SizeLimit = 1000 * 1024 * 1024; // 1000MB
    });
    builder.Services.AddDistributedMemoryCache();
#endif

#if DISTRIBUTED_CACHE_REDIS
    var configurationOptions = new ConfigurationOptions
    {
        EndPoints = { builder.Configuration.GetSection("RedisConnection").GetValue<string>("EndPoints") },
        AllowAdmin = true,
        Password = builder.Configuration.GetSection("RedisConnection").GetValue<string>("Password"),
        Ssl = true,
        ConnectRetry = 5,
        ConnectTimeout = 10000,
        SslProtocols = System.Security.Authentication.SslProtocols.Tls12,
        AbortOnConnectFail = false,
    };

    builder.Services.AddStackExchangeRedisCache(options =>
    {
        options.InstanceName = "Dapper WebAPI Instance";
        options.ConfigurationOptions = configurationOptions;
    });
#endif

#if DISTRIBUTED_CACHE_SQL_SERVER
    builder.Services.AddDistributedSqlServerCache(options =>
    {
        options.ConnectionString = builder.Configuration.GetConnectionString("CacheDatabase");
        options.SchemaName = "dbo";
        options.TableName = "StockItemsCache";
    });
#endif

    var app = builder.Build();

    // Configure the HTTP request pipeline.
    app.UseHttpsRedirection();
    app.MapControllers();
    app.Run();
}

I tested the SQL Server implementation (AddDistributedSqlServerCached) using the SQL Server on my development machine, and Azure SQL as a backing store. I did consider using SQL Azure In-Memory OLTP but the performance improvement with my trivial example would most probably not worth the additional cost of the required SKU.

CREATE TABLE [dbo].[StockItemsCache](
	[Id] [nvarchar](449) NOT NULL,
	[Value] [varbinary](max) NOT NULL,
	[ExpiresAtTime] [datetimeoffset](7) NOT NULL,
	[SlidingExpirationInSeconds] [bigint] NULL,
	[AbsoluteExpiration] [datetimeoffset](7) NULL,
PRIMARY KEY CLUSTERED 
(
	[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

The table used to store the data wasn’t very complex and I could view the data associated with a cache key in SQL Server Mangement studio.

SQL Server Managment Studio displaying cache table contents

One of the applications I work on uses a complex SQL Server Stored procedure to load reference data (updated daily) and being able to purge the cache at the end of this process like this might be useful. For a geographically distributed application putting the Azure SQL instance “closer” to the application’s users might be worth considering.

I trialed the Redis implementation with Memurai (on my development machine) and Azure Cache for Redis with multiple Azure AppService clients and there was a significant improvement in performance.

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    var utcNow = DateTime.UtcNow;

    var cached = await distributedCache.GetAsync("StockItems");
    if (cached != null)
    {
#if SERIALISATION_JSON
        return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
    }

    var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);

#if SERIALISATION_JSON
    await distributedCache.SetAsync("StockItems", JsonSerializer.SerializeToUtf8Bytes(stockItems), new DistributedCacheEntryOptions()
#endif
#if SERIALISATION_MESSAGE_PACK
    await distributedCache.SetAsync("StockItems", MessagePackSerializer.Serialize(stockItems), new DistributedCacheEntryOptions()
#endif
    {
        AbsoluteExpiration = new DateTime(utcNow.Year, utcNow.Month, DateTime.DaysInMonth(utcNow.Year, utcNow.Month), StockItemListAbsoluteExpiration.Hours, StockItemListAbsoluteExpiration.Minutes, StockItemListAbsoluteExpiration.Seconds)
    });

    return this.Ok(stockItems);
}

[HttpGet("NoLoad")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetNoLoad()
{
    var cached = await distributedCache.GetAsync("StockItems");
    if (cached == null)
    {
        return this.NoContent();
    }

#if SERIALISATION_JSON
    return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
}

In my test environment the JSON payload for a list of stock items was a bit “chunky” at 25K bytes, so I added compile time configurable support for the MessagePack library. This significantly reduced the size of the payload LZ4Block (5K bytes) and LZ4BlockArray (5K2 bytes) which should reduce network traffic.

Assuming the overheads of JSON vs. MessagePack serialisation are similar and the much smaller MessagePack library payload I would most probably use MessagePack and LZ4BlockArray (For improved compatibility with other implementations) compression.

.NET Core web API + Dapper – Distributed Cache Redis HMGET Error

Initially, I was running the WebAPI Dapper DistributedCache application on my development box and it ran third/forth time after sorting out the Azure Cache for Redis connection string and firewall configuration.

StockItems list served from the cache running on my Desktop
The Azure Cache for Redis firewall configuration for my home office

I then deployed the WebAPI Dapper DistributedCache application to an Azure AppService and it failed (spoiler alert it wasn’t DNS).

Azure AppService hosted StockItems list code failing in web browser

I then fired up Azure Application Insights to look at the logging information.

Azure Application Insights diagnostic information

There were many questions and tentative answers on stackoverflow some of them I tried, and which didn’t work in my scenario.

No connection is active/available to service this operation: HMGET Dapper WebAPI InstanceStockItems; 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.redis.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433, 
mc: 1/1/0, mgr: 10 of 10 available, 
clientName: XXXXXXXXXXXX, <----------------------This was important
IOCP: (Busy=0,Free=1000,Min=1,Max=1000), 
WORKER: (Busy=5,Free=1018,Min=1,Max=1023), 
v: 2.2.4.27433 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433 

I then recreated my configuration from scratch so none of my random changes based on stackoverflow posts would mess my next round of debugging.

Azure AppService Virtual, Outbound and Additional Outbound IP Addresses

The error appeared to be a networking issue, most probably the firewall blocking connections so I added the Azure AppService Virtual IP Address which didn’t work…

Firewall configuration for Azure AppService Virtual IP Address

I then went back and re-read the error message and noticed “clientName: XXXXXXXXXXXX,”. The Azure AppService client name wasn’t the default domain…

Firewall configuration for Azure AppService Virtual and outbound IP Addresses

I then added the Outbound IP addresses to the firewall and the Azure AppService could connect to my Azure Cache for Redis instance.

StockItems list served from the cache running in an AppService

As soon as I realised it wasn’t DNS I should have “stopped counted to ten”, debugged the issue carefully and not immediately gone to stackoverflow.

.NET Core web API + Dapper – LazyCache

I have used Alastair Crabtree’s LazyCache (read-through in-memory cache) on a couple of other projects e.g. The Things Industries V3 Connector (DeviceClient instances) and Swarm Space Azure IoT Connector (DeviceClient Instances and CS-Script binaries).

Adding LazyCache support to an ASP.Net Core WebAPI application is one line of code.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    // Register IAppCache as a singleton CachingService
    builder.Services.AddLazyCache();

    var app = builder.Build();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

One of the applications I work uses a lot of “reference” data which is effectively static e.g. national/regional holidays in countries where they have customers, and data which is updated on a schedule e.g. exchange rates downloaded at a known time every day.

//...
 private readonly TimeSpan StockItemListAbsoluteExpirationTime = new TimeSpan(23, 59, 59);
//....

private async Task<IEnumerable<Model.StockItemListDtoV1>> GetStockItemsAsync()
{
    return await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
}

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    MemoryCacheEntryOptions memoryCacheEntryOptions = new MemoryCacheEntryOptions()
    {
        Priority = CacheItemPriority.NeverRemove,
        AbsoluteExpiration = DateTime.UtcNow.Date.Add(StockItemListAbsoluteExpirationTime)
    };

    return this.Ok(await cache.GetOrAddAsync("StockItems", () => GetStockItemsAsync(), memoryCacheEntryOptions));
}

[HttpDelete()]
public ActionResult Delete()
{
    cache.Remove("StockItems");

    return this.Ok();
}

For the holiday information updates, the dataset could be deleted(“burped”) then refreshed the next time it is referenced or deleted then re-loaded. For exchange rate updates the cache could automatically expire the dataset shortly after the scheduled download.

When I deployed my demo application to an Azure AppService the requests (using the composer tab of Telerik Fiddler) took longer than when the application was running on my development box which was unexpected. I think this was because the JavaScript Object Notation(JSON) response payload was larger than the AzureSQL Tabular Data Stream(TDS) payload.

The LazyCache source code hasn’t been updated for quite a while (but it is quite stable and works well) and moving to other libraries has been recommended in the wiki.

.NET Core web API + Dapper – MiniProfiler

The time to open the home page of one of the “legacy” applications I was working on was slowly increasing over time. After a quick investigation it looked like there a couple of Azure SQL stored procedures which were called many times as the home page was opening were the problem.

The application used a combination of ADO.Net and Dapper to access the Azure SQL database, so Stack Overflow MiniProfiler looked like a good option.

There were quite a few samples online like the one below which “inspired” my approach.

using (SqlConnection connection = new SqlConnection(_configuration.GetConnectionString("default")))
{
   using (ProfiledDbConnection connection = new ProfiledDbConnection((IDbConnection)connection, MiniProfiler.Current))
   {
      await connection.OpenAsync();

      using (ProfiledDbCommand command = (ProfiledDbCommand)connection.CreateCommand())
      {
         command.CommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems];
         command.CommandType = CommandType.Text;

                 using (ProfiledDbDataReader reader = (ProfiledDbDataReader)await command.ExecuteReaderAsync())
                 {
                     var rowParser = reader.GetRowParser<Model.StockItemListDtoV1>();

                     while (await reader.ReadAsync())
                     {
                         response.Add(rowParser(reader));
                     }

                     await reader.CloseAsync();
                 }
             }

             await connection.CloseAsync();
       }
}

A simple modification of the WebAPI Dapper PerformanceProfiling project program.cs file enabled MiniProfiler for the project.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    builder.Services.AddMiniProfiler(options =>
    {
        options.RouteBasePath = "/profiler";
        //options.SqlFormatter = new StackExchange.Profiling.SqlFormatters.InlineFormatter();
        options.TrackConnectionOpenClose = true;
        //options.PopupShowTrivial = false
    });

    var app = builder.Build();

    app.UseMiniProfiler();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

For simple Dapper calls, the modifications to WebAPI controllers were minimal, and could be done largely with Visual Studio 2022 “search & replace.”

[HttpGet("Dapper")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapper()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = dapperContext.ConnectionCreate())
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
}
World Wide Importers database list of stockitems

The World Wide Importers database has approximately 250 StockItems which was representative of one of the problematic queries.

[HttpGet("DapperProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiled()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
 }

When I executed the GetDapperProfiler() method of the StockItems controller on my development box it took roughly 2.4 seconds.

MiniProfiler results for StockItems query running on my desktop

I sometimes ran the website on my development box so when I used “toggle trivial gaps” it was easier to see what where the delays were.

When I executed the GetDapperProfiler() method of the StockItems controller running in an Azure AppService it took roughly 20 mSec.

MiniProfiler results for StockItems query running in an Azure AppService

In one application a QueryMultipleAsync is used to retrieve information about a product and a list of its attributes. The World Wide Importers database has Invoices which have invoice lines and Transactions which was representative of another problematic query.

[HttpGet("DapperProfiledQueryMultiple")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultiple([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (ProfiledDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        var invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);

        response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();

        response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
    }

    return this.Ok(response);
}
World Wide Importers database list of invoice lines and transactions for a StockItem
MiniProfiler results for Invoice Item Query Multiple running in an Azure AppService

I couldn’t see any results for reading the StockItem Invoice lines and Transactions so I wrapped each ReadAsync with a MiniProfiler.Current.Step.

[HttpGet("DapperProfiledQueryMultipleStep")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultipleStep([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        SqlMapper.GridReader invoiceSummary;

        using (MiniProfiler.Current.Step("db.QueryMultipleAsync"))
        {
            invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);
        }

        using (MiniProfiler.Current.Step("invoiceSummary.ReadSingleOrDefaultAsync"))
        {
            response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        }
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
        {
            response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
        }

        using (MiniProfiler.Current.Step("TransactionSummary.ReadAsync"))
        {
            response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
        }
    }

    return this.Ok(response);
}

With larger lists every so often there were ReadAsync calls that took a more than a “trivial” amount of time. I surmise was some sort of batching done by the underlying ReadAsync + NextResultAsync methods of a SqlDataReader.

Need to investigate the use of

using (IDbConnection db = new ProfiledDbConnection(new SqlConnection(_configuration.GetConnectionString("default"), MiniProfiler.Current))
{
//...
}

.NET Core web API + Dapper – HTTP Patch

The application I’m currently working on has some tables with many columns and these were proving painful to update with HTTP PUT methods. Over the last couple of releases, I have been extending the customer facing API with PATCH methods so the client can specify only the values to changed.

{ 
    "op": "replace", 
    "path": "/name", 
    "value": "USB missile launcher (Green)." 
},
{ 
    "op": "replace", 
    "path": "/UnitPrice", 
    "value": 25
},
{ 
    "op": "replace", 
    "path": "/recommendedRetailPrice", 
    "value": 37.41
}

The JSON Patch is a format for specifying updates to be applied to a resource.

Stock Items list before HTTP Patch operation

A JSON Patch document has an array of operations which identify a particular type of change.

Using Telerik Fiddler Composer functionality to apply an HTTP PATCH
Stock Items list after HTTP Patch operation

The StockItemPatchDtoV1 class is decorated with DataAnnotations to ensure the contents are valid.

public class StockItemPatchDtoV1
{
    [Required]
    [StringLength(100, MinimumLength = 1, ErrorMessage = "The name text must be at least {2} and no more than {1} characters long")]  // These would be constants in a real application
    public string Name { get; set; }

    [Required]
    [Range(0.0, 100.0)] // These would be constants in a real application
    public decimal UnitPrice { get; set; }

    [Required]
    [Range(0.0, 1000000.0)] // These would be constants in a real application
    public decimal RecommendedRetailPrice { get; set; }
}

The StockItemsController [HttpPatch(“{id}”)] method retrieves the stock item to be updated, then uses ApplyTo method and TryValidateModel to update only the specified fields.

[HttpPatch("{id}")]
public async Task<ActionResult<Model.StockItemGetDtoV1>> Patch([FromBody] JsonPatchDocument<Model.StockItemPatchDtoV1> stockItemPatch, int id)
{
    Model.StockItemGetDtoV1 stockItem;

    using (IDbConnection db = dapperContext.ConnectionCreate())
    {
        stockItem = await db.QuerySingleOrDefaultWithRetryAsync<Model.StockItemGetDtoV1>(sql: "[Warehouse].[StockItemsStockItemLookupV1]", param: new { stockItemId = id }, commandType: CommandType.StoredProcedure);

        if (stockItem == default)
        {
            logger.LogInformation("StockItem:{id} not found", id);

            return this.NotFound($"StockItem:{id} not found");
        }

        Model.StockItemPatchDtoV1 stockItemPatchDto = mapper.Map<Model.StockItemPatchDtoV1>(stockItem);

        stockItemPatch.ApplyTo(stockItemPatchDto, ModelState);

        if (!ModelState.IsValid || !TryValidateModel(stockItemPatchDto))
        {
            logger.LogInformation("stockItemPatchDto invalid {0}", string.Join(Environment.NewLine, ModelState.Values.SelectMany(v => v.Errors).Select(v => v.ErrorMessage + " " + v.Exception))); // would extract this out into shared module

            return BadRequest(ModelState);
        }

        mapper.Map(stockItemPatchDto, stockItem);

        await db.ExecuteWithRetryAsync(sql: "UPDATE Warehouse.StockItems SET StockItemName=@Name, UnitPrice=@UnitPrice, RecommendedRetailPrice=@RecommendedRetailPrice WHERE StockItemId=@Id", param: stockItem, commandType: CommandType.Text);
    }

    return this.Ok();
}

Initially the HTTP Patch method returned this error message.

HTTP/1.1 400 Bad Request
Content-Type: application/problem+json; charset=utf-8
Date: Tue, 27 Jun 2023 09:20:30 GMT
Server: Kestrel
Transfer-Encoding: chunked

1d7
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"00-665a6ee9ed1105a105237c421793af5d-1719bda40c0b7d5d-00","errors":{"$":["The JSON value could not be converted to Microsoft.AspNetCore.JsonPatch.JsonPatchDocument`1[devMobile.WebAPIDapper.HttpPatch.Model.StockItemPatchDtoV1]. Path: $ | LineNumber: 0 | BytePositionInLine: 1."],"stockItemPatch":["The stockItemPatch field is required."]}}
0

After some research I worked out that I had forgotten to wire up the Newtonsoft JSON support with builder.Services.AddControllers().AddNewtonsoftJson();

.NET Core web API + Dapper – Dependency Injection

So far, to keep the code really obvious (tends to be more verbose) I have limited the use Dependency Injection(DI). I have been “injecting” an instance of an IConfiguration interface then retrieving the database connection string and other configuration. This isn’t a great approach as the database connection string name is in multiple files etc.

public CustomerController(IConfiguration configuration, ILogger<CustomerController> logger)
{
    this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
    this.logger = logger;
}

I wanted to explore the impact(s) of different DI approaches on ADO.Net connection pooling. With the system idle I used exec sp_who to see how many connections there were to my SQL Azure database.

SQL Server Management Studio(SSMS) sp_who query – Idle

Dapper Context approach

The application I’m currently working on uses a Command and Query Responsibility Segregation(CQRS) like approach. The application is largely “read only” so we have replicas of the database to improve the performance of queries hence the ConnectionReadCreate and ConnectionWriteCreate methods.

namespace devMobile.WebAPIDapper.ListsDIBasic
{
   using System.Data;
   using System.Data.SqlClient;

   using Microsoft.Extensions.Configuration;

   public interface IDapperContext 
   {
      public IDbConnection ConnectionCreate();
      public IDbConnection ConnectionCreate(string connectionStringName);

      public IDbConnection ConnectionReadCreate();
      public IDbConnection ConnectionWriteCreate();
   }

   public class DapperContext : IDapperContext
   {
      private readonly IConfiguration _configuration;

      public DapperContext(IConfiguration configuration)
      {
         _configuration = configuration;
      }

      public IDbConnection ConnectionCreate()
      { 
         return new SqlConnection(_configuration.GetConnectionString("default"));
      }

      public IDbConnection ConnectionCreate(string connectionStringName)
      {
         return new SqlConnection(_configuration.GetConnectionString(connectionStringName));
      }

      public IDbConnection ConnectionReadCreate()
      {
         return new SqlConnection(_configuration.GetConnectionString("default-read"));
      }

      public IDbConnection ConnectionWriteCreate()
      {
         return new SqlConnection(_configuration.GetConnectionString("default-write"));
      }
   }
}

I have experimented with how the IDapperContext context is constructed in the application startup with builder.Services.AddSingleton, builder.Services.AddScopedand and builder.Services.AddTransient.

public class Program
{
    public static void Main(string[] args)
    {
        var builder = WebApplication.CreateBuilder(args);

        // Add services to the container.
        //builder.Services.AddSingleton<IDapperContext>(s => new DapperContext(builder.Configuration));
        //builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));
        //builder.Services.AddScoped<IDapperContext>(s => new DapperContext(builder.Configuration));

        builder.Services.AddControllers();

        var app = builder.Build();

        // Configure the HTTP request pipeline.

        app.UseHttpsRedirection();

        app.MapControllers();

        app.Run();
    }
}

Then in the StockItems controller the IDapperContext interface implementation is used to create an IDbConnection for Dapper operations to use. I also added “WAITFOR DELAY ’00:00:02″ to the query to extend the duration of the requests.

[ApiController]
[Route("api/[controller]")]
public class StockItemsController : ControllerBase
{
   private readonly ILogger<StockItemsController> logger;
   private readonly IDapperContext dapperContext;

   public StockItemsController(ILogger<StockItemsController> logger, IDapperContext dapperContext)
   {
      this.logger = logger;

      this.dapperContext = dapperContext;
   }

   [HttpGet]
   public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
   {
      IEnumerable<Model.StockItemListDtoV1> response;

      using (IDbConnection db = dapperContext.ConnectionCreate())
      {
         //response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
         response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02'", commandType: CommandType.Text);
      }

      return this.Ok(response);
   }
...
}

I ran a stress testing application which simulated 50 concurrent users. When the stress test rig was stopped all the connections in the pool were closed after roughly 5 minutes.

SQL Server Management Studio(SSMS) sp_who query – stress test

Nasty approach

I then tried using DI to create a new connection for each request using builder.Services.AddScoped

public class Program
{
    public static void Main(string[] args)
    {
        var builder = WebApplication.CreateBuilder(args);

        // Add services to the container.

        //builder.Services.AddSingleton<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));
        //builder.Services.AddScoped<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));
        //builder.Services.AddTransient<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));

        builder.Services.AddControllers();

        var app = builder.Build();

        app.UseHttpsRedirection();

        app.MapControllers();

        app.Run();
    }
}

The code in the get method was reduced. I also added “WAITFOR DELAY ’00:00:02″ to the query to extend the duration of the requests.

public class StockItemsController : ControllerBase
{
   private readonly ILogger<StockItemsController> logger;
   private readonly IDbConnection dbConnection;

   public StockItemsController(ILogger<StockItemsController> logger, IDbConnection dbConnection)
   {
      this.logger = logger;

      this.dbConnection = dbConnection;
   }

   [HttpGet]
   public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
   {
      // return this.Ok(await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02';", commandType: CommandType.Text));
      return this.Ok(await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text));
    }
...
}

With the stress test rig running the number of active connections was roughly the same as the DapperContext based implementation.

I don’t like this approach so will stick with DapperContext

.NET Core web API + Dapper – Parameters

Different Approaches…

While working on customer ASP.NET Core web API(WebAPI) + Microsoft SQL Server(MSSQL) applications I have encountered several different ways of passing parameters to stored procedures and embedded Structured Query Language(SQL) statements. I have created five examples which query the World Wide Importers database [Warehouse].[StockItems] in the World Wide Importers database to illustrate the different approaches.

A customer with large application which had a lot of ADO.Net code was comfortable Dapper DynamicParameters. Hundreds of stored procedures with input (some output) parameters were used to manage access to data. The main advantage of this approach was “familiarity” and the use of DynamicParameters made mapping of C# variable and stored procedure parameters (with different naming conventions) obvious.

[HttpGet("Dynamic")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDynamic(
            [Required][MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")] string searchText,
            [Required][Range(1, int.MaxValue, ErrorMessage = "MaximumRowsToReturn must be greater than or equal to {1}")] int maximumRowsToReturn)
{
    IEnumerable<Model.StockItemListDtoV1> response = null;

    using (SqlConnection db = new SqlConnection(this.connectionString))
    {
        DynamicParameters parameters = new DynamicParameters();

        parameters.Add("MaximumRowsToReturn", maximumRowsToReturn);
        parameters.Add("SearchText", searchText);

        response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: parameters, commandType: CommandType.StoredProcedure);
    }

    return this.Ok(response);
}
Error message displayed when SearchText field missing
Error message displayed when SearchText is too short
Error message displayed when SearchText too long
Successful query of StockItems table

The developers at another company used anonymous typed variables everywhere. They also had similar C# and stored procedure parameter naming conventions so there was minimal (in the example code only maximumRowsToReturn vs. stockItemsMaximum) mapping required. They found mapping stored procedure output parameters was problematic. For longer parameter lists they struggled with formatting the code in a way which was readable.

 [HttpGet("Anonymous")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAnonymous(
            [Required][MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")] string searchText,
            [Required][Range(1, 100, ErrorMessage = "The maximum number of stock items to return must be greater than or equal to {1} and less then or equal {2}")] int stockItemsMaximum)
{
   IEnumerable<Model.StockItemListDtoV1> response = null;

   using (SqlConnection db = new SqlConnection(this.connectionString))
   {
      response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", new { searchText, maximumRowsToReturn = stockItemsMaximum }, commandType: CommandType.StoredProcedure);
   }

   return this.Ok(response);
}

At another customer the developers used Data Transfer Objects(DTOs)/Plain Old CLR Objects(POCOs) and they had some control over the naming of the stored procedure/embedded SQL parameters.

public class StockItemNameSearchDtoV1
{
   [Required]
   [MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")]
   public string SearchText { get; set; }

   [Required]
   [Range(1, 100, ErrorMessage = "The maximum number of rows to return must be greater than or equal to {1} and less then or equal {2}")]
   public int MaximumRowsToReturn { get; set; }
}
[HttpGet("AutomagicDefault")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDefault([FromQuery] Model.StockItemNameSearchDtoV1 request)
{
   IEnumerable<Model.StockItemListDtoV1> response = null;

   using (SqlConnection db = new SqlConnection(this.connectionString))
   {
      response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: request, commandType: CommandType.StoredProcedure);
   }

   return this.Ok(response);
}

At another customer the developers used Data Transfer Objects(DTOs)/Plain Old CLR Objects(POCOs) to access the database which had several hundred stored procedures. They had no control over the stored procedure parameter names so they mapped query string parameters to the properties of their POCOs.

This took some experimentation as System.Text.Json/Newtonsoft.Json decorations didn’t work (query string is not Java Script Object Notation(JSON)). They decorated the properties of their DTOs with the [FromQuery] attribute.

public class StockItemNameSearchDtoV2
{
   [Required]
   [FromQuery(Name = "SearchText")]
   [MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more than {1} characters long")]
   public string SearchText { get; set; }

   [Required]
   [FromQuery(Name = "StockItemsMaximum")]
   [Range(1, 100, ErrorMessage = "The maximum number of stock items to return must be greater than or equal to {1} and less then or equal {2}")]
   public int MaximumRowsToReturn { get; set; }
}
[HttpGet("AutomagicMapped")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetMapperDecorated([FromQuery] Model.StockItemNameSearchDtoV2 request)
{
   IEnumerable<Model.StockItemListDtoV1> response = null;

   using (SqlConnection db = new SqlConnection(this.connectionString))
   {
      response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: request, commandType: CommandType.StoredProcedure);
   }

   return this.Ok(response);
}

I don’t think that [FromQuery] decorations on POCOs is a good idea. If the classes are only used for one method I would consider moving them into the controller file.

//
// https://localhost:5001/api/StockItemsParameter/Array?StockItemId=1&StockItemId=5&StockItemId=10
//
[HttpGet("Array")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetArray(
   [FromQuery(Name = "stockItemID")][Required(), MinLength(1, ErrorMessage = "Minimum of {1} StockItem id(s)"), MaxLength(100, ErrorMessage = "Maximum {1} StockItem ids")] int[] stockItemIDs)
{
    IEnumerable<Model.StockItemListDtoV1> response = null;

    using (SqlConnection db = new SqlConnection(this.connectionString))
    {
        response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems] WHERE  StockItemID IN @StockItemIds ", new { StockItemIDs = stockItemIDs }, commandType: CommandType.Text);
    }

    return this.Ok(response);
}

A customer wanted users to be able search for items selected in a multiple selection list so a Dapper WHERE IN value array was used.

Dapper WHERE IN with no StockItemIds on the query string
Dapper WHERE IN with several StockItemIds on query string

To explore how this worked I downloaded the Dapper source code and reference the project in my solution.

After single stepping through the Dapper source code I found where the array of StockTtems was getting mapped into a “generated” parameterised SQL statement.

Dapper generated parameterised SQL Statement

Based on my customer’s experiences a “mix ‘and ‘n’ match” approach to parameterising Dapper queries looks like a reasonable approach.

.NET Core web API + Dapper – Asynchronicity Revisited

Asynchronous is always better, maybe…

For a trivial ASP.NET Core web API controller like the one below the difference between using synchronous and asynchronous calls is most probably negligible. Especially as the sample World Wide Importers database [Warehouse].[StockItems] table only has 227 records.

[HttpGet("IEnumerableSmall")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableSmall([FromQuery] bool buffered = false)
{
	IEnumerable<Model.StockItemListDtoV1> response = null;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		logger.LogInformation("IEnumerableSmall start Buffered:{buffered}", buffered);

		response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
			sql: @"SELECT [SI1].[StockItemID] as ""ID"", [SI1].[StockItemName] as ""Name"", [SI1].[RecommendedRetailPrice], [SI1].[TaxRate]" +
				   "FROM [Warehouse].[StockItems] as SI1",
			buffered,
			commandType: CommandType.Text);

		logger.LogInformation("IEnumerableSmall done");
	}

	return this.Ok(response);
}

The easiest way to increase the size of the returned record was with CROSS JOIN(s). This is the first (and most probably the last time) I have used a cross join in a “real” application.

[HttpGet("IEnumerableMedium")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableMedium([FromQuery] bool buffered = false)
{
	IEnumerable<Model.StockItemListDtoV1> response = null;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		logger.LogInformation("IEnumerableMedium start Buffered:{buffered}", buffered);

		response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
					sql: @" SELECT [SI2].[StockItemID] as ""ID"", [SI2].[StockItemName] as ""Name"", [SI2].[RecommendedRetailPrice], [SI2].[TaxRate]" +
									"FROM [Warehouse].[StockItems] as SI1" +
									"CROSS JOIN[Warehouse].[StockItems] as SI2",
					buffered,
					commandType: CommandType.Text);

		logger.LogInformation("IEnumerableMedium done");
	}

	return this.Ok(response);
}

The medium controller returns 51,529 (227 x 227) rows and the large controller upto 11,697,083 (227 x 227 x 227) rows.

[HttpGet("IEnumerableLarge")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableLarge()
{
	IEnumerable<Model.StockItemListDtoV1> response = null;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		logger.LogInformation("IEnumerableLarge start");

		response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
				sql: $@"SELECT [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
						"FROM [Warehouse].[StockItems] as SI1" +
						"   CROSS JOIN[Warehouse].[StockItems] as SI2" +
						"	CROSS JOIN[Warehouse].[StockItems] as SI3",
				commandType: CommandType.Text);

		logger.LogInformation("IEnumerableLarge done");
	}

	return this.Ok(response);
}

The first version of “IEnumerableLarge” returned some odd Hyper Text Transfer Protocol(HTTP) error codes and Opera kept running out of memory.

After a roughly 3minute delay Opera Browser displayed a 500 error

I think this error was due to the Azure App Service Load Balancer 230 second timeout.

Opera displaying out of memory error

I added some query string parameters to the IEnumerable and IAsyncEnumerable methods so the limit number of records returned by the QueryWithRetryAsync(us the TOP statement).

if (command.Buffered)
{
   var buffer = new List<T>();
   var convertToType = Nullable.GetUnderlyingType(effectiveType) ?? effectiveType;
   while (await reader.ReadAsync(cancel).ConfigureAwait(false))
   {
      object val = func(reader);
      buffer.Add(GetValue<T>(reader, effectiveType, val));
   }
   while (await reader.NextResultAsync(cancel).ConfigureAwait(false)) 
   { /* ignore subsequent result sets */ }
   command.OnCompleted();
   return buffer;
}
else
{
   // can't use ReadAsync / cancellation; but this will have to do
   wasClosed = false; // don't close if handing back an open reader; rely on the command-behavior
   var deferred = ExecuteReaderSync<T>(reader, func, command.Parameters);
   reader = null; // to prevent it being disposed before the caller gets to see it
   return deferred;
 }

The QueryWithRetryAsync method (My wrapper around Dapper’s QueryAsync) also has a “buffered” vs. “Unbuffered” reader parameter(defaults to True) and I wanted to see if that had any impact.

[HttpGet("IEnumerableLarge")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableLarge([FromQuery] bool buffered = false, [FromQuery] int recordCount = 10)
{
	IEnumerable<Model.StockItemListDtoV1> response = null;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		logger.LogInformation("IEnumerableLarge start RecordCount:{recordCount} Buffered:{buffered}", recordCount, buffered);

		response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
			sql: $@"SELECT TOP({recordCount}) [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
					"FROM [Warehouse].[StockItems] as SI1" +
					"   CROSS JOIN[Warehouse].[StockItems] as SI2" +
					"	CROSS JOIN[Warehouse].[StockItems] as SI3",
		buffered,
		commandType: CommandType.Text);

		logger.LogInformation("IEnumerableLarge done");
	}

	return this.Ok(response);
}

I used Telerik Fiddler to call the StockItemsIAsyncEnumerable controller IEnumberable and IAsyncEnumerable methods. The Azure App Service was hosted in an Azure Application Plan (S1, 100 total ACU, 1.75 GB). I found Telerik Fiddler had problems with larger responses, and would crash if the body of a larger response was viewed.

IEnumberableLarge method (buffered=false) response sizes and timings
IEnumberableLarge method (buffered=true) response sizes and timings

The unbuffered buffered version was slower Time To Last Byte(TTLB) and failed earlier which I was expecting.

[HttpGet("IAsyncEnumerableLarge")]
public async Task<ActionResult<IAsyncEnumerable<Model.StockItemListDtoV1>>> GetAsyncEnumerableLarge([FromQuery] bool buffered = false, [FromQuery]int recordCount = 10)
{
    IEnumerable<Model.StockItemListDtoV1> response = null;

    using (SqlConnection db = new SqlConnection(this.connectionString))
    {
        logger.LogInformation("IAsyncEnumerableLarge start RecordCount:{recordCount} Buffered:{buffered}", recordCount, buffered);

        response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
            sql: $@"SELECT TOP({recordCount}) [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
                    "FROM [Warehouse].[StockItems] as SI1" +
                    "   CROSS JOIN[Warehouse].[StockItems] as SI2" +
                    "   CROSS JOIN[Warehouse].[StockItems] as SI3",
        buffered,
        commandType: CommandType.Text);

        logger.LogInformation("IAsyncEnumerableLarge done");
    }

    return this.Ok(response);
}
IAsyncEnumberableLarge method response sizes and timings
[HttpGet("IAsyncEnumerableLargeYield")]
public async IAsyncEnumerable<Model.StockItemListDtoV1> GetAsyncEnumerableLargeYield([FromQuery] int recordCount = 10)
{
	int rowCount = 0;

	using (SqlConnection db = new SqlConnection(this.connectionString))
	{
		logger.LogInformation("IAsyncEnumerableLargeYield start RecordCount:{recordCount}", recordCount);

		CommandDefinition commandDefinition = new CommandDefinition(
			$@"SELECT TOP({recordCount}) [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
						"FROM [Warehouse].[StockItems] as SI1" +
						"   CROSS JOIN[Warehouse].[StockItems] as SI2" +
						"	CROSS JOIN[Warehouse].[StockItems] as SI3",
			//commandTimeout:
			CommandType.Text,
			//flags: CommandFlags.Pipelined
		);

		using var reader = await db.ExecuteReaderWithRetryAsync(commandDefinition);

		var rowParser = reader.GetRowParser<Model.StockItemListDtoV1>();

		while (await reader.ReadAsync())
		{
			rowCount++;

			if ((rowCount % 10000) == 0)
			{
				logger.LogInformation("Row count:{0}", rowCount);
			}

			yield return rowParser(reader);
		}
		logger.LogInformation("IAsyncEnumerableLargeYield done");
	}
}

When this post was written (August 2022) Dapper IAsyncEnumerable understanding was limited so I trialed the approach suggested in the StackOverflow post.

IAsyncEnumberableLargeYield method response sizes and timings

The IAsyncEnumerableLargeYield was faster to start responding, the overall duration was less and returned significantly more records 7000000 vs. 13000000. I assume this was because the response was streamed so there wasn’t a timeout.

Azure Application Insights displaying the IAsyncEnumerable with yield method executing

The results of my tests should be treated as “indicative” rather than “definitive”. In a future post I compare the scalability of different approaches. The number of records returned by the IAsyncEnumerableLargeYield not realistic and in a “real-world” scenario paging or an alternate approach should be used.