.NET Core web API + Dapper – Distributed Cache

I have used LazyCache for several projects (The Things Network V2 HTTP, The Things Industries V2 MQTT The Things Industries V3 and Swarm Space Azure IoT Connector etc.) to cache Azure IoT Hub DeviceClient and other object instances.

The note on the wiki page For LazyCache v2+ users, you should consider switching away from LazyCache to IDistributedCache. More information at #59 caught my attention.

I have written other posts about caching Dapper query results with the Dapper Extension Library which worked well but had some configuration limitations. I also have posts about off-loading read-only workloads with Azure Active geo-replication or SQL Data Sync for Azure, which worked well in some scenarios but had limitations (performance and operational costs).

The IDistributedCache has Memory, SQL Server and Redis implementations so I built an Azure AppService to explore the functionality in more detail. In another project I had been working with the Azure SignalR Service and the use of the MessagePack library(rather than serialised JSON) caught my attention so I have added basic support for that as well.

I explored the in-memory implementation (AddDistributedMemoryCache) on my development machine and found “tinkering” with the configuration options had little impact on the performance of my trivial sample application.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddSingleton<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

#if SERIALISATION_MESSAGE_PACK
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options;
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4Block);
    MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4BlockArray);
#endif

#if DISTRIBUTED_CACHE_MEMORY
    builder.Services.AddDistributedMemoryCache(options =>
    {
       options.SizeLimit = 1000 * 1024 * 1024; // 1000MB
    });
    builder.Services.AddDistributedMemoryCache();
#endif

#if DISTRIBUTED_CACHE_REDIS
    var configurationOptions = new ConfigurationOptions
    {
        EndPoints = { builder.Configuration.GetSection("RedisConnection").GetValue<string>("EndPoints") },
        AllowAdmin = true,
        Password = builder.Configuration.GetSection("RedisConnection").GetValue<string>("Password"),
        Ssl = true,
        ConnectRetry = 5,
        ConnectTimeout = 10000,
        SslProtocols = System.Security.Authentication.SslProtocols.Tls12,
        AbortOnConnectFail = false,
    };

    builder.Services.AddStackExchangeRedisCache(options =>
    {
        options.InstanceName = "Dapper WebAPI Instance";
        options.ConfigurationOptions = configurationOptions;
    });
#endif

#if DISTRIBUTED_CACHE_SQL_SERVER
    builder.Services.AddDistributedSqlServerCache(options =>
    {
        options.ConnectionString = builder.Configuration.GetConnectionString("CacheDatabase");
        options.SchemaName = "dbo";
        options.TableName = "StockItemsCache";
    });
#endif

    var app = builder.Build();

    // Configure the HTTP request pipeline.
    app.UseHttpsRedirection();
    app.MapControllers();
    app.Run();
}

I tested the SQL Server implementation (AddDistributedSqlServerCached) using the SQL Server on my development machine, and Azure SQL as a backing store. I did consider using SQL Azure In-Memory OLTP but the performance improvement with my trivial example would most probably not worth the additional cost of the required SKU.

CREATE TABLE [dbo].[StockItemsCache](
	[Id] [nvarchar](449) NOT NULL,
	[Value] [varbinary](max) NOT NULL,
	[ExpiresAtTime] [datetimeoffset](7) NOT NULL,
	[SlidingExpirationInSeconds] [bigint] NULL,
	[AbsoluteExpiration] [datetimeoffset](7) NULL,
PRIMARY KEY CLUSTERED 
(
	[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

The table used to store the data wasn’t very complex and I could view the data associated with a cache key in SQL Server Mangement studio.

SQL Server Managment Studio displaying cache table contents

One of the applications I work on uses a complex SQL Server Stored procedure to load reference data (updated daily) and being able to purge the cache at the end of this process like this might be useful. For a geographically distributed application putting the Azure SQL instance “closer” to the application’s users might be worth considering.

I trialed the Redis implementation with Memurai (on my development machine) and Azure Cache for Redis with multiple Azure AppService clients and there was a significant improvement in performance.

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    var utcNow = DateTime.UtcNow;

    var cached = await distributedCache.GetAsync("StockItems");
    if (cached != null)
    {
#if SERIALISATION_JSON
        return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
    }

    var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);

#if SERIALISATION_JSON
    await distributedCache.SetAsync("StockItems", JsonSerializer.SerializeToUtf8Bytes(stockItems), new DistributedCacheEntryOptions()
#endif
#if SERIALISATION_MESSAGE_PACK
    await distributedCache.SetAsync("StockItems", MessagePackSerializer.Serialize(stockItems), new DistributedCacheEntryOptions()
#endif
    {
        AbsoluteExpiration = new DateTime(utcNow.Year, utcNow.Month, DateTime.DaysInMonth(utcNow.Year, utcNow.Month), StockItemListAbsoluteExpiration.Hours, StockItemListAbsoluteExpiration.Minutes, StockItemListAbsoluteExpiration.Seconds)
    });

    return this.Ok(stockItems);
}

[HttpGet("NoLoad")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetNoLoad()
{
    var cached = await distributedCache.GetAsync("StockItems");
    if (cached == null)
    {
        return this.NoContent();
    }

#if SERIALISATION_JSON
    return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
}

In my test environment the JSON payload for a list of stock items was a bit “chunky” at 25K bytes, so I added compile time configurable support for the MessagePack library. This significantly reduced the size of the payload LZ4Block (5K bytes) and LZ4BlockArray (5K2 bytes) which should reduce network traffic.

Assuming the overheads of JSON vs. MessagePack serialisation are similar and the much smaller MessagePack library payload I would most probably use MessagePack and LZ4BlockArray (For improved compatibility with other implementations) compression.

.NET Core web API + Dapper – Distributed Cache Redis HMGET Error

Initially, I was running the WebAPI Dapper DistributedCache application on my development box and it ran third/forth time after sorting out the Azure Cache for Redis connection string and firewall configuration.

StockItems list served from the cache running on my Desktop
The Azure Cache for Redis firewall configuration for my home office

I then deployed the WebAPI Dapper DistributedCache application to an Azure AppService and it failed (spoiler alert it wasn’t DNS).

Azure AppService hosted StockItems list code failing in web browser

I then fired up Azure Application Insights to look at the logging information.

Azure Application Insights diagnostic information

There were many questions and tentative answers on stackoverflow some of them I tried, and which didn’t work in my scenario.

No connection is active/available to service this operation: HMGET Dapper WebAPI InstanceStockItems; 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.redis.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433, 
mc: 1/1/0, mgr: 10 of 10 available, 
clientName: XXXXXXXXXXXX, <----------------------This was important
IOCP: (Busy=0,Free=1000,Min=1,Max=1000), 
WORKER: (Busy=5,Free=1018,Min=1,Max=1023), 
v: 2.2.4.27433 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433 

I then recreated my configuration from scratch so none of my random changes based on stackoverflow posts would mess my next round of debugging.

Azure AppService Virtual, Outbound and Additional Outbound IP Addresses

The error appeared to be a networking issue, most probably the firewall blocking connections so I added the Azure AppService Virtual IP Address which didn’t work…

Firewall configuration for Azure AppService Virtual IP Address

I then went back and re-read the error message and noticed “clientName: XXXXXXXXXXXX,”. The Azure AppService client name wasn’t the default domain…

Firewall configuration for Azure AppService Virtual and outbound IP Addresses

I then added the Outbound IP addresses to the firewall and the Azure AppService could connect to my Azure Cache for Redis instance.

StockItems list served from the cache running in an AppService

As soon as I realised it wasn’t DNS I should have “stopped counted to ten”, debugged the issue carefully and not immediately gone to stackoverflow.

.NET Core web API + Dapper – LazyCache

I have used Alastair Crabtree’s LazyCache (read-through in-memory cache) on a couple of other projects e.g. The Things Industries V3 Connector (DeviceClient instances) and Swarm Space Azure IoT Connector (DeviceClient Instances and CS-Script binaries).

Adding LazyCache support to an ASP.Net Core WebAPI application is one line of code.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    // Register IAppCache as a singleton CachingService
    builder.Services.AddLazyCache();

    var app = builder.Build();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

One of the applications I work uses a lot of “reference” data which is effectively static e.g. national/regional holidays in countries where they have customers, and data which is updated on a schedule e.g. exchange rates downloaded at a known time every day.

//...
 private readonly TimeSpan StockItemListAbsoluteExpirationTime = new TimeSpan(23, 59, 59);
//....

private async Task<IEnumerable<Model.StockItemListDtoV1>> GetStockItemsAsync()
{
    return await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
}

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    MemoryCacheEntryOptions memoryCacheEntryOptions = new MemoryCacheEntryOptions()
    {
        Priority = CacheItemPriority.NeverRemove,
        AbsoluteExpiration = DateTime.UtcNow.Date.Add(StockItemListAbsoluteExpirationTime)
    };

    return this.Ok(await cache.GetOrAddAsync("StockItems", () => GetStockItemsAsync(), memoryCacheEntryOptions));
}

[HttpDelete()]
public ActionResult Delete()
{
    cache.Remove("StockItems");

    return this.Ok();
}

For the holiday information updates, the dataset could be deleted(“burped”) then refreshed the next time it is referenced or deleted then re-loaded. For exchange rate updates the cache could automatically expire the dataset shortly after the scheduled download.

When I deployed my demo application to an Azure AppService the requests (using the composer tab of Telerik Fiddler) took longer than when the application was running on my development box which was unexpected. I think this was because the JavaScript Object Notation(JSON) response payload was larger than the AzureSQL Tabular Data Stream(TDS) payload.

The LazyCache source code hasn’t been updated for quite a while (but it is quite stable and works well) and moving to other libraries has been recommended in the wiki.

.NET Core web API + Dapper – MiniProfiler

The time to open the home page of one of the “legacy” applications I was working on was slowly increasing over time. After a quick investigation it looked like there a couple of Azure SQL stored procedures which were called many times as the home page was opening were the problem.

The application used a combination of ADO.Net and Dapper to access the Azure SQL database, so Stack Overflow MiniProfiler looked like a good option.

There were quite a few samples online like the one below which “inspired” my approach.

using (SqlConnection connection = new SqlConnection(_configuration.GetConnectionString("default")))
{
   using (ProfiledDbConnection connection = new ProfiledDbConnection((IDbConnection)connection, MiniProfiler.Current))
   {
      await connection.OpenAsync();

      using (ProfiledDbCommand command = (ProfiledDbCommand)connection.CreateCommand())
      {
         command.CommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems];
         command.CommandType = CommandType.Text;

                 using (ProfiledDbDataReader reader = (ProfiledDbDataReader)await command.ExecuteReaderAsync())
                 {
                     var rowParser = reader.GetRowParser<Model.StockItemListDtoV1>();

                     while (await reader.ReadAsync())
                     {
                         response.Add(rowParser(reader));
                     }

                     await reader.CloseAsync();
                 }
             }

             await connection.CloseAsync();
       }
}

A simple modification of the WebAPI Dapper PerformanceProfiling project program.cs file enabled MiniProfiler for the project.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    builder.Services.AddMiniProfiler(options =>
    {
        options.RouteBasePath = "/profiler";
        //options.SqlFormatter = new StackExchange.Profiling.SqlFormatters.InlineFormatter();
        options.TrackConnectionOpenClose = true;
        //options.PopupShowTrivial = false
    });

    var app = builder.Build();

    app.UseMiniProfiler();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

For simple Dapper calls, the modifications to WebAPI controllers were minimal, and could be done largely with Visual Studio 2022 “search & replace.”

[HttpGet("Dapper")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapper()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = dapperContext.ConnectionCreate())
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
}
World Wide Importers database list of stockitems

The World Wide Importers database has approximately 250 StockItems which was representative of one of the problematic queries.

[HttpGet("DapperProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiled()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
 }

When I executed the GetDapperProfiler() method of the StockItems controller on my development box it took roughly 2.4 seconds.

MiniProfiler results for StockItems query running on my desktop

I sometimes ran the website on my development box so when I used “toggle trivial gaps” it was easier to see what where the delays were.

When I executed the GetDapperProfiler() method of the StockItems controller running in an Azure AppService it took roughly 20 mSec.

MiniProfiler results for StockItems query running in an Azure AppService

In one application a QueryMultipleAsync is used to retrieve information about a product and a list of its attributes. The World Wide Importers database has Invoices which have invoice lines and Transactions which was representative of another problematic query.

[HttpGet("DapperProfiledQueryMultiple")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultiple([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (ProfiledDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        var invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);

        response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();

        response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
    }

    return this.Ok(response);
}
World Wide Importers database list of invoice lines and transactions for a StockItem
MiniProfiler results for Invoice Item Query Multiple running in an Azure AppService

I couldn’t see any results for reading the StockItem Invoice lines and Transactions so I wrapped each ReadAsync with a MiniProfiler.Current.Step.

[HttpGet("DapperProfiledQueryMultipleStep")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultipleStep([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        SqlMapper.GridReader invoiceSummary;

        using (MiniProfiler.Current.Step("db.QueryMultipleAsync"))
        {
            invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);
        }

        using (MiniProfiler.Current.Step("invoiceSummary.ReadSingleOrDefaultAsync"))
        {
            response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        }
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
        {
            response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
        }

        using (MiniProfiler.Current.Step("TransactionSummary.ReadAsync"))
        {
            response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
        }
    }

    return this.Ok(response);
}

With larger lists every so often there were ReadAsync calls that took a more than a “trivial” amount of time. I surmise was some sort of batching done by the underlying ReadAsync + NextResultAsync methods of a SqlDataReader.

Need to investigate the use of

using (IDbConnection db = new ProfiledDbConnection(new SqlConnection(_configuration.GetConnectionString("default"), MiniProfiler.Current))
{
//...
}

.NET Core web API + Dapper – HTTP Patch

The application I’m currently working on has some tables with many columns and these were proving painful to update with HTTP PUT methods. Over the last couple of releases, I have been extending the customer facing API with PATCH methods so the client can specify only the values to changed.

{ 
    "op": "replace", 
    "path": "/name", 
    "value": "USB missile launcher (Green)." 
},
{ 
    "op": "replace", 
    "path": "/UnitPrice", 
    "value": 25
},
{ 
    "op": "replace", 
    "path": "/recommendedRetailPrice", 
    "value": 37.41
}

The JSON Patch is a format for specifying updates to be applied to a resource.

Stock Items list before HTTP Patch operation

A JSON Patch document has an array of operations which identify a particular type of change.

Using Telerik Fiddler Composer functionality to apply an HTTP PATCH
Stock Items list after HTTP Patch operation

The StockItemPatchDtoV1 class is decorated with DataAnnotations to ensure the contents are valid.

public class StockItemPatchDtoV1
{
    [Required]
    [StringLength(100, MinimumLength = 1, ErrorMessage = "The name text must be at least {2} and no more than {1} characters long")]  // These would be constants in a real application
    public string Name { get; set; }

    [Required]
    [Range(0.0, 100.0)] // These would be constants in a real application
    public decimal UnitPrice { get; set; }

    [Required]
    [Range(0.0, 1000000.0)] // These would be constants in a real application
    public decimal RecommendedRetailPrice { get; set; }
}

The StockItemsController [HttpPatch(“{id}”)] method retrieves the stock item to be updated, then uses ApplyTo method and TryValidateModel to update only the specified fields.

[HttpPatch("{id}")]
public async Task<ActionResult<Model.StockItemGetDtoV1>> Patch([FromBody] JsonPatchDocument<Model.StockItemPatchDtoV1> stockItemPatch, int id)
{
    Model.StockItemGetDtoV1 stockItem;

    using (IDbConnection db = dapperContext.ConnectionCreate())
    {
        stockItem = await db.QuerySingleOrDefaultWithRetryAsync<Model.StockItemGetDtoV1>(sql: "[Warehouse].[StockItemsStockItemLookupV1]", param: new { stockItemId = id }, commandType: CommandType.StoredProcedure);

        if (stockItem == default)
        {
            logger.LogInformation("StockItem:{id} not found", id);

            return this.NotFound($"StockItem:{id} not found");
        }

        Model.StockItemPatchDtoV1 stockItemPatchDto = mapper.Map<Model.StockItemPatchDtoV1>(stockItem);

        stockItemPatch.ApplyTo(stockItemPatchDto, ModelState);

        if (!ModelState.IsValid || !TryValidateModel(stockItemPatchDto))
        {
            logger.LogInformation("stockItemPatchDto invalid {0}", string.Join(Environment.NewLine, ModelState.Values.SelectMany(v => v.Errors).Select(v => v.ErrorMessage + " " + v.Exception))); // would extract this out into shared module

            return BadRequest(ModelState);
        }

        mapper.Map(stockItemPatchDto, stockItem);

        await db.ExecuteWithRetryAsync(sql: "UPDATE Warehouse.StockItems SET StockItemName=@Name, UnitPrice=@UnitPrice, RecommendedRetailPrice=@RecommendedRetailPrice WHERE StockItemId=@Id", param: stockItem, commandType: CommandType.Text);
    }

    return this.Ok();
}

Initially the HTTP Patch method returned this error message.

HTTP/1.1 400 Bad Request
Content-Type: application/problem+json; charset=utf-8
Date: Tue, 27 Jun 2023 09:20:30 GMT
Server: Kestrel
Transfer-Encoding: chunked

1d7
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"00-665a6ee9ed1105a105237c421793af5d-1719bda40c0b7d5d-00","errors":{"$":["The JSON value could not be converted to Microsoft.AspNetCore.JsonPatch.JsonPatchDocument`1[devMobile.WebAPIDapper.HttpPatch.Model.StockItemPatchDtoV1]. Path: $ | LineNumber: 0 | BytePositionInLine: 1."],"stockItemPatch":["The stockItemPatch field is required."]}}
0

After some research I worked out that I had forgotten to wire up the Newtonsoft JSON support with builder.Services.AddControllers().AddNewtonsoftJson();

.NET Core web API + Dapper – Dependency Injection

So far, to keep the code really obvious (tends to be more verbose) I have limited the use Dependency Injection(DI). I have been “injecting” an instance of an IConfiguration interface then retrieving the database connection string and other configuration. This isn’t a great approach as the database connection string name is in multiple files etc.

public CustomerController(IConfiguration configuration, ILogger<CustomerController> logger)
{
    this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
    this.logger = logger;
}

I wanted to explore the impact(s) of different DI approaches on ADO.Net connection pooling. With the system idle I used exec sp_who to see how many connections there were to my SQL Azure database.

SQL Server Management Studio(SSMS) sp_who query – Idle

Dapper Context approach

The application I’m currently working on uses a Command and Query Responsibility Segregation(CQRS) like approach. The application is largely “read only” so we have replicas of the database to improve the performance of queries hence the ConnectionReadCreate and ConnectionWriteCreate methods.

namespace devMobile.WebAPIDapper.ListsDIBasic
{
   using System.Data;
   using System.Data.SqlClient;

   using Microsoft.Extensions.Configuration;

   public interface IDapperContext 
   {
      public IDbConnection ConnectionCreate();
      public IDbConnection ConnectionCreate(string connectionStringName);

      public IDbConnection ConnectionReadCreate();
      public IDbConnection ConnectionWriteCreate();
   }

   public class DapperContext : IDapperContext
   {
      private readonly IConfiguration _configuration;

      public DapperContext(IConfiguration configuration)
      {
         _configuration = configuration;
      }

      public IDbConnection ConnectionCreate()
      { 
         return new SqlConnection(_configuration.GetConnectionString("default"));
      }

      public IDbConnection ConnectionCreate(string connectionStringName)
      {
         return new SqlConnection(_configuration.GetConnectionString(connectionStringName));
      }

      public IDbConnection ConnectionReadCreate()
      {
         return new SqlConnection(_configuration.GetConnectionString("default-read"));
      }

      public IDbConnection ConnectionWriteCreate()
      {
         return new SqlConnection(_configuration.GetConnectionString("default-write"));
      }
   }
}

I have experimented with how the IDapperContext context is constructed in the application startup with builder.Services.AddSingleton, builder.Services.AddScopedand and builder.Services.AddTransient.

public class Program
{
    public static void Main(string[] args)
    {
        var builder = WebApplication.CreateBuilder(args);

        // Add services to the container.
        //builder.Services.AddSingleton<IDapperContext>(s => new DapperContext(builder.Configuration));
        //builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));
        //builder.Services.AddScoped<IDapperContext>(s => new DapperContext(builder.Configuration));

        builder.Services.AddControllers();

        var app = builder.Build();

        // Configure the HTTP request pipeline.

        app.UseHttpsRedirection();

        app.MapControllers();

        app.Run();
    }
}

Then in the StockItems controller the IDapperContext interface implementation is used to create an IDbConnection for Dapper operations to use. I also added “WAITFOR DELAY ’00:00:02″ to the query to extend the duration of the requests.

[ApiController]
[Route("api/[controller]")]
public class StockItemsController : ControllerBase
{
   private readonly ILogger<StockItemsController> logger;
   private readonly IDapperContext dapperContext;

   public StockItemsController(ILogger<StockItemsController> logger, IDapperContext dapperContext)
   {
      this.logger = logger;

      this.dapperContext = dapperContext;
   }

   [HttpGet]
   public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
   {
      IEnumerable<Model.StockItemListDtoV1> response;

      using (IDbConnection db = dapperContext.ConnectionCreate())
      {
         //response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
         response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02'", commandType: CommandType.Text);
      }

      return this.Ok(response);
   }
...
}

I ran a stress testing application which simulated 50 concurrent users. When the stress test rig was stopped all the connections in the pool were closed after roughly 5 minutes.

SQL Server Management Studio(SSMS) sp_who query – stress test

Nasty approach

I then tried using DI to create a new connection for each request using builder.Services.AddScoped

public class Program
{
    public static void Main(string[] args)
    {
        var builder = WebApplication.CreateBuilder(args);

        // Add services to the container.

        //builder.Services.AddSingleton<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));
        //builder.Services.AddScoped<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));
        //builder.Services.AddTransient<IDbConnection>(s => new SqlConnection(builder.Configuration.GetConnectionString("default")));

        builder.Services.AddControllers();

        var app = builder.Build();

        app.UseHttpsRedirection();

        app.MapControllers();

        app.Run();
    }
}

The code in the get method was reduced. I also added “WAITFOR DELAY ’00:00:02″ to the query to extend the duration of the requests.

public class StockItemsController : ControllerBase
{
   private readonly ILogger<StockItemsController> logger;
   private readonly IDbConnection dbConnection;

   public StockItemsController(ILogger<StockItemsController> logger, IDbConnection dbConnection)
   {
      this.logger = logger;

      this.dbConnection = dbConnection;
   }

   [HttpGet]
   public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
   {
      // return this.Ok(await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02';", commandType: CommandType.Text));
      return this.Ok(await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text));
    }
...
}

With the stress test rig running the number of active connections was roughly the same as the DapperContext based implementation.

I don’t like this approach so will stick with DapperContext

.NET nanoFramework RAK11200 – Brownout Voltage Revisited

The voltage my test setup was calculating looked wrong, then I realised that the sample calculation in the RAK Wireless forums wasn’t applicable to my setup.

I reassembled my RAK11200 WisBlock WiFi Module, RAK19001 WisBlock Base Board, RAK1901 WisBlock Temperature and Humidity Sensor, 1200mAH Lithium Polymer (LiPo) battery, SKU920100 Solar Board test setup, put a new 9V battery (I had forgotten to turn it off last-time) in my multimeter then collected some data. A=ReadValue(), C= ReadRatio(), E= measured battery voltage.

Excel spreadsheet for calculating ratio

I updated the formula used to calculate the battery voltage and deployed the application

public static void Main()
{
    Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoTHub.RAK11200.PowerSleep starting");

    Thread.Sleep(5000);

    try
    {
        double batteryVoltage;

        Configuration.SetPinFunction(Gpio.IO04, DeviceFunction.I2C1_DATA);
        Configuration.SetPinFunction(Gpio.IO05, DeviceFunction.I2C1_CLOCK);

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Battery voltage measurement");

        // Configure Analog input (AIN0) port then read the "battery charge"
        AdcController adcController = new AdcController();

        using (AdcChannel batteryVoltageAdcChannel = adcController.OpenChannel(AdcControllerChannel))
        {
            batteryVoltage = batteryVoltageAdcChannel.ReadValue() / 723.7685;

            Debug.WriteLine($" BatteryVoltage {batteryVoltage:F2}");

            if (batteryVoltage < Config.BatteryVoltageBrownOutThreshold)
            {
                Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
                Sleep.StartDeepSleep();
            }
        }
        catch (Exception ex)
        {
...    
}

To test the accuracy of the voltage calculation I am going to run my setup on the office windowsill for a week regularly measuring the voltage. Then, turn the solar panel over (so the battery is not getting charged) and monitor the battery discharging until the RAK11200 WisBlock WiFi Module won’t connect to the network.

.NET nanoFramework RAK11200 – Brownout Voltage

My test setup was a RAK11200 WisBlock WiFi Module, RAK19001 WisBlock Base Board, RAK1901 WisBlock Temperature and Humidity Sensor, 1200mAH Lithium Polymer (LiPo) battery and SKU920100 Solar Board. The test setup uploads temperature, humidity and battery voltage telemetry to an Azure IoT Hub every 5 minutes (short delay so battery life reduced).

The first step was to check that I could get a “battery voltage” value for the RAKWireless RAK11200 WisBlock WiFi Module on a RAK19001 WisBlock Base Board for managing “brownouts” and send to my Azure IoT Hub.

RAK19001 Power supply schematic

The RAK19001 WisBlock Base Board has a voltage divider (R4&R5 with output ADC_VBAT) which is connected to pin 21(AIN0) on the CPU slot connector.

RAK19001 connector schematic

The RAK19001 WisBlock Base Board has quite a low leakage current so the majority of the power consumption should be the RAK11200 WisBlock WiFi Module.

RAK19001 leakage current from specifications

I used AdcController + AdcChannel to read AIN0 and modified the code using the formula (for a RAK4631 module) in the RAK Wireless forums to calculate the battery voltage. (UPDATE This calculation is not applicable to my scenario)

RAK11200 Schematic with battery voltage analog input highlighted

When “slept” the RAK11200 WisBlock WiFi Module power consumption is very low

RAK11200 low power current from specifications
public static void Main()
{
    Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoTHub.RAK11200.PowerSleep starting");

    Thread.Sleep(5000); // This do debugger can attach consider removing in realease version

    try
    {
        double batteryVoltage;

        Configuration.SetPinFunction(Gpio.IO04, DeviceFunction.I2C1_DATA);
        Configuration.SetPinFunction(Gpio.IO05, DeviceFunction.I2C1_CLOCK);

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Battery voltage measurement");

        // Configure Analog input (AIN0) port then read the "battery charge"
        AdcController adcController = new AdcController();

        using (AdcChannel batteryVoltageAdcChannel = adcController.OpenChannel(AdcControllerChannel))
        {

            // https://forum.rakwireless.com/t/custom-li-ion-battery-voltage-calculation-in-rak4630/4401/7
            // When I checked with multimeter I had to increase 1.72 to 1.9
            batteryVoltage = batteryVoltageAdcChannel.ReadValue() * (3.0 / 4096) * 1.9;

            Debug.WriteLine($" BatteryVoltage {batteryVoltage:F2}");

            if (batteryVoltage < Config.BatteryVoltageBrownOutThreshold)
            {
                Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
                Sleep.StartDeepSleep();
            }
        }

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Wifi connecting");

        if (!WifiNetworkHelper.ConnectDhcp(Config.Ssid, Config.Password, requiresDateTime: true))
        {
            if (NetworkHelper.HelperException != null)
            {
                Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} WifiNetworkHelper.ConnectDhcp failed {NetworkHelper.HelperException}");
            }

            Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
            Sleep.StartDeepSleep();
        }
        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Wifi connected");

        // Configure the SHTC3 
        I2cConnectionSettings settings = new(I2cDeviceBusID, Shtc3.DefaultI2cAddress);

        string payload ;

        using (I2cDevice device = I2cDevice.Create(settings))
        using (Shtc3 shtc3 = new(device))
        {
            if (shtc3.TryGetTemperatureAndHumidity(out var temperature, out var relativeHumidity))
            {
                Debug.WriteLine($" Temperature {temperature.DegreesCelsius:F1}°C Humidity {relativeHumidity.Value:F0}% BatteryVoltage {batteryVoltage:F2}");

                payload = $"{{\"RelativeHumidity\":{relativeHumidity.Value:F0},\"Temperature\":{temperature.DegreesCelsius:F1}, \"BatteryVoltage\":{batteryVoltage:F2}}}";
            }
            else
            {
                Debug.WriteLine($" BatteryVoltage {batteryVoltage:F2}");

                payload = $"{{\"BatteryVoltage\":{batteryVoltage:F2}}}";
            }

#if SLEEP_SHT3C
            shtc3.Sleep();
#endif
        }

        // Configure the HttpClient uri, certificate, and authorization
        string uri = $"{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}";

        HttpClient httpClient = new HttpClient()
        {
            SslProtocols = System.Net.Security.SslProtocols.Tls12,
            HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
            BaseAddress = new Uri($"https://{uri}/messages/events?api-version=2020-03-13"),
        };
        httpClient.DefaultRequestHeaders.Add("Authorization", SasTokenGenerate(uri, Config.Key, DateTime.UtcNow.Add(Config.SasTokenRenewFor)));

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub device {Config.DeviceID} telemetry update start");

        HttpResponseMessage response = httpClient.Post("", new StringContent(payload));

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Response code:{response.StatusCode}");

        response.EnsureSuccessStatusCode();
    }
    catch (Exception ex)
    {
        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub telemetry update failed:{ex.Message} {ex?.InnerException?.Message}");

        Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
        Sleep.StartDeepSleep();
    }

    Sleep.EnableWakeupByTimer(Config.TelemetryUploadInterval);
#if SLEEP_LIGHT
    Sleep.StartLightSleep();
#endif
#if SLEEP_DEEP
    Sleep.StartDeepSleep();
#endif
}

The nanoFramework.Hardware.Esp32.Sleep functionality supports LightSleep and DeepSleep states. The ESP32 device can be “woken up” by GPIO pin(s), Touch pad activity or by a Timer.

RAK11200+RAK19007+RAK1901+ LiPo battery test rig

After some “tinkering” I found the voltage calculation was surprisingly accurate (usually within 0.01V) for my RAK19001 and RAK19007 base boards.

When the battery voltage was close to its minimum working voltage of the ESP32 device it would reboot when the WifiNetworkHelper.ConnectDhcp method was called. This would quickly drain the battery flat even when the solar panel was trying to charge the battery.

Now, before trying to connect to the wireless network the battery voltage is checked and if too low (more experimentation required) the device goes into a deep sleep for a configurable period (more experimentation required). This is so the solar panel can charge the battery to a level where wireless connectivity will work.

Swarm Space – Asset Tracker Payload Formatter

After writing Swarm Space – Payload Formatter Debugging I then tested it creating a new payload formatter for my new Swarm Asset Tracker.

Swarm Asset Tracker device

The Swarm Asset Tracker has a slightly different payload to the Swarm Eval Kit which is detailed in the product manual.

Swarm Asset Tracker JSON payload

The first message sent shortly after I powered up the device had the latitude and longitude of Null Island

The Asset Tracker UserApplicationId is 65002 and the payload is similar to the Swarm Eval Kit. I created some message payloads (location of Christchurch Cathedral) for testing.

The JSON payload sent by my Swarm Asset Tracker

{
  "dt": 1677396395,
  "lt": -43.5333,
  "ln": 172.6333,
  "al": 25,
  "sp": 0,
  "hd": 126,
  "gj": 92,
  "gs": 1,
  "bv": 4103,
  "tp": 20,
  "rs": -110,
  "tr": -107,
  "ts": 3,
  "td": 1677396357,
  "hp": 166,
  "vp": 187,
  "tf": 36526
}

The Base64 representation of the payload sent by my Swarm Asset Tracker

ew0KICAiZHQiOiAxNjc3Mzk2Mzk1LA0KICAibHQiOiAtNDMuNTMzMywNCiAgImxuIjogMTcyLjYzMzMsDQogICJhbCI6IDI1LA0KICAic3AiOiAwLA0KICAiaGQiOiAxMjYsDQogICJnaiI6IDkyLA0KICAiZ3MiOiAxLA0KICAiYnYiOiA0MTAzLA0KICAidHAiOiAyMCwNCiAgInJzIjogLTExMCwNCiAgInRyIjogLTEwNywNCiAgInRzIjogMywNCiAgInRkIjogMTY3NzM5NjM1NywNCiAgImhwIjogMTY2LA0KICAidnAiOiAxODcsDQogICJ0ZiI6IDM2NTI2DQp9

The initial version of my payload formatter

using System;
using System.Collections.Generic;
using System.Globalization;
using Newtonsoft.Json.Linq;

public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
    public JObject Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        JObject telemetryEvent = new JObject();

        if ((payloadText != "") && (payloadJson != null))
        {
            JObject location = new JObject();

            location.Add("lat", payloadJson.GetValue("lt"));
            location.Add("lon", payloadJson.GetValue("ln"));
            location.Add("alt", payloadJson.GetValue("al"));

            telemetryEvent.Add("DeviceLocation", location);
        }

        // Course & speed
        telemetryEvent.Add("Course", payloadJson.GetValue("hd"));
        telemetryEvent.Add("Speed", payloadJson.GetValue("sp"));

        // Battery voltage
        telemetryEvent.Add("BatteryVoltage", payloadJson.GetValue("bv"));

        // RSSI
        telemetryEvent.Add("RSSI", payloadJson.GetValue("rs"));

        properties.Add("iothub-creation-time-utc", DateTimeOffset.FromUnixTimeSeconds((long)payloadJson.GetValue("dt")).ToString("s", CultureInfo.InvariantCulture));

        return telemetryEvent;
    }
}

The PayloadFormatterMaintenanceApplication command line I used for testing my Swarm Asset Tracker payload formatter

The console output of my Swarm Asset Tracker payload formatter

The PayloadFormatterMaintenanceApplication is better than trying to debug a payload formatter in a staging/production environment.

Currently the payload formatters still have to be manually uploaded to the application’s Azure Blob Storage for final testing.

Swarm Space – Payload Formatter Debugging

After Swarm Space – Uplink Payload Formatters revisited I wrote a couple of payload formatters and they were easy to get wrong and the Azure Application Insights error messages were unhelpful.

namespace PayloadFormatter // Additional namespace for shortening interface when usage in formatter code
{
    using System.Collections.Generic;

    using Newtonsoft.Json.Linq;

    public interface IFormatterUplink
    {
        public JObject Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes);
    }

    public interface IFormatterDownlink
    {
        public byte[] Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes);
    }
}

The definitions of the uplink & downlink payload formatter evaluator interfaces have been updated and shifted to a new project.

Visual Studio 2022 Solution with payloadformatter maintenance application

I built a console application to help with developing and debugging uplink or downlink formatters. The application has a number of command line parameters which specify the formatter to be used, UserApplicationId, OrganizationId, DeviceType etc.

public class CommandLineOptions
{
    [Option('d', "Direction", Required = true, HelpText = "Test Uplink or DownLink formatter")]
	public string Direction { get; set; }

    [Option('p', "filename", HelpText = "Uplink or Downlink Payload file name")]
    public string PayloadFilename { get; set; } = string.Empty;

    [Option('o', "OrganisationId", Required = true, HelpText = "Organisation unique identifier")]
    public uint OrganizationId { get; set; }

    [Option('i', "DeviceId", Required = true, HelpText = "Device unique identitifer")]
    public uint DeviceId { get; set; }

    [Option('t', "DeviceType", Required = true, HelpText = "Device type number")]
    public byte DeviceType { get; set; }

    [Option('u', "UserApplicationId", Required = true, HelpText = "User Application Id")]
    public ushort UserApplicationId { get; set; }

    [Option('h', "SwarmHiveReceivedAtUtc", HelpText = "Swarm Hive received at time UTC")]
    public DateTime? SwarmHiveReceivedAtUtc { get; set; }

    [Option('w', "UplinkWebHookReceivedAtUtc", HelpText = "Webhook received at time UTC")]
    public DateTime? UplinkWebHookReceivedAtUtc { get; set; }

    [Option('s', "Status", HelpText = "Uplink local file system file name")]
    public byte? Status { get; set; }

    [Option('c', "Client", HelpText = "Uplink local file system file name")]
    public string Client { get; set; } 
 }

The downlink formatter (similar approach for uplink) loads the sample file as an array of bytes, then tries to convert it to text, and finally to JSON. Then the formatter code is “compiled” and the executed with the file payload and command line parameters.

private static async Task DownlinkFormatterCore(CommandLineOptions options)
{
    Dictionary<string, string> properties = new Dictionary<string, string>();

    string formatterFolder = Path.Combine(Environment.CurrentDirectory, "downlink");
    Console.WriteLine($"Downlink- uplinkFormatterFolder: {formatterFolder}");

    string formatterFile = Path.Combine(formatterFolder, $"{options.UserApplicationId}.cs");
    Console.WriteLine($"Downlink- UserApplicationId: {options.UserApplicationId}");
    Console.WriteLine($"Downlink- Payload formatter file: {formatterFile}");

    PayloadFormatter.IFormatterDownlink evalulator;
    try
    {
        evalulator = CSScript.Evaluator.LoadFile<PayloadFormatter.IFormatterDownlink>(formatterFile);
     }
    catch (CSScriptLib.CompilerException cex)
    {
        Console.Write($"Loading or compiling file:{formatterFile} failed Exception:{cex}");
        return;
    }

    string payloadFilename = Path.Combine(formatterFolder, options.PayloadFilename);
    Console.WriteLine($"Downlink- payloadFilename:{payloadFilename}");
    byte[] uplinkBytes;

    try
    {
        uplinkBytes = File.ReadAllBytes(payloadFilename);
    }
    catch (DirectoryNotFoundException dex)
    {
        Console.WriteLine($"Uplink payload filename directory {formatterFolder} not found:{dex}");
        return;
    }
    catch (FileNotFoundException fnfex)
    {
        Console.WriteLine($"Uplink payload filename {payloadFilename} not found:{fnfex}");
        return;
    }
    catch (FormatException fex)
    {
        Console.WriteLine($"Uplink payload file invalid format {payloadFilename} not found:{fex}");
        return;
    }

    // See if payload can be converted to a string
    string uplinkText = string.Empty;
    try
    {
        uplinkText = Encoding.UTF8.GetString(uplinkBytes);
    }
    catch (FormatException fex)
    {
        Console.WriteLine("Encoding.UTF8.GetString failed:{0}", fex.Message);
    }

    // See if payload can be converted to JSON
    JObject uplinkJson;
    try
    {
        uplinkJson = JObject.Parse(uplinkText);
    }
    catch (JsonReaderException jrex)
    {
        Console.WriteLine("JObject.Parse failed Exception:{1}", jrex);

        uplinkJson = new JObject();
    }

    Console.WriteLine("Properties");
    foreach (var property in properties)
    {
        Console.WriteLine($"{property.Key}:{property.Value}");
    }

    // Transform the byte and optional text and JSON payload
    Byte[] payload;
    try
    {
        payload = evalulator.Evaluate(properties, options.OrganizationId, options.DeviceId, options.DeviceType, options.UserApplicationId, uplinkJson, uplinkText, uplinkBytes);
    }
    catch (Exception ex)
    {
        Console.WriteLine($"evalulatorUplink.Evaluate failed Exception:{ex}");
        return;
    }

    Console.WriteLine("Payload");
    Console.WriteLine(Convert.ToBase64String(payload));
}

The sample JSON payload is what would be sent by Azure IoT Central to a device to configure the fan speed

Azure IoT Central M138 Breakout device template with the Fan Status command selected
{
  "FanStatus": 2
}

If the downlink payload formatter is compiled and executes successfully the Base64 representation output is displayed

using System;
using System.Collections.Generic;
using Newtonsoft.Json.Linq;

public class FormatterDownlink : PayloadFormatter.IFormatterDownlink
{
    public byte[] Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        byte? status = payloadJson.Value<byte?>("FanStatus");

        if ( status.HasValue ) 
        { 
            return new byte[] { status.Value };
        }

        return new byte[]{};
    }
}

If the downlink payload formatter syntax is incorrect e.g. { status.Value ; }; an error message with the line and column is displayed.

using System;
using System.Collections.Generic;
using Newtonsoft.Json.Linq;

public class FormatterDownlink : PayloadFormatter.IFormatterDownlink
{
    public byte[] Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        byte? status = payloadJson.Value<byte?>("FanStatus");

        if ( status.HasValue ) 
        {
            return new byte[] { status.Value ; };
        }

        return new byte[]{};
    }
}

If the downlink payload formatter syntax is correct but execution fails (in the example code division by zero) an error message is displayed.

using System;
using System.Collections.Generic;
using Newtonsoft.Json.Linq;

public class FormatterDownlink : PayloadFormatter.IFormatterDownlink
{
    public byte[] Evaluate(IDictionary<string, string> properties, uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        byte? status = payloadJson.Value<byte?>("FanStatus");

        if ( status.HasValue ) 
        {
            int divideByZero = 10;

            divideByZero = divideByZero / 0;

            return new byte[] { status.Value };
        }

        return new byte[]{};
    }
}

The PayloadFormatterMaintenanceApplication makes it significantly easier to develop formatters. Currently the payload formatters have to be manually uploaded to the application’s Azure Blob Storage for final testing.