Myriota device Uplink Serialisation

The Myriota Developer documentation has some sample webhook data payloads so I used JSON2csharp to generate a Data Transfer Object(DTO) to deserialise payload. The format of the message is a bit “odd”, the “Data “Value” contains an “escaped” JSON object.

{
  "EndpointRef": "ksnb8GB_TuGj:__jLfs2BQJ2d",
  "Timestamp": 1692928585,
  "Data": "{"Packets": [{"Timestamp": 1692927646796, "TerminalId": "0001020304", "Value": "00008c9512e624cce066adbae764cccccccccccc"}]}",
  "Id": "a5c1bffe-4b62-4233-bbe9-d4ecc4f8b6cb",
  "CertificateUrl": "https://security.myriota.com/data-13f7751f3c5df569a6c9c42a9ce73a8a.crt",
  "Signature": "FDJpQdWHwCY+tzCN/WvQdnbyjgu4BmP/t3cJIOEF11sREGtt7AH2L9vMUDji6X/lxWBYa4K8tmI0T914iPyFV36i+GtjCO4UHUGuFPJObCtiugVV8934EBM+824xgaeW8Hvsqj9eDeyJoXH2S6C1alcAkkZCVt0pUhRZSZZ4jBJGGEEQ1Gm+SOlYjC2exUOf0mCrI5Pct+qyaDHbtiHRd/qNGW0LOMXrB/9difT+/2ZKE1xvDv9VdxylXi7W0/mARCfNa0J6aWtQrpvEXJ5w22VQqKBYuj3nlGtL1oOuXCZnbFYFf4qkysPaXON31EmUBeB4WbZMyPaoyFK0wG3rwA=="
}
namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook.Models
{
    public class UplinkPayloadWebDto
    {
        public string EndpointRef { get; set; }
        public long Timestamp { get; set; } 
        public string Data { get; set; } // Embedded JSON ?
        public string Id { get; set; }
        public string CertificateUrl { get; set; }
        public string Signature { get; set; }
    }
}

The UplinkWebhook controller “automagically” deserialises the message, then in code the embedded JSON is deserialised and “unpacked”, finally the processed message is inserted into an Azure Storage queue.

namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook.Controllers
{
    [Route("[controller]")]
    [ApiController]
    public class UplinkController : ControllerBase
    {
        private readonly Models.ApplicationSettings _applicationSettings;
        private readonly ILogger<UplinkController> _logger;
        private readonly QueueServiceClient _queueServiceClient;

        public UplinkController(IOptions<Models.ApplicationSettings> applicationSettings, QueueServiceClient queueServiceClient, ILogger<UplinkController> logger)
        {
            _applicationSettings = applicationSettings.Value;
            _queueServiceClient = queueServiceClient;
            _logger = logger;
        }

        [HttpPost]
        public async Task<IActionResult> Post([FromBody] Models.UplinkPayloadWebDto payloadWeb)
        {
            _logger.LogInformation("SendAsync queue name:{QueueName}", _applicationSettings.QueueName);

            QueueClient queueClient = _queueServiceClient.GetQueueClient(_applicationSettings.QueueName);

            var serializeOptions = new JsonSerializerOptions
            {
                WriteIndented = true,
                Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping
            };

            await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payloadWeb, serializeOptions)));

            return this.Ok();
        }
    }
}

The webhook application uses the QueueClientBuilderExtensions and AddServiceClient so a QueueServiceClient can be injected into the webhook controller.

namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook
{
    public class Program
    {
        public static void Main(string[] args)
        {
            var builder = WebApplication.CreateBuilder(args);

            // Add services to the container.
            builder.Services.AddControllers();

            builder.Services.AddApplicationInsightsTelemetry(i => i.ConnectionString = builder.Configuration.GetConnectionString("ApplicationInsights"));

            builder.Services.Configure<Models.ApplicationSettings>(builder.Configuration.GetSection("Application"));

            builder.Services.AddAzureClients(azureClient =>
            {
                azureClient.AddQueueServiceClient(builder.Configuration.GetConnectionString("AzureWebApi"));
            });

            var app = builder.Build();

            // Configure the HTTP request pipeline.

            app.UseHttpsRedirection();

            app.MapControllers();

            app.Run();
        }
    }
}

After debugging the application on my desktop with Telerik fiddler I deployed the application to one of my Azure subscriptions.

Azure Resource Group for the myriota Azure IoT Connector
Adding a new Destination in the myriota device manager

As part of configuring a new device test messages can be sent to the configured destinations.

Testing a new Destination in the myriota device manager
{
  "EndpointRef": "N_HlfTNgRsqe:uyXKvYTmTAO5",
  "Timestamp": 1563521870,
  "Data": "{"Packets": [{"Timestamp": 1563521870359,
    "TerminalId": "f74636ec549f9bde50cf765d2bcacbf9",
    "Value": "0101010101010101010101010101010101010101"}]}",
  "Id": "fe77e2c7-8e9c-40d0-8980-43720b9dab75",
  "CertificateUrl":    "https://security.myriota.com/data-13f7751f3c5df569a6c9c42a9ce73a8a.crt",
  "Signature": "k2OIBppMRmBT520rUlIvMxNg+h9soJYBhQhOGSIWGdzkppdT1Po2GbFr7jbg..."
}

The DTO generated with JSON2csharp needed some manual “tweaking” after examining how a couple of the sample messages were deserialised.

Azure Storage Explorer messages

I left the Myriota Developer Toolkit device (running the tracker sample) outside overnight and the following day I could see with Azure Storage Explorer a couple of messages in the Azure Storage Queue

Myriota device configuration

For a couple of weeks Myriota Developer Toolkit has been sitting under my desk and today I got some time to setup a device, register it, then upload some data.

Myriota Developer Toolkit

The first step was to download and install the Myriota Configurator so I could get the device registration information and install the tracker example application.

Using Windows File Explorer to “unblock” the downloaded file

After “unblocking” the zip file and upgrading my pip install the install script worked.

Myriota Configurator installation script

The application had to be run from the command line with “python MyriotaConfigurator.py”

Myriota Configurator main menu
Myriota Configurator retrieving device registration code

On the device I’m using the Tracker sample application to generate some sample payloads.

Myriota Configurator downloading tracker sample to device

The next step was to “register” my device and configure the destination(s) for its messages.

Myriota Device Manager Device configuration

Once the device and device manager configuration were sorted, I put the Tracker out on the back lawn on top of a large flowerpot.

Device Manager Access Times

On the “Access Times” page I could see that there were several periods when a satellite was overhead and overnight a couple of messages were uploaded.

ASP.NET Core authentication – In the beginning

While building my ASP.NET Core identity, Dapper Custom storage provider I found there wasn’t a lot of discussion of the ASPNETUserClaims functionality for fine “grained permissions”.

ASP.NET Core identity initial data model

ASP.NET Core identity Roles can also have individual claims but with the authorisation model of the legacy application I work on this functionality hasn’t been useful. We use role based authentication with a few user claims to minimise the size of our Java Web Tokens(JWT)

Visual Studio 2022 ASP.NET Core Web Application template options

The first step was to create a “bare-bones” ASP.NET Core Razor pages Web Application with Individual Accounts Authentication project

Default ASP.NET Core identity Web application Homepage

I tried to minimise the modifications to the application. I added EnableRetryOnFailure, some changes to names spaces etc. I also added support for email address confirmation with SendGrid and “authentication” link to the navabar in _Layout.cshtml.

@page
@model RolesModel
@{
    <table class="table">
        <thead>
            <tr>
                <th>Role</th>
            </tr>
        </thead>
        <tbody>
            @foreach (var role in Model.Roles)
            {
                <tr>
                    <td>
                        @Html.DisplayFor(modelItem => role.Value)
                    </td>
                </tr>
            }
        </tbody>
    </table>
    <br/>
        <table class="table">
        <thead>
            <tr>
                <th>Claim Subject</th>
                <th>Value</th>
            </tr>
        </thead>

        <tbody>
            @foreach (var claim in Model.Claims)
            {
                <tr>
                    <td>
                        @Html.DisplayFor(modelItem => claim.Type)
                    </td>
                    <td>
                        @Html.DisplayFor(modelItem => claim.Value)
                    </td>
                </tr>
            }
        </tbody>
    </table>
}

The “Authentication” page displays the logged in User’s Role and Claims.

namespace devMobile.AspNetCore.Identity.WebApp.EFCore.Pages
{
    [Authorize()]
    public class RolesModel : PageModel
    {
        private readonly ILogger<RolesModel> _logger;

        public List<Claim> Roles { get; set; }
        public List<Claim> Claims { get; set; }

        public RolesModel(ILogger<RolesModel> logger)
        {
            _logger = logger;
        }

        public void OnGet()
        {
            Roles = User.Claims.Where(c => c.Type == ClaimTypes.Role).ToList();

            Claims = User.Claims.Where(c => c.Type != ClaimTypes.Role).ToList();
        }
    }
}

Each user can have role(s), with optional claims, and some optional individual claims.

ASP.NET Core identity application Authentication information page

The WebApp.EFCore project is intended to be the starting point for a series of posts about ASP.NET Core identity so I have not included Cross-Origin Resource Sharing (CORS), Cross Site Request Forgery (CSRF) etc. functionality.

Swarm Space – Underlying Architecture sorted

After figuring out that calling an Azure Http Trigger function to load the cache wasn’t going to work reliably, I have revisited the architecture one last time and significantly refactored the SwarmSpaceAzuureIoTConnector project.

Visual Studio 2022 solution

The application now has a StartUpService which loads the Azure DeviceClient cache (Lazy Cache) in the background as the application starts up. If an uplink message is received from a SwarmDevice before, it has been loaded by the FunctionsStartup the DeviceClient information is cached and another connection to the Azure IoT Hub is not established.

...
using Microsoft.Azure.Functions.Extensions.DependencyInjection;

[assembly: FunctionsStartup(typeof(devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector.StartUpService))]
namespace devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector
{
...
    public class StartUpService : BackgroundService
    {
        private readonly ILogger<StartUpService> _logger;
        private readonly ISwarmSpaceBumblebeeHive _swarmSpaceBumblebeeHive;
        private readonly Models.ApplicationSettings _applicationSettings;
        private readonly IAzureDeviceClientCache _azureDeviceClientCache;

        public StartUpService(ILogger<StartUpService> logger, IAzureDeviceClientCache azureDeviceClientCache, ISwarmSpaceBumblebeeHive swarmSpaceBumblebeeHive, IOptions<Models.ApplicationSettings> applicationSettings)//, IOptions<Models.AzureIoTSettings> azureIoTSettings)
        {
            _logger = logger;
            _azureDeviceClientCache = azureDeviceClientCache;
            _swarmSpaceBumblebeeHive = swarmSpaceBumblebeeHive;
            _applicationSettings = applicationSettings.Value;
        }

        protected override async Task ExecuteAsync(CancellationToken cancellationToken)
        {
            await Task.Yield();

            _logger.LogInformation("StartUpService.ExecuteAsync start");

            try
            {
                _logger.LogInformation("BumblebeeHiveCacheRefresh start");

                foreach (SwarmSpace.BumblebeeHiveClient.Device device in await _swarmSpaceBumblebeeHive.DeviceListAsync(cancellationToken))
                {
                    _logger.LogInformation("BumblebeeHiveCacheRefresh DeviceId:{DeviceId} DeviceName:{DeviceName}", device.DeviceId, device.DeviceName);

                    Models.AzureIoTDeviceClientContext context = new Models.AzureIoTDeviceClientContext()
                    {
                        OrganisationId = _applicationSettings.OrganisationId,
                        DeviceType = (byte)device.DeviceType,
                        DeviceId = (uint)device.DeviceId,
                    };

                    await _azureDeviceClientCache.GetOrAddAsync(context.DeviceId, context);
                }

                _logger.LogInformation("BumblebeeHiveCacheRefresh finish");
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "StartUpService.ExecuteAsync error");

                throw;
            }

            _logger.LogInformation("StartUpService.ExecuteAsync finish");
        }
    }
}

The uplink and downlink payload formatters are stored in Azure Blob Storage are compiled (CS-Script) as they are loaded then cached (Lazy Cache)

Azure Storage explorer displaying list of uplink payload formatter blobs.
Azure Storage explorer displaying list of downlink payload formatter blobs.
private async Task<IFormatterDownlink> DownlinkLoadAsync(int userApplicationId)
{
    BlobClient blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormattersDownlinkContainer, $"{userApplicationId}.cs");

    if (!await blobClient.ExistsAsync())
    {
        _logger.LogInformation("PayloadFormatterDownlink- UserApplicationId:{0} Container:{1} not found using default:{2}", userApplicationId, _applicationSettings.PayloadFormattersUplinkContainer, _applicationSettings.PayloadFormatterUplinkBlobDefault);

        blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormatterDownlinkBlobDefault, _applicationSettings.PayloadFormatterDownlinkBlobDefault);
    }

    BlobDownloadResult downloadResult = await blobClient.DownloadContentAsync();

    return CSScript.Evaluator.LoadCode<PayloadFormatter.IFormatterDownlink>(downloadResult.Content.ToString());
}

The uplink and downlink formatters can be edited in Visual Studio 2022 with syntax highlighting (currently they have to be manually uploaded).

The SwarmSpaceBumbleebeehive module no longer has public login or logout methods.

    public interface ISwarmSpaceBumblebeeHive
    {
        public Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken);

        public Task SendAsync(uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, byte[] payload);
    }

The DeviceListAsync and SendAsync methods now call the BumblebeeHive login method after configurable period of inactivity.

public async Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken)
{
        if ((_TokenActivityAtUtC + _bumblebeeHiveSettings.TokenValidFor) < DateTime.UtcNow)
        {
            await Login();
        }

        using (HttpClient httpClient = _httpClientFactory.CreateClient())
       {
            Client client = new Client(httpClient);

            client.BaseUrl = _bumblebeeHiveSettings.BaseUrl;

            httpClient.DefaultRequestHeaders.Add("Authorization", $"bearer {_token}");

            return await client.GetDevicesAsync(null, null, null, null, null, null, null, null, null, cancellationToken);
        }
}

I’m looking at building a webby user interface where users an interactivity list, create, edit, delete formatters with syntax highlighter support, and the executing the formatter with sample payloads.

Swarm Space Azure IoT Connector Identity Translation Gateway Architecture

This approach uses most of the existing building blocks, and that’s it no more changes.

.NET Core web API + Dapper – Readonly query workloads with Data Sync

Azure SQL Data Synchronisation Process

Read-only replicas of an Azure SQL Database database with Active geo-replication are easy to setup but there are some disadvantages. e.g. bi-directional synchronisation is not supported, not all tables or selected columns of some tables might not be needed\should not be accessible for reporting, the overhead of replicating tables used for transaction processing might impact on the performance of the solution etc. Azure SQL Data Sync is a service built on Azure SQL Database that can synchronise selected data bi-directionally across multiple databases, both on-premises and in the cloud.

The first step was to remove all the Microsoft SQL Server features used in the the World Wide Importers database (e.g. Sequence Numbers, Column Store indexes etc.) which are not supported(see general limitiations) by Azure SQL Data Sync. I then used the “Deploy Database Wizard” to copy my modified World Wide Importers database to an Azure SQL Database.

Deploy Database to Microsoft Azure SQL Database running
Microsoft Azure SQL Database Servers with WorldWideImporters and ReadonlyReplicaHub database
Data Sync Group creation onReadOnlyReplicaHub database

For my “read-only replicas” scenario if there are any update conflicts the the source database “wins”.

Data Sync Group created
Data Sync Group configuration database management
Data Sync Group configuration adding “source” database synchronisation to Hub
Data Sync Group configuration adding “destination” database(s) synchronisation from Hub
Data Sync Group selecting source database StockItems table and columns to synchronise
Data Sync Group “destination” databases configured
Initial synchronisation of only StockItems table and seed data
Regular synchronisation of a subset of StockItems columns to destination databases
Source Azure SQL Database regular synchronisation compute utilisation
StockItemsReadOnlyReplicas Controller JSON after first replication completed
Azure application Insights Dependencies showing usage of different synchronised databases
StockItems table in source database with updated RRP
StockItems table in destination database with updated RRP after next scheduled snychronisation
StockItems table in destination database after next scheduled synchronisation

The Azure SQL Database Data Sync was pretty easy to setup (configuration in the hub database tripped me up initially). For a production scenario where only a portion of the database (e.g. shaped by Customer, Geography, security considerations, or a bi-directional requirement) it would be an effective solution, though for some applications the delay between synchronisations might be an issue.

.NET Core web API + Dapper – Readonly query workloads with Geo Replication

Geo Replication process

One of the easiest ways to create read-only replicas of an Azure SQL Database database is with Active geo-replication(it’s also useful for disaster recovery with geo-failure to a geo-secondary in a different Azure Region).

The first step was to remove all the Microsoft SQL Server features used in the The World Wide Importers database (e.g. Memory Optimized tables) which were not supported by the Azure SQL Database tier I was using.

SQL Service Management Studio(SSMS) Deploy to Azure wizard

I then used the “Deploy Database Wizard” to copy my modified World Wide Importers database to an Azure SQL Database.

Azure Portal Primary Database in SQL Service Management Studio(SSMS)

I then created replicas in the same region (if the application had a global customer base creating read only geo replicas in regions close to users might be worth considering) for the read-only queries.

Azure SQL Database no replicas configured
Azure Portal Create Geo Replica

I created four replicas which is the maximum number supported. If more replicas were required a secondary of a secondary (a process known as chaining) could be use to create additional geo-replicas

Azure Portal Primary Database and four Geo-replicas
Azure Portal Primary Database and Geo-replicas in SQL Service Management Studio(SSMS)
Azure Application Insights showing multiple Geo-Replicas being used.

The Azure Database Geo-replication was pretty easy to setup. For a production scenario where only a portion of the database (e.g. shaped by Customer or Geography) is required it might not be the “right hammer”.

WebAPI Dapper Azure Resource Group

The other limitation I encountered was the resources used by the replication of “transaction processing” tables (in the World Wide Importers database tables like the Sales.OrderLines, Sales.CustomerTransactions etc.) which often wouldn’t be required for read-only applications.

.NET Core web API + Dapper – Offloading readonly query workloads.

A Scaling Out Scenario

Initially this was going to be a post about Sharding but after some discussion with my manager at one of the companies I work for it evolved into a post about using the Dapper Object Relational Mapper(ORM) with Azure SQL Database and Active geo-replication or SQL Data Sync for Azure to offload read-only query workloads to improve scalability.

The company builds a Software as a Service(Saas) product for managing portfolios of foreign currency forwardsoptionsswaps etc. Part of the solution has an application which customers use to get an “aggregated” view of their purchases.

The database queries to lookup reference data (forward curves etc.), return a shaped dataset for each supported instrument type, then “aggregating” the information with C# code consumes significant database and processing resources.

The first step was to remove all the Microsoft SQL Server features used in the The World Wide Importers database (e.g. Memory Optimized tables) which were not supported by the Azure SQL Database vCore or DTU tier I had was using.

I then uploaded my modified World Wide Importers database to an Azure SQL Database Server.

Azure SQL Database SQL Server Management Studio (SSMS) initial database

I then created read only replicas of the original database to use for scaling out in my demo application.

Azure SQL Database SQL Server Management Studio (SSMS) with replicas databases

The configuration strings of the read-only replicas are loaded as the application starts.

// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
	services.AddControllers();

	var errorHandlerSettings = Configuration.GetSection(nameof(ErrorHandlerSettings));
	services.Configure<ErrorHandlerSettings>(errorHandlerSettings);

	var readonlyReplicaServersConnectionStringSettings = Configuration.GetSection("ReadonlyReplicaServersConnectionStringSettings");
	services.Configure<List<string>>(readonlyReplicaServersConnectionStringSettings);

	services.AddResponseCaching();

	services.AddDapperForMSSQL();
#if DAPPER_EXTENSIONS_CACHE_MEMORY
	services.AddDapperCachingInMemory(new MemoryConfiguration
	{
		AllMethodsEnableCache = false
 	});
#endif
#if DAPPER_EXTENSIONS_CACHE_REDIS
	services.AddDapperCachingInRedis(new RedisConfiguration
	{
		AllMethodsEnableCache = false,
		KeyPrefix = Configuration.GetValue<string>("RedisKeyPrefix"),
		ConnectionString = Configuration.GetConnectionString("RedisConnection")
	}); 
#endif
	services.AddApplicationInsightsTelemetry();
}

Then code was added to the controller to randomly select which read-only replica to use. More complex approaches were considered but not implemented for the initial version.

[ApiController]
[Route("api/[controller]")]
public class StockItemsReadonlyReplicasController : ControllerBase
{
    private readonly ILogger<StockItemsReadonlyReplicasController> logger;
    private readonly List<string> readonlyReplicasConnectionStrings;

    public StockItemsReadonlyReplicasController(ILogger<StockItemsReadonlyReplicasController> logger, IOptions<List<string>> readonlyReplicasServerConnectionStrings)
    {
        this.logger = logger;

        this.readonlyReplicasConnectionStrings = readonlyReplicasServerConnectionStrings.Value;
    }

    [HttpGet]
    public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
    {
        IEnumerable<Model.StockItemListDtoV1> response = null;

        if (readonlyReplicasConnectionStrings.Count == 0)
        {
            logger.LogError("No readonly replica server Connection strings configured");

            return this.StatusCode(StatusCodes.Status500InternalServerError);
        }

        Random random = new Random(); // maybe this should be instantiated ever call, but "danger here by thy threading"

        string connectionString = readonlyReplicasConnectionStrings[random.Next(0, readonlyReplicasConnectionStrings.Count)];

        logger.LogTrace("Connection string {connectionString}", connectionString);

        using (SqlConnection db = new SqlConnection(connectionString))
        {
            response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
        }

        return this.Ok(response);
    }
}

The Read-only replica server connection string setup template in appsettings.Development.json.

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning",
      "Microsoft.Hosting.Lifetime": "Information"
    }
  },
  "ConnectionStrings": {
    "WorldWideImportersDatabase": "ThisIsNotTheDatabaseConnectionStringYouAreLookingFor",
    "DefaultConnection": "ThisIsNotTheDefaultDatabaseConnectionStringYouAreLookingFor",
    "RedisConnection": "ThisIsNotTheRedisConnectionStringYouAreLookingFor"
  },
  "ReadonlyReplicaServersConnectionStringSettings": [
    "ThisIsNotTheReadonlyReplicaDatabaseConnectionStringYouAreLookingFor",
    "ThisIsNotTheReadonlyReplicaDatabaseConnectionStringYouAreLookingFor",
    "ThisIsNotTheReadonlyReplicaDatabaseConnectionStringYouAreLookingFor",
    "ThisIsNotTheReadonlyReplicaDatabaseConnectionStringYouAreLookingFor"
  ],
  "ApplicationInsights": {
    "ConnectionString": "ThisIsNotTheApplicationInsightsConnectionStringYouAreLookingFor"
  },
  "ErrorHandlerSettings": {
    "Detail": "Default detail",
    "Title": "Default title",
    "UrlSpecificSettings": {
      "localhost": {
        "Title": "Title for localhost",
        "Detail": "Detail for localhost"
      },
      "127.0.0.1": {
        "Title": "Title for 127.0.0.1",
        "Detail": "Detail for 127.0.0.1"
      }
    }
  }
}

The Manage UserSecrets(Secrets.json) functionality was used for testing on my development machine. In production Azure App Service the array of connections strings was configured with ReadonlyReplicaServersConnectionStringSettings:0, ReadonlyReplicaServersConnectionStringSettings:1 etc. syntax

Sample application Azure App Service Configuration
Azure Application Insights with connections to different read-only replicas highlighted

I had incorrectly configured the firewall on one of the read-only replica database servers so roughly one in four connection attempts failed.

Azure Application Insights failed database connection displayed

The customer’s application was also fairly compute intensive so we configure the Azure App Service to Auto scale based on the CPU load.,

Azure app service horizontal auto scale configuration

I’ll outline the configuration of Active geo-replication or SQL Data Sync for Azure in a couple of future posts.

The current solution works but I need to tidy up few issues like the StockItemsReadonlyReplicasController getting constructor getting a bit “chunky”.

.NET Core web API + Dapper – Error handling

Failure is an Option

For some historical reason I can’t remember my controllers often had an outer try/catch and associated logging. I think may have been ensure no “sensitive” information was returned to the caller even if the application was incorrectly deployed. So I could revisit my approach I added a controller with two methods one which returns an HTTP 500 error and another which has un-caught exception.

[Route("api/[controller]")]
[ApiController]
public class StockItemsNok500Controller : ControllerBase
{
	private readonly string connectionString;
	private readonly ILogger<StockItemsNok500Controller> logger;

	public StockItemsNok500Controller(IConfiguration configuration, ILogger<StockItemsNok500Controller> logger)
	{
		this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");

		this.logger = logger;
	}

	public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get500()
	{
		IEnumerable<Model.StockItemListDtoV1> response = null;

		try
		{
			using (SqlConnection db = new SqlConnection(this.connectionString))
			{
				response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItem500]", commandType: CommandType.Text);
			}
		}
		catch (SqlException ex)
		{
			logger.LogError(ex, "Retrieving list of StockItems");

			return this.StatusCode(StatusCodes.Status500InternalServerError);
		}

		return this.Ok(response);
	}
}

The information returned to a caller was generic and the only useful information was the “traceId”.

StockItemsNok500Controller error page
[Route("api/[controller]")]
[ApiController]
public class StockItemsNokExceptionController : ControllerBase
{
	private readonly string connectionString;

	public StockItemsNokExceptionController(IConfiguration configuration)
	{
		this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
	}

	public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetException()
	{
		IEnumerable<Model.StockItemListDtoV1> response = null;

		using (SqlConnection db = new SqlConnection(this.connectionString))
		{
			response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItemsException]", commandType: CommandType.Text);
		}

		return this.Ok(response);
	}
}

In “Development” mode the information returned to the caller contains a detailed stack trace that reveals implementation details which are useful for debugging but would also be useful to an attacker.

Developer StockItemsNok Controller Exception page

When not in “Development” mode no additional information is returned (not even a TraceId).

Production StockItemsNok500Controller Exception

The diagnostic stacktrace information logged by the two different controllers was essentially the same

System.Data.SqlClient.SqlException:
   at System.Data.SqlClient.SqlCommand+<>c.<ExecuteDbDataReaderAsync>b__126_0 (System.Data.SqlClient, Version=4.6.1.3, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
   at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Threading.Tasks.Task+<>c.<.cctor>b__272_0 (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Threading.ExecutionContext.RunInternal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Threading.ExecutionContext.RunInternal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Threading.Tasks.Task.ExecuteWithThreadLocal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Dapper.SqlMapper+<QueryAsync>d__33`1.MoveNext (Dapper, Version=2.0.0.0, Culture=neutral, PublicKeyToken=null: /_/Dapper/SqlMapper.Async.cs:418)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Polly.Retry.AsyncRetryEngine+<ImplementationAsync>d__0`1.MoveNext (Polly, Version=7.0.0.0, Culture=neutral, PublicKeyToken=c8a3ffc3f8f825cc)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at Polly.AsyncPolicy+<ExecuteAsync>d__21`1.MoveNext (Polly, Version=7.0.0.0, Culture=neutral, PublicKeyToken=c8a3ffc3f8f825cc)
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
   at devMobile.WebAPIDapper.Lists.Controllers.StockItemsNokController+<Get500>d__4.MoveNext (ListsClassic, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null: C:\Users\BrynLewis\source\repos\WebAPIDapper\Lists\Controllers\14.StockItemsNokController.cs:70)

One customer wanted their client application to display a corporate help desk number for staff to call for support. This information was made configurable

namespace devMobile.WebAPIDapper.Lists
{
	public class ErrorHandlerSettings
	{
		public string Detail { get; set; } = "devMobile Lists Classic API failure";

		public string Title { get; set; } = "System Error";
	}
}
public void ConfigureServices(IServiceCollection services)
{
	services.AddControllers();

	var errorHandlerSettings = Configuration.GetSection(nameof(ErrorHandlerSettings));
	services.Configure<ErrorHandlerSettings>(errorHandlerSettings);

	services.AddResponseCaching();

	services.AddDapperForMSSQL();
#if DAPPER_EXTENSIONS_CACHE_MEMORY
	services.AddDapperCachingInMemory(new MemoryConfiguration
	{
		AllMethodsEnableCache = false
	});
#endif
#if DAPPER_EXTENSIONS_CACHE_REDIS
	services.AddDapperCachingInRedis(new RedisConfiguration
	{
		AllMethodsEnableCache = false,
		KeyPrefix = Configuration.GetConnectionString("RedisKeyPrefix"),
		ConnectionString = Configuration.GetConnectionString("RedisConnection")
	}); 
#endif
	services.AddApplicationInsightsTelemetry();
}
{
 ...
  },
  "ErrorHandlerSettings": {
    "Title": "Webpage has died",
    "Detail": "Something has gone wrong call the help desk on 0800-RebootIt"
  },
...
}
namespace devMobile.WebAPIDapper.Lists.Controllers
{
	using Microsoft.AspNetCore.Mvc;
	using Microsoft.Extensions.Hosting;
	using Microsoft.Extensions.Options;


	[ApiController]
	public class ErrorController : Controller
	{
		private readonly ErrorHandlerSettings errorHandlerSettings;

		public ErrorController(IOptions<ErrorHandlerSettings> errorHandlerSettings)
		{
			this.errorHandlerSettings = errorHandlerSettings.Value;
		}

		[Route("/error")]
		public IActionResult HandleError([FromServices] IHostEnvironment hostEnvironment)
		{
			return Problem(detail: errorHandlerSettings.Detail, title: errorHandlerSettings.Title);
		}
	}
}
StockItemsNok Controller Error page with configurable title and details

Another customer wanted their client application to display a corporate help desk number based on the source hostname.

  • ClientA.SaasApplicationProvider.co.nz
  • ClientB.SaasApplicationProvider.co.nz
  • ClientC.SaasApplicationProvider.co.nz
  • SaasApplication.ClientD.co.nz

This information was also made configurable

namespace devMobile.WebAPIDapper.Lists
{
	using System.Collections.Generic;

	public class UrlSpecificSetting
	{
		public string Title { get; set; } = "";
		
		public string Detail { get; set; } = "";

		public UrlSpecificSetting()
		{
		}

		public UrlSpecificSetting(string title, string detail)
		{
			this.Title = title;
			this.Detail = detail;
		}
	}

	public class ErrorHandlerSettings
	{
		public string Title { get; set; } = "System Error";

		public string Detail { get; set; } = "devMobile Lists Classic API failure";

		public Dictionary<string, UrlSpecificSetting> UrlSpecificSettings { get; set; }

		public ErrorHandlerSettings()
		{
		}

		public ErrorHandlerSettings(string title, string detail, Dictionary<string, UrlSpecificSetting> urlSpecificSettings )
		{
			Title = title;

			Detail = detail;

			UrlSpecificSettings = urlSpecificSettings;
		}
	}
}

We considered storing the title and details message in the database but that approach was discounted as we wanted to minimise dependencies.

{
 ...
  "ErrorHandlerSettings": {
    "Detail": "Default detail",
    "Title": "Default title",
    "UrlSpecificSettings": {
      "localhost": {
        "Title": "Title for localhost",
        "Detail": "Detail for localhost"
      },
      "127.0.0.1": {
        "Title": "Title for 127.0.0.1",
        "Detail": "Detail for 127.0.0.1"
      }
    }
  }
}
namespace devMobile.WebAPIDapper.Lists.Controllers
{
	using Microsoft.AspNetCore.Mvc;
	using Microsoft.Extensions.Hosting;
	using Microsoft.Extensions.Options;


	[ApiController]
	public class ErrorController : Controller
	{
		private readonly ErrorHandlerSettings errorHandlerSettings;

		public ErrorController(IOptions<ErrorHandlerSettings> errorHandlerSettings)
		{
			this.errorHandlerSettings = errorHandlerSettings.Value;
		}

		[Route("/error")]
		public IActionResult HandleError([FromServices] IHostEnvironment hostEnvironment)
		{
			if (!this.errorHandlerSettings.UrlSpecificSettings.ContainsKey(this.Request.Host.Host))
			{
				return Problem(detail: errorHandlerSettings.Detail, title: errorHandlerSettings.Title);
			}

			return Problem(errorHandlerSettings.UrlSpecificSettings[this.Request.Host.Host].Title, errorHandlerSettings.UrlSpecificSettings[this.Request.Host.Host].Detail);
		}
	}
}

The sample configuration has custom title and details text for localhost and 127.0.0.1 with a default title and details text for all other hostnames.

StockItemsNok Controller Error page with 127.0.0.1 specific title and details
StockItemsNok Controller Error page with localhost specific title and details

One customer had a staff member who would take a photo of the client application error page with their mobile and email it to us which made it really easy to track down issues. This was especially usefully as they were in an awkward timezone.

Application Insights TraceId search
Application Insights TraceId search result with exception details

With a customisable error page my approach with the outer try/catch has limited benefit and just adds complexity.

TTI V3 Connector Device Provisioning Service(DPS) support

The previous versions of my Things Network Industries(TTI) and The Things Network(TTN) connectors supported the Azure IoT Hub Device Provisioning Service(DPS) with Symmetric Key Attestation(SAS) to “automagically” setup the LoRaWAN devices in a TTI Application.(See my V2 Gateway DPS setup post for more detail).

Azure Device Provisioning Service configuring Azure IoT Hubs

I used an “evenly weighted distribution” to spread the devices across five Azure IoT Hubs.

Azure IoT Hub no registered devices

In the Azure Portal I configured the DPS ID Scope (AzureSettings:DeviceProvisioningServiceSettings:IdScope) and the Group Enrollment Key(AzureSettings:DeviceProvisioningServiceSettings:GroupEnrollmentKey) then saved the configuration which restarted the AppService.

Azure Portal AppService configration

The first time a device sent an uplink message the cache query fails and the RegisterAsync method of the ProvisioningDeviceClient is called to get a device connection string.

	logger.LogInformation("Uplink-ApplicationID:{0} DeviceID:{1} Port:{2} Payload Raw:{3}", applicationId, deviceId, port, payload.UplinkMessage.PayloadRaw);

	if (!_DeviceClients.TryGetValue(deviceId, out DeviceClient deviceClient))
	{
		logger.LogInformation("Uplink-Unknown device for ApplicationID:{0} DeviceID:{1}", applicationId, deviceId);

		// Check that only one of Azure Connection string or DPS is configured
		if (string.IsNullOrEmpty(_azureSettings.IoTHubConnectionString) && (_azureSettings.DeviceProvisioningServiceSettings == null))
		{
			logger.LogError("Uplink-Neither Azure IoT Hub connection string or Device Provisioning Service configured");

			return req.CreateResponse(HttpStatusCode.UnprocessableEntity);
		}

		// Check that only one of Azure Connection string or DPS is configured
		if (!string.IsNullOrEmpty(_azureSettings.IoTHubConnectionString) && (_azureSettings.DeviceProvisioningServiceSettings != null))
		{
			logger.LogError("Uplink-Both Azure IoT Hub connection string and Device Provisioning Service configured");

			return req.CreateResponse(HttpStatusCode.UnprocessableEntity);
		}

		// User Azure IoT Connection string if configured and Device Provisioning Service isn't
		if (!string.IsNullOrEmpty(_azureSettings.IoTHubConnectionString))
		{
			deviceClient = DeviceClient.CreateFromConnectionString(_azureSettings.IoTHubConnectionString, deviceId, transportSettings);

			try
			{
				await deviceClient.OpenAsync();
			}
			catch (DeviceNotFoundException)
			{
				logger.LogWarning("Uplink-Unknown DeviceID:{0}", deviceId);

				return req.CreateResponse(HttpStatusCode.NotFound);
			}
		}

		// Azure IoT Hub Device provisioning service if configured
		if (_azureSettings.DeviceProvisioningServiceSettings != null) 
		{
			string deviceKey;

			if ( string.IsNullOrEmpty(_azureSettings.DeviceProvisioningServiceSettings.IdScope) || string.IsNullOrEmpty(_azureSettings.DeviceProvisioningServiceSettings.GroupEnrollmentKey))
			{
				logger.LogError("Uplink-Device Provisioning Service requires ID Scope and Group Enrollment Key configured");

				return req.CreateResponse(HttpStatusCode.UnprocessableEntity);
			}

			using (var hmac = new HMACSHA256(Convert.FromBase64String(_azureSettings.DeviceProvisioningServiceSettings.GroupEnrollmentKey)))
			{
				deviceKey = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(deviceId)));
			}

			using (var securityProvider = new SecurityProviderSymmetricKey(deviceId, deviceKey, null))
			{
				using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly))
				{
					ProvisioningDeviceClient provClient = ProvisioningDeviceClient.Create(
						Constants.AzureDpsGlobalDeviceEndpoint,
						_azureSettings.DeviceProvisioningServiceSettings.IdScope,
						securityProvider,
						transport);

					DeviceRegistrationResult result = await provClient.RegisterAsync();

					if (result.Status != ProvisioningRegistrationStatusType.Assigned)
					{
						_logger.LogError("Config-DeviceID:{0} Status:{1} RegisterAsync failed ", deviceId, result.Status);

						return req.CreateResponse(HttpStatusCode.FailedDependency);
					}

					IAuthenticationMethod authentication = new DeviceAuthenticationWithRegistrySymmetricKey(result.DeviceId, (securityProvider as SecurityProviderSymmetricKey).GetPrimaryKey());

					deviceClient = DeviceClient.Create(result.AssignedHub, authentication, transportSettings);

					await deviceClient.OpenAsync();
				}
			}
		}

		if (!_DeviceClients.TryAdd(deviceId, deviceClient))
		{
			logger.LogWarning("Uplink-TryAdd failed for ApplicationID:{0} DeviceID:{1}", applicationId, deviceId);

			return req.CreateResponse(HttpStatusCode.Conflict);
		}

		Models.AzureIoTHubReceiveMessageHandlerContext context = new Models.AzureIoTHubReceiveMessageHandlerContext()
		{
			DeviceId = deviceId,
			ApplicationId = applicationId,
			WebhookId = _theThingsIndustriesSettings.WebhookId,
			WebhookBaseURL = _theThingsIndustriesSettings.WebhookBaseURL,
			ApiKey = _theThingsIndustriesSettings.ApiKey
		};

		await deviceClient.SetReceiveMessageHandlerAsync(AzureIoTHubClientReceiveMessageHandler, context);

		await deviceClient.SetMethodDefaultHandlerAsync(AzureIoTHubClientDefaultMethodHandler, context);
	}

	JObject telemetryEvent = new JObject
	{
		{ "ApplicationID", applicationId },
		{ "DeviceID", deviceId },
		{ "Port", port },
		{ "Simulated", payload.Simulated },
		{ "ReceivedAtUtc", payload.UplinkMessage.ReceivedAtUtc.ToString("s", CultureInfo.InvariantCulture) },
		{ "PayloadRaw", payload.UplinkMessage.PayloadRaw }
	};

	// If the payload has been decoded by payload formatter, put it in the message body.
	if (payload.UplinkMessage.PayloadDecoded != null)
	{
		telemetryEvent.Add("PayloadDecoded", payload.UplinkMessage.PayloadDecoded);
	}

	// Send the message to Azure IoT Hub
	using (Message ioTHubmessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent))))
	{
		// Ensure the displayed time is the acquired time rather than the uploaded time. 
		ioTHubmessage.Properties.Add("iothub-creation-time-utc", payload.UplinkMessage.ReceivedAtUtc.ToString("s", CultureInfo.InvariantCulture));
		ioTHubmessage.Properties.Add("ApplicationId", applicationId);
		ioTHubmessage.Properties.Add("DeviceEUI", payload.EndDeviceIds.DeviceEui);
		ioTHubmessage.Properties.Add("DeviceId", deviceId);
		ioTHubmessage.Properties.Add("port", port.ToString());
		ioTHubmessage.Properties.Add("Simulated", payload.Simulated.ToString());

		await deviceClient.SendEventAsync(ioTHubmessage);

		logger.LogInformation("Uplink-DeviceID:{0} SendEventAsync success", payload.EndDeviceIds.DeviceId);
	}
}
catch (Exception ex)
{
	logger.LogError(ex, "Uplink-Message processing failed");

	return req.CreateResponse(HttpStatusCode.InternalServerError);
}

I used Telerik Fiddler and some sample payloads copied from my Azure Storage Queue sample to simulate many devices and the registrations were spread across my five Azure IoT Hubs.

DPS Device Registrations tab showing distribution of LoRaWAN Devices

I need to review the HTTP Error codes returned for different errors and ensure failures are handled robustly.

TTI V3 Connector Minimalist Cloud to Device(C2D)

In a previous version of my Things Network Industries(TTI) The Things Network(TTN) connector I queried the The Things Stack(TTS) Application Programing Interface(API) to get a list of Applications and their Devices. For a large number of Applications and/or Devices this process could take many 10’s of seconds. Application and Device creation and deletion then had to be tracked to keep the AzureDeviceClient connection list current, which added significant complexity.

In this version a downlink message can be sent to a device only after an uplink message. I’m looking at adding an Azure Function which initiates a connection to the configured Azure IoT Hub for the specified device to mitigate with this issue.

To send a TTN downlink message to a device the minimum required info is the LoRaWAN port number (specified in a Custom Property on the Azure IoT Hub cloud to device message), the device Id (from uplink message payload, which has been validated by a successful Azure IoT Hub connection) web hook id, web hook base URL, and an API Key (The Web Hook parameters are stored in the Connector configuration).

Azure IoT Explorer Clod to Device message with LoRaWAN Port number custom parameter

When a LoRaWAN device sends an Uplink message a session is established using Advanced Message Queuing Protocol(AMQP) so connections can be multiplexed)

I used Azure IoT Explorer to send Cloud to Device messages to the Azure IoT Hub (to initiate the sending of a downlink message to the Device by the Connector) after simulating a TTN uplink message with Telerik Fiddler and a modified TTN sample payload.

BEWARE – TTN URLs and Azure IoT Hub device identifiers are case sensitive

...
if (!_DeviceClients.TryGetValue(deviceId, out DeviceClient deviceClient))
{
   logger.LogInformation("Uplink-Unknown device for ApplicationID:{0} DeviceID:{1}", applicationId, deviceId);

   deviceClient = DeviceClient.CreateFromConnectionString(_configuration.GetConnectionString("AzureIoTHub"), deviceId, 
                    new ITransportSettings[]
                    {
                        new AmqpTransportSettings(TransportType.Amqp_Tcp_Only)
                        {
                            AmqpConnectionPoolSettings = new AmqpConnectionPoolSettings()
                            {
                                Pooling = true,
                            }
                        }
                    });

   try
   {
      await deviceClient.OpenAsync();
   }
   catch (DeviceNotFoundException)
   {
      logger.LogWarning("Uplink-Unknown DeviceID:{0}", deviceId);

      return req.CreateResponse(HttpStatusCode.NotFound);
   }

   if (!_DeviceClients.TryAdd(deviceId, deviceClient))
   {
      logger.LogWarning("Uplink-TryAdd failed for ApplicationID:{0} DeviceID:{1}", applicationId, deviceId);

      return req.CreateResponse(HttpStatusCode.Conflict);
   }

   Models.AzureIoTHubReceiveMessageHandlerContext context = new Models.AzureIoTHubReceiveMessageHandlerContext()
   { 
      DeviceId = deviceId,
      ApplicationId = applicationId,
      WebhookId = _configuration.GetSection("TheThingsIndustries").GetSection("WebhookId").Value,
      WebhookBaseURL = _configuration.GetSection("TheThingsIndustries").GetSection("WebhookBaseURL").Value,
      ApiKey = _configuration.GetSection("TheThingsIndustries").GetSection("APiKey").Value,
   };      

   await deviceClient.SetReceiveMessageHandlerAsync(AzureIoTHubClientReceiveMessageHandler, context);

   await deviceClient.SetMethodDefaultHandlerAsync(AzureIoTHubClientDefaultMethodHandler, context);
 }

...

An Azure IoT Hub can invoke methods(synchronous) or send messages(asynchronous) to a device for processing. The Azure IoT Hub DeviceClient has two methods SetMethodDefaultHandlerAsync and SetReceiveMessageHandlerAsync which enable the processing of direct methods and messages.

private async Task<MethodResponse> AzureIoTHubClientDefaultMethodHandler(MethodRequest methodRequest, object userContext)
{
	if (methodRequest.DataAsJson != null)
	{
		_logger.LogWarning("AzureIoTHubClientDefaultMethodHandler name:{0} payload:{1}", methodRequest.Name, methodRequest.DataAsJson);
	}
	else
	{
		_logger.LogWarning("AzureIoTHubClientDefaultMethodHandler name:{0} payload:NULL", methodRequest.Name);
	}

	return new MethodResponse(404);
}

After some experimentation in previous TTN Connectors I found the synchronous nature of DirectMethods didn’t work well with LoRAWAN “irregular” connectivity so currently they are ignored.

public partial class Integration
{
	private async Task AzureIoTHubClientReceiveMessageHandler(Message message, object userContext)
	{
		try
		{
			Models.AzureIoTHubReceiveMessageHandlerContext receiveMessageHandlerContext = (Models.AzureIoTHubReceiveMessageHandlerContext)userContext;

			if (!_DeviceClients.TryGetValue(receiveMessageHandlerContext.DeviceId, out DeviceClient deviceClient))
			{
				_logger.LogWarning("Downlink-DeviceID:{0} unknown", receiveMessageHandlerContext.DeviceId);
				return;
			}

			using (message)
			{
				string payloadText = Encoding.UTF8.GetString(message.GetBytes()).Trim();

				if (!AzureDownlinkMessage.PortTryGet(message.Properties, out byte port))
				{
					_logger.LogWarning("Downlink-Port property is invalid");

					await deviceClient.RejectAsync(message);
					return;
				}

				// Split over multiple lines in an attempt to improve readability. In this scenario a valid JSON string should start/end with {/} for an object or [/] for an array
				if ((payloadText.StartsWith("{") && payloadText.EndsWith("}"))
														||
					((payloadText.StartsWith("[") && payloadText.EndsWith("]"))))
				{
					try
					{
						downlink.PayloadDecoded = JToken.Parse(payloadText);
					}
					catch (JsonReaderException)
					{
						downlink.PayloadRaw = payloadText;
					}
				}
				else
				{
					downlink.PayloadRaw = payloadText;
				}

				_logger.LogInformation("Downlink-IoT Hub DeviceID:{0} MessageID:{1} LockToken :{2} Port{3}",
					receiveMessageHandlerContext.DeviceId,
					message.MessageId,
		            message.LockToken,
					downlink.Port);

				Models.DownlinkPayload Payload = new Models.DownlinkPayload()
				{
					Downlinks = new List<Models.Downlink>()
					{
						downlink
					}
				};

				string url = $"{receiveMessageHandlerContext.WebhookBaseURL}/{receiveMessageHandlerContext.ApplicationId}/webhooks/{receiveMessageHandlerContext.WebhookId}/devices/{receiveMessageHandlerContext.DeviceId}/down/replace");

				using (var client = new WebClient())
				{
					client.Headers.Add("Authorization", $"Bearer {receiveMessageHandlerContext.ApiKey}");

					client.UploadString(new Uri(url), JsonConvert.SerializeObject(Payload));
				}

				_logger.LogInformation("Downlink-DeviceID:{0} LockToken:{1} success", receiveMessageHandlerContext.DeviceId, message.LockToken);
			}
		}
		catch (Exception ex)
		{
			_logger.LogError(ex, "Downlink-ReceiveMessge processing failed");
		}
	}
}

If the message body contains a valid JavaScript Object Notation(JSON) payload it is “spliced” into the DownLink message decoded_payload field otherwise the Base64 encoded frm_payload is populated.

The Things “Industries Live” data tab downlink message

The SetReceiveMessageHandlerAsync context information is used to construct a TTN downlink message payload and request URL(with default queuing, message priority and confirmation options)

Arduino Serial Monitor displaying Uplink and Downlink messages