Myriota Connector – Payload formatters revisited again

The myriota Azure IoT Hub Cloud Identity Translation Gateway payload formatters use compiled C# code to convert uplink/downlink packet payloads to JSON/byte array. While trying out different formatters I had “compile” and “evaluation” errors which would have been a lot easier to debug if there was more diagnostic information in the Azure Application Insights logging.

namespace PayloadFormatter // Additional namespace for shortening interface when usage in formatter code
{
    using System.Collections.Generic;

    using Newtonsoft.Json.Linq;

    public interface IFormatterUplink
    {
        public JObject Evaluate(IDictionary<string, string> properties, string terminalId, DateTime timestamp, byte[] payloadBytes);
    }

    public interface IFormatterDownlink
    {
        public byte[] Evaluate(IDictionary<string, string> properties, string terminalId, JObject? payloadJson, byte[] payloadBytes);
    }
}

An uplink payload formatter is loaded from Azure Storage Blob, compiled with Oleg Shilo’s CS-Script then cached in memory with Alastair Crabtree’s LazyCache.

// Get the payload formatter from Azure Storage container, compile, and then cache binary.
IFormatterUplink formatterUplink;

try
{
   formatterUplink = await _payloadFormatterCache.UplinkGetAsync(context.PayloadFormatterUplink, cancellationToken);
}
catch (Azure.RequestFailedException aex)
{
   _logger.LogError(aex, "Uplink- PayloadID:{0} payload formatter load failed", payload.Id);

   return payload;
}
catch (NullReferenceException nex)
{
   _logger.LogError(nex, "Uplink- PayloadID:{id} formatter:{formatter} compilation failed missing interface", payload.Id, context.PayloadFormatterUplink);

   return payload;
}
catch (CSScriptLib.CompilerException cex)
{
   _logger.LogError(cex, "Uplink- PayloadID:{id} formatter:{formatter} compiler failed", payload.Id, context.PayloadFormatterUplink);

   return payload;
}
catch (Exception ex)
{
   _logger.LogError(ex, "Uplink- PayloadID:{id} formatter:{formatter} compilation failed", payload.Id, context.PayloadFormatterUplink);

   return payload;
}

If the Azure Storage blob is missing or the payload formatter code incorrect an exception is thrown. I added specialised exception handers for Azure.RequestFailedException, NullReferenceException and CSScriptLib.CompilerException to add more detail to the Azure Application Insights logging.

// Process the payload with configured formatter
Dictionary<string, string> properties = new Dictionary<string, string>();
JObject telemetryEvent;

try
{
   telemetryEvent = formatterUplink.Evaluate(properties, packet.TerminalId, packet.Timestamp, payloadBytes);
}
catch (Exception ex)
{
   _logger.LogError(ex, "Uplink- PayloadId:{0} TerminalId:{1} Value:{2} Bytes:{3} payload formatter evaluate failed", payload.Id, packet.TerminalId, packet.Value, Convert.ToHexString(payloadBytes));

   return payload;
}

if (telemetryEvent is null)
{
   _logger.LogError("Uplink- PayloadId:{0} TerminalId:{1} Value:{2} Bytes:{3} payload formatter evaluate failed returned null", payload.Id, packet.TerminalId, packet.Value, Convert.ToHexString(payloadBytes));

   return payload;
}

The Evaluate method can return many different types of exception so in the initial version only the “generic” exception is caught and logged.

using System;
using System.Collections.Generic;

using Newtonsoft.Json;
using Newtonsoft.Json.Linq;

public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
    public JObject Evaluate(IDictionary<string, string> properties, string terminalId, DateTime timestamp, byte[] payloadBytes)
    {
        JObject telemetryEvent = new JObject();

        telemetryEvent.Add("Bytes", BitConverter.ToString(payloadBytes));
        telemetryEvent.Add("Bytes", BitConverter.ToString(payloadBytes));

        return telemetryEvent;
    }
}

There are a number (which should grow over time) of test uplink/downlink payload formatters for testing different compile and execution failures.

Azure IoT Storage Explorer container with sample formatter blobs.

I used Azure Storage Explorer to upload my test payload formatters to the uplink/downlink Azure Storage containers.

Myriota Connector – Uplink Payload formatters revisited

The myriota Azure IoT Hub Cloud Identity Translation Gateway payload formatters use compiled C# code to convert uplink packet payloads to JSON.

namespace PayloadFormattercode
{
    using System.Collections.Generic;

    using Newtonsoft.Json.Linq;

    public interface IFormatterUplink
    {
        public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, JObject payloadJson, string payloadText, byte[] payloadBytes);
    }
..
}

The myriota uplink packet payload is only 20 bytes long so it is very unlikely that the payloadText and payloadJSON parameters would ever be populated so I removed them from the interface. The uplink message handler interface has been updated and the code to convert (if possible) the payload bytes to text and then to JSON deleted.

namespace PayloadFormatter
{
    using System.Collections.Generic;

    using Newtonsoft.Json.Linq;

    public interface IFormatterUplink
    {
        public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, byte[] payloadBytes);
    }
...
}

All of the sample payload formatters have been updated to reflect the updated parameters. The sample Tracker.cs payload formatter unpacks a message from Myriota Dev Kit running the Tracker sample and returns an Azure IoT Central compatible location telemetry payload.

/*
myriota tracker payload format

typedef struct {
  uint16_t sequence_number;
  int32_t latitude;   // scaled by 1e7, e.g. -891234567 (south 89.1234567)
  int32_t longitude;  // scaled by 1e7, e.g. 1791234567 (east 179.1234567)
  uint32_t time;      // epoch timestamp of last fix
} __attribute__((packed)) tracker_message; 

*/ 
using System;
using System.Collections.Generic;
using System.Globalization;

using Newtonsoft.Json;
using Newtonsoft.Json.Linq;


public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
    public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, byte[] payloadBytes)
    {
        JObject telemetryEvent = new JObject();

        if (payloadBytes is null)
        {
            return telemetryEvent;
        }

        telemetryEvent.Add("SequenceNumber", BitConverter.ToUInt16(payloadBytes));

        JObject location = new JObject();

        double latitude = BitConverter.ToInt32(payloadBytes, 2) / 10000000.0;
        location.Add("lat", latitude);

        double longitude = BitConverter.ToInt32(payloadBytes, 6) / 10000000.0;
        location.Add("lon", longitude);

        location.Add("alt", 0);

        telemetryEvent.Add("DeviceLocation", location);

        UInt32 packetimestamp = BitConverter.ToUInt32(payloadBytes, 10);

        DateTime fixAtUtc = DateTime.UnixEpoch.AddSeconds(packetimestamp);

        telemetryEvent.Add("FixAtUtc", fixAtUtc);

        properties.Add("iothub-creation-time-utc", fixAtUtc.ToString("s", CultureInfo.InvariantCulture));

        return telemetryEvent;
    }
}

If a message payload is text or JSON it can still be converted in the payload formatter.

Myriota – Uplink Payload formatters and caching

My myriota Azure IoT Hub Cloud Identity Translation Gateway payload formatters uses C# code (compiled with CS-Script cached with Alastair Crabtrees’s LazyCache) to convert uplink packet payloads to JSON.

I have found that putting the C/C++ structure for the uplink payload at the top of the convertor really helpful.

/*
myriota tracker payload format

typedef struct {
  uint16_t sequence_number;
  int32_t latitude;   // scaled by 1e7, e.g. -891234567 (south 89.1234567)
  int32_t longitude;  // scaled by 1e7, e.g. 1791234567 (east 179.1234567)
  uint32_t time;      // epoch timestamp of last fix
} __attribute__((packed)) tracker_message; 

*/ 
using System;
using System.Collections.Generic;

using Newtonsoft.Json;
using Newtonsoft.Json.Linq;


public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
    public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        JObject telemetryEvent = new JObject();

        telemetryEvent.Add("SequenceNumber", BitConverter.ToUInt16(payloadBytes));

        double latitude = BitConverter.ToInt32(payloadBytes, 2) / 10000000.0;
        telemetryEvent.Add("Latitude", latitude);

        double longitude = BitConverter.ToInt32(payloadBytes, 6) / 10000000.0;
        telemetryEvent.Add("Longitude", longitude);

        UInt32 packetimestamp = BitConverter.ToUInt32(payloadBytes, 10);
        DateTime lastFix = DateTime.UnixEpoch.AddSeconds(packetimestamp);

       properties.Add("iothub-creation-time-utc", lastFix .ToString("s", CultureInfo.InvariantCulture));

       return telemetryEvent;
    }
}

The sample Tracker.cs payload formatter unpacks a message from Myriota Dev Kit running the Tracker sample and returns an Azure IoT Central compatible location telemetry payload.

BEWARE : I think the Azure IoT Central Position lat, lon + alt values might be case sensitive.

Azure IoT Explorer displaying Tracker.cs payload formatter output

The identity payload formatter to use is configured as part of the Destination webhook Uniform Resource Locator (URL).

Myriota Destination configuration application name URL configuration
namespace devMobile.IoT.MyriotaAzureIoTConnector.Connector.Models
{
    public class UplinkPayloadQueueDto
    {
        public string Application { get; set; }
        public string EndpointRef { get; set; }
        public DateTime PayloadReceivedAtUtc { get; set; }
        public DateTime PayloadArrivedAtUtc { get; set; }
        public QueueData Data { get; set; }
        public string Id { get; set; }
        public Uri CertificateUrl { get; set; }
        public string Signature { get; set; }
    }

    public class QueueData
    {
        public List<QueuePacket> Packets { get; set; }
    }

    public class QueuePacket
    {
        public string TerminalId { get; set; }

        public DateTime Timestamp { get; set; }

        public string Value { get; set; }
    }
}

A pair of Azure Blob Storage containers are used to store the uplink/downlink (coming soon) formatter files. The compiled payload formatters are cached with Uplink/Downlink + Application (from the UplinkPayloadQueueDto) as the key.

Azure IoT Storage Explorer uplink payload formatters

The default uplink and downlink formatters used when there is no payload formatter for “Application” are configured in the application settings.

Myriota device Uplink Serialisation

The Myriota Developer documentation has some sample webhook data payloads so I used JSON2csharp to generate a Data Transfer Object(DTO) to deserialise payload. The format of the message is a bit “odd”, the “Data “Value” contains an “escaped” JSON object.

{
  "EndpointRef": "ksnb8GB_TuGj:__jLfs2BQJ2d",
  "Timestamp": 1692928585,
  "Data": "{"Packets": [{"Timestamp": 1692927646796, "TerminalId": "0001020304", "Value": "00008c9512e624cce066adbae764cccccccccccc"}]}",
  "Id": "a5c1bffe-4b62-4233-bbe9-d4ecc4f8b6cb",
  "CertificateUrl": "https://security.myriota.com/data-13f7751f3c5df569a6c9c42a9ce73a8a.crt",
  "Signature": "FDJpQdWHwCY+tzCN/WvQdnbyjgu4BmP/t3cJIOEF11sREGtt7AH2L9vMUDji6X/lxWBYa4K8tmI0T914iPyFV36i+GtjCO4UHUGuFPJObCtiugVV8934EBM+824xgaeW8Hvsqj9eDeyJoXH2S6C1alcAkkZCVt0pUhRZSZZ4jBJGGEEQ1Gm+SOlYjC2exUOf0mCrI5Pct+qyaDHbtiHRd/qNGW0LOMXrB/9difT+/2ZKE1xvDv9VdxylXi7W0/mARCfNa0J6aWtQrpvEXJ5w22VQqKBYuj3nlGtL1oOuXCZnbFYFf4qkysPaXON31EmUBeB4WbZMyPaoyFK0wG3rwA=="
}
namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook.Models
{
    public class UplinkPayloadWebDto
    {
        public string EndpointRef { get; set; }
        public long Timestamp { get; set; } 
        public string Data { get; set; } // Embedded JSON ?
        public string Id { get; set; }
        public string CertificateUrl { get; set; }
        public string Signature { get; set; }
    }
}

The UplinkWebhook controller “automagically” deserialises the message, then in code the embedded JSON is deserialised and “unpacked”, finally the processed message is inserted into an Azure Storage queue.

namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook.Controllers
{
    [Route("[controller]")]
    [ApiController]
    public class UplinkController : ControllerBase
    {
        private readonly Models.ApplicationSettings _applicationSettings;
        private readonly ILogger<UplinkController> _logger;
        private readonly QueueServiceClient _queueServiceClient;

        public UplinkController(IOptions<Models.ApplicationSettings> applicationSettings, QueueServiceClient queueServiceClient, ILogger<UplinkController> logger)
        {
            _applicationSettings = applicationSettings.Value;
            _queueServiceClient = queueServiceClient;
            _logger = logger;
        }

        [HttpPost]
        public async Task<IActionResult> Post([FromBody] Models.UplinkPayloadWebDto payloadWeb)
        {
            _logger.LogInformation("SendAsync queue name:{QueueName}", _applicationSettings.QueueName);

            QueueClient queueClient = _queueServiceClient.GetQueueClient(_applicationSettings.QueueName);

            var serializeOptions = new JsonSerializerOptions
            {
                WriteIndented = true,
                Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping
            };

            await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payloadWeb, serializeOptions)));

            return this.Ok();
        }
    }
}

The webhook application uses the QueueClientBuilderExtensions and AddServiceClient so a QueueServiceClient can be injected into the webhook controller.

namespace devMobile.IoT.myriotaAzureIoTConnector.myriota.UplinkWebhook
{
    public class Program
    {
        public static void Main(string[] args)
        {
            var builder = WebApplication.CreateBuilder(args);

            // Add services to the container.
            builder.Services.AddControllers();

            builder.Services.AddApplicationInsightsTelemetry(i => i.ConnectionString = builder.Configuration.GetConnectionString("ApplicationInsights"));

            builder.Services.Configure<Models.ApplicationSettings>(builder.Configuration.GetSection("Application"));

            builder.Services.AddAzureClients(azureClient =>
            {
                azureClient.AddQueueServiceClient(builder.Configuration.GetConnectionString("AzureWebApi"));
            });

            var app = builder.Build();

            // Configure the HTTP request pipeline.

            app.UseHttpsRedirection();

            app.MapControllers();

            app.Run();
        }
    }
}

After debugging the application on my desktop with Telerik fiddler I deployed the application to one of my Azure subscriptions.

Azure Resource Group for the myriota Azure IoT Connector
Adding a new Destination in the myriota device manager

As part of configuring a new device test messages can be sent to the configured destinations.

Testing a new Destination in the myriota device manager
{
  "EndpointRef": "N_HlfTNgRsqe:uyXKvYTmTAO5",
  "Timestamp": 1563521870,
  "Data": "{"Packets": [{"Timestamp": 1563521870359,
    "TerminalId": "f74636ec549f9bde50cf765d2bcacbf9",
    "Value": "0101010101010101010101010101010101010101"}]}",
  "Id": "fe77e2c7-8e9c-40d0-8980-43720b9dab75",
  "CertificateUrl":    "https://security.myriota.com/data-13f7751f3c5df569a6c9c42a9ce73a8a.crt",
  "Signature": "k2OIBppMRmBT520rUlIvMxNg+h9soJYBhQhOGSIWGdzkppdT1Po2GbFr7jbg..."
}

The DTO generated with JSON2csharp needed some manual “tweaking” after examining how a couple of the sample messages were deserialised.

Azure Storage Explorer messages

I left the Myriota Developer Toolkit device (running the tracker sample) outside overnight and the following day I could see with Azure Storage Explorer a couple of messages in the Azure Storage Queue

Myriota device configuration

For a couple of weeks Myriota Developer Toolkit has been sitting under my desk and today I got some time to setup a device, register it, then upload some data.

Myriota Developer Toolkit

The first step was to download and install the Myriota Configurator so I could get the device registration information and install the tracker example application.

Using Windows File Explorer to “unblock” the downloaded file

After “unblocking” the zip file and upgrading my pip install the install script worked.

Myriota Configurator installation script

The application had to be run from the command line with “python MyriotaConfigurator.py”

Myriota Configurator main menu
Myriota Configurator retrieving device registration code

On the device I’m using the Tracker sample application to generate some sample payloads.

Myriota Configurator downloading tracker sample to device

The next step was to “register” my device and configure the destination(s) for its messages.

Myriota Device Manager Device configuration

Once the device and device manager configuration were sorted, I put the Tracker out on the back lawn on top of a large flowerpot.

Device Manager Access Times

On the “Access Times” page I could see that there were several periods when a satellite was overhead and overnight a couple of messages were uploaded.

Swarm Space – Underlying Architecture sorted

After figuring out that calling an Azure Http Trigger function to load the cache wasn’t going to work reliably, I have revisited the architecture one last time and significantly refactored the SwarmSpaceAzuureIoTConnector project.

Visual Studio 2022 solution

The application now has a StartUpService which loads the Azure DeviceClient cache (Lazy Cache) in the background as the application starts up. If an uplink message is received from a SwarmDevice before, it has been loaded by the FunctionsStartup the DeviceClient information is cached and another connection to the Azure IoT Hub is not established.

...
using Microsoft.Azure.Functions.Extensions.DependencyInjection;

[assembly: FunctionsStartup(typeof(devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector.StartUpService))]
namespace devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector
{
...
    public class StartUpService : BackgroundService
    {
        private readonly ILogger<StartUpService> _logger;
        private readonly ISwarmSpaceBumblebeeHive _swarmSpaceBumblebeeHive;
        private readonly Models.ApplicationSettings _applicationSettings;
        private readonly IAzureDeviceClientCache _azureDeviceClientCache;

        public StartUpService(ILogger<StartUpService> logger, IAzureDeviceClientCache azureDeviceClientCache, ISwarmSpaceBumblebeeHive swarmSpaceBumblebeeHive, IOptions<Models.ApplicationSettings> applicationSettings)//, IOptions<Models.AzureIoTSettings> azureIoTSettings)
        {
            _logger = logger;
            _azureDeviceClientCache = azureDeviceClientCache;
            _swarmSpaceBumblebeeHive = swarmSpaceBumblebeeHive;
            _applicationSettings = applicationSettings.Value;
        }

        protected override async Task ExecuteAsync(CancellationToken cancellationToken)
        {
            await Task.Yield();

            _logger.LogInformation("StartUpService.ExecuteAsync start");

            try
            {
                _logger.LogInformation("BumblebeeHiveCacheRefresh start");

                foreach (SwarmSpace.BumblebeeHiveClient.Device device in await _swarmSpaceBumblebeeHive.DeviceListAsync(cancellationToken))
                {
                    _logger.LogInformation("BumblebeeHiveCacheRefresh DeviceId:{DeviceId} DeviceName:{DeviceName}", device.DeviceId, device.DeviceName);

                    Models.AzureIoTDeviceClientContext context = new Models.AzureIoTDeviceClientContext()
                    {
                        OrganisationId = _applicationSettings.OrganisationId,
                        DeviceType = (byte)device.DeviceType,
                        DeviceId = (uint)device.DeviceId,
                    };

                    await _azureDeviceClientCache.GetOrAddAsync(context.DeviceId, context);
                }

                _logger.LogInformation("BumblebeeHiveCacheRefresh finish");
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "StartUpService.ExecuteAsync error");

                throw;
            }

            _logger.LogInformation("StartUpService.ExecuteAsync finish");
        }
    }
}

The uplink and downlink payload formatters are stored in Azure Blob Storage are compiled (CS-Script) as they are loaded then cached (Lazy Cache)

Azure Storage explorer displaying list of uplink payload formatter blobs.
Azure Storage explorer displaying list of downlink payload formatter blobs.
private async Task<IFormatterDownlink> DownlinkLoadAsync(int userApplicationId)
{
    BlobClient blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormattersDownlinkContainer, $"{userApplicationId}.cs");

    if (!await blobClient.ExistsAsync())
    {
        _logger.LogInformation("PayloadFormatterDownlink- UserApplicationId:{0} Container:{1} not found using default:{2}", userApplicationId, _applicationSettings.PayloadFormattersUplinkContainer, _applicationSettings.PayloadFormatterUplinkBlobDefault);

        blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormatterDownlinkBlobDefault, _applicationSettings.PayloadFormatterDownlinkBlobDefault);
    }

    BlobDownloadResult downloadResult = await blobClient.DownloadContentAsync();

    return CSScript.Evaluator.LoadCode<PayloadFormatter.IFormatterDownlink>(downloadResult.Content.ToString());
}

The uplink and downlink formatters can be edited in Visual Studio 2022 with syntax highlighting (currently they have to be manually uploaded).

The SwarmSpaceBumbleebeehive module no longer has public login or logout methods.

    public interface ISwarmSpaceBumblebeeHive
    {
        public Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken);

        public Task SendAsync(uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, byte[] payload);
    }

The DeviceListAsync and SendAsync methods now call the BumblebeeHive login method after configurable period of inactivity.

public async Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken)
{
        if ((_TokenActivityAtUtC + _bumblebeeHiveSettings.TokenValidFor) < DateTime.UtcNow)
        {
            await Login();
        }

        using (HttpClient httpClient = _httpClientFactory.CreateClient())
       {
            Client client = new Client(httpClient);

            client.BaseUrl = _bumblebeeHiveSettings.BaseUrl;

            httpClient.DefaultRequestHeaders.Add("Authorization", $"bearer {_token}");

            return await client.GetDevicesAsync(null, null, null, null, null, null, null, null, null, cancellationToken);
        }
}

I’m looking at building a webby user interface where users an interactivity list, create, edit, delete formatters with syntax highlighter support, and the executing the formatter with sample payloads.

Swarm Space Azure IoT Connector Identity Translation Gateway Architecture

This approach uses most of the existing building blocks, and that’s it no more changes.

Swarm Space – Uplink with WebAPI Revisited again

After reviewing my ASP .NET Core WebAPI Swarm Space Delivery Method webhook implementation I have made a final round of changes.

There are now separate Data Transfer Objects(DTO) for the uplink and queue message payloads mainly, because the UplinkPayloadQueueDto has additional fields for the client (based on the x-api-key) and when the webhook was called.

public class UplinkPayloadQueueDto
{
    public ulong PacketId { get; set; }
    public byte DeviceType { get; set; }
    public uint DeviceId { get; set; }
    public ushort UserApplicationId { get; set; }
    public uint OrganizationId { get; set; }
    public string Data { get; set; } = string.Empty;
    public byte Length { get; set; }
    public int Status { get; set; }
    public DateTime SwarmHiveReceivedAtUtc { get; set; }
    public DateTime UplinkWebHookReceivedAtUtc { get; set; }
    public string Client { get; set; } = string.Empty;
 }

public class UplinkPayloadWebDto
{
    public ulong PacketId { get; set; }
    public byte DeviceType { get; set; }
    public uint DeviceId { get; set; }
    public ushort UserApplicationId { get; set; }
    public uint OrganizationId { get; set; }
    public string Data { get; set; } = string.Empty;

    [Range(Constants.PayloadLengthMinimum, Constants.PayloadLengthMaximum)]
    public byte Len { get; set; }
    public int Status { get; set; }

    public DateTime HiveRxTime { get; set; }
}

I did consider using AutoMapper to copy the values from the UplinkPayloadWebDto to the UplinkPayloadQueueDto but the additional complexity/configuration required for one mapping wasn’t worth it.

The UplinkController has a single POST method, which has a JSON payload(FromBody) and a single header (FromHeader) “x-api-key” which is to secure the method and identify the caller.

[HttpPost]
public async Task<IActionResult> Post([FromHeader(Name = "x-api-key")] string xApiKeyValue, [FromBody] Models.UplinkPayloadWebDto payloadWeb)
{
    if (!_applicationSettings.XApiKeys.TryGetValue(xApiKeyValue, out string apiKeyName))
    {
        _logger.LogWarning("Authentication unsuccessful X-API-KEY value:{xApiKeyValue}", xApiKeyValue);

        return this.Unauthorized("Unauthorized client");
    }

    _logger.LogInformation("Authentication successful X-API-KEY value:{apiKeyName}", apiKeyName);

    // Could of used AutoMapper but didn't seem worth it for one place
    Models.UplinkPayloadQueueDto payloadQueue = new()
    {
        PacketId = payloadWeb.PacketId,
        DeviceType = payloadWeb.DeviceType,
        DeviceId = payloadWeb.DeviceId,
        UserApplicationId = payloadWeb.UserApplicationId,
        OrganizationId = payloadWeb.OrganizationId,
        Data = payloadWeb.Data,
        Length = payloadWeb.Len,
        Status = payloadWeb.Status,
        SwarmHiveReceivedAtUtc = payloadWeb.HiveRxTime,
        UplinkWebHookReceivedAtUtc = DateTime.UtcNow,
        Client = apiKeyName,
    };

    _logger.LogInformation("SendAsync queue name:{QueueName}", _applicationSettings.QueueName);

    QueueClient queueClient = _queueServiceClient.GetQueueClient(_applicationSettings.QueueName);

    await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payloadQueue)));

    return this.Ok();
 }

I’ve also used dependency injection (DI) to get a QueueClient just because “it’s always better with DI”.

Azure Web App Application settings with x-api-key configuration

The “x-api-key” values can also be updated without having to redeploy the application.

Swarm Space – Uplink with Azure Functions or WebAPI

This post could have been much longer with more screen grabs and code snippets, so this is the “highlights package”. This post took a lot longer than I expected as building, testing locally, then deploying the different implementations was time consuming.

Swarm Space Connector Functions Projects

I built the projects to investigate the different options taking into account reliability, robustness, amount of code, performance (I think slow startup could be a problem). The code is very “plain” I used the default options, no copyright notices, default formatting, context sensitive error messages were used to add any required “using” statements, libraries etc.

The desktop emulator hosting the six functions

I also deployed the Azure Functions and ASP .NET Core WebAPI application to check there were no difference (beyond performance) in the way they worked. I included a “default” function (generated by the new project wizard) for reference while I was building the others.

The function application with six functions in deployed to Azure

The “dynamic” type function worked but broke when the Javascript Object Notation(JSON) was invalid, or fields were missing, and it didn’t enforce the payload was correct.

namespace WebhookHttpTrigger
{
    public static class Dynamic
    {
        [FunctionName("Dynamic")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] dynamic input,
            ILogger log)
        {
            log.LogInformation($"C# HTTP Dynamic trigger function processed a request PacketId:{input.packetId}.");

            return new OkObjectResult("Hello, This HTTP triggered Dynamic function executed successfully.");
        }
    }
}
Dynamic Trigger function failing because PacketId format was valid (x in numeric field)

The “TypedAutomagic” function worked, it ensured the Javascript Object Notation(JSON) was valid, the payload format was correct but didn’t enforce the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedAutomagic
    {
        [FunctionName("TypedAutomagic")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] UplinkPayload payload,
            ILogger log)
        {
            log.LogInformation($"C# HTTP trigger function typed TypedAutomagic UplinkPayload processed a request PacketId:{payload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered automagic function executed successfully.");
        }
    }
}
Successful execution of TypedAutomagic function

The “TypedAutomagic” implementation also detected when the JSON property values in the payload couldn’t be deserialised successfully, but if the hiveRxTime was invalid the value was set to 1/1/0001 12:00:00 am.

TypedAutomagic hiveRxTime deserialisation failing

The “TypedDeserializeObject” function worked, it ensured the Javascript Object Notation(JSON) was valid, the payload format was correct but also didn’t enforce the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedDeserializeObject
    {
        [FunctionName("TypedDeserializeObject")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] string httpPayload,
            ILogger log)
        {
            UplinkPayload uplinkPayload;

            try
            {
                uplinkPayload = JsonConvert.DeserializeObject<UplinkPayload>(httpPayload);
            }
            catch(Exception ex) 
            {
                log.LogWarning(ex, "JsonConvert.DeserializeObject failed");

                return new BadRequestObjectResult(ex.Message);
            }

            log.LogInformation($"C# HTTP trigger function typed DeserializeObject UplinkPayload processed a request PacketId:{uplinkPayload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered DeserializeObject function executed successfully.");
        }
    }
}
TypedDeserializeObject function failing because PacketId format was valid (x in numeric field)
TypedDeserializeObject function failing because deviceId value is negative but datatype is unsigned
Successful execution of TypedDeserializeObject function

The “TypedDeserializeObjectAnnotations” function worked, it ensured the Javascript Object Notation (JSON) was valid, the payload format was correct and enforced the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedDeserializeObjectAnnotations
    {
        [FunctionName("TypedDeserializeObjectAnnotations")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] string httpPayload,
            ILogger log)
        {
            UplinkPayload uplinkPayload;

            try
            {
                uplinkPayload = JsonConvert.DeserializeObject<UplinkPayload>(httpPayload);
            }
            catch (Exception ex)
            {
                log.LogWarning(ex, "JsonConvert.DeserializeObject failed");

                return new BadRequestObjectResult(ex.Message);
            }

            var context = new ValidationContext(uplinkPayload, serviceProvider: null, items: null);

            var results = new List<ValidationResult>();

            var isValid = Validator.TryValidateObject(uplinkPayload, context, results,true);

            if (!isValid)
            {
                log.LogWarning("Validator.TryValidateObject failed results:{results}", results);

                return new BadRequestObjectResult(results);
            }

            log.LogInformation($"C# HTTP trigger function typed DeserializeObject UplinkPayload processed a request PacketId:{uplinkPayload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered DeserializeObject function executed successfully.");
        }
    }
}

I built an ASP .NET Core WebAPI version with two uplink method implementations, one which used dependency injection (DI) and the other that didn’t. I also added code to validate the deserialisation of HiveRxTimeUtc.

....
[HttpPost]
public async Task<IActionResult> Post([FromBody] UplinkPayload payload)
{
    if ( payload.HiveRxTimeUtc == DateTime.MinValue)
    {
        _logger.LogWarning("HiveRxTimeUtc validation failed");

        return this.BadRequest();
    }

    QueueClient queueClient = _queueServiceClient.GetQueueClient("uplink");

    await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));

    return this.Ok();
}
...
 [HttpPost]
public async Task<IActionResult> Post([FromBody] UplinkPayload payload)
{
    // Check that the post data is good
    if (!this.ModelState.IsValid)
    {
        _logger.LogWarning("QueuedController validation failed {0}", this.ModelState.ToString());

        return this.BadRequest(this.ModelState);
    }

    if ( payload.HiveRxTimeUtc == DateTime.MinValue)
    {
        _logger.LogWarning("HiveRxTimeUtc validation failed");

        return this.BadRequest();
    }

    try
    {
        QueueClient queueClient = new QueueClient(_configuration.GetConnectionString("AzureWebApi"), "uplink");

        //await queueClient.CreateIfNotExistsAsync();

        await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));
    }
    catch (Exception ex)
    {
        _logger.LogError(ex,"Unable to open/create queue or send message", ex);

        return this.Problem("Unable to open queue (creating if it doesn't exist) or send message", statusCode: 500, title: "Uplink payload not sent");
    }

    return this.Ok();
}

In Telerik Fiddler I could see calls to the Azure Functions and the ASP .NET Core WebAPI were taking similar time to execute (Though I did see 5+ seconds) and the ASP .NET Core WebAPI appeared to take much longer to startup. (I did see 100+ seconds when I made four requests as the ASP .NET Core WebAPI was starting)

I’m going to use the ASP .NET Core WebAPI with dependency injection (DI) approach just because “it’s always better with DI”.

I noticed some other “oddness” while implementing then testing the Azure Http Trigger functions and ASP .NET Core WebAPI which I will cover off in some future posts.

Swarm Space – Underlying Architecture Revisited

After figuring out that calling a CS-Script uplink payload formatter inside an Azure Http Trigger function wasn’t going to work I needed a new architecture.

Swarm Space Azure IoT Connector Identity Translation Gateway Architecture

The new approach uses most of the existing building blocks but adds an Azure HTTP Trigger which receives the Swarm Space Bumble bee hive Webhook Delivery Method calls and writes them to an Azure Storage Queue.

Swarm Space Bumble bee hive Web Hook Delivery method

The uplink and downlink formatters are now called asynchronously so they have limited impact on the overall performance of the application.

Swarm Space – Uplink Payload Startup Problem

I initially noticed a couple of duplicate Swarm Space message PacketIds in Azure IoT Central.

Azure IoT Central with consecutive duplicate PacketIds

Then I started to pay more attention and noticed that duplicate PacketIds could be interleaved

Azure IoT Central with interleaved duplicate PacketIds

Shortly after noticing the interleaved PacketIds I checked the Delivery Method and found there were message delivery timeouts.

Swarm Space Delivery with method timeouts

In Azure Application Insights I could see that the UplinkController was taking up to 15 seconds to execute which was longer than the bumblebee hive delivery timeout.

Azure Application Insights displaying UplinkController metrics.

In Telerik Fiddler I could see calls to the UplinkController taking 16 seconds to execute. (I did see 30+ seconds)

Telerik Fiddler showing duration of Uplink controller calls

To see if the problem was loading CS-Script I added code to load a simple function as the application started. After averaging the duration over many executions there was little difference in the duration.

public interface IApplication
{
    public DateTime Startup(DateTime utcNow);
}
...
protected override async Task ExecuteAsync(CancellationToken cancellationToken)
{
    await Task.Yield();

    _logger.LogInformation("StartUpService.ExecuteAsync start");
            
    // Force the loading and startup of CS Script evaluator
    dynamic application = CSScript.Evaluator
        .LoadCode(
                @"using System;
                public class Application : IApplication
                {
                    public DateTime Startup(DateTime utcNow)
                    {
                        return utcNow;
                    }
                }");

    DateTime result = application.Startup(DateTime.UtcNow);
            
    try
    {
        await _swarmSpaceBumblebeeHive.Login(cancellationToken);

       await _azureIoTDeviceClientCache.Load(cancellationToken);
    }
    catch (Exception ex)
    {
        _logger.LogError(ex, "StartUpService.ExecuteAsync error");

        throw;
    }

    _logger.LogInformation("StartUpService.ExecuteAsync finish");
}

The Swarm Eval Kit uplink formatter (UserApplicationId 65535.cs) “unpacks” the uplink Javascript ObjectNotation(JSON) message, adds an Azure IoT Central compatible location which requires a number of libraries to be loaded.

using System;
using System.Globalization;
using System.Text;

using Microsoft.Azure.Devices.Client;

using Newtonsoft.Json;
using Newtonsoft.Json.Linq;

public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
    public Message Evaluate(int organisationId, int deviceId, int deviceType, int userApplicationId, JObject telemetryEvent, JObject payloadJson, string payloadText, byte[] payloadBytes)
    {
        if ((payloadText != "") && (payloadJson != null))
        {
            JObject location = new JObject();

            location.Add("lat", payloadJson.GetValue("lt"));
            location.Add("lon", payloadJson.GetValue("ln"));
            location.Add("alt", payloadJson.GetValue("a"));

            telemetryEvent.Add("DeviceLocation", location);
        }

        Message ioTHubmessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent)));

        ioTHubmessage.Properties.Add("iothub-creation-time-utc", DateTimeOffset.FromUnixTimeSeconds((long)payloadJson.GetValue("d")).ToString("s", CultureInfo.InvariantCulture));

        return ioTHubmessage;
    }
}

I then added code to load the most complex uplink and downlink formatters as the application started. There was a significant reduction in the UplinkController execution durations, but it could still take more than 30 seconds.

try
{
    await _swarmSpaceBumblebeeHive.Login(cancellationToken);

    await _azureIoTDeviceClientCache.Load(cancellationToken);

    await _formatterCache.UplinkGetAsync(65535);

    await _formatterCache.DownlinkGetAsync(20);
}
catch (Exception ex)
{
    _logger.LogError(ex, "StartUpService.ExecuteAsync error");

    throw;
}

I then added detailed telemetry to the code and found that the duration (also variability) was a combination of Azure IoT Device Provisoning Service(DPS) registration, Azure IoT Hub connection establishment, CS-Script payload formatter loading/compilation/execution, application startup tasks and message uploading durations.

After much experimentation It looks like that “synchronously” calling the payload processing code from the Uplink controller is not a viable approach as the Swarm Space Bumblebee hive calls regularly timeout resulting in duplicate messages.