TTI V3 Gateway provisioning Dragino LHT65 Uplink

This very long post is about how to connect a Dragino LHT65 Temperature and Humidity sensor to Azure IoT Central using my TTI/TTN V3Azure IoT Connector and the Digital Twin Definition Language (DTDL).

Dragino LHT65 temperature and Humidity sensor

The first step was to add an application(dragino-lht65) in my The Things Industries(TTI) tenant

TTI/TTN application for my Dragino LHT65 devices
Adding devMobile as a collaborator on the new application
TTI Application API Key configuration

The new Application API Key used by the MQTTnet managed client only needs to have write downlink and read uplink traffic enabled.

FTDI Adapter and modified LHT64 cable

So I could reliably connect to my LHT65 devices to configure them I modified a programming cable so I could use it with a spare FTDI adaptor without jumper wires. Todo this I used a small jewelers screwdriver to “pop” out the VCC cable and move the transmit data line.

After entering the device password and checking the firmware version I used the AT+CFG command to display the device settings

AT+CFG: Print all configurations

[334428]***** UpLinkCounter= 0 *****
[334430]TX on freq 923200000 Hz at DR 2
[334804]txDone
[339807]RX on freq 923200000 Hz at DR 2
[339868]rxTimeOut
[340807]RX on freq 923200000 Hz at DR 2
[340868]rxTimeOut

Correct Password

Stop Tx events,Please wait for all configurations to print
Printf all config...
AT+DEUI=a8 .. .. .. .. .. .. d6
AT+DADDR=01......D6

AT+APPKEY=9d .. .. .. .. .. .. .. .. .. .. .. .. .. .. 2e
AT+NWKSKEY=f6 .. .. .. .. .. .. .. .. .. .. .. .. .. .. 69
AT+APPSKEY=4c 35 .. .. .. .. .. .. .. .. .. .. .. .. .. 3d
AT+APPEUI=a0 .. .. .. .. .. .. 00
AT+ADR=1
AT+TXP=0
AT+DR=0
AT+DCS=0
AT+PNM=1
AT+RX2FQ=923200000
AT+RX2DR=2
AT+RX1DL=1000
AT+RX2DL=2000
AT+JN1DL=5000
AT+JN2DL=6000
AT+NJM=1
AT+NWKID=00 00 00 00
AT+FCU=0
AT+FCD=0
AT+CLASS=A
AT+NJS=0
AT+RECVB=0:
AT+RECV=0:
AT+VER=v1.7 AS923

AT+CFM=0
AT+CFS=0
AT+SNR=0
AT+RSSI=0
AT+TDC=1200000
AT+PORT=2
AT+PWORD=123456
AT+CHS=0
AT+DATE=21/3/26 07:49:15
AT+SLEEP=0
AT+EXT=4,2
AT+RTP=20
AT+BAT=3120
AT+WMOD=0
AT+ARTEMP=-40,125
AT+CITEMP=1
Start Tx events OK


[399287]***** UpLinkCounter= 0 *****

[399289]TX on freq 923400000 Hz at DR 2

[399663]txDone

[404666]RX on freq 923400000 Hz at DR 2

[404726]rxTimeOut

[405666]RX on freq 923200000 Hz at DR 2

[405726]rxTimeOut

I copied the AppEUI and DevEUI for use on the TI Dragino LHT65 Register end device form provided by the TTI/TTN.

TTYI/TTN Dragino LHT65 Register end device

The Dragino LHT65 uses the DeviceEUI as the DeviceID which meant I had todo more redaction in my TTI/TTN and Azure Application Insights screen captures. The rules around the re-use of EndDevice ID were a pain in the arse(PITA) in my development focused tenant.

Dragino LHT 65 Device uplink payload formatter

The connector supports both uplink and downlink messages with JSON encoded payloads. The Dragino LHT65 has a vendor supplied formatter which is automatically configured when an EndDevice is created. The EndDevice formatter configuration can also be overridden at the Application level in the app.settings.json file.

Device Live Data Uplink Data Payload

Once an EndDevice is configured in TTI/TTN I usually use the “Live data Uplink Payload” to work out the decoded payload JSON property names and data types.

LHT65 Uplink only Azure IoT Central Device Template
LHT65 Device Template View Identity

For Azure IoT Central “automagic” provisioning the DTDLModelId has to be copied from the Azure IoT Central Template into the TTI/TTN EndDevice or app.settings.json file application configuration.

LHT65 Device Template copy DTDL @ID
TTI EndDevice configuring the DTDLV2 @ID at the device level

Configuring the DTDLV2 @ID at the TTI application level in the app.settings.json file

{
  "Logging": {
    "LogLevel": {
      "Default": "Debug",
      "Microsoft": "Debug",
      "Microsoft.Hosting.Lifetime": "Debug"
    },
    "ApplicationInsights": {
      "LogLevel": {
        "Default": "Debug"
      }
    }
  },

  "ProgramSettings": {
    "Applications": {
      "application1": {
        "AzureSettings": {
          "DeviceProvisioningServiceSettings": {
            "IdScope": "0ne...DD9",
            "GroupEnrollmentKey": "eFR...w=="
          }
        },
        "DTDLModelId": "dtmi:ttnv3connectorclient:FezduinoWisnodeV14x8;4",
        "MQTTAccessKey": "NNSXS.HCY...RYQ",
        "DeviceIntegrationDefault": false,
        "MethodSettings": {
          "Reboot": {
            "Port": 21,
            "Confirmed": true,
            "Priority": "normal",
            "Queue": "push"
          },
          "value_0": {
            "Port": 30,
            "Confirmed": true,
            "Priority": "normal",
            "Queue": "push"
          },
          "value_1": {
            "Port": 30,
            "Confirmed": true,
            "Priority": "normal",
            "Queue": "push"
          },
          "TemperatureOOBAlertMinimumAndMaximum": {
            "Port": 30,
            "Confirmed": true,
            "Priority": "normal",
            "Queue": "push"
          }
        }
      },
      "seeeduinolorawan": {
        "AzureSettings": {
          "DeviceProvisioningServiceSettings": {
            "IdScope": "0ne...DD9",
            "GroupEnrollmentKey": "AtN...g=="
          },
        },
        "DTDLModelId": "dtmi:ttnv3connectorclient:SeeeduinoLoRaWAN4cz;1",
        "MQTTAccessKey": "NNSXS.V44...42A",
        "DeviceIntegrationDefault": true,
        "DevicePageSize": 10
      },
      "dragino-lht65": {
        "AzureSettings": {
          "DeviceProvisioningServiceSettings": {
            "IdScope": "0ne...DD9",
            "GroupEnrollmentKey": "SLB...w=="
          }
        },
        "DTDLModelId": "dtmi:ttnv3connectorclient:DraginoLHT656w6;1",
        "MQTTAccessKey": "NNSXS.RIJ...NZQ",
        "DeviceIntegrationDefault": true,
        "DevicePageSize": 10
      }
    },
    "TheThingsIndustries": {
      "MqttServerName": "eu1.cloud.thethings.industries",
      "MqttClientId": "MQTTClient",
      "MqttAutoReconnectDelay": "00:00:05",
      "Tenant": "...-test",
      "ApiBaseUrl": "https://...-test.eu1.cloud.thethings.industries/api/v3",
      "ApiKey": "NNSXS.NR7...ZSA",
      "Collaborator": "devmobile",
      "DevicePageSize": 10,
      "DeviceIntegrationDefault": true
    }
  }
}

The Azure Device Provisioning Service(DPS) is configured at the TTI application level in the app.settings.json file. The IDScope and one of the Primary or Secondary Shared Access Signature(SAS) keys should be copied into DeviceProvisioningServiceSettings of an Application in the app.settings.json file. I usually set the “Automatically connect devices in this group” flag as part of the “automagic” provisioning process.

Azure IoT Central Group Enrollment Key
Then device templates need to be mapped to an Enrollment Group then Device Group.

For testing the connector application can be run locally with diagnostic information displayed in the application console window as it “automagically’ provisions devices and uploads telemetry data.

Connector application Diagnostics
Azure IoT Central Device list before my LHT65 device is “automagically” provisioned
Azure IoT Central Device list after my LHT65 device is “automagically” provisioned

One a device has been provisioned I check on the raw data display that all the fields I configured have been mapped correctly.

Azure IoT Central raw data display

I then created a dashboard to display the telemetry data from the LHT65 sensors.

Azure IoT Central dashboard displaying LHT65 temperature, humidity and battery voltage graphs.

The dashboard also has a few KPI displays which highlighted an issue which occurs a couple of times a month with the LHT65 onboard temperature sensor values (327.7°). I have connected Dragino technical support and have also been unable to find a way to remove the current an/or filter out future aberrant values.

Azure Application Insights logging

I also noticed that the formatting of the DeviceEUI values in the Application Insights logging was incorrect after trying to search for one of my Seeedstudio LoRaWAN device with its DeviceEUI.

The Things Network Cayenne LPP Support

Uplink Encoding

In my applications the myDevices Cayenne Low power payload(LPP) uplink messages from my *duino devices are decoded by the built in The Things Network(TTN) decoder. I can also see the nicely formatted values in the device data view.

Downlink Encoding

I could successfully download raw data to the device but I found that manually unpacking it on the device was painful.

Raw data

I really want to send LPP formatted messages to my devices so I could use a standard LPP library. I initially populated the payload fields in the downlink message JSON. The TTN documentation appeared to indicate this was possible.

Download JSON payload format

Initially I tried a more complex data type because I was looking at downloading a location to the device.

Complex data type

I could see nicely formatted values in the device data view but they didn’t arrive at the device. I then tried simpler data type to see if the complex data type was an issue.

Simple Data Types

At this point I asked a few questions on the TTN forums and started to dig into the TTN source code.

Learning Go on demand

I had a look at the TTB Go code and learnt a lot as I figured out how the “baked in “encoder/decoder worked. I haven’t done any Go coding so it took a while to get comfortable with the syntax. The code my look a bit odd as a Pascal formatter was the closest I could get to Go.

In core/handler/cayennelpp/encoder.go there was

func (e *Encoder) Encode(fields map[string]interface{}, fPort uint8) ([]byte, bool, error) and func (d *Decoder) Decode(payload []byte, fPort uint8) (map[string]interface{}, bool, error)

Which was a positive sign…

Then in core/handler/convert_fields.go there are these two functions

> // ConvertFieldsUp converts the payload to fields using the application's payload formatter
> func (h *handler) ConvertFieldsUp(ctx ttnlog.Interface, _ *pb_broker.DeduplicatedUplinkMessage, appUp *types.UplinkMessage, dev *device.Device) error {
> 	// Find Application

and

> // ConvertFieldsDown converts the fields into a payload
> func (h *handler) ConvertFieldsDown(ctx ttnlog.Interface, appDown *types.DownlinkMessage, ttnDown *pb_broker.DownlinkMessage, _ *device.Device) error {

Then further down in the second function is this call

var encoder PayloadEncoder
	switch app.PayloadFormat {
	case application.PayloadFormatCustom:
		encoder = &CustomDownlinkFunctions{
			Encoder: app.CustomEncoder,
			Logger:  functions.Ignore,
		}
	case application.PayloadFormatCayenneLPP:
		encoder = &cayennelpp.Encoder{}
	default:
		return nil
	}var encoder PayloadEncoder
	switch app.PayloadFormat {
	case application.PayloadFormatCustom:
		encoder = &CustomDownlinkFunctions{
			Encoder: app.CustomEncoder,
			Logger:  functions.Ignore,
		}
	case application.PayloadFormatCayenneLPP:
		encoder = &cayennelpp.Encoder{}
	default:
		return nil
	}

Which I think calls

// Encode encodes the fields to CayenneLPP
func (e *Encoder) Encode(fields map[string]interface{}, fPort uint8) ([]byte, bool, error) {
	encoder := protocol.NewEncoder()
	for name, value := range fields {
		key, channel, err := parseName(name)
		if err != nil {
			continue
		}
		switch key {
		case valueKey:
			if val, ok := value.(float64); ok {
				encoder.AddPort(channel, float32(val))
			}
		}
	}
	return encoder.Bytes(), true, nil
}

Then right down at the very bottom of the call stack in keys.go

func parseName(name string) (string, uint8, error) {
	parts := strings.Split(name, "_")
	if len(parts) < 2 {
		return "", 0, errors.New("Invalid name")
	}
	key := strings.Join(parts[:len(parts)-1], "_")
	if key == "" {
		return "", 0, errors.New("Invalid key")
	}
	channel, err := strconv.Atoi(parts[len(parts)-1])
	if err != nil {
		return "", 0, err
	}
	if channel < 0 || channel > 255 {
		return "", 0, errors.New("Invalid range")
	}
	return key, uint8(channel), nil
}

At this point I started to hit the limits of my Go skills but with some trial and error I figured it out…

Executive Summary

The downlink payload values are sent as 2 byte floats with a sign bit, 100 multiplier. The fields have to be named “value_X” where X is is a byte value.

Dictionary<string, object> payloadFields = new Dictionary<string, object>();
payloadFields.Add(“value_0”, 0.0);
//00-00-00
payloadFields.Add(“value_1”, 1.0);
//01-00-64
payloadFields.Add(“value_2”, 2.0);
//02-00-C8
payloadFields.Add(“value_3”, 3.0);
//03-01-2C
payloadFields.Add(“value_4”, 4.0);
//04-01-90

payloadFields.Add(“value_0”, -0.0);
//00-00-00
payloadFields.Add(“value_1”, -1.0);
//01-FF-9C
payloadFields.Add(“value_2”, -2.0);
//02-FF-38
payloadFields.Add(“value_3”, -3.0);
//03-FE-D4
payloadFields.Add(“value_4”, -4.0);
//04-FE-70

I could see these arrive on my TinyCLR plus RAK811 device and could manually unpack them

The stream of bytes can be decoded on an Arduino using the electronic cats library (needs a small modification) with code this

byte data[] = {0xff,0x38} ; // bytes which represent -2 
float value = lpp.getValue( data, 2, 100, 1);
Serial.print("value:");
Serial.println(value);

It is possible to use the “baked” in Cayenne Encoder/Decoder to send payload fields to a device but I’m not certain is this is quite what myDevices/TTN intended.

The Things Network HTTP Integration Part13

Connection multiplexing

For the Proof of Concept(PoC) I had used a cache to store Azure IoT Hub connections to reduce the number of calls to the Device Provisioning Service(DPS).

Number of connections with no pooling

When stress testing with 1000’s of devices my program hit the host connection limit so I enabled Advanced Message Queuing Protocol(AMQP) connection pooling.

return DeviceClient.Create(result.AssignedHub,
                  authentication,
                  new ITransportSettings[]
                  {
                     new AmqpTransportSettings(TransportType.Amqp_Tcp_Only)
                     {
                        PrefetchCount = 0,
                        AmqpConnectionPoolSettings = new AmqpConnectionPoolSettings()
                        {
                           Pooling = true,
                        }
                     }
                  }
               );

My first attempt failed as I hadn’t configured “TransportType.Amqp_Tcp_Only” which would have allowed the AMQP implementation to fallback to other protocols which don’t support pooling.

Exception caused by not using TransportType.Amqp_Tcp_Only

I then deployed the updated code and ran my 1000 device stress test (note the different x axis scales)

Number of connections with pooling

This confirmed what I found in the Azure.AMQP source code

/// <summary>
/// The default size of the pool
/// </summary>
/// <remarks>
/// Allows up to 100,000 devices
/// </remarks>
/// private const uint DefaultPoolSize = 100;

The Things Network HTTP Integration Part12

Removing the DIY cache

For the Proof of Concept(PoC) I had written a simple cache using a ConcurrentDictionary to store Azure IoT Hub connections to reduce the number of calls to the Device Provisioning Service(DPS).

Device Provisioning Service calls in stress test

For a PoC the DIY cache was ok but I wanted to replace it with something more robust like the .Net ObjectCache which is in the System.Runtime.Caching namespace.

I started by replacing the ConcurrentDictionary declaration

static readonly ConcurrentDictionary<string, DeviceClient> DeviceClients = new ConcurrentDictionary<string, DeviceClient>();
     

With an ObjectCache declaration.

static readonly ObjectCache DeviceClients = MemoryCache.Default;
  

Then, where there were compiler errors I updated the method call.

// See if the device has already been provisioned or is being provisioned on another thread.
if (DeviceClients.Add(registrationId, deviceContext, cacheItemPolicy))
{
   log.LogInformation("RegID:{registrationId} Device provisioning start", registrationId);
...

One difference I found was that ObjectCache throws an exception if the value is null. I was using a null value to indicate that the Device Provisioning Service(DPS) process had been initiated on another thread and was underway.

I have been planning to add support for downlink messages so I added a new class to store the uplink (Azure IoT Hub DeviceClient) and downlink ( downlink_url in the uplink message) details.

 public class DeviceContext
   {
      public DeviceClient Uplink { get; set; }
      public Uri Downlink { get; set; }
   }

For the first version the only functionality I’m using is sliding expiration which is set to one day

CacheItemPolicy cacheItemPolicy = new CacheItemPolicy()
{
   SlidingExpiration = new TimeSpan(1, 0, 0, 0),
   //RemovedCallback
};

DeviceContext deviceContext = new DeviceContext()
{
   Uplink = null,
   Downlink = new Uri(payload.DownlinkUrl)
};

I didn’t have to make many changes and I’ll double check my implementation in the next round of stress and soak testing.

The Things Network HTTP Integration Part11

Moving Secrets to KeyVault

The application configuration file contained sensitive information like Device Provision Service(DPS) Group Enrollment Symmetric Keys and Azure IoT Hub connection strings which is OK for a proof of concept (PoC) but sub-optimal for production deployments.

"DeviceProvisioningService": {
      "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
      "ScopeID": "",
      "EnrollmentGroupSymmetricKeyDefault": "TopSecretKey",
      "DeviceProvisioningPollingDelay": 500,
      "ApplicationEnrollmentGroupMapping": {
         "Application1": "TopSecretKey1",
         "Application2": "TopSecretKey2"
      }
   }

The Azure Key Vault is intended for securing sensitive information like connection strings so I added one to my resource group.

Azure Key Vault overview and basic metrics

I wrote a wrapper which resolves configuration settings based on the The Things Network(TTN) application identifier and port information in the uplink message payload. The resolve methods start by looking for configuration for the applicationId and port (separated by a – ), then the applicationId and then finally falling back to a default value. This functionality is used for AzureIoTHub connection strings, DPS IDScopes, DPS Enrollment Group Symmetric Keys, and is also used to format the cache keys.

public class ApplicationConfiguration
{
const string DpsGlobaDeviceEndpointDefault = "global.azure-devices-provisioning.net";

private IConfiguration Configuration;

public void Initialise( )
{
   // Check that KeyVault URI is configured in environment variables. Not a lot we can do if it isn't....
   if (Configuration == null)
   {
      string keyVaultUri = Environment.GetEnvironmentVariable("KeyVaultURI");
      if (string.IsNullOrWhiteSpace(keyVaultUri))
      {
         throw new ApplicationException("KeyVaultURI environment variable not set");
      }

      // Load configuration from KeyVault 
      Configuration = new ConfigurationBuilder()
         .AddEnvironmentVariables()
         .AddAzureKeyVault(keyVaultUri)
         .Build();
   }
}

public string DpsGlobaDeviceEndpointResolve()
{
   string globaDeviceEndpoint = Configuration.GetSection("DPSGlobaDeviceEndpoint").Value;
   if (string.IsNullOrWhiteSpace(globaDeviceEndpoint))
   {
      globaDeviceEndpoint = DpsGlobaDeviceEndpointDefault;
   }

   return globaDeviceEndpoint;
}

public string ConnectionStringResolve(string applicationId, int port)
{
   // Check to see if there is application + port specific configuration
   string connectionString = Configuration.GetSection($"AzureIotHubConnectionString-{applicationId}-{port}").Value;
   if (!string.IsNullOrWhiteSpace(connectionString))
   {
      return connectionString;
   }

   // Check to see if there is application specific configuration, otherwise run with default
   connectionString = Configuration.GetSection($"AzureIotHubConnectionString-{applicationId}").Value;
   if (!string.IsNullOrWhiteSpace(connectionString))
   {
      return connectionString;
   }

   // get the default as not a specialised configuration
   connectionString = Configuration.GetSection("AzureIotHubConnectionStringDefault").Value;

   return connectionString;
}

public string DpsIdScopeResolve(string applicationId, int port)
{
   // Check to see if there is application + port specific configuration
   string idScope = Configuration.GetSection($"DPSIDScope-{applicationId}-{port}").Value;
   if (!string.IsNullOrWhiteSpace(idScope))
   {
      return idScope;
   }

   // Check to see if there is application specific configuration, otherwise run with default
   idScope = Configuration.GetSection($"DPSIDScope-{applicationId}").Value;
   if (!string.IsNullOrWhiteSpace(idScope))
   {
      return idScope;
   }

   // get the default as not a specialised configuration
   idScope = Configuration.GetSection("DPSIDScopeDefault").Value;

   if (string.IsNullOrWhiteSpace(idScope))
   {
      throw new ApplicationException($"DPSIDScope configuration invalid");
   }

   return idScope;
}

The values of Azure function configuration settings are replaced by a reference to the secret in the Azure Key Vault.

Azure Function configuration value replacement

In the Azure Key Vault “Access Policies” I configured an “Application Access Policy” so my Azure TTNAzureIoTHubMessageV2Processor function identity could retrieve secrets.

Azure Key Vault Secrets

I kept on making typos in the secret names and types which was frustrating.

Azure Key Vault secret

While debugging in Visual Studio you may need to configure the Azure Identity so the application can access the Azure Key Vault.

Azure IoT Hub MQTT/AMQP oddness

This is a long post which covers some oddness I noticed when changing the protocol used by an Azure IoT Hub client from Message Queuing Telemetry Transport(MQTT) to Advanced Message Queuing Protocol (AMQP). I want to build a console application to test the pooling of AMQP connections so I started with an MQTT client written for another post.

class Program
{
   private static string payload;

   static async Task Main(string[] args)
   {
      string filename;
      string azureIoTHubconnectionString;
      DeviceClient azureIoTHubClient;

      if (args.Length != 2)
      {
         Console.WriteLine("[JOSN file] [AzureIoTHubConnectionString]");
         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
         return;
      }

      filename = args[0];
      azureIoTHubconnectionString = args[1];

      try
      {
         payload = File.ReadAllText(filename);

         // Open up the connection
         azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Mqtt);
         //azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Mqtt_Tcp_Only);
         //azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Mqtt_WebSocket_Only);

         await azureIoTHubClient.OpenAsync();

         await azureIoTHubClient.SetMethodDefaultHandlerAsync(MethodCallbackDefault, null);

         Timer MessageSender = new Timer(TimerCallback, azureIoTHubClient, new TimeSpan(0, 0, 10), new TimeSpan(0, 0, 10));


         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }
      catch (Exception ex)
      {
         Console.WriteLine(ex.Message);
         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }
   }

   public static async void TimerCallback(object state)
   {
      DeviceClient azureIoTHubClient = (DeviceClient)state;

      try
      {
         // I know having the payload as a global is a bit nasty but this is a demo..
         using (Message message = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(payload))))
         {
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync start", DateTime.UtcNow);
            await azureIoTHubClient.SendEventAsync(message);
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync finish", DateTime.UtcNow);
         }
      }
      catch (Exception ex)
      {
         Console.WriteLine(ex.Message);
      }
   }

   private static async Task<MethodResponse> MethodCallbackDefault(MethodRequest methodRequest, object userContext)
   {
      Console.WriteLine($"Default handler method {methodRequest.Name} was called.");

      return new MethodResponse(200);
   }
}

I configured an Azure IoT hub then used Azure IoT explorer to create a device and get the connections string for my application. After fixing up the application’s command line parameters I could see the timer code was successfully sending telemetry messages to my Azure IoT Hub. I also explored the different MQTT connections options TransportType.Mqtt, TransportType.Mqtt_Tcp_Only, and TransportType.Mqtt_WebSocket_Only which worked as expected.

MQTT Console application displaying sent telemetry
Azure IoT Hub displaying received telemetry

I could also initiate Direct Method calls to my console application from Azure IoT explorer.

Azure IoT Explorer initiating a Direct Method
MQTT console application displaying direct method call.

I then changed the protocol to AMQP

class Program
{
   private static string payload;

   static async Task Main(string[] args)
   {
      string filename;
      string azureIoTHubconnectionString;
      DeviceClient azureIoTHubClient;
      Timer MessageSender;

      if (args.Length != 2)
      {
         Console.WriteLine("[JOSN file] [AzureIoTHubConnectionString]");
         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
         return;
      }

      filename = args[0];
      azureIoTHubconnectionString = args[1];

      try
      {
         payload = File.ReadAllText(filename);

         // Open up the connection
         azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Amqp);
         //azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Amqp_Tcp_Only);
         //azureIoTHubClient = DeviceClient.CreateFromConnectionString(azureIoTHubconnectionString, TransportType.Amqp_WebSocket_Only);

         await azureIoTHubClient.OpenAsync();

         await azureIoTHubClient.SetMethodDefaultHandlerAsync(MethodCallbackDefault, null);

         //MessageSender = new Timer(TimerCallbackAsync, azureIoTHubClient, new TimeSpan(0, 0, 10), new TimeSpan(0, 0, 10));
         MessageSender = new Timer(TimerCallbackSync, azureIoTHubClient, new TimeSpan(0, 0, 10), new TimeSpan(0, 0, 10));

#if MESSAGE_PUMP
         Console.WriteLine("Press any key to exit");
         while (!Console.KeyAvailable)
         {
            await Task.Delay(100);
         }
#else
         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
#endif
      }
      catch (Exception ex)
      {
         Console.WriteLine(ex.Message);
         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }
   }

   public static async void TimerCallbackAsync(object state)
   {
      DeviceClient azureIoTHubClient = (DeviceClient)state;

      try
      {
         // I know having the payload as a global is a bit nasty but this is a demo..
         using (Message message = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(payload))))
         {
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync start", DateTime.UtcNow);
            await azureIoTHubClient.SendEventAsync(message);
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync finish", DateTime.UtcNow);
         }
      }
      catch (Exception ex)
      {
         Console.WriteLine(ex.Message);
      }
   }

   public static void TimerCallbackSync(object state)
   {
      DeviceClient azureIoTHubClient = (DeviceClient)state;

      try
      {
         // I know having the payload as a global is a bit nasty but this is a demo..
         using (Message message = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(payload))))
         {
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync start", DateTime.UtcNow);
            azureIoTHubClient.SendEventAsync(message).GetAwaiter();
            Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync finish", DateTime.UtcNow);
         }
      }
      catch (Exception ex)
      {
         Console.WriteLine(ex.Message);
      }
   }


   private static async Task<MethodResponse> MethodCallbackDefault(MethodRequest methodRequest, object userContext)
   {
      Console.WriteLine($"Default handler method {methodRequest.Name} was called.");

      return new MethodResponse(200);
   }
}

In the first version of my console application I could see the SendEventAsync method was getting called but was not returning

AMQP Console application displaying sent telemetry failure

Even though the SendEventAsync call was not returning the telemetry messages were making it to my Azure IoT Hub.

Azure IoT Hub displaying AMQP telemetry

When I tried to initiate a Direct Method call from Azure IoT Explorer it failed after a while with a timeout.

Azure IoT Explorer initiating a Direct Method

The first successful approach I tried was to change the Console.Readline to a “message pump” (flashbacks to Win32 API programming).

Console.WriteLine("Press any key to exit");
while (!Console.KeyAvailable)
{
   await Task.Delay(100);
}

After some more experimentation I found that changing the timer method from asynchronous to synchronous also worked.

public static void TimerCallbackSync(object state)
{
   DeviceClient azureIoTHubClient = (DeviceClient)state;

   try
   {
      // I know having the payload as a global is a bit nasty but this is a demo..
      using (Message message = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(payload))))
      {
         Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync start", DateTime.UtcNow);
         azureIoTHubClient.SendEventAsync(message).GetAwaiter();
         Console.WriteLine(" {0:HH:mm:ss} AzureIoTHubDeviceClient SendEventAsync finish", DateTime.UtcNow);
      }
   }
   catch (Exception ex)
   {
      Console.WriteLine(ex.Message);
   }
}

I also had to change the method declaration and modify the SendEventAsync call to use a GetAwaiter.

AMQP Console application displaying sent telemetry
Azure IoT Hub displaying received telemetry
Azure IoT Explorer initiating a Direct Method
MQTT console application displaying direct method call.

It took a while to figure out enough about what was going on so I could do a search with the right keywords (DeviceClient AMQP async await SendEventAsync) to confirm my suspicion that MQTT and AMQP clients did behave differently.

For anyone who reads this post, I think this Github issue about task handling and blocking calls is most probably the answer (October 2020).

The Things Network HTTP Azure IoT Integration Soak Testing

I wanted to do some testing to make sure the application would reliably process messages from 1000’s of devices…

The first thing I learnt was “don’t forget to restart your Azure Function after deleting all the devices from the Azure IoT Hub” as the DeviceClients are cached. Also make sure you delete the devices from both your Azure Device Provisioning service(DPS) and Azure IoT Hub instances.

Applications Insights provisioning event tracking

The next “learning” was that if you forget to enable “always on” the caching won’t work and your application will call the DPS way more often than expected.

Azure Application “always on configuration

The next “learning” was if your soak test sends 24000 messages it will start to fail just after you go out to get a coffee because of the 8000 msgs/day limit on the free version of IoT Hub.

Azure IoT Hub Free tier 8000 messages/day limit

After these “learnings” the application appeared to be working and every so often a message would briefly appear in Azure Storage Explorer queue view.

Azure storage explorer view of uplink messages queue

The console test application simulated 1000 devices sending 24 messages every so often and took roughly 8 hours to complete.

Message generator finished

In the Azure IoT Hub telemetry 24000 messages had been received after roughly 8 hours confirming the test rig was working as expected.

The notch was another “learning”, if you go and do some gardening then after roughly 40 minutes of inactivity your desktop PC will go into power save mode and the test client will stop sending messages.

The caching of settings appeared to be work as there were only a couple of requests to my Azure Key Vault where sensitive information like connection strings, symmetric keys etc. are stored.

Memory consumption did look to bad and topped out at roughly 120M.

In the application logging you can see the 1000 calls to DPS at the beginning (the yellow dependency events) then the regular processing of messages.

Application Insights logging

Even with the “learnings” the testing went pretty well overall. I do need to run the test rig for longer and with even more simulated devices.

I think this should do

48K Telemetry messages

If you get lots of errors in the logs “Host thresholds exceeded: [Connections]…. might need to bump your plan to something a bit larger

The Things Network HTTP Azure IoT Central Integration

This post is an overview of the Azure IoT Central configuration required to process The Things Network(TTN) HTTP integration uplink messages. I have assumed that the reader is already reasonably familiar with these products. There is an overview of configuring TTN HTTP integration in my “Simplicating and securing the HTTP handler” post.

The first step is to copy the IDScope from the Device connection blade.

Device connection blade

Then copy one of the primary or secondary keys

For more complex deployment the ApplicationEnrollmentGroupMapping configuration enables The Things Network(TTN) devices to be provisioned using different GroupEnrollment keys based on the applicationid in the Uplink message which initiates their provisoning.

"DeviceProvisioningService": {
      "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
      "IDScope": "",
      "EnrollmentGroupSymmetricKeyDefault": "TopSecretKey",
      "DeviceProvisioningPollingDelay": 500,
      "ApplicationEnrollmentGroupMapping": {
         "Application1": "TopSecretKey1",
         "Application2": "TopSecretKey2"
      }
   }

Shortly after the first uplink message from a TTN device is processed, it will listed in the “Unassociated devices” blade with the DevEUI as the Device ID.

Unassociated devices blade

The device can then be associated with an Azure IoT Central Device Template.

Unassociated devices blade showing recently associated device

The device template provides for the mapping of uplink message payload_fields to measurements. In this example the payload field has been generated by the TTN HTTP integration Cayenne Low Power Protocol(LPP) decoder. Many LoRaWAN devices use LPP to minimise the size of the network payload.

Azure IoT Central Device template blade

Once the device has been associated with a template a user friendly device name etc. can be configured.

Azure IoT Central Device properties blade

In the telemetry event payload sent to Azure IoT Central there are some extra fields to help with debugging and tracing.

// Assemble the JSON payload to send to Azure IoT Hub/Central.
log.LogInformation($"{messagePrefix} Payload assembly start");
JObject telemetryEvent = new JObject();
try
{
   JObject payloadFields = (JObject)payloadObect.payload_fields;
   telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
   telemetryEvent.Add("Retry", payloadObect.is_retry);
   telemetryEvent.Add("Counter", payloadObect.counter);
   telemetryEvent.Add("DeviceID", payloadObect.dev_id);
   telemetryEvent.Add("ApplicationID", payloadObect.app_id);
   telemetryEvent.Add("Port", payloadObect.port);
   telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
   telemetryEvent.Add("ReceivedAtUTC", payloadObect.metadata.time);

   // If the payload has been unpacked in TTN backend add fields to telemetry event payload
   if (payloadFields != null)
   {
      foreach (JProperty child in payloadFields.Children())
      {
         EnumerateChildren(telemetryEvent, child);
      }
   }
}
catch (Exception ex)
{
   log.LogError(ex, $"{messagePrefix} Payload processing or Telemetry event assembly failed");
   throw;
}

Azure IoT Central has mapping functionality which can be used to display the location of a device.

Azure Device

The format of the location payload generated by the TTN LPP decoder is different to the one required by Azure IoT Central. I have added temporary code (“a cost effective modification to expedite deployment” aka. a hack) to format the TelemetryEvent payload so it can be processed.

if (token.First is JValue)
{
   // Temporary dirty hack for Azure IoT Central compatibility
   if (token.Parent is JObject possibleGpsProperty)
   {
      if (possibleGpsProperty.Path.StartsWith("GPS", StringComparison.OrdinalIgnoreCase))
      {
         if (string.Compare(property.Name, "Latitude", true) == 0)
         {
            jobject.Add("lat", property.Value);
         }
         if (string.Compare(property.Name, "Longitude", true) == 0)
         {
            jobject.Add("lon", property.Value);
         }
         if (string.Compare(property.Name, "Altitude", true) == 0)
         {
            jobject.Add("alt", property.Value);
         }
      }
   }
   jobject.Add(property.Name, property.Value);
}

I need review the IoT Plug and Play specification documentation to see what other payload transformations maybe required.

I did observe that if a device had not reported its position the default location was zero degrees latitude and zero degrees longitude which is about 610 KM south of Ghana and 1080 KM west of Gabon in the Atlantic Ocean.

Azure IoT Central mapping default position

After configuring a device template, associating my devices with the template, and modifying each device’s properties I could create a dashboard to view the temperature and humidity information returned by my Seeeduino LoRaWAN devices.

Azure IoT Central dashboard

The Things Network HTTP Azure IoT Hub Integration

This post provides an overview of the required Azure Device Provisioning Service(DPS) and Azure IoT Hub configuration to process The Things Network(TTN) HTTP integration uplink messages. I have assumed that the reader is already familiar with all these products. There is an overview of configuring TTN HTTP integration in my “Simplicating and securing the HTTP handler” post.

The first step is to configure a DPS Enrollment Group

DPS Group Enrollment blade

The scopeID and the primary/secondary key need to be configured in the appsettings.json file of uplink message processing Azure QueueTrigger function.

For more complex deployments the ApplicationEnrollmentGroupMapping configuration enables The Things Network(TTN) devices to be provisioned using different GroupEnrollment keys based on the applicationid in the first Uplink message which initiates provisoning.

"DeviceProvisioningService": {
      "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
      "ScopeID": "",
      "EnrollmentGroupSymmetricKeyDefault": "TopSecretKey",
      "DeviceProvisioningPollingDelay": 500,
      "ApplicationEnrollmentGroupMapping": {
         "Application1": "TopSecretKey1",
         "Application2": "TopSecretKey2"
      }
   }

DPS Group Enrolment with no provisioned devices

Then as uplink messages from the TTN integration are processed devices are “automagically” created in the DPS.

Simultaneously devices are created in the Azure IoT Hub

Then shortly after telemetry events are available for applications to process or inspection with tools like Azure IoT Explorer.

In the telemetry event payload sent to the Azure IoT IoT Hub are some extra fields to help with debugging and tracing. The raw payload is also included so messages not decoded by TTN can be processed by the client application(s).

/ Assemble the JSON payload to send to Azure IoT Hub/Central.
log.LogInformation($"{messagePrefix} Payload assembly start");
JObject telemetryEvent = new JObject();
try
{
   JObject payloadFields = (JObject)payloadObect.payload_fields;
   telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
   telemetryEvent.Add("Retry", payloadObect.is_retry);
   telemetryEvent.Add("Counter", payloadObect.counter);
   telemetryEvent.Add("DeviceID", payloadObect.dev_id);
   telemetryEvent.Add("ApplicationID", payloadObect.app_id);
   telemetryEvent.Add("Port", payloadObect.port);
   telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
   telemetryEvent.Add("ReceivedAtUTC", payloadObect.metadata.time);
 
   // If the payload has been unpacked in TTN backend add fields to telemetry event payload
   if (payloadFields != null)
   {
      foreach (JProperty child in payloadFields.Children())
      {
         EnumerateChildren(telemetryEvent, child);
      }
   }
}
catch (Exception ex)
{
   log.LogError(ex, $"{messagePrefix} Payload processing or Telemetry event assembly failed");
   throw;
}

Beware, the Azure Storage Account and storage queue names have a limited character set. This caused me problems several times when I used camel cased queue names etc.

The Things Network HTTP Integration Part10

Assembling the components

After a series of articles exploring how portions of solution could be built

I now had working code for receiving The Things Network(TTN) HTTP integration JSON messages with an Azure Function using an HTTPTrigger. (secured with an APIKey) and then putting them into an Azure Storage Queue for processing. This code was intentionally kept as small and as simple as possible so there was less to go wrong. The required configuration is also minimal.

HTTP Endpoint handler application

In the last couple of posts I had been building an Azure Function with a QueueTrigger to process the uplink messages. The function used custom bindings so that the CloudQueueMessage could be accessed, and load the Azure Storage account plus queue name from configuration. I’m still using classes generated by JSON2CSharp (with minimal modifications) for deserialising the payloads with JSON.Net.

The message processor Azure Function uses a ConcurrentCollection to store AzureDeviceClient objects constructed using the information returned by the Azure Device Provisioning Service(DPS). This is so the DPS doesn’t have to be called for the connection details for every message.(When the Azure function is restarted the dictionary of DeviceClient objects has to be repopulated). If there is a backlog of messages the message processor can process more than a dozen messages concurrently so the telemetry events displayed in an application like Azure IoT Central can arrive out of order.

The solution uses DPS Group Enrollment with Symmetric Key Attestation so Azure IoT Hub devices can be “automagically” created when a message from a new device is processed. The processing code is multi-thread and relies on many error conditions being handled by the Azure Function retry mechanism. After a number of failed retries the messages are moved to a poison queue. Azure Storage Explorer is a good tool for viewing payloads and moving poison messages back to the processing queue.

public static class UplinkMessageProcessor
{
   static readonly ConcurrentDictionary<string, DeviceClient> DeviceClients = new ConcurrentDictionary<string, DeviceClient>();

   [FunctionName("UplinkMessageProcessor")]
   public static async Task Run(
      [QueueTrigger("%UplinkQueueName%", Connection = "AzureStorageConnectionString")]
      CloudQueueMessage cloudQueueMessage, // Used to get CloudQueueMessage.Id for logging
      Microsoft.Azure.WebJobs.ExecutionContext context,
      ILogger log)
   {
      PayloadV5 payloadObect;
      DeviceClient deviceClient = null;
      DeviceProvisioningServiceSettings deviceProvisioningServiceConfig;

      string environmentName = Environment.GetEnvironmentVariable("ENVIRONMENT");

      // Load configuration for DPS. Refactor approach and store securely...
      var configuration = new ConfigurationBuilder()
      .SetBasePath(context.FunctionAppDirectory)
      .AddJsonFile($"appsettings.json")
      .AddJsonFile($"appsettings.{environmentName}.json")
      .AddEnvironmentVariables()
      .Build();

      // Load configuration for DPS. Refactor approach and store securely...
      try
      {
         deviceProvisioningServiceConfig = (DeviceProvisioningServiceSettings)configuration.GetSection("DeviceProvisioningService").Get<DeviceProvisioningServiceSettings>(); ;
      }
      catch (Exception ex)
      {
         log.LogError(ex, $"Configuration loading failed");
         throw;
      }

      // Deserialise uplink message from Azure storage queue
      try
      {
         payloadObect = JsonConvert.DeserializeObject<PayloadV5>(cloudQueueMessage.AsString);
      }
      catch (Exception ex)
      {
         log.LogError(ex, $"MessageID:{cloudQueueMessage.Id} uplink message deserialisation failed");
         throw;
      }

      // Extract the device ID as it's used lots of places
      string registrationID = payloadObect.hardware_serial;

      // Construct the prefix used in all the logging
      string messagePrefix = $"MessageID: {cloudQueueMessage.Id} DeviceID:{registrationID} Counter:{payloadObect.counter} Application ID:{payloadObect.app_id}";
      log.LogInformation($"{messagePrefix} Uplink message device processing start");

      // See if the device has already been provisioned
      if (DeviceClients.TryAdd(registrationID, deviceClient))
      {
         log.LogInformation($"{messagePrefix} Device provisioning start");

         string enrollmentGroupSymmetricKey = deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault;

         // figure out if custom mapping for TTN applicationID
         if (deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping != null)
        {
            deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping.GetValueOrDefault(payloadObect.app_id, deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault);
         }

         // Do DPS magic first time device seen
         await DeviceRegistration(log, messagePrefix, deviceProvisioningServiceConfig.GlobalDeviceEndpoint, deviceProvisioningServiceConfig.ScopeID, enrollmentGroupSymmetricKey, registrationID);
      }

      // Wait for the Device Provisioning Service to complete on this or other thread
      log.LogInformation($"{messagePrefix} Device provisioning polling start");
      if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
      {
         log.LogError($"{messagePrefix} Device provisioning polling TryGet before while failed");

         throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet before while failed");
      }

      while (deviceClient == null)
      {
         log.LogInformation($"{messagePrefix} provisioning polling delay");
         await Task.Delay(deviceProvisioningServiceConfig.DeviceProvisioningPollingDelay);

         if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
         {
            log.LogError($"{messagePrefix} Device provisioning polling TryGet while loop failed");

            throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet while loopfailed");
         }
      }

      // Assemble the JSON payload to send to Azure IoT Hub/Central.
      log.LogInformation($"{messagePrefix} Payload assembly start");
      JObject telemetryEvent = new JObject();
      try
      {
         JObject payloadFields = (JObject)payloadObect.payload_fields;
         telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
         telemetryEvent.Add("Retry", payloadObect.is_retry);
         telemetryEvent.Add("Counter", payloadObect.counter);
         telemetryEvent.Add("DeviceID", payloadObect.dev_id);
         telemetryEvent.Add("ApplicationID", payloadObect.app_id);
         telemetryEvent.Add("Port", payloadObect.port);
         telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
         telemetryEvent.Add("ReceivedAt", payloadObect.metadata.time);

         // If the payload has been unpacked in TTN backend add fields to telemetry event payload
         if (payloadFields != null)
         {
            foreach (JProperty child in payloadFields.Children())
            {
               EnumerateChildren(telemetryEvent, child);
            }
         }
      }
      catch (Exception ex)
      {
         if (DeviceClients.TryRemove(registrationID, out deviceClient))
         {
            log.LogWarning($"{messagePrefix} TryRemove payload assembly failed");
         }

         log.LogError(ex, $"{messagePrefix} Payload assembly failed");
         throw;
      }

      // Send the message to Azure IoT Hub/Azure IoT Central
      log.LogInformation($"{messagePrefix} Payload SendEventAsync start");
      try
      {
         using (Message ioTHubmessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent))))
         {
            // Ensure the displayed time is the acquired time rather than the uploaded time. esp. importan for messages that end up in poison queue
            ioTHubmessage.Properties.Add("iothub-creation-time-utc", payloadObect.metadata.time.ToString("s", CultureInfo.InvariantCulture));
            await deviceClient.SendEventAsync(ioTHubmessage);
         }
      }
      catch (Exception ex)
      {
         if (DeviceClients.TryRemove(registrationID, out deviceClient))
         {
            log.LogWarning($"{messagePrefix} TryRemove SendEventAsync failed");
         }

         log.LogError(ex, $"{messagePrefix} SendEventAsync failed");
         throw;
      }

   log.LogInformation($"{messagePrefix} Uplink message device processing completed");
   }
}

There is also support for using a specific GroupEnrollment based on the application_id in the uplink message payload.

"DeviceProvisioningService": {
      "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
      "ScopeID": "",
      "EnrollmentGroupSymmetricKeyDefault": "TopSecretKey",
      "DeviceProvisioningPollingDelay": 500,
      "ApplicationEnrollmentGroupMapping": {
         "Application1": "TopSecretKey1",
         "Application2": "TopSecretKey2"
      }
   }

In addition to the appsettings.json there is configuration for application insights, uplink message queue name and Azure Storage connection strings. The “Environment” setting is important as it specifies what appsettings.json file should be used if code is being debugged etc..

TTN Integration uplink message processor configuration

The deployed solution application consists of Azure IoTHub and DPS instances. There are two Azure functions, one for putting the messages from the TTN into a queue the other is for processing them. The Azure Functions are hosted in an Azure AppService plan.

Azure solution deployment

An Azure Storage account is used for the queue and Azure Function synchronisation information and Azure Application Insights is used to monitor the solution.