MATH131 Numerical methods was useful

Back in 1986 in my second first year at the University of Canterbury I did “MATH131 Numerical Methods” which was a year of looking at why mathematics in FORTRAN, C, and Pascal sometimes didn’t return the result you were expecting…

While testing my GHI Electronics TinyCLR2 RAK Wireless RAK811 LoRaWAN client I noticed the temperature numbers didn’t quite match…

Visual Studio 2019 Debug output window
The Things Network Device Application Data tab

I have implemented my own Cayenne Low Power Payload encoder in C# based on the sample Mbed C code

uint8_t CayenneLPP::addTemperature(uint8_t channel, float celsius) {
    if ((cursor + LPP_TEMPERATURE_SIZE) > maxsize) {
        return 0;
    }
    int16_t val = celsius * 10;
    buffer[cursor++] = channel; 
    buffer[cursor++] = LPP_TEMPERATURE; 
    buffer[cursor++] = val >> 8; 
    buffer[cursor++] = val; 

    return cursor;
}

My translation of that code to C#

public void TemperatureAdd(byte channel, float celsius)
{
   if ((index + TemperatureSize) > buffer.Length)
   {
      throw new ApplicationException("TemperatureAdd insufficent buffer capacity");
   }

   short val = (short)(celsius * 10);

   buffer[index++] = channel;
   buffer[index++] = (byte)DataType.Temperature;
   buffer[index++] = (byte)(val >> 8);
   buffer[index++] = (byte)val;
}

After looking at the code I think the issues was most probably due to the representation of the constant 10(int32), 10.0(double), and 10.0f(single) . To confirm my theory I modified the client to send the temperature with the calculation done with three different constants.

Visual Studio 2019 Debug output window
The Things Network(TTN) Message Queue Telemetry Transport(MQTT) client

After some trial and error I settled on this C# code for my decoder

public void TemperatureAdd(byte channel, float celsius)
{
   if ((index + TemperatureSize) > buffer.Length)
   {
      throw new ApplicationException("TemperatureAdd insufficent buffer capacity");
   }

   short val = (short)(celsius * 10.0f);

   buffer[index++] = channel;
   buffer[index++] = (byte)DataType.Temperature;
   buffer[index++] = (byte)(val >> 8);
   buffer[index++] = (byte)val;
}

I don’t think this is specifically an issue with the TinyCLR V2 just with number type used for the constant.

The Things Network V2 MQTT SQL Connector

This code was written to solve a problem I had debugging and testing an application which processed data from sensors attached to The Things Network(TTN) and I figured others might find it useful.

As part of my series of TTN projects I wanted to verify that the data from a number of LoRaWAN sensors connected to TTN was reasonable and complete. I’m familiar with Microsoft SQL Server so I built a .Net Core console application which uses the TTN Message Queue Telemetry Transport(MQTT) Data API (so it can run alongside my existing TTN integration) to receive messages from the all devices in a TTN application and store them in a database for post processing.

The console application uses MQTTNet to connect to TTN MQTT Data API. It subscribes to an application device uplink topic, then uses a combination of Stackoverflow Dapper with Microsoft SQL Server tables and stored procedures to store the device data points. I re-generated the classes I had used in my other projects, added any obvious missing fields and fine tuned the data types by delving into the TTN V2 GO code.

The core of the application is in the MQTTNet application message received handler.

private static void MqttClient_ApplicationMessageReceived(MqttApplicationMessageReceivedEventArgs e)
{
   PayloadUplinkV2 payload;

   log.InfoFormat($"Receive Start Topic:{e.ApplicationMessage.Topic}");

   string connectionString = configuration.GetSection("TTNDatabase").Value;

   try
   {
      payload = JsonConvert.DeserializeObject<PayloadUplinkV2>(e.ApplicationMessage.ConvertPayloadToString());
   }
   catch (Exception ex)
   {
      log.Error("DeserializeObject failed", ex);
      return;
   }

   try
   {
      if (payload.PayloadFields != null)
      {
         var parameters = new DynamicParameters();

         EnumerateChildren(parameters, payload.PayloadFields);

         log.Debug($"Parameters:{parameters.ParameterNames.Aggregate((i, j) => i + ',' + j)}");

         foreach (string storedProcedure in storedProcedureMappings.Keys)
         {
            if (Enumerable.SequenceEqual(parameters.ParameterNames, storedProcedureMappings[storedProcedure].Split(',', StringSplitOptions.RemoveEmptyEntries), StringComparer.InvariantCultureIgnoreCase))
            {
               log.Info($"Payload fields processing with:{storedProcedure}");

               using (SqlConnection db = new SqlConnection(connectionString))
               {
                  parameters.Add("@ReceivedAtUtc", payload.Metadata.ReceivedAtUtc);
                  parameters.Add("@DeviceID", payload.DeviceId);
                  parameters.Add("@DeviceEui", payload.DeviceEui);
                  parameters.Add("@ApplicationID", payload.ApplicationId);
                  parameters.Add("@IsConfirmed", payload.IsConfirmed);
                  parameters.Add("@IsRetry", payload.IsRetry);
                  parameters.Add("@Port", payload.Port);

                  db.Execute(sql: storedProcedure, param: parameters, commandType: CommandType.StoredProcedure);
               }
            }
         }
      }
      else
      {
         foreach (string storedProcedure in storedProcedureMappings.Keys)
         {
            if (string.Compare(storedProcedureMappings[storedProcedure], "payload_raw", true) == 0)
            {
               log.Info($"Payload raw processing with:{storedProcedure}");

               using (SqlConnection db = new SqlConnection(connectionString))
               {
                  var parameters = new DynamicParameters();

                  parameters.Add("@ReceivedAtUtc", payload.Metadata.ReceivedAtUtc);
                  parameters.Add("@DeviceID", payload.DeviceId);
                  parameters.Add("@DeviceEui", payload.DeviceEui);
                  parameters.Add("@ApplicationID", payload.ApplicationId);
                  parameters.Add("@IsConfirmed", payload.IsConfirmed);
                  parameters.Add("@IsRetry", payload.IsRetry);
                  parameters.Add("@Port", payload.Port);
                  parameters.Add("@Payload", payload.PayloadRaw);

                  db.Execute(sql: storedProcedure, param: parameters, commandType: CommandType.StoredProcedure);
               }
            }
         }
      }
   }
   catch (Exception ex)
   {
      log.Error("Message processing failed", ex);
   }
}

For messages with payload fields the code attempts to match the list of field names (there maybe more than one match) with the parameter list for stored procedures in the AppSettings.json file. The Enumerable.SequenceEqual uses a case insensitive comparison but order is important. I did consider sorting the two lists of parameters but wasn’t certain the added complexity was worth it.

{
   "TTNDatabase": "Server=DESKTOP-1234567;Initial Catalog=Rak7200TrackerTest;Persist Security Info=False;User ID=TopSecret;Password=TopSecret;Connection Timeout=30",
   "MqttServer": "eu.thethings.network",
   "MqttPassword": "ttn-account-TopSecret",
   "ApplicationId": "rak811wisnodetest",
   "MqttClientId": "TTNSQLClient",
   "StoredProcedureMappings": {
      "EnvironmentalSensorProcess": "relative_humidity_0,temperature_0",
      "PayloadRawProcess": "payload_raw",
      "WeatherSensorProcess": "barometric_pressure_0,temperature_0",
      "PositionReportProcess": "accelerometer_3x,accelerometer_3y,accelerometer_3z,analog_in_10,analog_in_11,analog_in_8,analog_in_9,gps_1altitude,gps_1latitude,gps_1longitude,gyrometer_5x,gyrometer_5y,gyrometer_5z"
   }
}

To reduce the scope for mistakes (especially with longer parameter lists) I usually copy them from the Log4Net RollingFileAppender file or ManagedColoredConsoleAppender console output.

Environmental sensor output with flat data format

I created a database table to store the temperature and humidity values.

CREATE TABLE [dbo].[EnvironmentalSensorReport](
	[WeatherSensorReportUID] [UNIQUEIDENTIFIER] NOT NULL,
	[ReceivedAtUtC] [DATETIME] NOT NULL,
	[DeviceID] [NVARCHAR](32) NOT NULL,
	[DeviceEui] [NVARCHAR](32) NOT NULL,
	[ApplicationID] [NVARCHAR](32) NOT NULL,
	[IsConfirmed] [BIT] NOT NULL,
	[IsRetry] [BIT] NOT NULL,
	[Port] [SMALLINT] NOT NULL,
	[Temperature] [FLOAT] NOT NULL,
	[Humidity] [FLOAT] NOT NULL,
CONSTRAINT [PK_EnvironmentalSensorReport] PRIMARY KEY CLUSTERED 
(
	[WeatherSensorReportUID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

ALTER TABLE [dbo].[EnvironmentalSensorReport] ADD  CONSTRAINT [DF_EnvironmentalSensorReport_EnvironmentalSensorReporttUID]  DEFAULT (NEWID()) FOR [WeatherSensorReportUID]
GO

The stored procedure must have the parameters @ReceivedAtUtc, @DeviceID, @DeviceEui, @ApplicationID, @IsRetry, @IsConfirmed and @Port. In this example the payload specific fields generated by the Cayenne Low Power Protocol(LPP) decoder are @Temperature_0 and @relative_humidity_0

CREATE PROCEDURE [dbo].[EnvironmentalSensorProcess]
   @ReceivedAtUtc AS DATETIME,
   @DeviceID AS NVARCHAR(32),
   @DeviceEui AS NVARCHAR(32),
   @ApplicationID AS NVARCHAR(32),
   @IsRetry AS BIT,
   @IsConfirmed AS BIT,
   @Port AS SMALLINT,
   @Temperature_0 AS FLOAT,
   @relative_humidity_0 AS FLOAT
AS
BEGIN
   SET NOCOUNT ON;
 
   INSERT INTO [dbo].[EnvironmentalSensorReport]
           ([PositionReportUID]
	   .[ReceivedAtUtc]
           ,[DeviceID]
           ,[DeviceEui]
           ,[ApplicationID]
           ,[IsConfirmed]
           ,[IsRetry]
           ,[Port]
	   ,Temperature
	   ,Humidity)
   VALUES
   (
      @ReceivedAtUtc,
      @DeviceID,
      @DeviceEui,
      @ApplicationID,
      @IsConfirmed,
      @IsRetry,
      @port,
      @Temperature_0,
      @relative_humidity_0)
END
Environmental sensor data displayed in SQL Server Management Studio(SSMS)

To store more complex nest payload fields (e.g. latitude, longitude and altitude values), I flattened the the hierarchy.

private static void EnumerateChildren(DynamicParameters parameters, JToken token, string prefix ="")
{
   if (token is JProperty)
      if (token.First is JValue)
      {
         JProperty property = (JProperty)token;
         parameters.Add($"@{prefix}{property.Name}", property.Value.ToString());
      }
      else
      {
         JProperty property = (JProperty)token;
         prefix += property.Name;
      }

   foreach (JToken token2 in token.Children())
   {
      EnumerateChildren(parameters,token2, prefix);
   }
}
Unpacked LPP payload from GPS tracker displayed in TTN application data view
Flattened location, acceleration and rotation information
CREATE TABLE [dbo].[PositionReport](
      [PositionReportUID] [UNIQUEIDENTIFIER] NOT NULL,
      [ReceivedAtUtC] [DATETIME] NOT NULL,
      [DeviceID] [NVARCHAR](32) NOT NULL,
      [DeviceEui] [NVARCHAR](32) NOT NULL,
      [ApplicationID] [NVARCHAR](32) NOT NULL,
      [IsConfirmed] [BIT] NOT NULL,
      [IsRetry] [BIT] NOT NULL,
      [Port] [SMALLINT] NOT NULL,
      [Latitude] [FLOAT] NOT NULL,
      [Longitude] [FLOAT] NOT NULL,
      [Altitude] [FLOAT] NOT NULL,
 CONSTRAINT [PK_PositionReport] PRIMARY KEY CLUSTERED 
(
	[PositionReportUID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

I created a database table to store values of only the fields I cared about.

CREATE PROCEDURE [dbo].[PositionReportProcess]
      @ReceivedAtUtc AS DATETIME,
      @DeviceID AS NVARCHAR(32),
      @DeviceEui AS NVARCHAR(32),
      @ApplicationID AS NVARCHAR(32),
      @IsRetry AS Bit,
      @IsConfirmed AS BIT,
      @Port AS SMALLINT,
      @accelerometer_3x AS FLOAT,
      @accelerometer_3y AS FLOAT,
      @accelerometer_3z AS FLOAT,
      @analog_in_8 AS FLOAT,
      @analog_in_9 AS FLOAT,
      @analog_in_10 AS FLOAT,
      @analog_in_11 AS FLOAT,
      @gps_1Latitude AS FLOAT,
      @gps_1Longitude AS FLOAT,
      @gps_1Altitude AS FLOAT,
      @gyrometer_5x  AS FLOAT, 
      @gyrometer_5y  AS FLOAT, 
      @gyrometer_5z  AS FLOAT 
AS
BEGIN
   SET NOCOUNT ON;

   INSERT INTO [dbo].[PositionReport]
      ([PositionReportUID]
      .[ReceivedAtUtc]
      ,[DeviceID]
      ,[DeviceEui]
      ,[ApplicationID]
      ,[IsConfirmed]
      ,[IsRetry]
      ,[Port]
      ,Latitude
      ,Longitude
      ,Altitude)
   VALUES
   (
      @ReceivedAtUtc,
      @DeviceID,
      @DeviceEui,
      @ApplicationID,
      @IsConfirmed,
      @IsRetry,
      @port,
      @gps_1Latitude,
      @gps_1Longitude,
      @gps_1Altitude)
END

The stored procedure for storing the GPS tracker payload has to have parameters matching each payload field but some of the fields are not used.

Location data displayed in SQL Server Management Studio(SSMS)

For uplink messages with no payload fields the message processor looks for a stored procedure with a single parameter called “payload_raw”.(there maybe more than one match)

CREATE TABLE [dbo].[PayloadReport](
      [PayloadReportUID] [UNIQUEIDENTIFIER] NOT NULL,
      [ReceivedAtUtC] [DATETIME] NOT NULL,
      [DeviceID] [NVARCHAR](32) NOT NULL,
      [DeviceEui] [NVARCHAR](32) NOT NULL,
      [ApplicationID] [NVARCHAR](32) NOT NULL,
      [IsConfirmed] [BIT] NOT NULL,
      [IsRetry] [BIT] NOT NULL,
      [Port] [SMALLINT] NOT NULL,
      [Payload] [NVARCHAR](128) NOT NULL,
CONSTRAINT [PK_PayloadReport] PRIMARY KEY CLUSTERED 
(
      [PayloadReportUID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

ALTER TABLE [dbo].[PayloadReport] ADD  CONSTRAINT [DF_PayloadReport_PositionReportUID]  DEFAULT (NEWID()) FOR [PayloadReportUID]
GO
ALTER PROCEDURE [dbo].[PayloadRawProcess]
      @ReceivedAtUtc AS DATETIME,
      @DeviceID AS NVARCHAR(32),
      @DeviceEui AS NVARCHAR(32),
      @ApplicationID AS NVARCHAR(32),
      @IsRetry AS Bit,
      @IsConfirmed AS BIT,
      @Port AS SMALLINT,
      @Payload AS NVARCHAR(128)
AS
BEGIN
      SET NOCOUNT ON;

      INSERT INTO [dbo].[PayloadReport]
         ([PositionReportUID]
         .[ReceivedAtUtc]
         ,[DeviceID]
         ,[DeviceEui]
         ,[ApplicationID]
         ,[IsConfirmed]
         ,[IsRetry]
         ,[Port]
         ,[Payload])
     VALUES(@ReceivedAtUtc,
         @DeviceID,
         @DeviceEui,
         @ApplicationID,
         @IsConfirmed,
         @IsRetry,
         @port,
         @Payload)
END
Raw payload data displayed in SQL Server Management Studio(SSMS)

Initially the application just used Console.Writeline for logging, then I added Log4Net because it would be useful to persist information about failures and so I could copy n paste parameter lists to the appSettings.json file.

To make the application more robust adding a retries with the Enterprise Library Transient Fault Handling and Configuration blocks or Polly on the Dapper Execute would be a good idea. It also would take much work to get the application to run in Microsoft Azure as a “headless” webapp.

Dapper supports a number of database platforms so in theory this application (with a little bit of effort) should be platform portable.

The Things Network V3 MQTT Client Uplink

In preparation for the impending(delayed) deployment of The Things Network(TTN) V3 I wanted to build a new Message Queue Telemetry Transport(MQTT) integration. As per my usual approach I build a .Net Core console application which sends and receives messages

The console application uses MQTTNet to connect to TTN. It subscribes to to the TTN application device uplink topic (did try subscribing to the uplink messages for all the devices in the application, and the downlink message scheduled, sent and acknowledged topics.

I tried a lot of topic formats with and without wildcards to see which worked best

//downlinkTopic = $"v3/{applicationId}/devices/{deviceId}/down/push";
//uplinkTopic = $"v3/+";
//uplinkTopic = $"v3/#";
//uplinkTopic = $"v3/{applicationId}/+"; //exception
//uplinkTopic = $"v3/{applicationId}/*";
//uplinkTopic = $"v3/devices/+";
//uplinkTopic = $"v3/devices/#";
//uplinkTopic = $"v3/devices/+/events/+";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/+";
//uplinkTopic = $"v3/{applicationId}/devices/{deviceId}/events/update";
//uplinkTopic = $"v3/{applicationId}/devices/{deviceId}/events/create";
//uplinkTopic = $"v3/{applicationId}/devices/{deviceId}/events/delete";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/+";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/create";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/update";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/delete";
//uplinkTopic = $"v3/{applicationId}/devices/+/events/+";
//uplinkTopic = $"v3/{applicationId}/devices/{deviceId}/up";

string downlinkTopic = $"v3/{applicationId}/devices/{deviceId}/down/push";
string downlinkQueuedTopic = $"v3/{applicationId}/devices/{deviceId}/down/queued";
string downlinkSentTopic = $"v3/{applicationId}/devices/{deviceId}/down/sent";
string downlinkAckTopic = $"v3/{applicationId}/devices/{ deviceId}/down/ack";
string downlinkNakTopic = $"v3/{applicationId}/devices/{ deviceId}/down/nack";
string downlinkFailedTopic = $"v3/{applicationId}/devices/{deviceId}/down/sent";

I generated new classes from the ones provided in the documentation then added any obvious missing fields and fine tuned the data types by delving into the TTN V3 GO code.

The new messages payloads have significant differences to the V2 ones. I have refactored the generated classes to reduce the duplication of code and fix up datatypes e.g. int32 vs. ulong where JSON2Charp couldn’t infer the size of the number.

namespace devMobile.TheThingsNetwork.Models
{
   public class ApplicationIds
   {
      public string application_id { get; set; }
   }

   public class EndDeviceIds
   {
      public string device_id { get; set; }
      public ApplicationIds application_ids { get; set; }
      public string dev_eui { get; set; }
      public string join_eui { get; set; }
      public string dev_addr { get; set; }
   }
}

I wonder about the naming of the applicationIds class as it appears that it could only ever contain single applicationId.

I installed the tooling for GO support into Visual Studio Code and went looking for the uplink message definition which I think is in messages.pb.go (still learning go and how the TTN GO source is structured).

type ApplicationUplink struct {
	// Join Server issued identifier for the session keys used by this uplink.
	SessionKeyID []byte `protobuf:"bytes,1,opt,name=session_key_id,json=sessionKeyId,proto3" json:"session_key_id,omitempty"`
	FPort        uint32 `protobuf:"varint,2,opt,name=f_port,json=fPort,proto3" json:"f_port,omitempty"`
	FCnt         uint32 `protobuf:"varint,3,opt,name=f_cnt,json=fCnt,proto3" json:"f_cnt,omitempty"`
	// The frame payload of the uplink message.
	// The payload is still encrypted if the skip_payload_crypto field of the EndDevice
	// is true, which is indicated by the presence of the app_s_key field.
	FRMPayload []byte `protobuf:"bytes,4,opt,name=frm_payload,json=frmPayload,proto3" json:"frm_payload,omitempty"`
	// The decoded frame payload of the uplink message.
	// This field is set by the message processor that is configured for the end device (see formatters) or application (see default_formatters).
	DecodedPayload *types.Struct `protobuf:"bytes,5,opt,name=decoded_payload,json=decodedPayload,proto3" json:"decoded_payload,omitempty"`
	// Warnings generated by the message processor while decoding the frm_payload.
	DecodedPayloadWarnings []string `protobuf:"bytes,12,rep,name=decoded_payload_warnings,json=decodedPayloadWarnings,proto3" json:"decoded_payload_warnings,omitempty"`
	// A list of metadata for each antenna of each gateway that received this message.
	RxMetadata []*RxMetadata `protobuf:"bytes,6,rep,name=rx_metadata,json=rxMetadata,proto3" json:"rx_metadata,omitempty"`
	// Settings for the transmission.
	Settings TxSettings `protobuf:"bytes,7,opt,name=settings,proto3" json:"settings"`
	// Server time when the Network Server received the message.
	ReceivedAt time.Time `protobuf:"bytes,8,opt,name=received_at,json=receivedAt,proto3,stdtime" json:"received_at"`
	// The AppSKey of the current session.
	// This field is only present if the skip_payload_crypto field of the EndDevice
	// is true.
	// Can be used to decrypt uplink payloads and encrypt downlink payloads.
	AppSKey *KeyEnvelope `protobuf:"bytes,9,opt,name=app_s_key,json=appSKey,proto3" json:"app_s_key,omitempty"`
	// The last AFCntDown of the current session.
	// This field is only present if the skip_payload_crypto field of the EndDevice
	// is true.
	// Can be used with app_s_key to encrypt downlink payloads.
	LastAFCntDown uint32 `protobuf:"varint,10,opt,name=last_a_f_cnt_down,json=lastAFCntDown,proto3" json:"last_a_f_cnt_down,omitempty"`
	Confirmed     bool   `protobuf:"varint,11,opt,name=confirmed,proto3" json:"confirmed,omitempty"`
	// Consumed airtime for the transmission of the uplink message. Calculated by Network Server using the RawPayload size and the transmission settings.
	ConsumedAirtime *time.Duration `protobuf:"bytes,13,opt,name=consumed_airtime,json=consumedAirtime,proto3,stdduration" json:"consumed_airtime,omitempty"`
	// End device location metadata, set by the Application Server while handling the message.
	Locations            map[string]*Location `protobuf:"bytes,14,rep,name=locations,proto3" json:"locations,omitempty" protobuf_key:"bytes,1,opt,name=key,proto3" protobuf_val:"bytes,2,opt,name=value,proto3"`
	XXX_NoUnkeyedLiteral struct{}             `json:"-"`
	XXX_sizecache        int32                `json:"-"`
}

I also need to deploy some more gateways and devices to check that I haven’t missed any fields available in more realistic environments.

TTN V3 MQTT Console client

In the TTN Device data tab I could see messages being sent, to and received from from the simulated device.

TTN V3 MQTT Device Live Data

The next step is to get downlink messages working, then connect up a couple of gateways and trial with some real devices.

TTN V3 EndDevice API Basic Client

The next step was to enumerate all the EndDevices of a The Things Network(TTN) Application and display their attributes. I have to establish an Azure DeviceClient connection to an Azure IoT Hub for each TTN EndDevice to get downlink messages. To do this I will have to enumerate the TTN Applications in the instance then enumerate the LoRaWAN EndDevices.

using (HttpClient httpClient = new HttpClient())
{
	EndDeviceRegistryClient endDeviceRegistryClient = new EndDeviceRegistryClient(baseUrl, httpClient)
	{
		ApiKey = apiKey
	};

	try
	{
#if FIELDS_MINIMUM
		string[] fieldMaskPathsDevice = { "attributes" }; // think this is the bare minimum required for integration
#else
		string[] fieldMaskPathsDevice = { "name", "description", "attributes" };
#endif
		V3EndDevices endDevices = await endDeviceRegistryClient.ListAsync(applicationID, field_mask_paths:fieldMaskPathsDevice);
		if ((endDevices != null) && (endDevices.End_devices != null)) // If there are no devices returns null rather than empty list
		{
			foreach (V3EndDevice endDevice in endDevices.End_devices)
			{
#if FIELDS_MINIMUM
				Console.WriteLine($"EndDevice ID:{endDevice.Ids.Device_id}");
#else
				Console.WriteLine($"Device ID:{endDevice.Ids.Device_id} Name:{endDevice.Name} Description:{endDevice.Description}");
				Console.WriteLine($"  CreatedAt: {endDevice.Created_at:dd-MM-yy HH:mm:ss} UpdatedAt: {endDevice.Updated_at:dd-MM-yy HH:mm:ss}");
#endif
				if (endDevice.Attributes != null)
				{
					Console.WriteLine("  EndDevice attributes");

					foreach (KeyValuePair<string, string> attribute in endDevice.Attributes)
					{
						Console.WriteLine($"    Key: {attribute.Key} Value: {attribute.Value}");
					}
				}
				Console.WriteLine();
			}
		}
	}
	catch (Exception ex)
	{
		Console.WriteLine(ex.Message);
	}

	Console.WriteLine("Press <enter> to exit");
	Console.ReadLine();
}

Like the applicationRegistryClient.ListAsync call the endDeviceRegistryClient.ListAsync also returns null rather than an empty list.

I also wanted to explore whether I could use EndDevice attributes to populate the ClientOptions ModelId of my CreateFromConnectionString call. The modelId would contain the Digital Twins Definition Language(DTDL) ID of the LoRaWAN device so it could be automatically provisioned.

TTN V3 Application API Basic Paging and Filtering Client

The next step was to enumerate The Things Network(TTN) Applications so I could connect only to the required Azure IoT hub(s). There would also be a single configuration setting for the client (establish a connection for every TTN application, or don’t establish a connection for any) and this could be overridden with a TTN application attribute

long pageSize = long.Parse(args[3]);
Console.WriteLine($"Page size: {pageSize}");

Console.WriteLine();

using (HttpClient httpClient = new HttpClient())
{
	ApplicationRegistryClient applicationRegistryClient = new ApplicationRegistryClient(baseUrl, httpClient)
	{
		ApiKey = apiKey
	};

	try
	{
		int page = 1;

		string[] fieldMaskPathsApplication = { "attributes" }; // think this is the bare minimum required for integration

		V3Applications applications = await applicationRegistryClient.ListAsync(collaborator, field_mask_paths: fieldMaskPathsApplication, limit: pageSize, page: page);
		while ((applications != null) && (applications.Applications != null)) 
		{
			Console.WriteLine($"Applications:{applications.Applications.Count} Page:{page} Page size:{pageSize}");
			foreach (V3Application application in applications.Applications)
			{
				bool applicationIntegration = ApplicationAzureintegrationDefault;

				Console.WriteLine($"Application ID:{application.Ids.Application_id}");
				if (application.Attributes != null)
				{
					string ApplicationAzureIntegrationValue = string.Empty;
					if (application.Attributes.TryGetValue(ApplicationAzureIntegrationField, out ApplicationAzureIntegrationValue))
					{
						bool.TryParse(ApplicationAzureIntegrationValue, out applicationIntegration);
					}

					if (applicationIntegration)
					{
						Console.WriteLine("  Application attributes");

						foreach (KeyValuePair<string, string> attribute in application.Attributes)
						{
							Console.WriteLine($"   Key: {attribute.Key} Value: {attribute.Value}");
						}
					}
				}
				Console.WriteLine();
			}
			page += 1;
			applications = await applicationRegistryClient.ListAsync(collaborator, field_mask_paths: fieldMaskPathsApplication, limit: pageSize, page: page);
		};
	}
	catch (Exception ex)
	{
		Console.WriteLine(ex.Message);
	}

	Console.WriteLine("Press <enter> to exit");
	Console.ReadLine();
}

I Used the field_mask_paths parameter (don’t need created_at, updated_at, name etc.) to minimise the data returned to my client.

public async System.Threading.Tasks.Task<V3Applications> ListAsync(string collaborator_organization_ids_organization_id = null, string collaborator_user_ids_user_id = null, string collaborator_user_ids_email = null, System.Collections.Generic.IEnumerable<string> field_mask_paths = null, string order = null, long? limit = null, long? page = null, System.Threading.CancellationToken cancellationToken = default(System.Threading.CancellationToken))
{
   var urlBuilder_ = new System.Text.StringBuilder();
   urlBuilder_.Append(BaseUrl != null ? BaseUrl.TrimEnd('/') : "").Append("/applications?");
   if (collaborator_organization_ids_organization_id != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("collaborator.organization_ids.organization_id") + "=").Append(System.Uri.EscapeDataString(ConvertToString(collaborator_organization_ids_organization_id, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
   if (collaborator_user_ids_user_id != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("collaborator.user_ids.user_id") + "=").Append(System.Uri.EscapeDataString(ConvertToString(collaborator_user_ids_user_id, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
   if (collaborator_user_ids_email != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("collaborator.user_ids.email") + "=").Append(System.Uri.EscapeDataString(ConvertToString(collaborator_user_ids_email, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
   if (field_mask_paths != null) 
   {
         foreach (var item_ in field_mask_paths) { urlBuilder_.Append(System.Uri.EscapeDataString("field_mask.paths") + "=").Append(System.Uri.EscapeDataString(ConvertToString(item_, System.Globalization.CultureInfo.InvariantCulture))).Append("&"); }
   }
   if (order != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("order") + "=").Append(System.Uri.EscapeDataString(ConvertToString(order, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
   if (limit != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("limit") + "=").Append(System.Uri.EscapeDataString(ConvertToString(limit, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
   if (page != null) 
   {
         urlBuilder_.Append(System.Uri.EscapeDataString("page") + "=").Append(System.Uri.EscapeDataString(ConvertToString(page, System.Globalization.CultureInfo.InvariantCulture))).Append("&");
   }
}

I was hoping that there would be a away to further “shape” the returned data, but in the NSwag generated code the construction of the URL with field_mask_paths, order, limit, and page parameters meant this appears not to be possible.

TTN V3 Application API Basic Paging Client

The next step was to enumerate The Things Network(TTN) Applications and their attributes. I’m planning on using attributes to manage which applications (and in future EndDevices) are enabled in my Advanced Message Queuing Protocol(AMQP) client.

In the code I have left the different paging implementations which I trialled but abandoned.

using (HttpClient httpClient = new HttpClient())
{
	ApplicationRegistryClient applicationRegistryClient = new ApplicationRegistryClient(baseUrl, httpClient)
	{
		ApiKey = apiKey
	};

	try
	{
		int page = 1;
		string[] fieldMaskPathsApplication = { "attributes" };

		V3Applications applications = await applicationRegistryClient.ListAsync(collaborator, field_mask_paths: fieldMaskPathsApplication, limit:pageSize, page: page);
		while ((applications != null) && (applications.Applications != null))
		{ 
			Console.WriteLine($"Applications:{applications.Applications.Count} Page:{page} Page size:{pageSize}");
			foreach (V3Application application in applications.Applications)
			{
				Console.WriteLine($"Application ID:{application.Ids.Application_id}"); 
				if (application.Attributes != null)
				{
					Console.WriteLine("  Application attributes");

					foreach (KeyValuePair<string, string> attribute in application.Attributes)
					{
						Console.WriteLine($"   Key: {attribute.Key} Value: {attribute.Value}");
					}
				}
				Console.WriteLine();
			}
			page += 1;
			applications = await applicationRegistryClient.ListAsync(collaborator, field_mask_paths: fieldMaskPathsApplication, limit: pageSize, page: page);
		}
	}   
}

For each LoraWAN client I have to have an open connection to the Azure IoT hub to get Cloud to Device (C2D) messages so I’m looking at using connection pooling to reduce the overall number of connections.

I think the Azure ClientDevice library supports up to 995 devices per connection and has quiet a lot of additional functionality.

/// <summary>
/// contains Amqp Connection Pool settings for DeviceClient
/// </summary>
public sealed class AmqpConnectionPoolSettings
{
   private static readonly TimeSpan s_defaultConnectionIdleTimeout = TimeSpan.FromMinutes(2);
    private uint _maxPoolSize;
    internal const uint MaxDevicesPerConnection = 995; // IotHub allows upto 999 tokens per connection. Setting the threshold just below that.

    /// <summary>
    /// The default size of the pool
    /// </summary>
    /// <remarks>
    /// Allows up to 100,000 devices
    /// </remarks>
    private const uint DefaultPoolSize = 100;

    /// <summary>
    /// The maximum value that can be used for the MaxPoolSize property
    /// </summary>
     public const uint AbsoluteMaxPoolSize = ushort.MaxValue;

    /// <summary>
    /// Creates an instance of AmqpConnecitonPoolSettings with default properties
    /// </summary>
    public AmqpConnectionPoolSettings()
    {
       _maxPoolSize = DefaultPoolSize;
       Pooling = false;
    }

Whereas I think AMQPNetLite may support more, but will require me to implement more of the Azure IoT client interface

/// <summary>
/// The default maximum frame size used by the library.
/// </summary>
public const uint DefaultMaxFrameSize = 64 * 1024;
internal const ushort DefaultMaxConcurrentChannels = 8 * 1024;
internal const uint DefaultMaxLinkHandles = 256 * 1024;
internal const uint DefaultHeartBeatInterval = 90000;
internal const uint MinimumHeartBeatIntervalMs = 5 * 1000;

I have got todo some more research to see which library is easier/requires more code/complex/scales better.

TTN V3 Application API Basic Client

After reviewing the initial implementation I found I had to have one connection per The Things Network(TTN) device. Todo this I first have to enumerate the LoRaWAN Devices for each Application in my instance. First I had to add the TTN APIKey to the application and device registry requests.

namespace devMobile.TheThingsNetwork.API
{
	public partial class EndDeviceRegistryClient
	{
		public string ApiKey { set; get; }

		partial void PrepareRequest(System.Net.Http.HttpClient client, System.Net.Http.HttpRequestMessage request, string url)
		{
			if (!client.DefaultRequestHeaders.Contains("Authorization"))
			{
				client.DefaultRequestHeaders.Add("Authorization", $"Bearer {ApiKey}");
			}
		}
	}

	public partial class ApplicationRegistryClient
	{
		public string ApiKey { set; get; }

		partial void PrepareRequest(System.Net.Http.HttpClient client, System.Net.Http.HttpRequestMessage request, string url)
		{
			if (!client.DefaultRequestHeaders.Contains("Authorization"))
			{
				client.DefaultRequestHeaders.Add("Authorization", $"Bearer {ApiKey}");
			}
		}
	}
}

The first step was to enumerate Applications and their attributes

#if FIELDS_MINIMUM
	string[] fieldMaskPathsApplication = { "attributes" }; // think this is the bare minimum required for integration
#else
	string[] fieldMaskPathsApplication = { "name", "description", "attributes" };
#endif

	V3Applications applications = await applicationRegistryClient.ListAsync(collaborator, field_mask_paths: fieldMaskPathsApplication);
	if ((applications != null) && (applications.Applications != null)) // If there are no applications returns null rather than empty list
	{
		foreach (V3Application application in applications.Applications)
		{
#if FIELDS_MINIMUM
			Console.WriteLine($"Application ID:{application.Ids.Application_id}");
#else
			Console.WriteLine($"Application ID:{application.Ids.Application_id} Name:{application.Name} Description:{application.Description}");
			Console.WriteLine($"  CreatedAt: {application.Created_at:dd-MM-yy HH:mm:ss} UpdatedAt: {application.Updated_at:dd-MM-yy HH:mm:ss}");
#endif
			if (application.Attributes != null)
			{
				Console.WriteLine("  Application attributes");

				foreach (KeyValuePair<string, string> attribute in application.Attributes)
				{
					Console.WriteLine($"    Key: {attribute.Key} Value: {attribute.Value}");
				}
			}
			Console.WriteLine();
		}
	}
}

The applicationRegistryClient.ListAsync call returns null rather than an empty list which tripped me up. I only found this when I deleted all the applications in my instance and started from scratch.

The Things Network Cayenne LPP Support

Uplink Encoding

In my applications the myDevices Cayenne Low power payload(LPP) uplink messages from my *duino devices are decoded by the built in The Things Network(TTN) decoder. I can also see the nicely formatted values in the device data view.

Downlink Encoding

I could successfully download raw data to the device but I found that manually unpacking it on the device was painful.

Raw data

I really want to send LPP formatted messages to my devices so I could use a standard LPP library. I initially populated the payload fields in the downlink message JSON. The TTN documentation appeared to indicate this was possible.

Download JSON payload format

Initially I tried a more complex data type because I was looking at downloading a location to the device.

Complex data type

I could see nicely formatted values in the device data view but they didn’t arrive at the device. I then tried simpler data type to see if the complex data type was an issue.

Simple Data Types

At this point I asked a few questions on the TTN forums and started to dig into the TTN source code.

Learning Go on demand

I had a look at the TTB Go code and learnt a lot as I figured out how the “baked in “encoder/decoder worked. I haven’t done any Go coding so it took a while to get comfortable with the syntax. The code my look a bit odd as a Pascal formatter was the closest I could get to Go.

In core/handler/cayennelpp/encoder.go there was

func (e *Encoder) Encode(fields map[string]interface{}, fPort uint8) ([]byte, bool, error) and func (d *Decoder) Decode(payload []byte, fPort uint8) (map[string]interface{}, bool, error)

Which was a positive sign…

Then in core/handler/convert_fields.go there are these two functions

> // ConvertFieldsUp converts the payload to fields using the application's payload formatter
> func (h *handler) ConvertFieldsUp(ctx ttnlog.Interface, _ *pb_broker.DeduplicatedUplinkMessage, appUp *types.UplinkMessage, dev *device.Device) error {
> 	// Find Application

and

> // ConvertFieldsDown converts the fields into a payload
> func (h *handler) ConvertFieldsDown(ctx ttnlog.Interface, appDown *types.DownlinkMessage, ttnDown *pb_broker.DownlinkMessage, _ *device.Device) error {

Then further down in the second function is this call

var encoder PayloadEncoder
	switch app.PayloadFormat {
	case application.PayloadFormatCustom:
		encoder = &CustomDownlinkFunctions{
			Encoder: app.CustomEncoder,
			Logger:  functions.Ignore,
		}
	case application.PayloadFormatCayenneLPP:
		encoder = &cayennelpp.Encoder{}
	default:
		return nil
	}var encoder PayloadEncoder
	switch app.PayloadFormat {
	case application.PayloadFormatCustom:
		encoder = &CustomDownlinkFunctions{
			Encoder: app.CustomEncoder,
			Logger:  functions.Ignore,
		}
	case application.PayloadFormatCayenneLPP:
		encoder = &cayennelpp.Encoder{}
	default:
		return nil
	}

Which I think calls

// Encode encodes the fields to CayenneLPP
func (e *Encoder) Encode(fields map[string]interface{}, fPort uint8) ([]byte, bool, error) {
	encoder := protocol.NewEncoder()
	for name, value := range fields {
		key, channel, err := parseName(name)
		if err != nil {
			continue
		}
		switch key {
		case valueKey:
			if val, ok := value.(float64); ok {
				encoder.AddPort(channel, float32(val))
			}
		}
	}
	return encoder.Bytes(), true, nil
}

Then right down at the very bottom of the call stack in keys.go

func parseName(name string) (string, uint8, error) {
	parts := strings.Split(name, "_")
	if len(parts) < 2 {
		return "", 0, errors.New("Invalid name")
	}
	key := strings.Join(parts[:len(parts)-1], "_")
	if key == "" {
		return "", 0, errors.New("Invalid key")
	}
	channel, err := strconv.Atoi(parts[len(parts)-1])
	if err != nil {
		return "", 0, err
	}
	if channel < 0 || channel > 255 {
		return "", 0, errors.New("Invalid range")
	}
	return key, uint8(channel), nil
}

At this point I started to hit the limits of my Go skills but with some trial and error I figured it out…

Executive Summary

The downlink payload values are sent as 2 byte floats with a sign bit, 100 multiplier. The fields have to be named “value_X” where X is is a byte value.

Dictionary<string, object> payloadFields = new Dictionary<string, object>();
payloadFields.Add(“value_0”, 0.0);
//00-00-00
payloadFields.Add(“value_1”, 1.0);
//01-00-64
payloadFields.Add(“value_2”, 2.0);
//02-00-C8
payloadFields.Add(“value_3”, 3.0);
//03-01-2C
payloadFields.Add(“value_4”, 4.0);
//04-01-90

payloadFields.Add(“value_0”, -0.0);
//00-00-00
payloadFields.Add(“value_1”, -1.0);
//01-FF-9C
payloadFields.Add(“value_2”, -2.0);
//02-FF-38
payloadFields.Add(“value_3”, -3.0);
//03-FE-D4
payloadFields.Add(“value_4”, -4.0);
//04-FE-70

I could see these arrive on my TinyCLR plus RAK811 device and could manually unpack them

The stream of bytes can be decoded on an Arduino using the electronic cats library (needs a small modification) with code this

byte data[] = {0xff,0x38} ; // bytes which represent -2 
float value = lpp.getValue( data, 2, 100, 1);
Serial.print("value:");
Serial.println(value);

It is possible to use the “baked” in Cayenne Encoder/Decoder to send payload fields to a device but I’m not certain is this is quite what myDevices/TTN intended.

The Things Network HTTP Integration Part13

Connection multiplexing

For the Proof of Concept(PoC) I had used a cache to store Azure IoT Hub connections to reduce the number of calls to the Device Provisioning Service(DPS).

Number of connections with no pooling

When stress testing with 1000’s of devices my program hit the host connection limit so I enabled Advanced Message Queuing Protocol(AMQP) connection pooling.

return DeviceClient.Create(result.AssignedHub,
                  authentication,
                  new ITransportSettings[]
                  {
                     new AmqpTransportSettings(TransportType.Amqp_Tcp_Only)
                     {
                        PrefetchCount = 0,
                        AmqpConnectionPoolSettings = new AmqpConnectionPoolSettings()
                        {
                           Pooling = true,
                        }
                     }
                  }
               );

My first attempt failed as I hadn’t configured “TransportType.Amqp_Tcp_Only” which would have allowed the AMQP implementation to fallback to other protocols which don’t support pooling.

Exception caused by not using TransportType.Amqp_Tcp_Only

I then deployed the updated code and ran my 1000 device stress test (note the different x axis scales)

Number of connections with pooling

This confirmed what I found in the Azure.AMQP source code

/// <summary>
/// The default size of the pool
/// </summary>
/// <remarks>
/// Allows up to 100,000 devices
/// </remarks>
/// private const uint DefaultPoolSize = 100;

The Things Network V2 MQTT Client

Another option for I had been looking at for connecting an Azure IoT Hub and The Things Network(TTN) was a Message Queue Telemetry Transport(MQTT) integration.

To trial this approach I build a .Net Core console application which sent message to and received messages from an application running on a GHI Electronics TinyCLRV2 Fezduino with RakWireless Wisduino Evaluation Board(EVB).

The console application uses MQTTNet to connect to TTN. It subscribes to to the TTN application device uplink topic (did try subscribing to the uplink messages for all the devices in the application but this was to noisy), and the downlink message scheduled, sent and acknowledged topics. To send messages to the device I published them on the device downlink topic.

//string uplinktopic = $"{applicationId}/devices/+/up";
string uplinktopic = $"{applicationId}/devices/{deviceId}/up";
await mqttClient.SubscribeAsync(uplinktopic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtLeastOnce);

string downlinkAcktopic = $"{applicationId}/devices/{deviceId}/events/down/acks";
await mqttClient.SubscribeAsync(downlinkAcktopic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtLeastOnce);

string downlinkScheduledtopic = $"{applicationId}/devices/{deviceId}/events/down/scheduled";
await mqttClient.SubscribeAsync(downlinkScheduledtopic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtLeastOnce);

string downlinkSenttopic = $"{applicationId}/devices/{deviceId}/events/down/sent";
await mqttClient.SubscribeAsync(downlinkSenttopic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtLeastOnce);

string downlinktopic = $"{applicationId}/devices/{deviceId}/down";

I used the classes from one of my earlier blog posts to deserialise the uplink message payload so I could display a subset of the fields.

MQTTNet based .Net Core console client
Things Network Device Data view

In the TTN Device data tab I could see messages being sent, to and received from from the device.

Visual Studio 2019 Tiny CLR debugger Output

In the Visual Studio 2019 debugger output window I could see messages being sent and received by the Fezduino.

Malformed TTN downlink payload

I had some problems with the downlink messages silently failing as the TTN sample payload JSON was malformed and I had copied it without noticing.

I have a working TTN HTTP Integration (uplink messages only) but have been exploring alternatives using TTN MQTT and Azure IoT Hub AMQP clients.

The next step is to build an Azure IoT Hub client (using native AMQP) then join them together.