Earlier in the year I built Things Network(TTN)V2 and V3 connectors and after using these in production applications I have learnt a lot about what I had got wrong, less wrong and what I had got right.
Using a TTN V3MQTTApplication integration wasn’t a great idea. The management of state was very complex. The storage of application keys in a app.settings file made configuration easy but was bad for security.
The use of Azure Key Vault in the TTNV2 connector was a good approach, but the process of creation and updating of the settings needs to be easier.
Using TTN device registry as the “single source of truth” was a good decision as managing the amount of LoRaWAN network, application and device specific configuration in an Azure IoT Hub would be non-trivial.
The first step was to configure and Azure IoT Central enrollment group (ensure “Automatically connect devices in this group” is on) and copy the IDScope and Group Enrollment key to the appsettings.json file (see sample file below for more detail)
At startup the TTI Gateway enumerates through the devices in each application configured in the app.settings.json. The Azure Device Provisioning Service(DPS) is used to retrieve each device’s connection string and configure it in Azure IoT Central if required.
Azure IoT Central Device Group with no provisioned DevicesTTI Connector application connecting and provisioning EndDevices
Azure IoT Central devices mapped to an Azure IoT Central Template via the modelID
using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly))
{
ProvisioningDeviceClient provClient = ProvisioningDeviceClient.Create(
Constants.AzureDpsGlobalDeviceEndpoint,
deviceProvisiongServiceSettings.IdScope,
securityProvider,
transport);
DeviceRegistrationResult result;
if (!string.IsNullOrEmpty(modelId))
{
ProvisioningRegistrationAdditionalData provisioningRegistrationAdditionalData = new ProvisioningRegistrationAdditionalData()
{
JsonData = $"{{\"modelId\": \"{modelId}\"}}"
};
result = await provClient.RegisterAsync(provisioningRegistrationAdditionalData, stoppingToken);
}
else
{
result = await provClient.RegisterAsync(stoppingToken);
}
if (result.Status != ProvisioningRegistrationStatusType.Assigned)
{
_logger.LogError("Config-DeviceID:{0} Status:{1} RegisterAsync failed ", deviceId, result.Status);
return false;
}
IAuthenticationMethod authentication = new DeviceAuthenticationWithRegistrySymmetricKey(result.DeviceId, (securityProvider as SecurityProviderSymmetricKey).GetPrimaryKey());
deviceClient = DeviceClient.Create(result.AssignedHub, authentication, transportSettings);
}
My implementation was “inspired” by TemperatureController project in the PnP Device Samples.
Azure IoT Central Dashboard with Seeeduino LoRaWAN devices around my house that were “automagically” provisioned
I need to do some testing to confirm my code works reliably with both DPS and user provided connection strings. The RegisterAsync call is currently taking about four seconds which could be an issue for TTI applications with many devices.
Device Twin Explorer displaying telemetry from one of the Seeeduino devices
My integration uses only queued messages as often they won’t be delivered to the sensor node immediately, especially if the sensor node only sends an uplink message every 30 minutes/hour/day.
The confirmed flag should be used with care as the Azure IoT Hub messages may expire before a delivery Ack/Nack/Failed is received from the TTI.
PowerBI graph of temperature and humidity in my garage over 24 hours
Device explorer displaying a raw payload message which has been confirmed delivered
TTI device live data tab displaying raw payload in downlink message information tabAzure IoT Connector console application sending raw payload to sensor node with confirmation ack
Arduino monitor displaying received raw payload from TTI
If the Azure IoT Hub message payload is valid JSON it is copied into the payload decoded downlink message property. and if it is not valid JSON it assumed to be a Base64 encoded value and copied into the payload raw downlink message property.
try
{
// Split over multiple lines in an attempt to improve readability. A valid JSON string should start/end with {/} for an object or [/] for an array
if (!(payloadText.StartsWith("{") && payloadText.EndsWith("}"))
&&
(!(payloadText.StartsWith("[") && payloadText.EndsWith("]"))))
{
throw new JsonReaderException();
}
downlink.PayloadDecoded = JToken.Parse(payloadText);
}
catch (JsonReaderException)
{
downlink.PayloadRaw = payloadText;
}
Like the Azure IoT Central JSON validation I had to add a check that the string started with a “{” and finished with a “}” (a JSON object) or started with a “[” and finished with a “]” (a JSON array) as part of the validation process.
Device explorer displaying a JSON payload message which has been confirmed delivered
I normally wouldn’t use exceptions for flow control but I can’t see a better way of doing this.
TTI device live data tab displaying JSON payload in downlink message information tabAzure IoT Connector console application sending JSON payload to sensor node with confirmation ackArduino monitor displaying received JSON payload from TTI
The build in TTI decoder only supports downlink decoded payloads with property names “value_0” through “value_x” custom encoders may support other property names.
The first step was to display the temperature and barometric pressure values from the Seeedstudio Grove BMP180 attached to my sensor node.
Sensor node displaying temperature and barometric pressure values
Azure IoT Central temperature and barometric pressure telemetry configuration
Azure IoT Central Telemetry Dashboard displaying temperature and barometric pressure values
The next step was to configure a simple Azure IoT Central command to send to the sensor node. This was a queued request with no payload. An example of this sort of command would be a request for a sensor node to reboot or turn on an actuator.
My integration uses only offline queued commands as often messages won’t be delivered to the sensor node immediately, especially if the sensor node only sends a message every half hour/hour/day. The confirmed flag should be used with care as the Azure IoT Hub messages may expire before a delivery Ack/Nack/Failed is received from the TTI and it consumes downlink bandwidth.
if (message.Properties.ContainsKey("method-name"))
{
}
Azure IoT Central command with out a request payload value command configuration
To send a downlink message, TTI needs a LoRaWAN port number (plus optional queue, confirmed and priority values) which can’t be provided via the Azure IoT Central command setup so these values are configured in the app.settings file.
Each TTI application has zero or more Azure IoT Central command configurations which supply the port, confirmed, priority and queue settings.
Azure IoT Central simple command dashboardAzure IoT Central simple command initiationAzure IoT TTI connector application sending a simple command to my sensor nodeSensor node display simple command information. The note message payload is empty
The next step was to configure a more complex Azure IoT Central command to send to the sensor node. This was a queued request with a single value payload. An example of this sort of command could be setting the speed of a fan or the maximum temperature of a freezer for an out of band (OOB) notification to be sent.
Azure IoT Central single value command configuration
The value_0 settings are for the minimum temperature the value_1 settings are for the maximum temperature value.
Azure IoT Central single value command initiationAzure IoT TTI connector application sending a single value command to my sensor nodeSensor node displaying single value command information. There are two downlink messages and each payload contains a single value
The single value command payload contains the textual representation of the value e.g. “true”/”false” or “1.23” which are also valid JSON. This initially caused issues as I was trying to splice a single value into the decoded payload.
I had to add a check that the string started with a “{” and finished with a “}” (a JSON object) or started with a “[” and finished with a “]” (a JSON array) as part of the validation process.
For a single value command the payload decoded has a single property with the method-name value as the name and the payload as the value. For a command with a JSON payload the message payload is copied into the PayloadDecoded.
I normally wouldn’t use exceptions for flow control but I can’t see a better way of doing this.
try
{
// Split over multiple lines to improve readability
if (!(payloadText.StartsWith("{") && payloadText.EndsWith("}"))
&&
(!(payloadText.StartsWith("[") && payloadText.EndsWith("]"))))
{
throw new JsonReaderException();
}
downlink.PayloadDecoded = JToken.Parse(payloadText);
}
catch (JsonReaderException)
{
try
{
JToken value = JToken.Parse(payloadText);
downlink.PayloadDecoded = new JObject(new JProperty(methodName, value));
}
catch (JsonReaderException)
{
downlink.PayloadDecoded = new JObject(new JProperty(methodName, payloadText));
}
}
The final step was to configure an another Azure IoT Central command with a JSON payload to send to the sensor node. A “real-world” example of this sort of command would be setting the minimum and maximum temperatures of a freezer in a single downlink message.
Azure IoT Central JSON payload command setup
Azure IoT Central JSON payload command payload configuration
Azure IoT TTI connector application sending a JSON payload command to my sensor node
Sensor node displaying JSON command information. There is a single payload which contains a two values
The build in TTI decoder only supports downlink decoded payloads with property names “value_0” through “value_x” which results in some odd command names and JSON payload property names. (Custom encoders may support other property names). Case sensitivity of some configuration values also tripped me up.