Random wanderings through Microsoft Azure esp. PaaS plumbing, the IoT bits, AI on Micro controllers, AI on Edge Devices, .NET nanoFramework, .NET Core on *nix and ML.NET+ONNX
For my HTTP Integration I need to reliably forward messages to an Azure IoT Hub or Azure IoT Central. This solution needs to be robust and not lose any messages even when portions of the system are unavailable because of failures or sudden spikes in inbound traffic.
[Route("[controller]")]
[ApiController]
public class Queued : ControllerBase
{
private readonly string storageConnectionString;
private readonly string queueName;
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public Queued( IConfiguration configuration)
{
this.storageConnectionString = configuration.GetSection("AzureStorageConnectionString").Value;
this.queueName = configuration.GetSection("UplinkQueueName").Value;
}
public string Index()
{
return "Queued move along nothing to see";
}
[HttpPost]
public async Task<IActionResult> Post([FromBody] PayloadV5 payload)
{
string payloadFieldsUnpacked = string.Empty;
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("QueuedController validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
try
{
QueueClient queueClient = new QueueClient(storageConnectionString, queueName);
await queueClient.CreateIfNotExistsAsync();
await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));
}
catch( Exception ex)
{
log.Error("Unable to open/create queue or send message", ex);
return this.Problem("Unable to open queue (creating if it doesn't exist) or send message", statusCode:500, title:"Uplink payload not sent" );
}
return this.Ok();
}
}
An Azure Function with a Queue Trigger processes the messages and for this test pauses for 2 seconds (simulating a call to the Device Provisioning Service(DPS) ). It keeps track of the number of concurrent processing threads and when the first message for each device was received since the program was started.
public static class UplinkMessageProcessor
{
static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
static ConcurrentDictionary<string, PayloadV5> DevicesSeen = new ConcurrentDictionary<string, PayloadV5>();
static int ConcurrentThreadCount = 0;
[FunctionName("UplinkMessageProcessor")]
public static void Run([QueueTrigger("%UplinkQueueName%", Connection = "AzureStorageConnectionString")] string myQueueItem, Microsoft.Azure.WebJobs.ExecutionContext executionContext)
{
Interlocked.Increment(ref ConcurrentThreadCount);
var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));
log.Info($"Uplink message function triggered: {myQueueItem}");
PayloadV5 payloadMessage = (PayloadV5)JsonSerializer.Deserialize(myQueueItem, typeof(PayloadV5));
PayloadV5 payload = (PayloadV5)DevicesSeen.GetOrAdd(payloadMessage.dev_id, payloadMessage);
log.Info($"Uplink message DevEui:{payload.dev_id} Threads:{ConcurrentThreadCount} First:{payload.metadata.time} Current:{payloadMessage.metadata.time} PayloadRaw:{payload.payload_raw}");
Thread.Sleep(2000);
Interlocked.Decrement(ref ConcurrentThreadCount);
}
}
To explore how this processing worked I sent 1000 uplink messages from my Seeeduino LoRaWAN devices which were buffered in a queue.
Azure storage Explorer 1000 queued messagesApplication insights 1000 events
I processed 1000’s of messages with the Azure Function but every so often 10-20% of the logging messages wouldn’t turn up in the logs. I’m using Log4Net and I think it is most probably caused by not flushing the messages before the function finishes
For development and testing being able to provision an individual device is really useful, though for Azure IoT Central it is not easy (especially with the deprecation of DPS-KeyGen). With an Azure IoT Hub device connection strings are available in the portal which is convenient but not terribly scalable.
Initially the enrollment group had no registration records so I ran my command-line application to generate group enrollment keys for one of my devices.
Device registration before running my command line application
Then I ran the command-line application with my scopeID, registrationID (LoRaWAN deviceEUI) and the device group enrollment key I had generated in the previous step.
Registering a device and sending a message to the my Azure IoT Hub
After running the command line application the device was visible in the enrollment group registration records.
Device registration after running my command line application
Provisioning a device with an individual enrollment has a different workflow. I had to run my command-line application with the RegistrationID, ScopeID, and one of the symmetric keys from the DPS individual enrollment device configuration.
DPS Individual enrollment configuration
A major downside to an individual enrollment is either the primary or the secondary symmetric key for the device has to be deployed on the device which could be problematic if the device has no secure storage.
With a group enrollment only the registration ID and the derived symmetric key have to be deployed on the device which is more secure.
Registering a device and sending a message to the my Azure IoT Hub
In Azure IoT Explorer I could see messages from both my group and individually enrolled devices arriving at my Azure IoT hub
After some initial issues I found DPS was quite reliable and surprisingly easy to configure. I did find the DPS ProvisioningDeviceClient.RegisterAsync method sometimes took several seconds to execute which may have some ramifications when my application is doing this on demand.
To connect to an Azure IoT Hub I copied the connection string from the portal.
Azure IoT Hub connection string components
Retrieving a connection string for a device connected to Azure IoT Central (without using the Device Provisioning Service(DPS)) is a bit more involved. There is a deprecated command line application dps-keygen which calls the DPS with a device ID , device SAS key and scope ID and returns a connection string.
Azure IoT Central Device Connection InformationAzure DPS-Keygen command-line
Using Azure IoT Explorer I could see reformatted JSON messages from my client application.
Azure IoT Explorer displaying message payload
These two approaches are fine for testing but wouldn’t scale well and would be painful to use it there were 1000s, 100s or even 10s of devices.
Unpacking the payload_fields property was causing me some issues. I tried many different approaches but they all failed.
public class PayloadV4
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
//public JsonObject payload_fields { get; set; }
//public JObject payload_fields { get; set; }
//public JToken payload_fields { get; set; }
//public JContainer payload_fields { get; set; }
//public dynamic payload_fields { get; set; }
public Object payload_fields { get; set; }
public MetadataV4 metadata { get; set; }
public string downlink_url { get; set; }
}
I tried using the excellent JsonSubTypes library to build a polymorphic convertor, which failed.
...
public class PolymorphicJsonConverter : JsonConverter
{
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
JObject item = JObject.Load(reader);
var type = item["type"].Value<string>();
if (type == "PayloadFieldDigitalInput")
{
return item.ToObject<PayloadFieldDigitalInput>();
}
else if (type == "PayloadFieldDigitalInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else if (type == "PayloadFieldAnalogInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else if (type == "PayloadFieldAnalogInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else
{
return null;
}
}
...
}
It was about this point I figured that I was down a very deep rabbit hole and I should just embrace my “stupid”.
I realised I shouldn’t unpack the payload as the number of generated classes required and the complexity of other approaches was going to rapidly get out of hand. Using an Object and recursively traversing its contents with System.Text.Json looked like a viable approach.
public class GatewayV4
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV4
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV4> gateways { get; set; }
}
public class PayloadV4
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
// finally settled on an Object
public Object payload_fields { get; set; }
public MetadataV4 metadata { get; set; }
public string downlink_url { get; set; }
}
So, I added yet another new to controller to my application to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV4Fields : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV4 payload)
{
string payloadFieldsUnpacked = string.Empty;
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV4Fields validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
JsonElement jsonElement = (JsonElement)payload.payload_fields;
foreach (var property in jsonElement.EnumerateObject())
{
// Special handling for nested properties
if (property.Name.StartsWith("gps_") || property.Name.StartsWith("accelerometer_") || property.Name.StartsWith("gyrometer_"))
{
payloadFieldsUnpacked += $"Property Name:{property.Name}\r\n";
JsonElement gpsElement = (JsonElement)property.Value;
foreach (var gpsProperty in gpsElement.EnumerateObject())
{
payloadFieldsUnpacked += $" Property Name:{gpsProperty.Name} Property Value:{gpsProperty.Value}\r\n";
}
}
else
{
payloadFieldsUnpacked += $"Property Name:{property.Name} Property Value:{property.Value}\r\n";
}
}
log.Info(payloadFieldsUnpacked);
return this.Ok();
}
}
In the body of the events in Azure Application Insights I could see messages and the format looked fine for simple payloads
In part 1 & part 2 I had been ignoring the payload_fields property of the Payload class. The documentation indicates that payload_fields property is populated when an uplink message is Decoded.
There is a built in decoder for Low Power Payload(LPP) which looked like the simplest option to start with.
I used JSON2Csharp to generate C# classes which would deserialise the above uplink message.
// Third version of classes for unpacking HTTP payload
public class Gps1V3
{
public int altitude { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
}
public class PayloadFieldsV3
{
public double analog_in_1 { get; set; }
public int digital_in_1 { get; set; }
public Gps1V3 gps_1 { get; set; }
public int luminosity_1 { get; set; }
public double temperature_1 { get; set; }
}
public class GatewayV3
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV3
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV3> gateways { get; set; }
}
public class PayloadV3
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
public PayloadFieldsV3 payload_fields { get; set; }
public MetadataV3 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV3Fields : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV3 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV3Fields validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw} analog_in_1:{payload.payload_fields.analog_in_1} digital_in_1:{payload.payload_fields.digital_in_1} gps_1:{payload.payload_fields.gps_1.latitude},{payload.payload_fields.gps_1.longitude},{payload.payload_fields.gps_1.altitude} luminosity_1:{payload.payload_fields.luminosity_1} temperature_1:{payload.payload_fields.temperature_1}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint. In the body of the Application Insights events I could see the devEUI, port, and the payload fields had been extracted from the message.
This arrangement was pretty nasty and sort of worked but in the “real world” would not have been viable. I would need to generate lots of custom classes for each application taking into account the channel numbers (e,g, analog_in_1,analog_in_2) and datatypes used.
I also explored which datatypes were supported by the TTN decoder, after some experimentation (Aug 2019) it looks like only the LPPV1 ones are.
AnalogInput
AnalogOutput
DigitalInput
DigitalOutput
GPS
Accelerometer
Gyrometer
Luminosity
Presence
BarometricPressure
RelativeHumidity
Temperature
What I need is a more flexible way to stored and decode payload_fields property..
I used JSON2Csharp and a payload I downloaded in Part 1 to generate C# classes which would deserialise my minimalist messages.
// First version of classes for unpacking HTTP payload https://json2csharp.com/
public class GatewayV1
{
public string gtw_id { get; set; }
public int timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV1
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV1> gateways { get; set; }
}
public class PayloadV1
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool confirmed { get; set; }
public string payload_raw { get; set; }
public MetadataV1 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV1 : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV1 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV1 validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint.
TTN Application configuration overview
In the body of the Application Insights events I could see the devEUI, port, and the raw payload had been extracted from the message.
I then added another controller which decoded the Base64 encoded payload_raw.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV2Base64Decoded : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV2 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV2BCDDecoded validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Port:{payload.port} Payload:{ Encoding.UTF8.GetString(Convert.FromBase64String(payload.payload_raw))}");
return this.Ok();
}
}
Then after a while the deserialisation started to fail with an HTTP 400-Bad request. When I ran the same request with Telerik Fiddler on my desktop the raw response was
HTTP/1.1 400 Bad Request
Transfer-Encoding: chunked
Content-Type: application/problem+json; charset=utf-8
Server: Microsoft-IIS/10.0
Request-Context: appId=cid-v1:f4f72f2e-1144-4578-923f-d3ebdcfb7766
X-Powered-By: ASP.NET
Date: Mon, 31 Aug 2020 09:07:30 GMT
17a
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1",
"title":"One or more validation errors occurred.",
"status":400,
"traceId":"00-45033ec030b63d4ebb82b95b67cb8142-9fc52a18be202848-00",
"errors":{
"$.metadata.gateways[0].timestamp":["The JSON value could not be converted to System.Int32.
Path: $.metadata.gateways[0].timestamp | LineNumber: 21 | BytePositionInLine: 35."]}}
0
The line in the payload was the gateway timestamp. The value was 2,426,973,100 which larger than 2,147,483,647 the maximum number that can be stored in a signed 32 bit integer. The JSON2CSharp generator had made a reasonable choice of datatype but in this case the range was not sufficient.
public class GatewayV2
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
I checked the TTN code where the variable was declared as an unsigned 64 bit integer.
This issue could occur for other variables so I need to manually check all the generated classes.
This is the first in a series of posts about building an HTTP Integration for a The Things Network(TTN) application. I have assumed that readers are familiar with the configuration and operation of a TTN instance so I’m not going to cover that in detail.
#include <LoRaWan.h>
unsigned char data[] = {0x53, 0x65, 0x65, 0x65, 0x64, 0x75, 0x69, 0x6E, 0x6F, 0x20, 0x4C, 0x6F, 0x52, 0x61, 0x57, 0x41, 0x4E};
char buffer[256];
void setup(void)
{
SerialUSB.begin(9600);
while (!SerialUSB);
lora.init();
memset(buffer, 0, 256);
lora.getVersion(buffer, 256, 1);
SerialUSB.print("Ver:");
SerialUSB.print(buffer);
memset(buffer, 0, 256);
lora.getId(buffer, 256, 1);
SerialUSB.print(buffer);
SerialUSB.print("ID:");
lora.setKey(NULL, NULL, "12345678901234567890123456789012");
lora.setId(NULL, "1234567890123456", "1234567890123456");
lora.setPort(10);
lora.setDeciveMode(LWOTAA);
lora.setDataRate(DR0, AS923);
lora.setDutyCycle(false);
lora.setJoinDutyCycle(false);
lora.setPower(14);
while (!lora.setOTAAJoin(JOIN, 10))
{
SerialUSB.println("");
}
SerialUSB.println( "Joined");
}
void loop(void)
{
bool result = false;
//result = lora.transferPacket("Hello World!", 10);
result = lora.transferPacket(data, sizeof(data));
if (result)
{
short length;
short rssi;
memset(buffer, 0, 256);
length = lora.receivePacket(buffer, 256, &rssi);
if (length)
{
SerialUSB.print("Length is: ");
SerialUSB.println(length);
SerialUSB.print("RSSI is: ");
SerialUSB.println(rssi);
SerialUSB.print("Data is: ");
for (unsigned char i = 0; i < length; i ++)
{
SerialUSB.print("0x");
SerialUSB.print(buffer[i], HEX);
SerialUSB.print(" ");
}
SerialUSB.println();
}
}
delay( 10000);
}
The SetKey and SetId parameters are not obvious and it would be much easier if there were two methods one for OTTA and the other for Activation by-Personalization(ABP). I then built an Net Core 3.1 Web API application which had a single controller to receive messages from TTN.
[Route("[controller]")]
[ApiController]
public class Raw : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
[HttpGet]
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public void PostRaw([FromBody]JsonElement body)
{
string json = JsonSerializer.Serialize(body);
log.Info(json);
}
}
I then configured my TTN application integration to send messages to my shinny new endpoint
TTN Application configuration overview
My controller logged events to Azure application Insights so I could see if there were any errors and inspect message payloads. The TTN developers provide sample payloads to illustrate the message format but they were a bit chunky for my initial testing.
Application Insights event list
I could then inspect individual events and payloads
Application Insights event display
At this point I could download message payloads and save them locally.
These were useful because I could then use a tool like Telerik Fiddler to submit messages to my application when it was running locally in the Visual Studio 2019 debugger.
In a previous post I couldn’t add a TTN V3EndDevice to an application (I’m going try again soon) using the RESTAPI so I figured would try out the MQTT API. My aim was to get notifications of when a Device was created/updated/deleted in an Application.
After some tinkering with the format of MQTT usernames and passwords I can connect to my V3 instance and successfully subscribe to topics. But, currently(Aug 2020) I’m not receiving any messages when I create, update or delete a Device. I have tried different Quality of Service QoS settings etc. and I wonder if my topic names aren’t quite right.
.Net Core MQTT Client
I wanted notifications so I could “automagically” provision a device in an Azure IoT Hub (maybe with a tag indicating it’s an “orphan” so it is discoverable) or in Azure IoT Central when a Device was created in TTN.
This looked like a good approach as my Azure IoT Hub applications have other devices which are not connected via LoRaWAN, and there are many specialised LoRaWAN settings which would need to be validated, stored etc. by my software. (maybe TTN device templates would make this easier). The TTN software is pretty good at managing devices so why would I “re-invent the wheel”.
I built a “nasty” console application using MQTTNet so that I could figure out how to connect to my V3 setup and subscribe to topics.
namespace devMobile.TheThingsNetwork.MqttClient
{
using System;
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
using MQTTnet;
using MQTTnet.Client;
using MQTTnet.Client.Disconnecting;
using MQTTnet.Client.Options;
using MQTTnet.Client.Receiving;
using MQTTnet.Client.Subscribing;
class Program
{
private static IMqttClient mqttClient = null;
private static IMqttClientOptions mqttOptions = null;
private static string server;
private static string username;
private static string password;
private static string clientId;
static async Task Main(string[] args)
{
MqttFactory factory = new MqttFactory();
mqttClient = factory.CreateMqttClient();
if (args.Length != 4)
{
Console.WriteLine("[MQTT Server] [UserName] [Password] [ClientID]");
Console.WriteLine("Press <enter> to exit");
Console.ReadLine();
return;
}
server = args[0];
username = args[1];
password = args[2];
clientId = args[3];
mqttOptions = new MqttClientOptionsBuilder()
.WithTcpServer(server)
.WithCredentials(username, password)
.WithClientId(clientId)
.WithTls()
.Build();
mqttClient.UseDisconnectedHandler(new MqttClientDisconnectedHandlerDelegate(e => MqttClient_Disconnected(e)));
mqttClient.UseApplicationMessageReceivedHandler(new MqttApplicationMessageReceivedHandlerDelegate(e => MqttClient_ApplicationMessageReceived(e)));
await mqttClient.ConnectAsync(mqttOptions);
// Different topics I have tried
string topic;
topic = $"v3/{username}/devices/{clientId}/events/update";
//topic = $"v3/{username}/devices/{clientId}/events/create";
//topic = $"v3/{username}/devices/{clientId}/events/delete";
//topic = $"v3/{username}/devices/+/events/+";
//topic = $"v3/{username}/devices/+/events/create";
//topic = $"v3/{username}/devices/+/events/update";
//topic = $"v3/{username}/devices/+/events/delete";
//topic = $"v3/{username}/devices/+/events/+";
MqttClientSubscribeResult result;
// Different QoS I have tried
//result = await mqttClient.SubscribeAsync(topic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtMostOnce);
result = await mqttClient.SubscribeAsync(topic, MQTTnet.Protocol.MqttQualityOfServiceLevel.AtLeastOnce);
//result = await mqttClient.SubscribeAsync(topic, MQTTnet.Protocol.MqttQualityOfServiceLevel.ExactlyOnce);
Console.WriteLine("SubscribeAsync Result");
foreach ( var resultItem in result.Items)
{
Console.WriteLine($"ResultCode:{resultItem.ResultCode} TopicFilter:{resultItem.TopicFilter}");
}
Console.WriteLine("Press any key to temrminate wait");
while (!Console.KeyAvailable)
{
Console.Write(".");
Thread.Sleep(30100);
}
Console.WriteLine("Press <enter> to exit");
Console.ReadLine();
return;
}
private static void MqttClient_ApplicationMessageReceived(MqttApplicationMessageReceivedEventArgs e)
{
Console.WriteLine($"ClientId:{e.ClientId} Topic:{e.ApplicationMessage.Topic} Payload:{e.ApplicationMessage.ConvertPayloadToString()}");
}
private static async void MqttClient_Disconnected(MqttClientDisconnectedEventArgs e)
{
Debug.WriteLine("Disconnected");
await Task.Delay(TimeSpan.FromSeconds(5));
try
{
await mqttClient.ConnectAsync(mqttOptions);
}
catch (Exception ex)
{
Debug.WriteLine("Reconnect failed {0}", ex.Message);
}
}
}
}
I’m going to post some questions on the TTN forums and Slack community to see if what I’m trying to do is supported/possible.
I got some helpful responses on the TTN forums and it looks like what I want todo is not supported by the V3 stack (Aug2020) and I will have to use gRPC.
I then used nSwagStudio to generate a C# client from a local copy of the API swagger (in the future I will use download the swagger and use the command line tools).
nSwag User Interface
At this point I had a basic client for the TTN network stack API which lacked support for the TTN security model etc. After looking at the TTN API documentation I figured out I need to add a header which contained an API Key from the TTN application configuration.
namespace TheThingsNetwork.API
{
public partial class EndDeviceRegistryClient
{
public string ApiKey { set; get; }
partial void PrepareRequest(System.Net.Http.HttpClient client, System.Net.Http.HttpRequestMessage request, string url)
{
if (!client.DefaultRequestHeaders.Contains("Authorization"))
{
client.DefaultRequestHeaders.Add("Authorization", $"Bearer {ApiKey}");
}
}
}
}
In the TTN console on the overview page for my application I created an Access Key.
I then added some attributes to one of my devices so I had some addition device configuration data to display(I figured these could be useful for Azure IoT Hub configuration parameters etc. more about this later..)
Basic Device configuration in TTN Enterprise
I built a nasty console application which displayed some basic device configuration information to confirm I could authenticate and enumerate.
//---------------------------------------------------------------------------------
// Copyright (c) August 2020, devMobile Software
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//
// SECURITY_ANONYMISE
//---------------------------------------------------------------------------------
namespace TheThingsNetwork.EndDeviceClient
{
using System;
using System.Collections.Generic;
using System.Net.Http;
using TheThingsNetwork.API;
class Program
{
static void Main(string[] args)
{
Console.WriteLine("TheThingsNetwork.EndDeviceClient starting");
if (args.Length != 3)
{
Console.WriteLine("EndDeviceClient <baseURL> <applicationId> <apiKey>");
Console.WriteLine("Press <enter> to exit");
Console.ReadLine();
return;
}
string baseUrl = args[0];
#if !SECURITY_ANONYMISE
Console.WriteLine($"baseURL: {baseUrl}");
#endif
string applicationId = args[1];
#if !SECURITY_ANONYMISE
Console.WriteLine($"applicationId: {applicationId}");
#endif
string apiKey = args[2];
#if !SECURITY_ANONYMISE
Console.WriteLine($"apiKey: {apiKey}");
Console.WriteLine();
#endif
using (HttpClient httpClient = new HttpClient())
{
EndDeviceRegistryClient endDeviceRegistryClient = new EndDeviceRegistryClient(baseUrl, httpClient);
endDeviceRegistryClient.ApiKey = apiKey;
try
{
V3EndDevices endDevices = endDeviceRegistryClient.ListAsync(applicationId).GetAwaiter().GetResult();
foreach (V3EndDevice v3EndDevice in endDevices.End_devices)
{
#if SECURITY_ANONYMISE
v3EndDevice.Ids.Dev_eui[7] = 0x0;
v3EndDevice.Ids.Dev_eui[8] = 0x0;
v3EndDevice.Ids.Dev_eui[9] = 0x0;
v3EndDevice.Ids.Dev_eui[10] = 0x0;
v3EndDevice.Ids.Dev_eui[11] = 0x0;
#endif
Console.WriteLine($"Device ID:{v3EndDevice.Ids.Device_id} DevEUI:{Convert.ToBase64String(v3EndDevice.Ids.Dev_eui)}");
Console.WriteLine($" CreatedAt: {v3EndDevice.Created_at:dd-MM-yy HH:mm:ss} UpdatedAt: {v3EndDevice.Updated_at:dd-MM-yy HH:mm:ss}");
string[] fieldMaskPaths = { "name", "description", "attributes" };
var endDevice = endDeviceRegistryClient.GetAsync(applicationId, v3EndDevice.Ids.Device_id, field_mask_paths: fieldMaskPaths).GetAwaiter().GetResult();
Console.WriteLine($" Name: {endDevice.Name}");
Console.WriteLine($" Description: {endDevice.Description}");
if (endDevice.Attributes != null)
{
foreach (KeyValuePair<string, string> attribute in endDevice.Attributes)
{
Console.WriteLine($" Key: {attribute.Key} Name: {attribute.Value}");
}
}
Console.WriteLine();
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
Console.WriteLine("Press <enter> to exit");
Console.ReadLine();
}
}
}
}
I added some code so I could anonymise the displayed configuration so I could take screen grabs without revealing any sensitive information.
TTN API Client V1
Initially I struggled with versioning issues as the TTN community network is running V2 and the github repository was for V3. I approached TTN and they gave me access to a “limited” account on the enterprise network.
I also struggled with the number of blank fields in responses and spent some time learning GO (the programming language TTN is built with) to figure out how to use fieldMaskPaths etc.
Application Insights logging with message unpackingApplication Insights logging message payload
Then in the last log entry the decoded message payload
/*
Copyright ® 2020 Feb devMobile Software, All Rights Reserved
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
Default URL for triggering event grid function in the local environment.
http://localhost:7071/runtime/webhooks/EventGrid?functionName=functionname
*/
namespace EventGridProcessorAzureIotHub
{
using System;
using System.IO;
using System.Reflection;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using log4net;
using log4net.Config;
using Newtonsoft.Json;
public static class Telemetry
{
[FunctionName("Telemetry")]
public static void Run([EventGridTrigger]Microsoft.Azure.EventGrid.Models.EventGridEvent eventGridEvent, ExecutionContext executionContext )//, TelemetryClient telemetryClient)
{
ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));
log.Info($"eventGridEvent.Data-{eventGridEvent}");
log.Info($"eventGridEvent.Data.ToString()-{eventGridEvent.Data.ToString()}");
IotHubDeviceTelemetryEventData iOThubDeviceTelemetryEventData = (IotHubDeviceTelemetryEventData)JsonConvert.DeserializeObject(eventGridEvent.Data.ToString(), typeof(IotHubDeviceTelemetryEventData));
log.Info($"iOThubDeviceTelemetryEventData.Body.ToString()-{iOThubDeviceTelemetryEventData.Body.ToString()}");
byte[] base64EncodedBytes = System.Convert.FromBase64String(iOThubDeviceTelemetryEventData.Body.ToString());
log.Info($"System.Text.Encoding.UTF8.GetString(-{System.Text.Encoding.UTF8.GetString(base64EncodedBytes)}");
}
}
}
Overall it took roughly half a page of code (mainly generated by a tool) to unpack and log the contents of an Azure IoT Hub EventGrid payload to Application Insights.