For more complex deployments the ApplicationEnrollmentGroupMapping configuration enables The Things Network(TTN) devices to be provisioned using different GroupEnrollment keys based on the applicationid in the first Uplink message which initiates provisoning.
Then as uplink messages from the TTN integration are processed devices are “automagically” created in the DPS.
Simultaneously devices are created in the Azure IoT Hub
Then shortly after telemetry events are available for applications to process or inspection with tools like Azure IoT Explorer.
In the telemetry event payload sent to the Azure IoT IoT Hub are some extra fields to help with debugging and tracing. The raw payload is also included so messages not decoded by TTN can be processed by the client application(s).
/ Assemble the JSON payload to send to Azure IoT Hub/Central.
log.LogInformation($"{messagePrefix} Payload assembly start");
JObject telemetryEvent = new JObject();
try
{
JObject payloadFields = (JObject)payloadObect.payload_fields;
telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
telemetryEvent.Add("Retry", payloadObect.is_retry);
telemetryEvent.Add("Counter", payloadObect.counter);
telemetryEvent.Add("DeviceID", payloadObect.dev_id);
telemetryEvent.Add("ApplicationID", payloadObect.app_id);
telemetryEvent.Add("Port", payloadObect.port);
telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
telemetryEvent.Add("ReceivedAtUTC", payloadObect.metadata.time);
// If the payload has been unpacked in TTN backend add fields to telemetry event payload
if (payloadFields != null)
{
foreach (JProperty child in payloadFields.Children())
{
EnumerateChildren(telemetryEvent, child);
}
}
}
catch (Exception ex)
{
log.LogError(ex, $"{messagePrefix} Payload processing or Telemetry event assembly failed");
throw;
}
Beware, the Azure Storage Account and storage queue names have a limited character set. This caused me problems several times when I used camel cased queue names etc.
In the last couple of posts I had been building an Azure Function with a QueueTrigger to process the uplink messages. The function used custom bindings so that the CloudQueueMessage could be accessed, and load the Azure Storage account plus queue name from configuration. I’m still using classes generated by JSON2CSharp (with minimal modifications) for deserialising the payloads with JSON.Net.
The message processor Azure Function uses a ConcurrentCollection to store AzureDeviceClient objects constructed using the information returned by the Azure Device Provisioning Service(DPS). This is so the DPS doesn’t have to be called for the connection details for every message.(When the Azure function is restarted the dictionary of DeviceClient objects has to be repopulated). If there is a backlog of messages the message processor can process more than a dozen messages concurrently so the telemetry events displayed in an application like Azure IoT Central can arrive out of order.
The solution uses DPS Group Enrollment with Symmetric Key Attestation so Azure IoT Hub devices can be “automagically” created when a message from a new device is processed. The processing code is multi-thread and relies on many error conditions being handled by the Azure Function retry mechanism. After a number of failed retries the messages are moved to a poison queue. Azure Storage Explorer is a good tool for viewing payloads and moving poison messages back to the processing queue.
public static class UplinkMessageProcessor
{
static readonly ConcurrentDictionary<string, DeviceClient> DeviceClients = new ConcurrentDictionary<string, DeviceClient>();
[FunctionName("UplinkMessageProcessor")]
public static async Task Run(
[QueueTrigger("%UplinkQueueName%", Connection = "AzureStorageConnectionString")]
CloudQueueMessage cloudQueueMessage, // Used to get CloudQueueMessage.Id for logging
Microsoft.Azure.WebJobs.ExecutionContext context,
ILogger log)
{
PayloadV5 payloadObect;
DeviceClient deviceClient = null;
DeviceProvisioningServiceSettings deviceProvisioningServiceConfig;
string environmentName = Environment.GetEnvironmentVariable("ENVIRONMENT");
// Load configuration for DPS. Refactor approach and store securely...
var configuration = new ConfigurationBuilder()
.SetBasePath(context.FunctionAppDirectory)
.AddJsonFile($"appsettings.json")
.AddJsonFile($"appsettings.{environmentName}.json")
.AddEnvironmentVariables()
.Build();
// Load configuration for DPS. Refactor approach and store securely...
try
{
deviceProvisioningServiceConfig = (DeviceProvisioningServiceSettings)configuration.GetSection("DeviceProvisioningService").Get<DeviceProvisioningServiceSettings>(); ;
}
catch (Exception ex)
{
log.LogError(ex, $"Configuration loading failed");
throw;
}
// Deserialise uplink message from Azure storage queue
try
{
payloadObect = JsonConvert.DeserializeObject<PayloadV5>(cloudQueueMessage.AsString);
}
catch (Exception ex)
{
log.LogError(ex, $"MessageID:{cloudQueueMessage.Id} uplink message deserialisation failed");
throw;
}
// Extract the device ID as it's used lots of places
string registrationID = payloadObect.hardware_serial;
// Construct the prefix used in all the logging
string messagePrefix = $"MessageID: {cloudQueueMessage.Id} DeviceID:{registrationID} Counter:{payloadObect.counter} Application ID:{payloadObect.app_id}";
log.LogInformation($"{messagePrefix} Uplink message device processing start");
// See if the device has already been provisioned
if (DeviceClients.TryAdd(registrationID, deviceClient))
{
log.LogInformation($"{messagePrefix} Device provisioning start");
string enrollmentGroupSymmetricKey = deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault;
// figure out if custom mapping for TTN applicationID
if (deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping != null)
{
deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping.GetValueOrDefault(payloadObect.app_id, deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault);
}
// Do DPS magic first time device seen
await DeviceRegistration(log, messagePrefix, deviceProvisioningServiceConfig.GlobalDeviceEndpoint, deviceProvisioningServiceConfig.ScopeID, enrollmentGroupSymmetricKey, registrationID);
}
// Wait for the Device Provisioning Service to complete on this or other thread
log.LogInformation($"{messagePrefix} Device provisioning polling start");
if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
{
log.LogError($"{messagePrefix} Device provisioning polling TryGet before while failed");
throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet before while failed");
}
while (deviceClient == null)
{
log.LogInformation($"{messagePrefix} provisioning polling delay");
await Task.Delay(deviceProvisioningServiceConfig.DeviceProvisioningPollingDelay);
if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
{
log.LogError($"{messagePrefix} Device provisioning polling TryGet while loop failed");
throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet while loopfailed");
}
}
// Assemble the JSON payload to send to Azure IoT Hub/Central.
log.LogInformation($"{messagePrefix} Payload assembly start");
JObject telemetryEvent = new JObject();
try
{
JObject payloadFields = (JObject)payloadObect.payload_fields;
telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
telemetryEvent.Add("Retry", payloadObect.is_retry);
telemetryEvent.Add("Counter", payloadObect.counter);
telemetryEvent.Add("DeviceID", payloadObect.dev_id);
telemetryEvent.Add("ApplicationID", payloadObect.app_id);
telemetryEvent.Add("Port", payloadObect.port);
telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
telemetryEvent.Add("ReceivedAt", payloadObect.metadata.time);
// If the payload has been unpacked in TTN backend add fields to telemetry event payload
if (payloadFields != null)
{
foreach (JProperty child in payloadFields.Children())
{
EnumerateChildren(telemetryEvent, child);
}
}
}
catch (Exception ex)
{
if (DeviceClients.TryRemove(registrationID, out deviceClient))
{
log.LogWarning($"{messagePrefix} TryRemove payload assembly failed");
}
log.LogError(ex, $"{messagePrefix} Payload assembly failed");
throw;
}
// Send the message to Azure IoT Hub/Azure IoT Central
log.LogInformation($"{messagePrefix} Payload SendEventAsync start");
try
{
using (Message ioTHubmessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent))))
{
// Ensure the displayed time is the acquired time rather than the uploaded time. esp. importan for messages that end up in poison queue
ioTHubmessage.Properties.Add("iothub-creation-time-utc", payloadObect.metadata.time.ToString("s", CultureInfo.InvariantCulture));
await deviceClient.SendEventAsync(ioTHubmessage);
}
}
catch (Exception ex)
{
if (DeviceClients.TryRemove(registrationID, out deviceClient))
{
log.LogWarning($"{messagePrefix} TryRemove SendEventAsync failed");
}
log.LogError(ex, $"{messagePrefix} SendEventAsync failed");
throw;
}
log.LogInformation($"{messagePrefix} Uplink message device processing completed");
}
}
There is also support for using a specific GroupEnrollment based on the application_id in the uplink message payload.
In addition to the appsettings.json there is configuration for application insights, uplink message queue name and Azure Storage connection strings. The “Environment” setting is important as it specifies what appsettings.json file should be used if code is being debugged etc..
The deployed solution application consists of Azure IoTHub and DPS instances. There are two Azure functions, one for putting the messages from the TTN into a queue the other is for processing them. The Azure Functions are hosted in an Azure AppService plan.
Azure solution deployment
An Azure Storage account is used for the queue and Azure Function synchronisation information and Azure Application Insights is used to monitor the solution.
There was lots of code in nested classes for deserialising the The Things Network(TTN)JSON uplink messages in my WebAPI project. It looked a bit fragile and if the process failed uplink messages could be lost.
My first attempt at an Azure HTTP Trigger Function to handle an uplink message didn’t work. I had decorated the HTTP Trigger method with an Azure Storage Queue as the destination for the output.
public static class UplinkProcessor
{
[FunctionName("UplinkProcessor")]
[return: Queue("%UplinkQueueName%", Connection = "AzureStorageConnectionString")]
public static async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest request, ILogger log)
{
string payload;
log.LogInformation("C# HTTP trigger function processed a request.");
using (StreamReader streamReader = new StreamReader(request.Body))
{
payload = await streamReader.ReadToEndAsync();
}
return new OkObjectResult(payload);
}
}
There were a couple of other versions which failed with encoding issues.
Invalid uplink event JSON
public static class UplinkProcessor
{
[FunctionName("UplinkProcessor")]
[return: Queue("%UplinkQueueName%", Connection = "AzureStorageConnectionString")]
public static async Task<string> Run([HttpTrigger("post", Route = null)] HttpRequest request, ILogger log)
{
string payload;
log.LogInformation("C# HTTP trigger function processed a request.");
using (StreamReader streamReader = new StreamReader(request.Body))
{
payload = await streamReader.ReadToEndAsync();
}
return payload;
}
}
I finally settled on returning a string, which with the benefit of hindsight was obvious.
Valid JSON message
By storing the raw uplink event JSON from TTN the application can recover if it they can’t deserialised, (message format has changed or generated class issues) The queue processor won’t be able to process the uplink event messages so they will end up in the poison message queue after being retried a few times.
In the Azure management portal I generated a method scope API key.
Azure HTTP function API key management
I then added an x-functions-key header in the TTN application integration configuration and it worked second attempt due to a copy and past fail.
Things Network Application integration
To confirm my APIKey setup was correct I changed the header name and my requests started to fail with a 401 Unauthorised error.
After some experimentation it took less than two dozen lines of C# to create a secure endpoint to receive uplink messages and put them in an Azure Storage queue.
While testing the processing of queued The Things Network(TTN) uplink messages I had noticed that some of the Azure Application Insights events from my Log4Net setup were missing. I could see the MessagesProcessed counter was correct but there weren’t enough events.
I assume the missing events were because I wasn’t “flushing“ at the end of the Run method. There was also a lot of “plumbing” code (including loading configuration files) to setup Log4Net.
Application Insights API in Application Insights Event viewer
I assume there were no missing events because the using statement was “flushing” every time the Run method completed. There was still a bit of “plumbing” code and which it would be good to get rid of.
This implementation had even less code and all the messages were visible in the Azure Application Insights event viewer.
All the Azure functions for logging
While built the Proof of Concept(PoC) implementations I added the configurable “runtag” so I could search for the messages relating to a session in the Azure Application Insights event viewer. The queue name and storage account were “automagically” loaded by the runtime which also reduced the amount of code.
At this point I had minimised the amount and complexity of the code required to process messages in the ttnuplinkmessages queue. Reducing the amount of “startup” required should make my QueueTriggerAzure function faster. But there was still a lot of boilerplate code for serialising the body of the message which added complexity.
At this point I realised I had a lot of code across multiple projects which had helped me breakdown the problem into manageable chunks but didn’t add a lot of value.
For my HTTP Integration I need to reliably forward messages to an Azure IoT Hub or Azure IoT Central. This solution needs to be robust and not lose any messages even when portions of the system are unavailable because of failures or sudden spikes in inbound traffic.
[Route("[controller]")]
[ApiController]
public class Queued : ControllerBase
{
private readonly string storageConnectionString;
private readonly string queueName;
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public Queued( IConfiguration configuration)
{
this.storageConnectionString = configuration.GetSection("AzureStorageConnectionString").Value;
this.queueName = configuration.GetSection("UplinkQueueName").Value;
}
public string Index()
{
return "Queued move along nothing to see";
}
[HttpPost]
public async Task<IActionResult> Post([FromBody] PayloadV5 payload)
{
string payloadFieldsUnpacked = string.Empty;
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("QueuedController validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
try
{
QueueClient queueClient = new QueueClient(storageConnectionString, queueName);
await queueClient.CreateIfNotExistsAsync();
await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));
}
catch( Exception ex)
{
log.Error("Unable to open/create queue or send message", ex);
return this.Problem("Unable to open queue (creating if it doesn't exist) or send message", statusCode:500, title:"Uplink payload not sent" );
}
return this.Ok();
}
}
An Azure Function with a Queue Trigger processes the messages and for this test pauses for 2 seconds (simulating a call to the Device Provisioning Service(DPS) ). It keeps track of the number of concurrent processing threads and when the first message for each device was received since the program was started.
public static class UplinkMessageProcessor
{
static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
static ConcurrentDictionary<string, PayloadV5> DevicesSeen = new ConcurrentDictionary<string, PayloadV5>();
static int ConcurrentThreadCount = 0;
[FunctionName("UplinkMessageProcessor")]
public static void Run([QueueTrigger("%UplinkQueueName%", Connection = "AzureStorageConnectionString")] string myQueueItem, Microsoft.Azure.WebJobs.ExecutionContext executionContext)
{
Interlocked.Increment(ref ConcurrentThreadCount);
var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));
log.Info($"Uplink message function triggered: {myQueueItem}");
PayloadV5 payloadMessage = (PayloadV5)JsonSerializer.Deserialize(myQueueItem, typeof(PayloadV5));
PayloadV5 payload = (PayloadV5)DevicesSeen.GetOrAdd(payloadMessage.dev_id, payloadMessage);
log.Info($"Uplink message DevEui:{payload.dev_id} Threads:{ConcurrentThreadCount} First:{payload.metadata.time} Current:{payloadMessage.metadata.time} PayloadRaw:{payload.payload_raw}");
Thread.Sleep(2000);
Interlocked.Decrement(ref ConcurrentThreadCount);
}
}
To explore how this processing worked I sent 1000 uplink messages from my Seeeduino LoRaWAN devices which were buffered in a queue.
Azure storage Explorer 1000 queued messagesApplication insights 1000 events
I processed 1000’s of messages with the Azure Function but every so often 10-20% of the logging messages wouldn’t turn up in the logs. I’m using Log4Net and I think it is most probably caused by not flushing the messages before the function finishes
For development and testing being able to provision an individual device is really useful, though for Azure IoT Central it is not easy (especially with the deprecation of DPS-KeyGen). With an Azure IoT Hub device connection strings are available in the portal which is convenient but not terribly scalable.
Initially the enrollment group had no registration records so I ran my command-line application to generate group enrollment keys for one of my devices.
Device registration before running my command line application
Then I ran the command-line application with my scopeID, registrationID (LoRaWAN deviceEUI) and the device group enrollment key I had generated in the previous step.
Registering a device and sending a message to the my Azure IoT Hub
After running the command line application the device was visible in the enrollment group registration records.
Device registration after running my command line application
Provisioning a device with an individual enrollment has a different workflow. I had to run my command-line application with the RegistrationID, ScopeID, and one of the symmetric keys from the DPS individual enrollment device configuration.
DPS Individual enrollment configuration
A major downside to an individual enrollment is either the primary or the secondary symmetric key for the device has to be deployed on the device which could be problematic if the device has no secure storage.
With a group enrollment only the registration ID and the derived symmetric key have to be deployed on the device which is more secure.
Registering a device and sending a message to the my Azure IoT Hub
In Azure IoT Explorer I could see messages from both my group and individually enrolled devices arriving at my Azure IoT hub
After some initial issues I found DPS was quite reliable and surprisingly easy to configure. I did find the DPS ProvisioningDeviceClient.RegisterAsync method sometimes took several seconds to execute which may have some ramifications when my application is doing this on demand.
To connect to an Azure IoT Hub I copied the connection string from the portal.
Azure IoT Hub connection string components
Retrieving a connection string for a device connected to Azure IoT Central (without using the Device Provisioning Service(DPS)) is a bit more involved. There is a deprecated command line application dps-keygen which calls the DPS with a device ID , device SAS key and scope ID and returns a connection string.
Azure IoT Central Device Connection InformationAzure DPS-Keygen command-line
Using Azure IoT Explorer I could see reformatted JSON messages from my client application.
Azure IoT Explorer displaying message payload
These two approaches are fine for testing but wouldn’t scale well and would be painful to use it there were 1000s, 100s or even 10s of devices.
Unpacking the payload_fields property was causing me some issues. I tried many different approaches but they all failed.
public class PayloadV4
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
//public JsonObject payload_fields { get; set; }
//public JObject payload_fields { get; set; }
//public JToken payload_fields { get; set; }
//public JContainer payload_fields { get; set; }
//public dynamic payload_fields { get; set; }
public Object payload_fields { get; set; }
public MetadataV4 metadata { get; set; }
public string downlink_url { get; set; }
}
I tried using the excellent JsonSubTypes library to build a polymorphic convertor, which failed.
...
public class PolymorphicJsonConverter : JsonConverter
{
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
JObject item = JObject.Load(reader);
var type = item["type"].Value<string>();
if (type == "PayloadFieldDigitalInput")
{
return item.ToObject<PayloadFieldDigitalInput>();
}
else if (type == "PayloadFieldDigitalInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else if (type == "PayloadFieldAnalogInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else if (type == "PayloadFieldAnalogInput")
{
return item.ToObject<PayloadFieldDigitalOutput>();
}
else
{
return null;
}
}
...
}
It was about this point I figured that I was down a very deep rabbit hole and I should just embrace my “stupid”.
I realised I shouldn’t unpack the payload as the number of generated classes required and the complexity of other approaches was going to rapidly get out of hand. Using an Object and recursively traversing its contents with System.Text.Json looked like a viable approach.
public class GatewayV4
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV4
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV4> gateways { get; set; }
}
public class PayloadV4
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
// finally settled on an Object
public Object payload_fields { get; set; }
public MetadataV4 metadata { get; set; }
public string downlink_url { get; set; }
}
So, I added yet another new to controller to my application to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV4Fields : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV4 payload)
{
string payloadFieldsUnpacked = string.Empty;
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV4Fields validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
JsonElement jsonElement = (JsonElement)payload.payload_fields;
foreach (var property in jsonElement.EnumerateObject())
{
// Special handling for nested properties
if (property.Name.StartsWith("gps_") || property.Name.StartsWith("accelerometer_") || property.Name.StartsWith("gyrometer_"))
{
payloadFieldsUnpacked += $"Property Name:{property.Name}\r\n";
JsonElement gpsElement = (JsonElement)property.Value;
foreach (var gpsProperty in gpsElement.EnumerateObject())
{
payloadFieldsUnpacked += $" Property Name:{gpsProperty.Name} Property Value:{gpsProperty.Value}\r\n";
}
}
else
{
payloadFieldsUnpacked += $"Property Name:{property.Name} Property Value:{property.Value}\r\n";
}
}
log.Info(payloadFieldsUnpacked);
return this.Ok();
}
}
In the body of the events in Azure Application Insights I could see messages and the format looked fine for simple payloads
In part 1 & part 2 I had been ignoring the payload_fields property of the Payload class. The documentation indicates that payload_fields property is populated when an uplink message is Decoded.
I used JSON2Csharp to generate C# classes which would deserialise the above uplink message.
// Third version of classes for unpacking HTTP payload
public class Gps1V3
{
public int altitude { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
}
public class PayloadFieldsV3
{
public double analog_in_1 { get; set; }
public int digital_in_1 { get; set; }
public Gps1V3 gps_1 { get; set; }
public int luminosity_1 { get; set; }
public double temperature_1 { get; set; }
}
public class GatewayV3
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV3
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV3> gateways { get; set; }
}
public class PayloadV3
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
public PayloadFieldsV3 payload_fields { get; set; }
public MetadataV3 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV3Fields : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV3 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV3Fields validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw} analog_in_1:{payload.payload_fields.analog_in_1} digital_in_1:{payload.payload_fields.digital_in_1} gps_1:{payload.payload_fields.gps_1.latitude},{payload.payload_fields.gps_1.longitude},{payload.payload_fields.gps_1.altitude} luminosity_1:{payload.payload_fields.luminosity_1} temperature_1:{payload.payload_fields.temperature_1}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint. In the body of the Application Insights events I could see the devEUI, port, and the payload fields had been extracted from the message.
This arrangement was pretty nasty and sort of worked but in the “real world” would not have been viable. I would need to generate lots of custom classes for each application taking into account the channel numbers (e,g, analog_in_1,analog_in_2) and datatypes used.
I also explored which datatypes were supported by the TTN decoder, after some experimentation (Aug 2019) it looks like only the LPPV1 ones are.
AnalogInput
AnalogOutput
DigitalInput
DigitalOutput
GPS
Accelerometer
Gyrometer
Luminosity
Presence
BarometricPressure
RelativeHumidity
Temperature
What I need is a more flexible way to stored and decode payload_fields property..
I used JSON2Csharp and a payload I downloaded in Part 1 to generate C# classes which would deserialise my minimalist messages.
// First version of classes for unpacking HTTP payload https://json2csharp.com/
public class GatewayV1
{
public string gtw_id { get; set; }
public int timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV1
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV1> gateways { get; set; }
}
public class PayloadV1
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool confirmed { get; set; }
public string payload_raw { get; set; }
public MetadataV1 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV1 : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV1 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV1 validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint.
TTN Application configuration overview
In the body of the Application Insights events I could see the devEUI, port, and the raw payload had been extracted from the message.
I then added another controller which decoded the Base64 encoded payload_raw.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV2Base64Decoded : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV2 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV2BCDDecoded validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Port:{payload.port} Payload:{ Encoding.UTF8.GetString(Convert.FromBase64String(payload.payload_raw))}");
return this.Ok();
}
}
Then after a while the deserialisation started to fail with an HTTP 400-Bad request. When I ran the same request with Telerik Fiddler on my desktop the raw response was
HTTP/1.1 400 Bad Request
Transfer-Encoding: chunked
Content-Type: application/problem+json; charset=utf-8
Server: Microsoft-IIS/10.0
Request-Context: appId=cid-v1:f4f72f2e-1144-4578-923f-d3ebdcfb7766
X-Powered-By: ASP.NET
Date: Mon, 31 Aug 2020 09:07:30 GMT
17a
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1",
"title":"One or more validation errors occurred.",
"status":400,
"traceId":"00-45033ec030b63d4ebb82b95b67cb8142-9fc52a18be202848-00",
"errors":{
"$.metadata.gateways[0].timestamp":["The JSON value could not be converted to System.Int32.
Path: $.metadata.gateways[0].timestamp | LineNumber: 21 | BytePositionInLine: 35."]}}
0
The line in the payload was the gateway timestamp. The value was 2,426,973,100 which larger than 2,147,483,647 the maximum number that can be stored in a signed 32 bit integer. The JSON2CSharp generator had made a reasonable choice of datatype but in this case the range was not sufficient.
public class GatewayV2
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
I checked the TTN code where the variable was declared as an unsigned 64 bit integer.
This issue could occur for other variables so I need to manually check all the generated classes.