The Things Network HTTP Integration Part10

Assembling the components

After a series of articles exploring how portions of solution could be built

I now had working code for receiving The Things Network(TTN) HTTP integration JSON messages with an Azure Function using an HTTPTrigger. (secured with an APIKey) and then putting them into an Azure Storage Queue for processing. This code was intentionally kept as small and as simple as possible so there was less to go wrong. The required configuration is also minimal.

HTTP Endpoint handler application

In the last couple of posts I had been building an Azure Function with a QueueTrigger to process the uplink messages. The function used custom bindings so that the CloudQueueMessage could be accessed, and load the Azure Storage account plus queue name from configuration. I’m still using classes generated by JSON2CSharp (with minimal modifications) for deserialising the payloads with JSON.Net.

The message processor Azure Function uses a ConcurrentCollection to store AzureDeviceClient objects constructed using the information returned by the Azure Device Provisioning Service(DPS). This is so the DPS doesn’t have to be called for the connection details for every message.(When the Azure function is restarted the dictionary of DeviceClient objects has to be repopulated). If there is a backlog of messages the message processor can process more than a dozen messages concurrently so the telemetry events displayed in an application like Azure IoT Central can arrive out of order.

The solution uses DPS Group Enrollment with Symmetric Key Attestation so Azure IoT Hub devices can be “automagically” created when a message from a new device is processed. The processing code is multi-thread and relies on many error conditions being handled by the Azure Function retry mechanism. After a number of failed retries the messages are moved to a poison queue. Azure Storage Explorer is a good tool for viewing payloads and moving poison messages back to the processing queue.

public static class UplinkMessageProcessor
{
   static readonly ConcurrentDictionary<string, DeviceClient> DeviceClients = new ConcurrentDictionary<string, DeviceClient>();

   [FunctionName("UplinkMessageProcessor")]
   public static async Task Run(
      [QueueTrigger("%UplinkQueueName%", Connection = "AzureStorageConnectionString")]
      CloudQueueMessage cloudQueueMessage, // Used to get CloudQueueMessage.Id for logging
      Microsoft.Azure.WebJobs.ExecutionContext context,
      ILogger log)
   {
      PayloadV5 payloadObect;
      DeviceClient deviceClient = null;
      DeviceProvisioningServiceSettings deviceProvisioningServiceConfig;

      string environmentName = Environment.GetEnvironmentVariable("ENVIRONMENT");

      // Load configuration for DPS. Refactor approach and store securely...
      var configuration = new ConfigurationBuilder()
      .SetBasePath(context.FunctionAppDirectory)
      .AddJsonFile($"appsettings.json")
      .AddJsonFile($"appsettings.{environmentName}.json")
      .AddEnvironmentVariables()
      .Build();

      // Load configuration for DPS. Refactor approach and store securely...
      try
      {
         deviceProvisioningServiceConfig = (DeviceProvisioningServiceSettings)configuration.GetSection("DeviceProvisioningService").Get<DeviceProvisioningServiceSettings>(); ;
      }
      catch (Exception ex)
      {
         log.LogError(ex, $"Configuration loading failed");
         throw;
      }

      // Deserialise uplink message from Azure storage queue
      try
      {
         payloadObect = JsonConvert.DeserializeObject<PayloadV5>(cloudQueueMessage.AsString);
      }
      catch (Exception ex)
      {
         log.LogError(ex, $"MessageID:{cloudQueueMessage.Id} uplink message deserialisation failed");
         throw;
      }

      // Extract the device ID as it's used lots of places
      string registrationID = payloadObect.hardware_serial;

      // Construct the prefix used in all the logging
      string messagePrefix = $"MessageID: {cloudQueueMessage.Id} DeviceID:{registrationID} Counter:{payloadObect.counter} Application ID:{payloadObect.app_id}";
      log.LogInformation($"{messagePrefix} Uplink message device processing start");

      // See if the device has already been provisioned
      if (DeviceClients.TryAdd(registrationID, deviceClient))
      {
         log.LogInformation($"{messagePrefix} Device provisioning start");

         string enrollmentGroupSymmetricKey = deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault;

         // figure out if custom mapping for TTN applicationID
         if (deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping != null)
        {
            deviceProvisioningServiceConfig.ApplicationEnrollmentGroupMapping.GetValueOrDefault(payloadObect.app_id, deviceProvisioningServiceConfig.EnrollmentGroupSymmetricKeyDefault);
         }

         // Do DPS magic first time device seen
         await DeviceRegistration(log, messagePrefix, deviceProvisioningServiceConfig.GlobalDeviceEndpoint, deviceProvisioningServiceConfig.ScopeID, enrollmentGroupSymmetricKey, registrationID);
      }

      // Wait for the Device Provisioning Service to complete on this or other thread
      log.LogInformation($"{messagePrefix} Device provisioning polling start");
      if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
      {
         log.LogError($"{messagePrefix} Device provisioning polling TryGet before while failed");

         throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet before while failed");
      }

      while (deviceClient == null)
      {
         log.LogInformation($"{messagePrefix} provisioning polling delay");
         await Task.Delay(deviceProvisioningServiceConfig.DeviceProvisioningPollingDelay);

         if (!DeviceClients.TryGetValue(registrationID, out deviceClient))
         {
            log.LogError($"{messagePrefix} Device provisioning polling TryGet while loop failed");

            throw new ApplicationException($"{messagePrefix} Device provisioning polling TryGet while loopfailed");
         }
      }

      // Assemble the JSON payload to send to Azure IoT Hub/Central.
      log.LogInformation($"{messagePrefix} Payload assembly start");
      JObject telemetryEvent = new JObject();
      try
      {
         JObject payloadFields = (JObject)payloadObect.payload_fields;
         telemetryEvent.Add("HardwareSerial", payloadObect.hardware_serial);
         telemetryEvent.Add("Retry", payloadObect.is_retry);
         telemetryEvent.Add("Counter", payloadObect.counter);
         telemetryEvent.Add("DeviceID", payloadObect.dev_id);
         telemetryEvent.Add("ApplicationID", payloadObect.app_id);
         telemetryEvent.Add("Port", payloadObect.port);
         telemetryEvent.Add("PayloadRaw", payloadObect.payload_raw);
         telemetryEvent.Add("ReceivedAt", payloadObect.metadata.time);

         // If the payload has been unpacked in TTN backend add fields to telemetry event payload
         if (payloadFields != null)
         {
            foreach (JProperty child in payloadFields.Children())
            {
               EnumerateChildren(telemetryEvent, child);
            }
         }
      }
      catch (Exception ex)
      {
         if (DeviceClients.TryRemove(registrationID, out deviceClient))
         {
            log.LogWarning($"{messagePrefix} TryRemove payload assembly failed");
         }

         log.LogError(ex, $"{messagePrefix} Payload assembly failed");
         throw;
      }

      // Send the message to Azure IoT Hub/Azure IoT Central
      log.LogInformation($"{messagePrefix} Payload SendEventAsync start");
      try
      {
         using (Message ioTHubmessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent))))
         {
            // Ensure the displayed time is the acquired time rather than the uploaded time. esp. importan for messages that end up in poison queue
            ioTHubmessage.Properties.Add("iothub-creation-time-utc", payloadObect.metadata.time.ToString("s", CultureInfo.InvariantCulture));
            await deviceClient.SendEventAsync(ioTHubmessage);
         }
      }
      catch (Exception ex)
      {
         if (DeviceClients.TryRemove(registrationID, out deviceClient))
         {
            log.LogWarning($"{messagePrefix} TryRemove SendEventAsync failed");
         }

         log.LogError(ex, $"{messagePrefix} SendEventAsync failed");
         throw;
      }

   log.LogInformation($"{messagePrefix} Uplink message device processing completed");
   }
}

There is also support for using a specific GroupEnrollment based on the application_id in the uplink message payload.

"DeviceProvisioningService": {
      "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
      "ScopeID": "",
      "EnrollmentGroupSymmetricKeyDefault": "TopSecretKey",
      "DeviceProvisioningPollingDelay": 500,
      "ApplicationEnrollmentGroupMapping": {
         "Application1": "TopSecretKey1",
         "Application2": "TopSecretKey2"
      }
   }

In addition to the appsettings.json there is configuration for application insights, uplink message queue name and Azure Storage connection strings. The “Environment” setting is important as it specifies what appsettings.json file should be used if code is being debugged etc..

TTN Integration uplink message processor configuration

The deployed solution application consists of Azure IoTHub and DPS instances. There are two Azure functions, one for putting the messages from the TTN into a queue the other is for processing them. The Azure Functions are hosted in an Azure AppService plan.

Azure solution deployment

An Azure Storage account is used for the queue and Azure Function synchronisation information and Azure Application Insights is used to monitor the solution.

NLog and Application Insights Revisited

Just a few small changes to my NLog sample logging to Azure Application Insights.

I modified the application so I could provide the InstrumentationKey via the command line or the ApplicationInsights.Config file.(I have a minimalist config for this sample)

namespace devMobile.Azure.ApplicationInsightsNLogClient
{
   class Program
   {
      private static Logger log = LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType.ToString());

      static void Main(string[] args)
      {
         if ((args.Length != 0) && (args.Length != 1))
         {
            Console.WriteLine("Usage ApplicationInsightsNLogClient");
            Console.WriteLine("      ApplicationInsightsNLogClient <instrumentationKey>");
            return;
         }

         if (args.Length == 1)
         {
            TelemetryConfiguration.Active.InstrumentationKey = args[0];
         }

         log.Trace("This is an nLog Trace message");
         log.Debug("This is an nLog Debug message");
         log.Info("This is an nLog Info message");
         log.Warn("This is an nLog Warning message");
         log.Error("This is an nLog Error message");
         log.Fatal("This is an nLog Fatal message");

         TelemetryConfiguration.Active.TelemetryChannel.Flush();

			Console.WriteLine("Press <enter> to exit>");
			Console.ReadLine();
		}
	}
}

Code for my sample console application is here.

Azure Function Log4Net configuration Revisted

In a previous post I showed how I configured Apache Log4Net and Azure Application Insights to work with an Azure Function, this is the code updated to .Net Core V3.1.

With the different versions of the libraries involved (Early April 2020) this was what I found worked for me so YMMV.

Initially the logging to Application Insights wasn’t working even though it was configured in the ApplicationIngisghts.config file. After some experimentation I found setting the APPINSIGHTS_INSTRUMENTATIONKEY environment variable was the only way I could get it to work.

namespace ApplicationInsightsAzureFunctionLog4NetClient
{
	using System;
	using System.IO;
	using System.Reflection;
	using log4net;
	using log4net.Config;
	using Microsoft.ApplicationInsights;
	using Microsoft.ApplicationInsights.Extensibility;
	using Microsoft.Azure.WebJobs;

	public static class ApplicationInsightsTimer
	{
		[FunctionName("ApplicationInsightsTimerLog4Net")]
		public static void Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ExecutionContext executionContext)
		{
         ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

         using (TelemetryConfiguration telemetryConfiguration = TelemetryConfiguration.CreateDefault())
         {
            TelemetryClient telemetryClient = new TelemetryClient(telemetryConfiguration);
 
            var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
            XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));

            log.Debug("This is a Log4Net Debug message");
            log.Info("This is a Log4Net Info message");
            log.Warn("This is a Log4Net Warning message");
            log.Error("This is a Log4Net Error message");
            log.Fatal("This is a Log4Net Fatal message");

            telemetryClient.Flush();
         }
      }
   }
}

I did notice that there were a number of exceptions which warrant further investigation.

'func.exe' (CoreCLR: clrhost): Loaded 'C:\Users\BrynLewis\source\repos\AzureApplicationInsightsClients\ApplicationInsightsAzureFunctionLog4NetClient\bin\Debug\netcoreapp3.1\bin\log4net.dll'. 
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
Exception thrown: 'System.IO.FileNotFoundException' in System.Private.CoreLib.dll
'func.exe' (CoreCLR: clrhost): Loaded 'C:\Users\BrynLewis\AppData\Local\AzureFunctionsTools\Releases\2.47.1\cli_x64\System.Xml.XmlDocument.dll'. 
'func.exe' (CoreCLR: clrhost): Loaded 'C:\Users\BrynLewis\source\repos\AzureApplicationInsightsClients\ApplicationInsightsAzureFunctionLog4NetClient\bin\Debug\netcoreapp3.1\bin\Microsoft.ApplicationInsights.Log4NetAppender.dll'. 
'func.exe' (CoreCLR: clrhost): Loaded 'C:\Users\BrynLewis\AppData\Local\AzureFunctionsTools\Releases\2.47.1\cli_x64\System.Reflection.TypeExtensions.dll'. 
Application Insights Telemetry: {"name":"Microsoft.ApplicationInsights.64b1950b90bb46aaa36c26f5dce0cad3.Message","time":"2020-04-09T09:22:33.2274370Z","iKey":"1234567890123-1234-12345-123456789012","tags":{"ai.cloud.roleInstance":"DESKTOP-C9IPNQ1","ai.operation.id":"bc6c4d10cebd954c9d815ad06add2582","ai.operation.parentId":"|bc6c4d10cebd954c9d815ad06add2582.d8fa83b88b175348.","ai.operation.name":"ApplicationInsightsTimerLog4Net","ai.location.ip":"0.0.0.0","ai.internal.sdkVersion":"log4net:2.13.1-12554","ai.internal.nodeName":"DESKTOP-C9IPNQ1"},"data":{"baseType":"MessageData","baseData":{"ver":2,"message":"This is a Log4Net Info message","severityLevel":"Information","properties":{"Domain":"NOT AVAILABLE","InvocationId":"91063ef9-70d0-4318-a1e0-e49ade07c51b","ThreadName":"14","ClassName":"?","LogLevel":"Information","ProcessId":"15824","Category":"Function.ApplicationInsightsTimerLog4Net","MethodName":"?","Identity":"NOT AVAILABLE","FileName":"?","LoggerName":"ApplicationInsightsAzureFunctionLog4NetClient.ApplicationInsightsTimer","LineNumber":"?"}}}}

The latest code for my Azure Function Log4net to Applications Insights sample is available on here.

Apache Log4net .NET Core and Application Insights

In the previous post I revisited my sample .NET application that used Apache log4net and Azure Application Insights. This post updates the application to .NET Core V3.1.

I had to remove the ability to set the instrumentation key via the command line as I couldn’t get it to work.

I tried initialising the logger after loading the telemetry configuration, passing the InstrumentationKey in as a parameter of the TelemetryConfiguration constructor etc. and it made no difference.

The only other option that appeared to work was setting the instrumentation key via an Environment Variable called APPINSIGHTS_INSTRUMENTATIONKEY

   class Program
   {
      private static ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

      static void Main(string[] args)
      {
         using (TelemetryConfiguration telemetryConfiguration = TelemetryConfiguration.CreateDefault())
         {
            TelemetryClient telemetryClient = new TelemetryClient(telemetryConfiguration);

            var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
            XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(Environment.CurrentDirectory, "log4net.config")));

            log.Debug("This is a Log4Net Debug message");
            log.Info("This is a Log4Net Info message");
            log.Warn("This is a Log4Net Warning message");
            log.Error("This is a Log4Net Error message");
            log.Fatal("This is a Log4Net Fatal message");

            telemetryClient.Flush();
         }

         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }
   }

I updated the Log4Net setup to use the ManagedColoredConsoleAppender which required a couple of modifications to the Log4Net.config file. (Initially it was failing because I was using the non US spelling of log4net.Appender.ManagedColoredConsoleAppender)

 <appender name="ColoredConsoleAppender" type="log4net.Appender.ManagedColoredConsoleAppender">
      <mapping>
         <level value="ERROR" />
         <foreColor value="White" />
         <backColor value="Red" />
      </mapping>
      <mapping>
         <level value="DEBUG" />
         <backColor value="Green" />
      </mapping>
      <layout type="log4net.Layout.PatternLayout">
         <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
      </layout>
   </appender>

I did notice that after a several seconds while waiting for the enter key to be pressed there were a number of exceptions which warrants further investigation.

devMobile.Azure.ApplicationInsightsLog4NetCoreClient.Program: 2020-04-08 17:14:23,455 [1] FATAL devMobile.Azure.ApplicationInsightsLog4NetCoreClient.Program – This is a Log4Net Fatal message
‘ApplicationInsightsLog4NetCoreClient.exe’ (CoreCLR: clrhost): Loaded ‘C:\Program Files\dotnet\shared\Microsoft.NETCore.App\3.1.3\System.Security.Cryptography.Encoding.dll’.
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Net.Http.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Net.Http.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Net.Http.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
Exception thrown: ‘System.Threading.Tasks.TaskCanceledException’ in System.Private.CoreLib.dll
The program ‘[13920] ApplicationInsightsLog4NetCoreClient.exe’ has exited with code 0 (0x0).
The program ‘[13920] ApplicationInsightsLog4NetCoreClient.exe: Program Trace’ has exited with code 0 (0x0).

A sample project is available here.

“Don’t forget to flush” .Net Core Application Insights

This post updates a previous post “Don’t forget to flush Application insights Revisited” for .Net Core 3.X and shows the small change required by the deprecation of on of the TelemetryClient constructor overloads.

warning CS0618: ‘TelemetryClient.TelemetryClient()’ is obsolete: ‘We do not recommend using TelemetryConfiguration.Active on .NET Core. See https://github.com/microsoft/ApplicationInsights-dotnet/issues/1152 for more details’

   class Program
   {
      static void Main(string[] args)
      {
#if INSTRUMENTATION_KEY_TELEMETRY_CONFIGURATION
         if (args.Length != 1)
         {
            Console.WriteLine("Usage AzureApplicationInsightsClientConsole <instrumentationKey>");
            return;
         }

         TelemetryConfiguration telemetryConfiguration = new TelemetryConfiguration(args[0]);
         TelemetryClient telemetryClient = new TelemetryClient(telemetryConfiguration);
         telemetryClient.TrackTrace("INSTRUMENTATION_KEY_TELEMETRY_CONFIGURATION", SeverityLevel.Information);
#endif
#if INSTRUMENTATION_KEY_APPLICATION_INSIGHTS_CONFIG
         TelemetryConfiguration telemetryConfiguration = TelemetryConfiguration.CreateDefault();
         TelemetryClient telemetryClient = new TelemetryClient(telemetryConfiguration);
         telemetryClient.TrackTrace("INSTRUMENTATION_KEY_APPLICATION_INSIGHTS_CONFIG", SeverityLevel.Information);
#endif
         telemetryClient.Context.User.Id = Environment.UserName;
         telemetryClient.Context.Device.Id = Environment.MachineName;
         telemetryClient.Context.Operation.Name = "Test harness";

         telemetryClient.TrackTrace("This is a .Net Core AI API Verbose message", SeverityLevel.Verbose);
         telemetryClient.TrackTrace("This is a .Net Core AI API Information message", SeverityLevel.Information);
         telemetryClient.TrackTrace("This is a .Net Core AI API Warning message", SeverityLevel.Warning);
         telemetryClient.TrackTrace("This is a .Net Core AI API Error message", SeverityLevel.Error);
         telemetryClient.TrackTrace("This is a .Net Core AI API Critical message", SeverityLevel.Critical);

         telemetryClient.Flush();

         telemetryConfiguration.Dispose(); // In real-world use a using or similar approach to ensure cleaned up

         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }
   }

A sample project is available here

Apache Log4net and Application Insights Revisited

In a previous post I showed how we configured a client’s application to use Apache log4net and Azure Application Insights.

I modified the code to allow the Instrumentation Key input via a command line parameter or from the ApplicationInsights.config file.

class Program
{
   private static ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

   static void Main(string[] args)
   {
      if (( args.Length != 0) && (args.Length != 1 ))
      {
         Console.WriteLine("Usage AzureApplicationInsightsClientConsole");
         Console.WriteLine("      AzureApplicationInsightsClientConsole <instrumentationKey>");
         return;
      }

      if (args.Length == 1)
      {
         TelemetryConfiguration.Active.InstrumentationKey = args[0];
      }

      log.Debug("This is a Log4Net Debug message");
      log.Info("This is a Log4Net Info message");
      log.Warn("This is a Log4Net Warning message");
      log.Error("This is an Log4Net Error message");
      log.Fatal("This is a Log4Net Fatal message");

      TelemetryConfiguration.Active.TelemetryChannel.Flush();

      Console.WriteLine("Press <enter> to exit>");
      Console.ReadLine();
   }
}

I updated the Log4Net setup to use the ManagedColoredConsoleAppender which required a couple of modifications to the Log4Net.config file. (I had to remove HighIntensity)

 <appender name="ColoredConsoleAppender" type="log4net.Appender.ManagedColoredConsoleAppender">
      <mapping>
         <level value="ERROR" />
         <foreColor value="White" />
         <backColor value="Red" />
      </mapping>
      <mapping>
         <level value="DEBUG" />
         <backColor value="Green" />
      </mapping>
      <layout type="log4net.Layout.PatternLayout">
         <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
      </layout>
   </appender>

A sample project is available here.

“Don’t forget to flush” Application Insights Revisited

This post revisits a previous post “Don’t forget to flush” Application insights and shows how to configure the instrumentation key in code or via the ApplicationInsights.config file.

 class Program
   {
      static void Main(string[] args)
      {
#if INSTRUMENTATION_KEY_TELEMETRY_CONFIGURATION
         if (args.Length != 1)
         {
            Console.WriteLine("Usage AzureApplicationInsightsClientConsole <instrumentationKey>");
            return;
         }

         TelemetryConfiguration telemetryConfiguration = new TelemetryConfiguration(args[0]);
         TelemetryClient telemetryClient = new TelemetryClient(telemetryConfiguration);
         telemetryClient.TrackTrace("INSTRUMENTATION_KEY_TELEMETRY_CONFIGURATION", SeverityLevel.Information);
#endif
#if INSTRUMENTATION_KEY_APPLICATION_INSIGHTS_CONFIG
         TelemetryClient telemetryClient = new TelemetryClient();
         telemetryClient.TrackTrace("INSTRUMENTATION_KEY_APPLICATION_INSIGHTS_CONFIG", SeverityLevel.Information);
#endif
         telemetryClient.TrackTrace("This is an AI API Verbose message", SeverityLevel.Verbose);
         telemetryClient.TrackTrace("This is an AI API Information message", SeverityLevel.Information);
         telemetryClient.TrackTrace("This is an AI API Warning message", SeverityLevel.Warning);
         telemetryClient.TrackTrace("This is an AI API Error message", SeverityLevel.Error);
         telemetryClient.TrackTrace("This is an AI API Critical message", SeverityLevel.Critical);

         telemetryClient.Flush();

         Console.WriteLine("Press <enter> to exit");
         Console.ReadLine();
      }

A sample project is available here

Azure IoT Hub, Event Grid to Application Insights

For a second Proof of Concept (PoC) I wanted to upload sensor data from my MQTT LoRa Telemetry Field Gateway to an Azure IoT Hub, then using Azure EventGrid subscribe to the stream of telemetry data events, logging the payloads in Azure Application Insights (the aim was minimal code so no database etc.).

The first step was to create and deploy a simple Azure Function for unpacking the telemetry event payload.

Azure IoT Hub Azure Function Handler

Then wire the Azure function to the Microsoft.Devices.Device.Telemetry Event Type

Azure IoT Hub Event Metrics

On the Windows 10 IoT Core device in the Event Tracing Window(ETW) logging on the device I could see LoRa messages arriving and being unpacked.

Windows 10 Device ETW showing message payload

Then in Application Insights after some mucking around with code I could see in a series of Trace statements the event payload as it was unpacked.

{"id":"29108ebf-e5d5-7b95-e739-7d9048209d53","topic":"/SUBSCRIPTIONS/12345678-9012-3456-7890-123456789012/RESOURCEGROUPS/AZUREIOTHUBEVENTGRIDAZUREFUNCTION/PROVIDERS/MICROSOFT.DEVICES/IOTHUBS/FIELDGATEWAYHUB",
"subject":"devices/MQTTNetClient",
"eventType":"Microsoft.Devices.DeviceTelemetry",
"eventTime":"2020-02-01T04:30:51.427Z",
"data":
{
 "properties":{},
"systemProperties":{"iothub-connection-device-id":"MQTTNetClient","iothub-connection-auth-method":"{\"scope\":\"device\",\"type\":\"sas\",\"issuer\":\"iothub\",\"acceptingIpFilterRule\":null}",
"iothub-connection-auth-generation-id":"637149890997219611",
"iothub-enqueuedtime":"2020-02-01T04:30:51.427Z",
"iothub-message-source":"Telemetry"
},
"body":"eyJPZmZpY2VUZW1wZXJhdHVyZSI6IjIyLjUiLCJPZmZpY2VIdW1pZGl0eSI6IjkyIn0="
},
"dataVersion":"",
"metadataVersion":"1"
}
Application Insights logging with message unpacking
Application Insights logging message payload

Then in the last log entry the decoded message payload

/*
    Copyright ® 2020 Feb devMobile Software, All Rights Reserved
 
    MIT License

    Permission is hereby granted, free of charge, to any person obtaining a copy
    of this software and associated documentation files (the "Software"), to deal
    in the Software without restriction, including without limitation the rights
    to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
    copies of the Software, and to permit persons to whom the Software is
    furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in all
    copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
    OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
    SOFTWARE

    Default URL for triggering event grid function in the local environment.
    http://localhost:7071/runtime/webhooks/EventGrid?functionName=functionname
 */
namespace EventGridProcessorAzureIotHub
{
   using System;
   using System.IO;
   using System.Reflection;

   using Microsoft.Azure.WebJobs;
   using Microsoft.Azure.EventGrid.Models;
   using Microsoft.Azure.WebJobs.Extensions.EventGrid;

   using log4net;
   using log4net.Config;
   using Newtonsoft.Json;

   public static class Telemetry
    {
        [FunctionName("Telemetry")]
        public static void Run([EventGridTrigger]Microsoft.Azure.EventGrid.Models.EventGridEvent eventGridEvent, ExecutionContext executionContext )//, TelemetryClient telemetryClient)
        {
			ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

		   var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
			XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));

         log.Info($"eventGridEvent.Data-{eventGridEvent}");

         log.Info($"eventGridEvent.Data.ToString()-{eventGridEvent.Data.ToString()}");

        IotHubDeviceTelemetryEventData iOThubDeviceTelemetryEventData = (IotHubDeviceTelemetryEventData)JsonConvert.DeserializeObject(eventGridEvent.Data.ToString(), typeof(IotHubDeviceTelemetryEventData));

         log.Info($"iOThubDeviceTelemetryEventData.Body.ToString()-{iOThubDeviceTelemetryEventData.Body.ToString()}");

         byte[] base64EncodedBytes = System.Convert.FromBase64String(iOThubDeviceTelemetryEventData.Body.ToString());

         log.Info($"System.Text.Encoding.UTF8.GetString(-{System.Text.Encoding.UTF8.GetString(base64EncodedBytes)}");
      }
	}
}

Overall it took roughly half a page of code (mainly generated by a tool) to unpack and log the contents of an Azure IoT Hub EventGrid payload to Application Insights.

Azure Function Log4Net configuration

This post was inspired by the couple of hours lost from my life yesterday while I figured out how to get Apache Log4Net and Azure Application Insights working in an Azure Function built with .Net Core 2.X.

After extensive searching I found a couple of relevant blog posts but these had complex approaches and I wanted to keep the churn in the codebase I was working on to an absolute minimum.

With the different versions of the libraries involved (Late March 2019) this was what worked for me so YMMV. To provide the simplest possible example I have created a TimerTrigger which logs information via Log4Net to Azure Application Insights.

Initially the Log4Net configuration wasn’t loaded because its location is usually configured in the AssemblyInfo.cs file and .Net Core 2.x code doesn’t have one.

// You can specify all the values or you can default the Build and Revision Numbers
// by using the '*' as shown below:
// [assembly: AssemblyVersion("1.0.*")]
[assembly: AssemblyVersion("1.0.0.0")]
[assembly: AssemblyFileVersion("1.0.0.0")]
[assembly: log4net.Config.XmlConfigurator]

I figured I would have to manually load the Log4Net configuration and had to look at the file system of machine running the function to figure out where the Log4Net XML configuration file was getting copied to.

The “Copy to output directory” setting is important

Then I had to get the Dependency Injection (DI) framework to build an ExecutionContext for me so I could get the FunctionAppDirectory to combine with the Log4Net config file name. I used Path.Combine which is more robust and secure than manually concatenating segments of a path together.

/*
    Copyright ® 2019 March devMobile Software, All Rights Reserved
 
    MIT License
...
*/
namespace ApplicationInsightsAzureFunctionLog4NetClient
{
	using System;
	using System.IO;
	using System.Reflection;
	using log4net;
	using log4net.Config;
	using Microsoft.ApplicationInsights.Extensibility;
	using Microsoft.Azure.WebJobs;

	public static class ApplicationInsightsTimer
	{
		[FunctionName("ApplicationInsightsTimerLog4Net")]
		public static void Run([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, ExecutionContext executionContext)
		{
			ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

			TelemetryConfiguration.Active.InstrumentationKey = Environment.GetEnvironmentVariable("InstrumentationKey", EnvironmentVariableTarget.Process);

			var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
			XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));

			log.Debug("This is a Log4Net Debug message");
			log.Info("This is a Log4Net Info message");
			log.Warn("This is a Log4Net Warning message");
			log.Error("This is a Log4Net Error message");
			log.Fatal("This is a Log4Net Fatal message");

			TelemetryConfiguration.Active.TelemetryChannel.Flush();
		}
	}
}

Log4Net logging in Azure Application Insights

The latest code for my Azure Function Log4net to Applications Insights sample along with some samples for other logging platforms is available on GitHub.

ASP MVC Core V2.1 and Cross-Origin Resource Sharing

I’m working on an project for a customer which implements a number of application programming Interfaces(API) for a Single Page Application(SPA) and other clients. We are using entity tags (ETags) for versioning and the front end developers found the couldn’t access them from javascipt running in mainstream browser clients (June 2018).

The problems was understanding how Cross-Origin Resource Sharing (CORS) worked and how it interacted with our security model (API key and OAuth2.0 depending on the client)

In our scenario we first found the pre-flight check wasn’t working because in the HyperText Transfer Protocol (HTTP) OPTIONS method our X-API-KEY check was failing

OPTIONS http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
Access-Control-Request-Headers: x-api-key
Access-Control-Request-Method: GET
Accept-Encoding: gzip, deflate
Content-Length: 0
Host: xyz.azurewebsites.net
Connection: Keep-Alive
Pragma: no-cache

HTTP/1.1 400 Bad Request
Transfer-Encoding: chunked
Server: Kestrel
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 05:48:30 GMT

13
API Key is invalid.
0

So I disabled X-API-KEY validation in startup.cs

public async Task Invoke(HttpContext context)
{
   if (context.Request.Method == "OPTIONS")
   {
      await this.next.Invoke(context);
      return;
   }

   var claims = new List();
…

OPTIONS then worked

OPTIONS http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
Access-Control-Request-Headers: x-api-key
Access-Control-Request-Method: GET
Accept-Encoding: gzip, deflate
Content-Length: 0
Host: xyz.azurewebsites.net
Connection: Keep-Alive
Pragma: no-cache

HTTP/1.1 404 Not Found
Server: Kestrel
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 05:52:20 GMT
Content-Length: 0

I then turned on CORS allowing pretty much anything

public void ConfigureServices(IServiceCollection services)
{
   services.AddCors(options =>
   {
      options.AddPolicy("CorsPolicy",
      builder => builder.AllowAnyOrigin()
         .AllowAnyMethod()
         .AllowAnyHeader()
         .AllowCredentials());
   });
   services.AddMvc();
}

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
   if (env.IsDevelopment())
   {
      app.UseDeveloperExceptionPage();
   }

   TelemetryConfiguration.Active.InstrumentationKey = this.configuration.GetSection("ApplicationInsights").GetSection("InstrumentationKey").Value;

   loggerFactory.AddLog4Net();
   this.log.Info("Startup.Configure called");

   app.ApplyUserKeyValidation();
   app.UseCors("CorsPolicy");
   app.UseMvc();
   }
}

OPTIONS then worked

OPTIONS http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
Access-Control-Request-Headers: x-api-key
Access-Control-Request-Method: GET
Accept-Encoding: gzip, deflate
Content-Length: 0
Host: xyz.azurewebsites.net
Connection: Keep-Alive
Pragma: no-cache

HTTP/1.1 204 No Content
Vary: Origin
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: x-api-key
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 05:57:33 GMT

GET then worked

GET http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
X-API-KEY: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Accept-Language: en-NZ
Accept-Encoding: gzip, deflate
If-None-Match: 00-00-00-00-00-00-00-76
Host: xyz.azurewebsites.net
Connection: Keep-Alive

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
ETag: 00-00-00-00-00-00-00-76
Vary: Origin,Accept-Encoding
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 05:57:34 GMT
Content-Length: 2216

[{"...."}}

But HEAD didn’t work

HEAD http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
X-API-KEY: ABCDEFGHIJKLMNOPQRSTUVWXYZ
Accept-Language: en-NZ
Accept-Encoding: gzip, deflate
If-None-Match: 00-00-00-00-00-00-00-76
Host: xyz.azurewebsites.net
Connection: Keep-Alive

HTTP/1.1 400 Bad Request
Content-Length: 0
Vary: Origin
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 05:59:55 GMT

From the Application Insights logging and RestTest client (which I ran locally and remotely) I could see that the client side code couldn’t access the value of our eTag.  It had to be “exposed”

public void ConfigureServices(IServiceCollection services)
{
   services.AddCors(options =>
   {
      options.AddPolicy("CorsPolicy",
            builder => builder.AllowAnyOrigin()
            .AllowAnyMethod()
            .AllowAnyHeader()
            .WithExposedHeaders("etag")
            .AllowCredentials()
         );
      });
      services.AddMvc();
   }
...

GET then worked

GET http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
X-API-KEY: ABCDEFGHIJKLMNOPQRSTUVWXYZ
ETag: 00-00-00-00-00-00-00-76
Accept-Language: en-NZ
Accept-Encoding: gzip, deflate
If-None-Match: 00-00-00-00-00-00-00-76
Host: xyz.azurewebsites.net
Connection: Keep-Alive

HTTP/1.1 200 OK
Transfer-Encoding: chunked
Content-Type: application/json; charset=utf-8
Content-Encoding: gzip
ETag: 00-00-00-00-00-00-00-76
Vary: Origin,Accept-Encoding
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 07:53:41 GMT

[{"...."}}

HEAD then worked

OPTIONS http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
Access-Control-Request-Headers: x-api-key,etag
Access-Control-Request-Method: HEAD
Accept-Encoding: gzip, deflate
Content-Length: 0
Host: xyz.azurewebsites.net
Connection: Keep-Alive
Pragma: no-cache

HTTP/1.1 204 No Content
Vary: Origin
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Headers: x-api-key,etag
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 07:57:31 GMT

HEAD http://xyz.azurewebsites.net/api/portfolio HTTP/1.1
...
X-API-KEY: ABCDEFGHIJKLMNOPQRSTUVWXYZ
ETag: 00-00-00-00-00-00-00-76
Accept-Language: en-NZ
Accept-Encoding: gzip, deflate
Host: xyz.azurewebsites.net
Connection: Keep-Alive
Pragma: no-cache

HTTP/1.1 304 Not Modified
Vary: Origin
Server: Kestrel
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: file://
X-Powered-By: ASP.NET
...
Date: Sun, 24 Jun 2018 07:57:31 GMT

I had some oddness with releasing code updates which I think was down to caching of pre-flight request responses.
Next steps tidy up the headers etc. and lock the CORS configuration down to expose the minimum necessary required for the application to work.