Cloud AI with Copilot – Faster R-CNN Azure HTTP Function Performance Setup

Introduction

The Faster R-CNN Azure HTTP Trigger function performed (not unexpectedly) differently when invoked with Fiddler Classic in the Azure Functions emulator vs. when deployed in an Azure App Plan.

The code used is a “tidied” up version of the version of the code from the Building Cloud AI with Copilot – Faster R-CNN Azure HTTP Function “Dog Food” post

public class Function1
{
   private readonly ILogger<Function1> _logger;
   private readonly List<string> _labels;
   private readonly InferenceSession _session;

   public Function1(ILogger<Function1> logger)
   {
      _logger = logger;
      _labels = File.ReadAllLines(Path.Combine(AppContext.BaseDirectory, "labels.txt")).ToList();
      _session = new InferenceSession(Path.Combine(AppContext.BaseDirectory, "FasterRCNN-10.onnx"));
   }

   [Function("ObjectDetectionFunction")]
   public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req, ExecutionContext context)
   {
      if (!req.ContentType.StartsWith("image/"))
         return new BadRequestObjectResult("Content-Type must be an image.");

      using var ms = new MemoryStream();
      await req.Body.CopyToAsync(ms);
      ms.Position = 0;

      using var image = Image.Load<Rgb24>(ms);
      var inputTensor = PreprocessImage(image);

      var inputs = new List<NamedOnnxValue>
                  {
                      NamedOnnxValue.CreateFromTensor("image", inputTensor)
                  };

      using IDisposableReadOnlyCollection<DisposableNamedOnnxValue> results = _session.Run(inputs);
      var output = results.ToDictionary(x => x.Name, x => x.Value);

      var boxes = (DenseTensor<float>)output["6379"];
      var labels = (DenseTensor<long>)output["6381"];
      var scores = (DenseTensor<float>)output["6383"];

      var detections = new List<object>();
      for (int i = 0; i < scores.Length; i++)
      {
         if (scores[i] > 0.5)
         {
            detections.Add(new
            {
               label = _labels[(int)labels[i]],
               score = scores[i],
               box = new
               {
                  x1 = boxes[i, 0],
                  y1 = boxes[i, 1],
                  x2 = boxes[i, 2],
                  y2 = boxes[i, 3]
               }
            });
         }
      }
      return new OkObjectResult(detections);
   }

   private static DenseTensor<float> PreprocessImage(Image<Rgb24> image)
   {
      // Step 1: Resize so that min(H, W) = 800, max(H, W) <= 1333, keeping aspect ratio
      int origWidth = image.Width;
      int origHeight = image.Height;
      int minSize = 800;
      int maxSize = 1333;

      float scale = Math.Min((float)minSize / Math.Min(origWidth, origHeight),
                             (float)maxSize / Math.Max(origWidth, origHeight));

      int resizedWidth = (int)Math.Round(origWidth * scale);
      int resizedHeight = (int)Math.Round(origHeight * scale);

      image.Mutate(x => x.Resize(resizedWidth, resizedHeight));

      // Step 2: Pad so that both dimensions are divisible by 32
      int padWidth = ((resizedWidth + 31) / 32) * 32;
      int padHeight = ((resizedHeight + 31) / 32) * 32;

      var paddedImage = new Image<Rgb24>(padWidth, padHeight);
      paddedImage.Mutate(ctx => ctx.DrawImage(image, new Point(0, 0), 1f));

      // Step 3: Convert to BGR and normalize
      float[] mean = { 102.9801f, 115.9465f, 122.7717f };
      var tensor = new DenseTensor<float>(new[] { 3, padHeight, padWidth });

      for (int y = 0; y < padHeight; y++)
      {
         for (int x = 0; x < padWidth; x++)
         {
            Rgb24 pixel = default;
            if (x < resizedWidth && y < resizedHeight)
               pixel = paddedImage[x, y];

            tensor[0, y, x] = pixel.B - mean[0];
            tensor[1, y, x] = pixel.G - mean[1];
            tensor[2, y, x] = pixel.R - mean[2];
         }
      }

      paddedImage.Dispose();

      return tensor;
   }
}

For my initial testing in the Azure Functions emulator using Fiddler Classic I manually generated 10 requests, then replayed them sequentially, and then finally concurrently.

The results for the manual, then sequential results were fairly consistent but the 10 concurrent requests each to took more than 10x longer. In addition, the CPU was at 100% usage while the concurrently executed functions were running.

Cloud Deployment

To see how the Faster R-CNN Azure HTTP Trigger function performed I created four resource groups.

The first contained resources used by the three different deployment models being tested

The second resource group was for testing a Dedicated hosting plan deployment.

The third resource group was for testing an Azure Functions Consumption plan hosting.

The fourth resource group was for testing Azure Functions Flex Consumption plan hosting.

Summary

The next couple of posts will compare and look at options for improving the “performance” (scalability, execution duration, latency, jitter, billing etc.) of the Github Copilot generated code.

Building Cloud AI with Copilot – Faster R-CNN Azure HTTP Function SKU Results

Introduction

While testing the FasterRCNNObjectDetectionHttpTrigger function with Telerik Fiddler Classic and my “standard” test image I noticed the response bodies were different sizes.

Initially the application plan was an S1 SKU (1 vCPU 1.75G RAM)

The output JSON was 641 bytes

[
  {
    "label": "person",
    "score": 0.9998331,
    "box": {
      "x1": 445.9223, "y1": 124.11987, "x2": 891.18915, "y2": 696.37164
    }
  },
  {
    "label": "person",
    "score": 0.9994991,
    "box": {
      "x1": 0, "y1": 330.16595, "x2": 471.0475, "y2": 761.35846
    }
  },
  {
    "label": "baseball bat",
    "score": 0.9952342,
    "box": { "x1": 869.8053, "y1": 336.96188, "x2": 1063.2261, "y2": 467.74136
    }
  },
  {
    "label": "sports ball",
    "score": 0.9945949,
    "box": { "x1": 1040.916, "y1": 372.41507, "x2": 1071.8958, "y2": 402.50424
    }
  },
  {
    "label": "baseball glove",
    "score": 0.9943546,
    "box": {
      "x1": 377.8922, "y1": 431.95053, "x2": 458.4937, "y2": 536.52124
    }
  },
  {
    "label": "person",
    "score": 0.51779467,
    "box": {
      "x1": 0, "y1": 239.91418, "x2": 60.342667, "y2": 397.17004
    }
  }
]

The application plan was scaled to a Premium v3 P0V3 (1 vCPU 4G RAM)

The output JSON was 637 bytes

[
  {
    "label": "person",
    "score": 0.9998332,
    "box": {
      "x1": 445.9223, "y1": 124.1199, "x2": 891.18915, "y2": 696.3716
    }
  },
  {
    "label": "person",
    "score": 0.9994991,
    "box": { "x1": 0, "y1": 330.16595, "x2": 471.0475, "y2": 761.35846
    }
  },
  {
    "label": "baseball bat",
    "score": 0.9952342,
    "box": {
      "x1": 869.8053, "y1": 336.9619, "x2": 1063.2261, "y2": 467.74133
    }
  },
  {
    "label": "sports ball",
    "score": 0.994595,
    "box": {
      "x1": 1040.916, "y1": 372.41507, "x2": 1071.8958, "y2": 402.50424
    }
  },
  {
    "label": "baseball glove",
    "score": 0.9943546,
    "box": {
      "x1": 377.8922, "y1": 431.95053, "x2": 458.4937, "y2": 536.52124
    }
  },
  {
    "label": "person",
    "score": 0.51779467,
    "box": {
      "x1": 0, "y1": 239.91418, "x2": 60.342667, "y2": 397.17004
    }
  }
]

The application plan was scaled to Premium v3 P1V3 (2 vCPU 8G RAM)

The output JSON was 641 bytes

[
  {
    "label": "person",
    "score": 0.9998331,
    "box": {
      "x1": 445.9223, "y1": 124.11987, "x2": 891.18915, "y2": 696.37164
    }
  },
  {
    "label": "person",
    "score": 0.9994991,
    "box": {
      "x1": 0, "y1": 330.16595, "x2": 471.0475, "y2": 761.35846
    }
  },
  {
    "label": "baseball bat",
    "score": 0.9952342,
    "box": {
      "x1": 869.8053, "y1": 336.96188, "x2": 1063.2261, "y2": 467.74136
    }
  },
  {
    "label": "sports ball",
    "score": 0.9945949,
    "box": {
      "x1": 1040.916, "y1": 372.41507, "x2": 1071.8958, "y2": 402.50424
    }
  },
  {
    "label": "baseball glove",
    "score": 0.9943546,
    "box": {
      "x1": 377.8922, "y1": 431.95053, "x2": 458.4937, "y2": 536.52124
    }
  },
  {
    "label": "person",
    "score": 0.51779467,
    "box": {
      "x1": 0, "y1": 239.91418, "x2": 60.342667, "y2": 397.17004
    }
  }
]

The application plan was scaled to a Premium v3 P2V3 (4 vCPU 16G RAM)

The output JSON was 641 bytes

[
  {
    "label": "person",
    "score": 0.9998331,
    "box": {
      "x1": 445.9223, "y1": 124.11987, "x2": 891.18915, "y2": 696.37164
    }
  },
  {
    "label": "person",
    "score": 0.9994991,
    "box": {
      "x1": 0, "y1": 330.16595, "x2": 471.0475, "y2": 761.35846
    }
  },
  {
    "label": "baseball bat",
    "score": 0.9952342,
    "box": {
      "x1": 869.8053, "y1": 336.96188, "x2": 1063.2261, "y2": 467.74136
    }
  },
  {
    "label": "sports ball",
    "score": 0.9945949,
    "box": {
      "x1": 1040.916, "y1": 372.41507, "x2": 1071.8958, "y2": 402.50424
    }
  },
  {
    "label": "baseball glove",
    "score": 0.9943546,
    "box": {
      "x1": 377.8922, "y1": 431.95053, "x2": 458.4937, "y2": 536.52124 }
  },
  {
    "label": "person",
    "score": 0.51779467,
    "box": {
      "x1": 0, "y1": 239.91418, "x2": 60.342667, "y2": 397.17004
    }
  }
]

The application plan was scaled to a Premium v2 P1V2 (1vCPU 3.5G)

The output JSON was 637 bytes

[
  {
    "label": "person",
    "score": 0.9998332,
    "box": {
      "x1": 445.9223, "y1": 124.1199, "x2": 891.18915, "y2": 696.3716
    }
  },
  {
    "label": "person",
    "score": 0.9994991,
    "box": {
      "x1": 0, "y1": 330.16595, "x2": 471.0475, "y2": 761.35846
    }
  },
  {
    "label": "baseball bat",
    "score": 0.9952342,
    "box": {
      "x1": 869.8053, "y1": 336.9619, "x2": 1063.2261, "y2": 467.74133
    }
  },
  {
    "label": "sports ball",
    "score": 0.994595,
    "box": {
      "x1": 1040.916, "y1": 372.41507, "x2": 1071.8958, "y2": 402.50424
    }
  },
  {
    "label": "baseball glove",
    "score": 0.9943546,
    "box": {
      "x1": 377.8922, "y1": 431.95053, "x2": 458.4937, "y2": 536.52124
    }
  },
  {
    "label": "person",
    "score": 0.51779467,
    "box": {
      "x1": 0, "y1": 239.91418, "x2": 60.342667, "y2": 397.17004
    }
  }
]

Summary

The differences between the 637 & 641were small

Not certain why this could happen currently best guess is memory pressure.

Building Cloud AI with Copilot – Faster R-CNN Azure HTTP Function “Dog Food”

Introduction

A couple of months ago a web crawler visited every page on my website (would be interesting to know if my Github repositories were crawled as well) and I wondered if this might impact my Copilot or Github Copilot experiments. My blogging about The Azure HTTP Trigger functions with Ultralytics Yolo, YoloSharp, Resnet, Faster R-CNN, with Open Neural Network Exchange(ONNX) etc. is fairly “niche” so any improvements in the understanding of the problems and generated code might be visible.

please write an httpTrigger azure function that uses Faster RCNN and ONNX to detect the object in an image uploaded in the body of an HTTP Post

Github Copilot had used Sixlabors ImageSharp, the ILogger was injected into the constructor, the code checked that the image was in the body of the HTTP POST and the object classes were loaded from a text file. I had to manually add some Nugets and using directives before the code compiled and ran in the emulator, but this was a definite improvement.

To test the implementation, I was using Telerik Fiddler Classic to HTTP POST my “standard” test image to function.

Github Copilot had generated code that checked that the image was in the body of the HTTP POST so I had to modify the Telerik Fiddler Classic request.

I also had to fix up the content-type header

The path to the onnx file was wrong and I had to create a labels.txt file from Python code.

The Azure HTTP Trigger function ran but failed because the preprocessing of the image didn’t implement the specified preprocess steps.

Change DenseTensor to BGR (based on https://github.com/onnx/models/tree/main/validated/vision/object_detection_segmentation/faster-rcnn#preprocessing-steps)

Normalise colour values with mean = [102.9801, 115.9465, 122.7717]

The Azure HTTP Trigger function ran but failed because the output tensor names were incorrect

I used Netron to inspect the model properties to get the correct names for the output tensors

I had a couple of attempts at resizing the image to see what impact this had on the accuracy of the confidence and minimum bounding rectangles.

resize the image such that both height and width are within the range of [800, 1333], and then pad the image with zeros such that both height and width are divisible by 32.

modify the code to resize the image such that both height and width are within the range of [800, 1333], and then pad the image with zeros such that both height and width are divisible by 32 and the aspect ratio is not changed.

The final version of the image processing code scaled then right padded the image to keep the aspect ratio and MBR coordinates correct.

As a final test I deployed the code to Azure and the first time I ran the function it failed because the labels file couldn’t be found because Unix file paths are case sensitive (labels.txt vs. Labels.txt).

The inferencing time was a bit longer than I expected.

// please write an httpTrigger azure function that uses Faster RCNN and ONNX to detect the object in an image uploaded in the body of an HTTP Post
//    manually added the ML.Net ONNX NuGet + using directives
//    manually added the ImageSharp NuGet + using directives
//    Used Copilot to add Microsoft.ML.OnnxRuntime.Tensors using directive
//    Manually added ONNX FIle + labels file sorted out paths
//    Used Netron to fixup output tensor names
// Change DenseTensor to BGR (based on https://github.com/onnx/models/tree/main/validated/vision/object_detection_segmentation/faster-rcnn#preprocessing-steps)
// Normalise colour values with mean = [102.9801, 115.9465, 122.7717]
// resize the image such that both height and width are within the range of [800, 1333], and then pad the image with zeros such that both height and width are divisible by 32.
// modify the code to resize the image such that both height and width are within the range of [800, 1333], and then pad the image with zeros such that both height and width are divisible by 32 and the aspect ratio is not changed.
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using Microsoft.ML.OnnxRuntime;
using Microsoft.ML.OnnxRuntime.Tensors;
using SixLabors.ImageSharp; // Couldn't get inteliisense after adding NuGet package
using SixLabors.ImageSharp.PixelFormats; // Couldn't get inteliisense after adding NuGet package
using SixLabors.ImageSharp.Processing; // Couldn't get inteliisense after adding NuGet package


namespace FasterRCNNObjectDetectionHttpTriggerGithubCopilot
{
   public class Function1
   {
      private readonly ILogger<Function1> _logger;
      private readonly InferenceSession _session;
      private readonly List<string> _labels;

      public Function1(ILogger<Function1> logger)
      {
         _logger = logger;
         _session = new InferenceSession("FasterRCNN-10.onnx");
         _labels = File.ReadAllLines("labels.txt").ToList();
      }

      [Function("ObjectDetectionFunction")]
      public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req)
      {
         if (!req.ContentType.StartsWith("image/"))
            return new BadRequestObjectResult("Content-Type must be an image.");

         using var ms = new MemoryStream();
         await req.Body.CopyToAsync(ms);
         ms.Position = 0;

         using var image = Image.Load<Rgb24>(ms);
         var inputTensor = PreprocessImage(image);

         var inputs = new List<NamedOnnxValue>
                  {
                      NamedOnnxValue.CreateFromTensor("image", inputTensor)
                  };

         using IDisposableReadOnlyCollection<DisposableNamedOnnxValue> results = _session.Run(inputs);
         var output = results.ToDictionary(x => x.Name, x => x.Value);

         var boxes = (DenseTensor<float>)output["6379"];
         var labels = (DenseTensor<long>)output["6381"];
         var scores = (DenseTensor<float>)output["6383"];

         var detections = new List<object>();
         for (int i = 0; i < scores.Length; i++)
         {
            if (scores[i] > 0.5)
            {
               detections.Add(new
               {
                  label = _labels[(int)labels[i]],
                  score = scores[i],
                  box = new
                  {
                     x1 = boxes[i, 0],
                     y1 = boxes[i, 1],
                     x2 = boxes[i, 2],
                     y2 = boxes[i, 3]
                  }
               });
            }
         }

         return new OkObjectResult(detections);
      }

      private static DenseTensor<float> PreprocessImage( Image<Rgb24> image)
      {
         // Step 1: Resize so that min(H, W) = 800, max(H, W) <= 1333, keeping aspect ratio
         int origWidth = image.Width;
         int origHeight = image.Height;
         int minSize = 800;
         int maxSize = 1333;

         float scale = Math.Min((float)minSize / Math.Min(origWidth, origHeight),
                                (float)maxSize / Math.Max(origWidth, origHeight));
         /*
         float scale = 1.0f;

         // If either dimension is less than 800, scale up so the smaller is 800
         if (origWidth < minSize || origHeight < minSize)
         {
            scale = Math.Max((float)minSize / origWidth, (float)minSize / origHeight);
         }
         // If either dimension is greater than 1333, scale down so the larger is 1333
         if (origWidth * scale > maxSize || origHeight * scale > maxSize)
         {
            scale = Math.Min((float)maxSize / origWidth, (float)maxSize / origHeight);
         }
         */

         int resizedWidth = (int)Math.Round(origWidth * scale);
         int resizedHeight = (int)Math.Round(origHeight * scale);

         image.Mutate(x => x.Resize(resizedWidth, resizedHeight));

         // Step 2: Pad so that both dimensions are divisible by 32
         int padWidth = ((resizedWidth + 31) / 32) * 32;
         int padHeight = ((resizedHeight + 31) / 32) * 32;

         var paddedImage = new Image<Rgb24>(padWidth, padHeight);
         paddedImage.Mutate(ctx => ctx.DrawImage(image, new Point(0, 0), 1f));

         // Step 3: Convert to BGR and normalize
         float[] mean = { 102.9801f, 115.9465f, 122.7717f };
         var tensor = new DenseTensor<float>(new[] { 3, padHeight, padWidth });

         for (int y = 0; y < padHeight; y++)
         {
            for (int x = 0; x < padWidth; x++)
            {
               Rgb24 pixel = default;
               if (x < resizedWidth && y < resizedHeight)
                  pixel = paddedImage[x, y];

               tensor[0, y, x] = pixel.B - mean[0];
               tensor[1, y, x] = pixel.G - mean[1];
               tensor[2, y, x] = pixel.R - mean[2];
            }
         }

         paddedImage.Dispose();
         return tensor;
      }
   }
}

It took roughly an hour to “vibe code” the function, but it would have taken much longer for someone not familiar with the problem domain.

Summary

The Github Copilot generated code was okay but would be fragile, performance would suck and not scale terribly well.

The Copilot generated code in this post is not suitable for production

Building Cloud AI with Copilot – ResNet50 Azure HTTP Function

Introduction

This is another awfully long post about my experience using Copilot to write an Azure HTTP Trigger function that runs a resnet50 V2.7 Open Neural Network Exchange model(ONNX) on an image in the body of the HTTP POST.

For testing I was uploading the images with Telerik Fiddler Classic.

I forgot to specify language, so Copilot assumed (reasonably) that I wanted a Python Azure HTTP Trigger function.

The initial C# version wouldn’t compile because of the FunctionName attribute which is used for in-process Azure Functions. It did seem a bit odd that Copilot would generate code that support will end for November 10, 2026

public static class Function1
{
   private static readonly InferenceSession session = new InferenceSession("resnet50.onnx");

   [FunctionName("ImageClassification")]
   public static IActionResult Run(
       [HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req,
       ILogger log)
   {
      log.LogInformation("Processing image classification request...");

      try
      {
         using var ms = new MemoryStream();
         req.Body.CopyTo(ms);
         using var image = Image.FromStream(ms);

         var inputTensor = PreprocessImage(image);

         var inputName = session.InputMetadata.Keys.First();
         var outputName = session.OutputMetadata.Keys.First();
         var result = session.Run(new Dictionary<string, NamedOnnxValue>
            {
                { inputName, NamedOnnxValue.CreateFromTensor(inputName, inputTensor) }
            });

         var predictions = result.First().AsTensor<float>().ToArray();

         return new JsonResult(new { predictions });
      }
      catch (Exception ex)
      {
         log.LogError($"Error: {ex.Message}");
         return new BadRequestObjectResult("Invalid image or request.");
      }
   }
...
}

It was just easier to change the FunctionName attribute manually.

public static class Function1
{
   private static readonly InferenceSession session = new InferenceSession("resnet50.onnx");

   [Function("ImageClassification")]
   public static IActionResult Run(
       [HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req,
       ILogger log)
   {
      log.LogInformation("Processing image classification request...");

      try
      {
         using var ms = new MemoryStream();
         req.Body.CopyTo(ms);
         using var image = Image.FromStream(ms);

         var inputTensor = PreprocessImage(image);

         var inputName = session.InputMetadata.Keys.First();
         var outputName = session.OutputMetadata.Keys.First();
         var inputList = new List<NamedOnnxValue>
            {
                NamedOnnxValue.CreateFromTensor(inputName, inputTensor)
            };

         var result = session.Run(inputList);

         var predictions = result.First().AsTensor<float>().ToArray();

         return new JsonResult(new { predictions });
      }
      catch (Exception ex)
      {
         log.LogError($"Error: {ex.Message}");
         return new BadRequestObjectResult("Invalid image or request.");
      }
   }

The Azure HTTP Trigger function ran but failed when I tried to classify an image

The initialisation of the ILogger injected into the Run method was broken so I used Copilot to update the code to use constructor Dependency Injection (DI).

public static class Function1
{
   private static readonly ILogger logger;
   private static readonly InferenceSession session = new InferenceSession("resnet50-v2-7.onnx");

   // Static constructor to initialize logger
   static Function1()
   {
      var loggerFactory = LoggerFactory.Create(builder =>
      {
         builder.AddConsole();
      });
      logger = loggerFactory.CreateLogger("Function1Logger");
   }

   [Function("ImageClassification")]
   public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req)
   {
      logger.LogInformation("Processing image classification request...");

      try
      {
         using var ms = new MemoryStream();
         req.Body.CopyTo(ms);
         using var image = Image.FromStream(ms);

         var inputTensor = PreprocessImage(image);

         var inputName = session.InputMetadata.Keys.First();
         var outputName = session.OutputMetadata.Keys.First();
         var inputList = new List<NamedOnnxValue>
            {
                NamedOnnxValue.CreateFromTensor(inputName, inputTensor)
            };

         var result = session.Run(inputList);

         var predictions = result.First().AsTensor<float>().ToArray();

         return new JsonResult(new { predictions });
      }
      catch (Exception ex)
      {
         logger.LogError($"Error: {ex.Message}");
         return new BadRequestObjectResult("Invalid image or request.");
      }
   }
...
}

It was a bit odd that Copilot generated a static function and constructor unlike the equivalent YoloSharp Azure HTTP Trigger.

The Azure HTTP Trigger function ran but failed when I tried to classify an image

The Azure HTTP Trigger function ran but failed with a 400 Bad Request when I tried to classify an image

After some debugging I realised that Telerik Fiddle Classic was sending the image as form data so I modified the “composer” payload configuration.

Then the Azure HTTP Trigger function ran but the confidence values were wrong.

The confidence values were incorrect, so I checked the ResNet50 pre-processing instructions

The image needs to be preprocessed before fed to the network. The first step is to extract a 224x224 crop from the center of the image. For this, the image is first scaled to a minimum size of 256x256, while keeping aspect ratio. That is, the shortest side of the image is resized to 256 and the other side is scaled accordingly to maintain the original aspect ratio. After that, the image is normalized with mean = 255*[0.485, 0.456, 0.406] and std = 255*[0.229, 0.224, 0.225]. Last step is to transpose it from HWC to CHW layout.
 private static Tensor<float> PreprocessImage(Image image)
 {
    var resized = new Bitmap(image, new Size(224, 224));
    var tensorData = new float[1 * 3 * 224 * 224];

    float[] mean = { 0.485f, 0.456f, 0.406f };
    float[] std = { 0.229f, 0.224f, 0.225f };

    for (int y = 0; y < 224; y++)
    {
       for (int x = 0; x < 224; x++)
       {
          var pixel = resized.GetPixel(x, y);

          tensorData[(0 * 3 * 224 * 224) + (0 * 224 * 224) + (y * 224) + x] = (pixel.R / 255.0f - mean[0]) / std[0];
          tensorData[(0 * 3 * 224 * 224) + (1 * 224 * 224) + (y * 224) + x] = (pixel.G / 255.0f - mean[1]) / std[1];
          tensorData[(0 * 3 * 224 * 224) + (2 * 224 * 224) + (y * 224) + x] = (pixel.B / 255.0f - mean[2]) / std[2];
       }
    }

    return new DenseTensor<float>(tensorData, new[] { 1, 3, 224, 224 });
 }

When the “normalisation” code was implemented and the Azure HTTP Trigger function run the confidence values were still incorrect.

The Azure HTTP Trigger function was working reliably but the number of results and size response payload was unnecessary.

The Azure HTTP Trigger function ran but the confidence values were still incorrect, so I again checked the ResNet50 post-processing instructions

Postprocessing
The post-processing involves calculating the softmax probability scores for each class. You can also sort them to report the most probable classes. Check imagenet_postprocess.py for code.
 // Compute exponentials for all scores
 var expScores = predictions.Select(MathF.Exp).ToArray();

 // Compute sum of exponentials
 float sumExpScores = expScores.Sum();

 // Normalize scores into probabilities
 var softmaxResults = expScores.Select(score => score / sumExpScores).ToArray();

 // Get top 10 predictions (label ID and confidence)
 var top10 = softmaxResults
     .Select((confidence, labelId) => new { labelId, confidence, label = labelId < labels.Count ? labels[labelId] : $"Unknown-{labelId}" })
     .OrderByDescending(p => p.confidence)
     .Take(10)
     .ToList();

The Azure HTTP Trigger function should run on multiple platforms so System.Drawing.Comon had to be replaced with Sixlabors ImageSharp

The Azure HTTP Trigger function ran but the Sixlabors ImageSharp based image classification failed.

After some debugging I realised that the MemoryStream used to copy the HTTPRequest body was not being reset.

[Function("ImageClassification")]
public static async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req)
{
   logger.LogInformation("Processing image classification request...");

   try
   {
      using var ms = new MemoryStream();
      await req.Body.CopyToAsync(ms);

      ms.Seek(0, SeekOrigin.Begin);

      using var image = Image.Load<Rgb24>(ms);

      var inputTensor = PreprocessImage(image);
...   
   }
   catch (Exception ex)
   {
      logger.LogError($"Error: {ex.Message}");
      return new BadRequestObjectResult("Invalid image or request.");
   }
}

The odd thing was the confidence values changed slightly when the code was modified to use Sixlabors ImageSharp

The Azure HTTP Trigger function worked but the labelId wasn’t that “human readable”.

public static class Function1
{
   private static readonly ILogger logger;
   private static readonly InferenceSession session = new InferenceSession("resnet50-v2-7.onnx");
   private static readonly List<string> labels = LoadLabels("labels.txt");
...
   [Function("ImageClassification")]
   public static async Task<IActionResult> Run(
       [HttpTrigger(AuthorizationLevel.Function, "post")] HttpRequest req)
   {
      logger.LogInformation("Processing image classification request...");

      try
      {
...
         // Get top 10 predictions (label ID and confidence)
         var top10 = softmaxResults
             .Select((confidence, labelId) => new { labelId, confidence, label = labelId < labels.Count ? labels[labelId] : $"Unknown-{labelId}" })
             .OrderByDescending(p => p.confidence)
             .Take(10)
             .ToList();

         return new JsonResult(new { predictions = top10 });
      }
      catch (Exception ex)
      {
         logger.LogError($"Error: {ex.Message}");
         return new BadRequestObjectResult("Invalid image or request.");
      }
   }
...
   private static List<string> LoadLabels(string filePath)
   {
      try
      {
         return File.ReadAllLines(filePath).ToList();
      }
      catch (Exception ex)
      {
         logger.LogError($"Error loading labels file: {ex.Message}");
         return new List<string>(); // Return empty list if file fails to load
      }
   }
}

Summary

The Github Copilot generated code was okay but would be fragile and not scale terribly well. The confidence values changing very slightly when the code was updated for Sixlabors ImageSharp was disconcerting, but not surprising.

The Copilot generated code in this post is not suitable for production

Azure Function SendGrid Binding Fail

This post is for Azure Function developers having issues with the SendGrid binding throwing exceptions like the one below.

System.Private.CoreLib: Exception while executing function: Functions.AzureBlobFileUploadEmailer. Microsoft.Azure.WebJobs.Extensions.SendGrid: A 'To' address must be specified for the message

My Azure BlobTrigger Function sends an email (with SendGrid) when a file is uploaded to an Azure Blob Storage Container(a couple of times a day).

public class FileUploadEmailer(ILogger<FileUploadEmailer> logger, IOptions<EmailSettings> emailSettings)
{
   private readonly ILogger<FileUploadEmailer> _logger = logger;
   private readonly EmailSettings _emailSettings = emailSettings.Value;

   [Function(nameof(AzureBlobFileUploadEmailer))]
   [SendGridOutput(ApiKey = "SendGridAPIKey")]
   public string Run([BlobTrigger("filestobeprocesed/{name}", Connection = "upload-file-storage")] Stream stream, string name)
   {
      _logger.LogInformation("FileUploadEmailer Blob trigger function Processed blob Name:{0} start", name);

      try
      {
         var message = new SendGridMessage();

         message.SetFrom(_emailSettings.From);
         message.AddTo(_emailSettings.To);
         message.Subject = _emailSettings.Subject;

         message.AddContent(MimeType.Html, string.Format(_emailSettings.BodyFormat, name, DateTime.UtcNow));

         // WARNING - Use Newtonsoft JSON serializer to produce JSON string. System.Text.Json won't work because property annotations are different
         var messageJson = Newtonsoft.Json.JsonConvert.SerializeObject(message);

         _logger.LogInformation("FileUploadEmailer Blob trigger function Processed blob Name:{0} finish", name);

         return messageJson;
      }
      catch (Exception ex)
      {
         _logger.LogError(ex, "FileUploadEmailer Blob trigger function Processed blob Name: {0}", name);

         throw;
      }
   }
}

I missed the first clue when I looked at the JSON and missed the Tos, Ccs, Bccs property names.

{
"From":{"Name":"Foo","Email":"bryn.lewis@devmobile.co.nz"},
"Subject":"Hi 30/09/2024 1:27:49 pm",
"Personalizations":[{"Tos":[{"Name":"Bar","Email":"bryn.lewis@devmobile.co.nz"}],
"Ccs":null,
"Bccs":null,
"From":null,
"Subject":null,
"Headers":null,
"Substitutions":null,
"CustomArgs":null,
"SendAt":null,
"TemplateData":null}],
"Contents":[{"Type":"text/html","Value":"\u003Ch2\u003EHello AssemblyInfo.cs\u003C/h2\u003E"}],
"PlainTextContent":null,
"HtmlContent":null,
"Attachments":null,
"TemplateId":null,
"Headers":null,
"Sections":null,
"Categories":null,
"CustomArgs":null,
"SendAt":null,
"Asm":null,
"BatchId":null,
"IpPoolName":null,
"MailSettings":null,
"TrackingSettings":null,
"ReplyTo":null,
"ReplyTos":null
}

I wasn’t paying close enough attention to the sample code and used the System.Text.Json rather than Newtonsoft.Json to serialize the SendGridMessage object. They use different attributes for property names etc. so the JSON generated was wrong.

Initially, I tried adding System.Text.Json attributes to the SendGridMessage class

namespace SendGrid.Helpers.Mail
{
   /// <summary>
   /// Class SendGridMessage builds an object that sends an email through Twilio SendGrid.
   /// </summary>
   [JsonObject(IsReference = false)]
   public class SendGridMessage
   {
      /// <summary>
      /// Gets or sets an email object containing the email address and name of the sender. Unicode encoding is not supported for the from field.
      /// </summary>
      //[JsonProperty(PropertyName = "from")]
      [JsonPropertyName("from")]
      public EmailAddress From { get; set; }

      /// <summary>
      /// Gets or sets the subject of your email. This may be overridden by personalizations[x].subject.
      /// </summary>
      //[JsonProperty(PropertyName = "subject")]
      [JsonPropertyName("subject")]
      public string Subject { get; set; }

      /// <summary>
      /// Gets or sets a list of messages and their metadata. Each object within personalizations can be thought of as an envelope - it defines who should receive an individual message and how that message should be handled. For more information, please see our documentation on Personalizations. Parameters in personalizations will override the parameters of the same name from the message level.
      /// https://sendgrid.com/docs/Classroom/Send/v3_Mail_Send/personalizations.html.
      /// </summary>
      //[JsonProperty(PropertyName = "personalizations", IsReference = false)]
      [JsonPropertyName("personalizations")]
      public List<Personalization> Personalizations { get; set; }
...
}

SendGridMessage uses other classes like EmailAddress which worked because the property names matched the JSON

namespace SendGrid.Helpers.Mail
{
    /// <summary>
    /// An email object containing the email address and name of the sender or recipient.
    /// </summary>
    [JsonObject(IsReference = false)]
    public class EmailAddress : IEquatable<EmailAddress>
    {
        /// <summary>
        /// Initializes a new instance of the <see cref="EmailAddress"/> class.
        /// </summary>
        public EmailAddress()
        {
        }

        /// <summary>
        /// Initializes a new instance of the <see cref="EmailAddress"/> class.
        /// </summary>
        /// <param name="email">The email address of the sender or recipient.</param>
        /// <param name="name">The name of the sender or recipient.</param>
        public EmailAddress(string email, string name = null)
        {
            this.Email = email;
            this.Name = name;
        }

        /// <summary>
        /// Gets or sets the name of the sender or recipient.
        /// </summary>
        [JsonProperty(PropertyName = "name")]
        public string Name { get; set; }

        /// <summary>
        /// Gets or sets the email address of the sender or recipient.
        /// </summary>
        [JsonProperty(PropertyName = "email")]
        public string Email { get; set; }
...
}

Many of the property name “mismatch” issues were in the Personalization class with the Toos, Ccs, bccs etc. properties

namespace SendGrid.Helpers.Mail
{
    /// <summary>
    /// An array of messages and their metadata. Each object within personalizations can be thought of as an envelope - it defines who should receive an individual message and how that message should be handled. For more information, please see our documentation on Personalizations. Parameters in personalizations will override the parameters of the same name from the message level.
    /// https://sendgrid.com/docs/Classroom/Send/v3_Mail_Send/personalizations.html.
    /// </summary>
    [JsonObject(IsReference = false)]
    public class Personalization
    {
        /// <summary>
        /// Gets or sets an array of recipients. Each email object within this array may contain the recipient’s name, but must always contain the recipient’s email.
        /// </summary>
        [JsonProperty(PropertyName = "to", IsReference = false)]
        [JsonConverter(typeof(RemoveDuplicatesConverter<EmailAddress>))]
        public List<EmailAddress> Tos { get; set; }

        /// <summary>
        /// Gets or sets an array of recipients who will receive a copy of your email. Each email object within this array may contain the recipient’s name, but must always contain the recipient’s email.
        /// </summary>
        [JsonProperty(PropertyName = "cc", IsReference = false)]
        [JsonConverter(typeof(RemoveDuplicatesConverter<EmailAddress>))]
        public List<EmailAddress> Ccs { get; set; }

        /// <summary>
        /// Gets or sets an array of recipients who will receive a blind carbon copy of your email. Each email object within this array may contain the recipient’s name, but must always contain the recipient’s email.
        /// </summary>
        [JsonProperty(PropertyName = "bcc", IsReference = false)]
        [JsonConverter(typeof(RemoveDuplicatesConverter<EmailAddress>))]
        public List<EmailAddress> Bccs { get; set; }

        /// <summary>
        /// Gets or sets the from email address. The domain must match the domain of the from email property specified at root level of the request body.
        /// </summary>
        [JsonProperty(PropertyName = "from")]
        public EmailAddress From { get; set; }

        /// <summary>
        /// Gets or sets the subject line of your email.
        /// </summary>
        [JsonProperty(PropertyName = "subject")]
        public string Subject { get; set; }
...
}

After a couple of failed attempts at decorating the SendGrid SendGridMessage, EmailAddress, Personalization etc. classes I gave up and reverted to the Newtonsoft.Json serialiser.

Note to self – pay closer attention to the samples.

Myriota Connector – Azure IoT Hub DTDL Support

The Myriota connector supports the use of Digital Twin Definition Language(DTDL) for Azure IoT Hub Connection Strings and the Azure IoT Hub Device Provisioning Service(DPS).

{
  "ConnectionStrings": {
    "ApplicationInsights": "...",
    "UplinkQueueStorage": "...",
    "PayloadFormattersStorage": "..."
  },
  "AzureIoT": {
   ...
 "ApplicationToDtdlModelIdMapping": {
   "tracker": "dtmi:myriotaconnector:Tracker_2lb;1",
     }
  }
 ...    
}

The Digital Twin Definition Language(DTDL) configuration used when a device is provisioned or when it connects is determined by the payload application which is based on the Myriota Destination endpoint.

The Azure Function Configuration of Application to DTDL Model ID

BEWARE – They application in ApplicationToDtdlModelIdMapping is case sensitive!

Azure IoT Central Device Template Configuration

I used Azure IoT Central Device Template functionality to create my Azure Digital Twin definitions.

Azure IoT Hub Device Connection String

The DeviceClient CreateFromConnectionString method has an optional ClientOptions parameter which specifies the DTLDL model ID for the duration of the connection.

private async Task<DeviceClient> AzureIoTHubDeviceConnectionStringConnectAsync(string terminalId, string application, object context)
{
    DeviceClient deviceClient;

    if (_azureIoTSettings.ApplicationToDtdlModelIdMapping.TryGetValue(application, out string? modelId))
    {
        ClientOptions clientOptions = new ClientOptions()
        {
            ModelId = modelId
        };

        deviceClient = DeviceClient.CreateFromConnectionString(_azureIoTSettings.AzureIoTHub.ConnectionString, terminalId, TransportSettings, clientOptions);
    }
    else
    { 
        deviceClient = DeviceClient.CreateFromConnectionString(_azureIoTSettings.AzureIoTHub.ConnectionString, terminalId, TransportSettings);
    }

    await deviceClient.OpenAsync();

    return deviceClient;
}
Azure IoT Explorer Telemetry message with DTDL Model ID

Azure IoT Hub Device Provisioning Service

The ProvisioningDeviceClient RegisterAsync method has an optional ProvisionRegistrationAdditionalData parameter. The PnpConnection CreateDpsPayload is used to generate the JsonData property which specifies the DTLDL model ID used when the device is initially provisioned.

private async Task<DeviceClient> AzureIoTHubDeviceProvisioningServiceConnectAsync(string terminalId, string application, object context)
{
    DeviceClient deviceClient;

    string deviceKey;
    using (var hmac = new HMACSHA256(Convert.FromBase64String(_azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GroupEnrollmentKey)))
    {
        deviceKey = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(terminalId)));
    }

    using (var securityProvider = new SecurityProviderSymmetricKey(terminalId, deviceKey, null))
    {
        using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly))
        {
            DeviceRegistrationResult result;

            ProvisioningDeviceClient provClient = ProvisioningDeviceClient.Create(
                _azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GlobalDeviceEndpoint,
                _azureIoTSettings.AzureIoTHub.DeviceProvisioningService.IdScope,
                securityProvider,
            transport);

            if (_azureIoTSettings.ApplicationToDtdlModelIdMapping.TryGetValue(application, out string? modelId))
            {
                ClientOptions clientOptions = new ClientOptions()
                {
                    ModelId = modelId
                };

                ProvisioningRegistrationAdditionalData provisioningRegistrationAdditionalData = new ProvisioningRegistrationAdditionalData()
                {
                    JsonData = PnpConvention.CreateDpsPayload(modelId)
                };
                result = await provClient.RegisterAsync(provisioningRegistrationAdditionalData);
            }
            else
            {
                result = await provClient.RegisterAsync();
            }
  
            if (result.Status != ProvisioningRegistrationStatusType.Assigned)
            {
                _logger.LogWarning("Uplink-DeviceID:{0} RegisterAsync status:{1} failed ", terminalId, result.Status);

                throw new ApplicationException($"Uplink-DeviceID:{0} RegisterAsync status:{1} failed");
            }

            IAuthenticationMethod authentication = new DeviceAuthenticationWithRegistrySymmetricKey(result.DeviceId, (securityProvider as SecurityProviderSymmetricKey).GetPrimaryKey());

            deviceClient = DeviceClient.Create(result.AssignedHub, authentication, TransportSettings);
        }
    }

    await deviceClient.OpenAsync();

    return deviceClient;
}
Azure IoT Central Device Connection Group configuration

An Azure IoT Central Device connection groups can be configured to “automagically” provision devices.

Myriota Connector – Azure IoT Hub Connectivity

The Myriota connector supports the use of Azure IoT Hub Connection Strings and the Azure IoT Hub Device Provisioning Service(DPS) for device management. I use Alastair Crabtree’s LazyCache to store Azure IoT Hub connections which are opened the first time they are used.

 public async Task<DeviceClient> GetOrAddAsync(string terminalId, object context)
 {
     DeviceClient deviceClient;

     switch (_azureIoTSettings.AzureIoTHub.ConnectionType)
     {
         case Models.AzureIotHubConnectionType.DeviceConnectionString:
             deviceClient = await _azuredeviceClientCache.GetOrAddAsync(terminalId, (ICacheEntry x) => AzureIoTHubDeviceConnectionStringConnectAsync(terminalId, context));
             break;
         case Models.AzureIotHubConnectionType.DeviceProvisioningService:
             deviceClient = await _azuredeviceClientCache.GetOrAddAsync(terminalId, (ICacheEntry x) => AzureIoTHubDeviceProvisioningServiceConnectAsync(terminalId, context));
             break;
         default:
             _logger.LogError("Uplink- Azure IoT Hub ConnectionType unknown {0}", _azureIoTSettings.AzureIoTHub.ConnectionType);

             throw new NotImplementedException("AzureIoT Hub unsupported ConnectionType");
     }

     return deviceClient;
 }

The IAzureDeviceClientCache.GetOrAddAsync method returns an open Azure IoT Hub DeviceClient connection or uses the method specified in the application configuration.

Azure IoT Hub Device Connection String

The Azure IoT Hub delegate uses a Device Connection String which is retrieved from the application configuration.

{
  "ConnectionStrings": {
    "ApplicationInsights": "...",
    "UplinkQueueStorage": "...",
    "PayloadFormattersStorage": "..."
  },
  "AzureIoT": {
    "AzureIoTHub": {
      "ConnectionType": "DeviceConnectionString",
      "connectionString": "HostName=....azure-devices.net;SharedAccessKeyName=device;SharedAccessKey=...",
        }
   }
 ...    
}
Azure Function with IoT Hub Device connection string configuration
private async Task<DeviceClient> AzureIoTHubDeviceConnectionStringConnectAsync(string terminalId, object context)
{
    DeviceClient deviceClient = DeviceClient.CreateFromConnectionString(_azureIoTSettings.AzureIoTHub.ConnectionString, terminalId, TransportSettings);

    await deviceClient.OpenAsync();

    return deviceClient;
 }
Azure IoT Hub Device Shared Access Policy for Device Connection String

One of my customers uses an Azure Logic Application to manage Myriota and Azure IoT Connector configuration.

Azure IoT Hub manual Device configuration

Azure IoT Hub Device Provisioning Service

The Azure IoT Hub Device Provisioning Service(DPS) delegate uses Symmetric Key Attestation with the Global Device Endpoint, ID Scope and Group Enrollment Key retrieved from the application configuration.

{
  "ConnectionStrings": {
    "ApplicationInsights": "...",
    "UplinkQueueStorage": "...",
    "PayloadFormattersStorage": "..."
  },
  "AzureIoT": {
      "ConnectionType": "DeviceProvisioningService",
      "DeviceProvisioningServiceIoTHub": {
        "GlobalDeviceEndpoint": "global.azure-devices-provisioning.net",
        "IDScope": ".....",
        "GroupEnrollmentKey": "...."
      }
   }
}
Azure IoT Function with Azure IoT Hub Device Provisioning Service(DPS) configuration

Symmetric key attestation with the Azure IoT Hub Device Provisioning Service(DPS) is performed using the same security tokens supported by Azure IoT Hubs to securely connect devices. The symmetric key of an enrollment group isn’t used directly by devices in the provisioning process. Instead, devices that provision through an enrollment group do so using a derived device key.

private async Task<DeviceClient> AzureIoTHubDeviceProvisioningServiceConnectAsync(string terminalId, object context)
{
    DeviceClient deviceClient;

    string deviceKey;
    using (var hmac = new HMACSHA256(Convert.FromBase64String(_azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GroupEnrollmentKey)))
    {
        deviceKey = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(terminalId)));
    }

    using (var securityProvider = new SecurityProviderSymmetricKey(terminalId, deviceKey, null))
    {
        using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly))
        {
            DeviceRegistrationResult result;

            ProvisioningDeviceClient provClient = ProvisioningDeviceClient.Create(
                _azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GlobalDeviceEndpoint,
                _azureIoTSettings.AzureIoTHub.DeviceProvisioningService.IdScope,
                securityProvider,
                transport);

            result = await provClient.RegisterAsync();
  
            if (result.Status != ProvisioningRegistrationStatusType.Assigned)
            {
                _logger.LogWarning("Uplink-DeviceID:{0} RegisterAsync status:{1} failed ", terminalId, result.Status);

                throw new ApplicationException($"Uplink-DeviceID:{0} RegisterAsync status:{1} failed");
            }

            IAuthenticationMethod authentication = new DeviceAuthenticationWithRegistrySymmetricKey(result.DeviceId, (securityProvider as SecurityProviderSymmetricKey).GetPrimaryKey());

            deviceClient = DeviceClient.Create(result.AssignedHub, authentication, TransportSettings);
        }
    }

    await deviceClient.OpenAsync();

    return deviceClient;
}

The derived device key is a hash of the device’s registration ID and is computed using the symmetric key of the enrollment group. The device can then use its derived device key to sign the SAS token it uses to register with DPS.

Azure Device Provisioning Service Adding Enrollment Group Attestation
Azure Device Provisioning Service Add Enrollment Group IoT Hub(s) selection.
Azure Device Provisioning Service Manager Enrollments

For initial development and testing I ran the function application in the desktop emulator and simulated Myriota Device Manager webhook calls with Azure Storage Explorer and modified sample payloads.

Azure Storage Explorer Storage Account Queued Messages

I then used Azure IoT Explorer to configure devices, view uplink traffic etc.

Azure IoT Explorer Devices

When I connected to my Azure IoT Hub shortly after starting the Myriota Azure IoT Connector Function my test devices started connecting as messages arrived.

Azure IoT Explorer Device Telemetry

I then deployed my function to Azure and configured the Azure IoT Hub connection string, Azure Application Insights connection string etc.

Azure Portal Myriota Resource Group
Azure Portal Myriota IoT Hub Metrics

There was often a significant delay for the Device Status to update. which shouldn’t be a problem.

Swarm Space – Underlying Architecture sorted

After figuring out that calling an Azure Http Trigger function to load the cache wasn’t going to work reliably, I have revisited the architecture one last time and significantly refactored the SwarmSpaceAzuureIoTConnector project.

Visual Studio 2022 solution

The application now has a StartUpService which loads the Azure DeviceClient cache (Lazy Cache) in the background as the application starts up. If an uplink message is received from a SwarmDevice before, it has been loaded by the FunctionsStartup the DeviceClient information is cached and another connection to the Azure IoT Hub is not established.

...
using Microsoft.Azure.Functions.Extensions.DependencyInjection;

[assembly: FunctionsStartup(typeof(devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector.StartUpService))]
namespace devMobile.IoT.SwarmSpaceAzureIoTConnector.Connector
{
...
    public class StartUpService : BackgroundService
    {
        private readonly ILogger<StartUpService> _logger;
        private readonly ISwarmSpaceBumblebeeHive _swarmSpaceBumblebeeHive;
        private readonly Models.ApplicationSettings _applicationSettings;
        private readonly IAzureDeviceClientCache _azureDeviceClientCache;

        public StartUpService(ILogger<StartUpService> logger, IAzureDeviceClientCache azureDeviceClientCache, ISwarmSpaceBumblebeeHive swarmSpaceBumblebeeHive, IOptions<Models.ApplicationSettings> applicationSettings)//, IOptions<Models.AzureIoTSettings> azureIoTSettings)
        {
            _logger = logger;
            _azureDeviceClientCache = azureDeviceClientCache;
            _swarmSpaceBumblebeeHive = swarmSpaceBumblebeeHive;
            _applicationSettings = applicationSettings.Value;
        }

        protected override async Task ExecuteAsync(CancellationToken cancellationToken)
        {
            await Task.Yield();

            _logger.LogInformation("StartUpService.ExecuteAsync start");

            try
            {
                _logger.LogInformation("BumblebeeHiveCacheRefresh start");

                foreach (SwarmSpace.BumblebeeHiveClient.Device device in await _swarmSpaceBumblebeeHive.DeviceListAsync(cancellationToken))
                {
                    _logger.LogInformation("BumblebeeHiveCacheRefresh DeviceId:{DeviceId} DeviceName:{DeviceName}", device.DeviceId, device.DeviceName);

                    Models.AzureIoTDeviceClientContext context = new Models.AzureIoTDeviceClientContext()
                    {
                        OrganisationId = _applicationSettings.OrganisationId,
                        DeviceType = (byte)device.DeviceType,
                        DeviceId = (uint)device.DeviceId,
                    };

                    await _azureDeviceClientCache.GetOrAddAsync(context.DeviceId, context);
                }

                _logger.LogInformation("BumblebeeHiveCacheRefresh finish");
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, "StartUpService.ExecuteAsync error");

                throw;
            }

            _logger.LogInformation("StartUpService.ExecuteAsync finish");
        }
    }
}

The uplink and downlink payload formatters are stored in Azure Blob Storage are compiled (CS-Script) as they are loaded then cached (Lazy Cache)

Azure Storage explorer displaying list of uplink payload formatter blobs.
Azure Storage explorer displaying list of downlink payload formatter blobs.
private async Task<IFormatterDownlink> DownlinkLoadAsync(int userApplicationId)
{
    BlobClient blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormattersDownlinkContainer, $"{userApplicationId}.cs");

    if (!await blobClient.ExistsAsync())
    {
        _logger.LogInformation("PayloadFormatterDownlink- UserApplicationId:{0} Container:{1} not found using default:{2}", userApplicationId, _applicationSettings.PayloadFormattersUplinkContainer, _applicationSettings.PayloadFormatterUplinkBlobDefault);

        blobClient = new BlobClient(_payloadFormatterConnectionString, _applicationSettings.PayloadFormatterDownlinkBlobDefault, _applicationSettings.PayloadFormatterDownlinkBlobDefault);
    }

    BlobDownloadResult downloadResult = await blobClient.DownloadContentAsync();

    return CSScript.Evaluator.LoadCode<PayloadFormatter.IFormatterDownlink>(downloadResult.Content.ToString());
}

The uplink and downlink formatters can be edited in Visual Studio 2022 with syntax highlighting (currently they have to be manually uploaded).

The SwarmSpaceBumbleebeehive module no longer has public login or logout methods.

    public interface ISwarmSpaceBumblebeeHive
    {
        public Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken);

        public Task SendAsync(uint organisationId, uint deviceId, byte deviceType, ushort userApplicationId, byte[] payload);
    }

The DeviceListAsync and SendAsync methods now call the BumblebeeHive login method after configurable period of inactivity.

public async Task<ICollection<Device>> DeviceListAsync(CancellationToken cancellationToken)
{
        if ((_TokenActivityAtUtC + _bumblebeeHiveSettings.TokenValidFor) < DateTime.UtcNow)
        {
            await Login();
        }

        using (HttpClient httpClient = _httpClientFactory.CreateClient())
       {
            Client client = new Client(httpClient);

            client.BaseUrl = _bumblebeeHiveSettings.BaseUrl;

            httpClient.DefaultRequestHeaders.Add("Authorization", $"bearer {_token}");

            return await client.GetDevicesAsync(null, null, null, null, null, null, null, null, null, cancellationToken);
        }
}

I’m looking at building a webby user interface where users an interactivity list, create, edit, delete formatters with syntax highlighter support, and the executing the formatter with sample payloads.

Swarm Space Azure IoT Connector Identity Translation Gateway Architecture

This approach uses most of the existing building blocks, and that’s it no more changes.

Swarm Space – DeviceClient Cache warming with HTTPTrigger

For C2D messaging to work a device must have a DeviceClient “connection” established to the Azure IoT Hub which is a problem for irregularly connect devices. Sometimes establishing a connection on the first D2C messages is sufficient, especially for devices which only support D2C messaging. An Identity Translation Gateway establishes a connection for each device (see discussion about AMQP Connection multiplexing) so that C2D messages can be sent immediately.

I initially tried building a cache loader with BackgroundService so that the DeviceClient cache would start loading as the application started but interdependencies became problem.

public partial class Connector
{
    [Function("BumblebeeHiveCacheRefresh")]
    public async Task<IActionResult> BumblebeeHiveCacheRefreshRun([HttpTrigger(AuthorizationLevel.Function, "get")] CancellationToken cancellationToken)
    {
        _logger.LogInformation("BumblebeeHiveCacheRefresh start");

        await _swarmSpaceBumblebeeHive.Login(cancellationToken);

        foreach (SwarmSpace.BumblebeeHiveClient.Device device in await _swarmSpaceBumblebeeHive.DeviceListAsync(cancellationToken))
        {
            _logger.LogInformation("BumblebeeHiveCacheRefresh DeviceId:{DeviceId} DeviceName:{DeviceName}", device.DeviceId, device.DeviceName);

            Models.AzureIoTDeviceClientContext context = new Models.AzureIoTDeviceClientContext()
            {
                // TODO seems a bit odd getting this from application settings
                OrganisationId = _applicationSettings.OrganisationId, 
                //UserApplicationId = device.UserApplicationId, deprecated
                DeviceType = (byte)device.DeviceType,
                DeviceId = (uint)device.DeviceId,
            };

            switch (_azureIoTSettings.ApplicationType)
            {
                case Models.ApplicationType.AzureIotHub:
                    switch (_azureIoTSettings.AzureIotHub.ConnectionType)
                    {
                        case Models.AzureIotHubConnectionType.DeviceConnectionString:
                             await _azureDeviceClientCache.GetOrAddAsync<DeviceClient>(device.DeviceId.ToString(), (ICacheEntry x) => AzureIoTHubDeviceConnectionStringConnectAsync(device.DeviceId.ToString(), context));
                            break;
                        case Models.AzureIotHubConnectionType.DeviceProvisioningService:
                             await _azureDeviceClientCache.GetOrAddAsync<DeviceClient>(device.DeviceId.ToString(), (ICacheEntry x) => AzureIoTHubDeviceProvisioningServiceConnectAsync(device.DeviceId.ToString(), context, _azureIoTSettings.AzureIotHub.DeviceProvisioningService));
                            break;
                        default:

                        _logger.LogError("Azure IoT Hub ConnectionType unknown {0}", _azureIoTSettings.AzureIotHub.ConnectionType);

                            throw new NotImplementedException("AzureIoT Hub unsupported ConnectionType");
                    }
                    break;

                case Models.ApplicationType.AzureIoTCentral:
                    await _azureDeviceClientCache.GetOrAddAsync<DeviceClient>(device.DeviceId.ToString(), (ICacheEntry x) => AzureIoTHubDeviceProvisioningServiceConnectAsync(device.DeviceId.ToString(), context, _azureIoTSettings.AzureIoTCentral.DeviceProvisioningService));
                break;

                default:
                    _logger.LogError("AzureIoT application type unknown {0}", _azureIoTSettings.ApplicationType);

                    throw new NotImplementedException("AzureIoT unsupported ApplicationType");
            }
        }

        _logger.LogInformation("BumblebeeHiveCacheRefresh finish");

        return new OkResult();
    }
}

The HTTP WEBSITE_WARMUP_PATH environment variable is used to call the Azure HTTPTrigger Function and this is secured with an x-functions-key header.

Azure Function App Configuration

In the short-term loading the cache with a call to an Azure HTTPTrigger Function works but may timeout issues. When I ran the connector with my 100’s of devices simulator the function timed out every so often.

Swarm Space – Uplink with Azure Functions or WebAPI

This post could have been much longer with more screen grabs and code snippets, so this is the “highlights package”. This post took a lot longer than I expected as building, testing locally, then deploying the different implementations was time consuming.

Swarm Space Connector Functions Projects

I built the projects to investigate the different options taking into account reliability, robustness, amount of code, performance (I think slow startup could be a problem). The code is very “plain” I used the default options, no copyright notices, default formatting, context sensitive error messages were used to add any required “using” statements, libraries etc.

The desktop emulator hosting the six functions

I also deployed the Azure Functions and ASP .NET Core WebAPI application to check there were no difference (beyond performance) in the way they worked. I included a “default” function (generated by the new project wizard) for reference while I was building the others.

The function application with six functions in deployed to Azure

The “dynamic” type function worked but broke when the Javascript Object Notation(JSON) was invalid, or fields were missing, and it didn’t enforce the payload was correct.

namespace WebhookHttpTrigger
{
    public static class Dynamic
    {
        [FunctionName("Dynamic")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] dynamic input,
            ILogger log)
        {
            log.LogInformation($"C# HTTP Dynamic trigger function processed a request PacketId:{input.packetId}.");

            return new OkObjectResult("Hello, This HTTP triggered Dynamic function executed successfully.");
        }
    }
}
Dynamic Trigger function failing because PacketId format was valid (x in numeric field)

The “TypedAutomagic” function worked, it ensured the Javascript Object Notation(JSON) was valid, the payload format was correct but didn’t enforce the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedAutomagic
    {
        [FunctionName("TypedAutomagic")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] UplinkPayload payload,
            ILogger log)
        {
            log.LogInformation($"C# HTTP trigger function typed TypedAutomagic UplinkPayload processed a request PacketId:{payload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered automagic function executed successfully.");
        }
    }
}
Successful execution of TypedAutomagic function

The “TypedAutomagic” implementation also detected when the JSON property values in the payload couldn’t be deserialised successfully, but if the hiveRxTime was invalid the value was set to 1/1/0001 12:00:00 am.

TypedAutomagic hiveRxTime deserialisation failing

The “TypedDeserializeObject” function worked, it ensured the Javascript Object Notation(JSON) was valid, the payload format was correct but also didn’t enforce the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedDeserializeObject
    {
        [FunctionName("TypedDeserializeObject")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] string httpPayload,
            ILogger log)
        {
            UplinkPayload uplinkPayload;

            try
            {
                uplinkPayload = JsonConvert.DeserializeObject<UplinkPayload>(httpPayload);
            }
            catch(Exception ex) 
            {
                log.LogWarning(ex, "JsonConvert.DeserializeObject failed");

                return new BadRequestObjectResult(ex.Message);
            }

            log.LogInformation($"C# HTTP trigger function typed DeserializeObject UplinkPayload processed a request PacketId:{uplinkPayload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered DeserializeObject function executed successfully.");
        }
    }
}
TypedDeserializeObject function failing because PacketId format was valid (x in numeric field)
TypedDeserializeObject function failing because deviceId value is negative but datatype is unsigned
Successful execution of TypedDeserializeObject function

The “TypedDeserializeObjectAnnotations” function worked, it ensured the Javascript Object Notation (JSON) was valid, the payload format was correct and enforced the System.ComponentModel.DataAnnotations attributes.

namespace WebhookHttpTrigger
{
    public static class TypedDeserializeObjectAnnotations
    {
        [FunctionName("TypedDeserializeObjectAnnotations")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] string httpPayload,
            ILogger log)
        {
            UplinkPayload uplinkPayload;

            try
            {
                uplinkPayload = JsonConvert.DeserializeObject<UplinkPayload>(httpPayload);
            }
            catch (Exception ex)
            {
                log.LogWarning(ex, "JsonConvert.DeserializeObject failed");

                return new BadRequestObjectResult(ex.Message);
            }

            var context = new ValidationContext(uplinkPayload, serviceProvider: null, items: null);

            var results = new List<ValidationResult>();

            var isValid = Validator.TryValidateObject(uplinkPayload, context, results,true);

            if (!isValid)
            {
                log.LogWarning("Validator.TryValidateObject failed results:{results}", results);

                return new BadRequestObjectResult(results);
            }

            log.LogInformation($"C# HTTP trigger function typed DeserializeObject UplinkPayload processed a request PacketId:{uplinkPayload.PacketId}");

            return new OkObjectResult("Hello, This HTTP triggered DeserializeObject function executed successfully.");
        }
    }
}

I built an ASP .NET Core WebAPI version with two uplink method implementations, one which used dependency injection (DI) and the other that didn’t. I also added code to validate the deserialisation of HiveRxTimeUtc.

....
[HttpPost]
public async Task<IActionResult> Post([FromBody] UplinkPayload payload)
{
    if ( payload.HiveRxTimeUtc == DateTime.MinValue)
    {
        _logger.LogWarning("HiveRxTimeUtc validation failed");

        return this.BadRequest();
    }

    QueueClient queueClient = _queueServiceClient.GetQueueClient("uplink");

    await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));

    return this.Ok();
}
...
 [HttpPost]
public async Task<IActionResult> Post([FromBody] UplinkPayload payload)
{
    // Check that the post data is good
    if (!this.ModelState.IsValid)
    {
        _logger.LogWarning("QueuedController validation failed {0}", this.ModelState.ToString());

        return this.BadRequest(this.ModelState);
    }

    if ( payload.HiveRxTimeUtc == DateTime.MinValue)
    {
        _logger.LogWarning("HiveRxTimeUtc validation failed");

        return this.BadRequest();
    }

    try
    {
        QueueClient queueClient = new QueueClient(_configuration.GetConnectionString("AzureWebApi"), "uplink");

        //await queueClient.CreateIfNotExistsAsync();

        await queueClient.SendMessageAsync(Convert.ToBase64String(JsonSerializer.SerializeToUtf8Bytes(payload)));
    }
    catch (Exception ex)
    {
        _logger.LogError(ex,"Unable to open/create queue or send message", ex);

        return this.Problem("Unable to open queue (creating if it doesn't exist) or send message", statusCode: 500, title: "Uplink payload not sent");
    }

    return this.Ok();
}

In Telerik Fiddler I could see calls to the Azure Functions and the ASP .NET Core WebAPI were taking similar time to execute (Though I did see 5+ seconds) and the ASP .NET Core WebAPI appeared to take much longer to startup. (I did see 100+ seconds when I made four requests as the ASP .NET Core WebAPI was starting)

I’m going to use the ASP .NET Core WebAPI with dependency injection (DI) approach just because “it’s always better with DI”.

I noticed some other “oddness” while implementing then testing the Azure Http Trigger functions and ASP .NET Core WebAPI which I will cover off in some future posts.