Random wanderings through Microsoft Azure esp. PaaS plumbing, the IoT bits, AI on Micro controllers, AI on Edge Devices, .NET nanoFramework, .NET Core on *nix and ML.NET+ONNX
The myriotaAzure IoT Hub Cloud Identity Translation Gateway uplink message handler Azure Storage Queue Trigger Function wasn’t processing “transient” vs. “permanent” failures well. Sometimes a “permanent” failure message would be retried multiple times by the function runtime before getting moved to the poison queue.
After some experimentation using an Azure Storage Queue Function Output binding to move messages to the poison queue looked like a reasonable approach. (Though, returning null to indicate the message should be removed from the queue was not obvious from the documentation)
[Function("UplinkMessageProcessor")]
[QueueOutput(queueName: "uplink-poison", Connection = "UplinkQueueStorage")]
public async Task<Models.UplinkPayloadQueueDto> UplinkMessageProcessor([QueueTrigger(queueName: "uplink", Connection = "UplinkQueueStorage")] Models.UplinkPayloadQueueDto payload, CancellationToken cancellationToken)
{
...
// Process each packet in the payload. Myriota docs say only one packet per payload but just incase...
foreach (Models.QueuePacket packet in payload.Data.Packets)
{
// Lookup the device client in the cache or create a new one
Models.DeviceConnectionContext context;
try
{
context = await _deviceConnectionCache.GetOrAddAsync(packet.TerminalId, cancellationToken);
}
catch (DeviceNotFoundException dnfex)
{
_logger.LogError(dnfex, "Uplink- PayloadId:{0} TerminalId:{1} terminal not found", payload.Id, packet.TerminalId);
return payload;
}
catch (Exception ex) // Maybe just send to poison queue or figure if transient error?
{
_logger.LogError(ex, "Uplink- PayloadId:{0} TerminalId:{1} ", payload.Id, packet.TerminalId);
throw;
}
...
// Proccessing successful, message can be deleted by QueueTrigger plumbing
return null;
}
After building and testing an Azure Storage Queue Function Output binding implementation I’m not certain that it is a good approach. The code is a bit “chunky” and I have had to implement more of the retry process logic.
The myriotaAzure IoT Hub Cloud Identity Translation Gateway downlink message handler was getting a bit “chunky”. So, I started by stripping the code back to the absolute bare minimum that would “work”.
Then the code was then extended so it worked for “sunny day” scenarios. The payload formatter was successfully retrieved from the configured Azure Storage Blob, CS-Script successfully compiled the payload formatter, the message payload was valid text, the message text was valid Javascript Object Notation(JSON), the JSON was successfully processed by the compiled payload formatter, and finally the payload was accepted by the Myriota Cloud API.
Then finally the code was modified to gracefully handle broken payloads returned by the payload formatter evaluation, some comments were added, and the non-managed resources of the DeviceClient.Message disposed.
public async Task AzureIoTHubMessageHandler(Message message, object userContext)
{
Models.DeviceConnectionContext context = (Models.DeviceConnectionContext)userContext;
_logger.LogInformation("Downlink- IoT Hub TerminalId:{termimalId} LockToken:{LockToken}", context.TerminalId, message.LockToken);
// Use default formatter and replace with message specific formatter if configured.
string payloadFormatter;
if (!message.Properties.TryGetValue(Constants.IoTHubDownlinkPayloadFormatterProperty, out payloadFormatter) || string.IsNullOrEmpty(payloadFormatter))
{
payloadFormatter = context.PayloadFormatterDownlink;
}
_logger.LogInformation("Downlink- IoT Hub TerminalID:{termimalId} LockToken:{LockToken} Payload formatter:{payloadFormatter} ", context.TerminalId, message.LockToken, payloadFormatter);
try
{
// If this fails payload broken
byte[] messageBytes = message.GetBytes();
// This will fail for some messages, payload formatter gets bytes only
string messageText = string.Empty;
try
{
messageText = Encoding.UTF8.GetString(messageBytes);
}
catch (ArgumentException aex)
{
_logger.LogInformation("Downlink- IoT Hub TerminalID:{TerminalId} LockToken:{LockToken} messageBytes:{2} not valid Text", context.TerminalId, message.LockToken, BitConverter.ToString(messageBytes));
}
// This will fail for some messages, payload formatter gets bytes only
JObject? messageJson = null;
try
{
messageJson = JObject.Parse(messageText);
}
catch ( JsonReaderException jex)
{
_logger.LogInformation("Downlink- IoT Hub TerminalID:{TerminalId} LockToken:{LockToken} messageText:{2} not valid json", context.TerminalId, message.LockToken, BitConverter.ToString(messageBytes));
}
// This shouldn't fail, but it could for lots of diffent reasons, invalid path to blob, syntax error, interface broken etc.
IFormatterDownlink payloadFormatterDownlink = await _payloadFormatterCache.DownlinkGetAsync(payloadFormatter);
// This shouldn't fail, but it could for lots of different reasons, null references, divide by zero, out of range etc.
byte[] payloadBytes = payloadFormatterDownlink.Evaluate(message.Properties, context.TerminalId, messageJson, messageBytes);
// Validate payload before calling Myriota control message send API method
if (payloadBytes is null)
{
_logger.LogWarning("Downlink- IoT Hub TerminalID:{terminalId} LockToken:{LockToken} payload formatter:{payloadFormatter} Evaluate returned null", context.TerminalId, message.LockToken, payloadFormatter);
await context.DeviceClient.RejectAsync(message);
return;
}
if ((payloadBytes.Length < Constants.DownlinkPayloadMinimumLength) || (payloadBytes.Length > Constants.DownlinkPayloadMaximumLength))
{
_logger.LogWarning("Downlink- IoT Hub TerminalID:{terminalId} LockToken:{LockToken} payloadData length:{Length} invalid must be {DownlinkPayloadMinimumLength} to {DownlinkPayloadMaximumLength} bytes", context.TerminalId, message.LockToken, payloadBytes.Length, Constants.DownlinkPayloadMinimumLength, Constants.DownlinkPayloadMaximumLength);
await context.DeviceClient.RejectAsync(message);
return;
}
// This shouldn't fail, but it could few reasons mainly connectivity & message queuing etc.
_logger.LogInformation("Downlink- IoT Hub TerminalID:{TerminalId} LockToken:{LockToken} PayloadData:{payloadData} Length:{Length} sending", context.TerminalId, message.LockToken, Convert.ToHexString(payloadBytes), payloadBytes.Length);
// Finally send the message using Myriota API
string messageId = await _myriotaModuleAPI.SendAsync(context.TerminalId, payloadBytes);
_logger.LogInformation("Downlink- IoT Hub TerminalID:{TerminalId} LockToken:{LockToken} MessageID:{messageId} sent", context.TerminalId, message.LockToken, messageId);
await context.DeviceClient.CompleteAsync(message);
_logger.LogInformation("Downlink- IoT Hub TerminalID:{terminalId} LockToken:{LockToken} MessageID:{messageId} sent", context.TerminalId, message.LockToken, messageId);
}
catch (Exception ex)
{
await context.DeviceClient.RejectAsync(message);
_logger.LogError(ex, "Downlink- IoT Hub TerminalID:{terminalId} LockToken:{LockToken} failed", context.TerminalId, message.LockToken);
}
finally
{
// Mop up the non managed resources of message
message.Dispose();
}
}
As the code was being extended, I tested different failures to make sure the Application Insights logging messages were useful. The first failure mode tested was the Azure Storage Blob, path was broken or the blob was missing.
Visual Studio 2022 Debugger blob not found exception message
Application Insights blob not found exception logging
Then a series of “broken” payload formatters were created to test CS-Script compile time failures.
// Broken interface implementation
using System;
using System.Collections.Generic;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public class FormatterDownlink : PayloadFormatter.IFormatterDownlink
{
public byte[] Evaluate(IDictionary<string, string> properties, string terminalId, byte[] payloadBytes)
{
return payloadBytes;
}
}
Visual Studio 2022 Debugger interface implementation broken exception message
// Broken syntax
using System;
using System.Collections.Generic;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public class FormatterDownlink : PayloadFormatter.IFormatterDownlink
{
public byte[] Evaluate(IDictionary<string, string> properties, string terminalId, JObject payloadJson, byte[] payloadBytes)
{
return payloadBytes
}
}
Visual Studio 2022 Debugger syntax error exception message
The final test was sending a downlink message which was valid JSON, contained the correct information for the specified payload formatter and was successfully processed by the Myriota Cloud API.
Azure IoT Explorer with valid JSON payload and payload formatter name
Azure function output of successful downlink message
The myriotaAzure IoT Hub Cloud Identity Translation Gateway payload formatters use compiled C# code to convert uplink/downlink packet payloads to JSON/byte array. While trying out different formatters I had “compile” and “evaluation” errors which would have been a lot easier to debug if there was more diagnostic information in the Azure Application Insights logging.
namespace PayloadFormatter // Additional namespace for shortening interface when usage in formatter code
{
using System.Collections.Generic;
using Newtonsoft.Json.Linq;
public interface IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string terminalId, DateTime timestamp, byte[] payloadBytes);
}
public interface IFormatterDownlink
{
public byte[] Evaluate(IDictionary<string, string> properties, string terminalId, JObject? payloadJson, byte[] payloadBytes);
}
}
// Process the payload with configured formatter
Dictionary<string, string> properties = new Dictionary<string, string>();
JObject telemetryEvent;
try
{
telemetryEvent = formatterUplink.Evaluate(properties, packet.TerminalId, packet.Timestamp, payloadBytes);
}
catch (Exception ex)
{
_logger.LogError(ex, "Uplink- PayloadId:{0} TerminalId:{1} Value:{2} Bytes:{3} payload formatter evaluate failed", payload.Id, packet.TerminalId, packet.Value, Convert.ToHexString(payloadBytes));
return payload;
}
if (telemetryEvent is null)
{
_logger.LogError("Uplink- PayloadId:{0} TerminalId:{1} Value:{2} Bytes:{3} payload formatter evaluate failed returned null", payload.Id, packet.TerminalId, packet.Value, Convert.ToHexString(payloadBytes));
return payload;
}
The Evaluate method can return many different types of exception so in the initial version only the “generic” exception is caught and logged.
using System;
using System.Collections.Generic;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string terminalId, DateTime timestamp, byte[] payloadBytes)
{
JObject telemetryEvent = new JObject();
telemetryEvent.Add("Bytes", BitConverter.ToString(payloadBytes));
telemetryEvent.Add("Bytes", BitConverter.ToString(payloadBytes));
return telemetryEvent;
}
}
There are a number (which should grow over time) of test uplink/downlink payload formatters for testing different compile and execution failures.
Azure IoT Storage Explorer container with sample formatter blobs.
When writing payload formatters, the Visual Studio 2022 syntax highlighting is really useful for spotting syntax errors and with the “Downlink Payload Formatter Test Harness” application payload formatters can be executed and debugged before deployment with Azure Storage Explorer.
namespace PayloadFormattercode
{
using System.Collections.Generic;
using Newtonsoft.Json.Linq;
public interface IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, JObject payloadJson, string payloadText, byte[] payloadBytes);
}
..
}
The myriota uplink packet payload is only 20 bytes long so it is very unlikely that the payloadText and payloadJSON parameters would ever be populated so I removed them from the interface. The uplink message handler interface has been updated and the code to convert (if possible) the payload bytes to text and then to JSON deleted.
namespace PayloadFormatter
{
using System.Collections.Generic;
using Newtonsoft.Json.Linq;
public interface IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, byte[] payloadBytes);
}
...
}
All of the sample payload formatters have been updated to reflect the updated parameters. The sample Tracker.cs payload formatter unpacks a message from Myriota Dev Kit running the Tracker sample and returns an Azure IoT Central compatible location telemetry payload.
/*
myriota tracker payload format
typedef struct {
uint16_t sequence_number;
int32_t latitude; // scaled by 1e7, e.g. -891234567 (south 89.1234567)
int32_t longitude; // scaled by 1e7, e.g. 1791234567 (east 179.1234567)
uint32_t time; // epoch timestamp of last fix
} __attribute__((packed)) tracker_message;
*/
using System;
using System.Collections.Generic;
using System.Globalization;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, byte[] payloadBytes)
{
JObject telemetryEvent = new JObject();
if (payloadBytes is null)
{
return telemetryEvent;
}
telemetryEvent.Add("SequenceNumber", BitConverter.ToUInt16(payloadBytes));
JObject location = new JObject();
double latitude = BitConverter.ToInt32(payloadBytes, 2) / 10000000.0;
location.Add("lat", latitude);
double longitude = BitConverter.ToInt32(payloadBytes, 6) / 10000000.0;
location.Add("lon", longitude);
location.Add("alt", 0);
telemetryEvent.Add("DeviceLocation", location);
UInt32 packetimestamp = BitConverter.ToUInt32(payloadBytes, 10);
DateTime fixAtUtc = DateTime.UnixEpoch.AddSeconds(packetimestamp);
telemetryEvent.Add("FixAtUtc", fixAtUtc);
properties.Add("iothub-creation-time-utc", fixAtUtc.ToString("s", CultureInfo.InvariantCulture));
return telemetryEvent;
}
}
If a message payload is text or JSON it can still be converted in the payload formatter.
The Digital Twin Definition Language(DTDL) configuration used when a device is provisioned or when it connects is determined by the payload application which is based on the Myriota Destination endpoint.
The Azure Function Configuration of Application to DTDL Model ID
BEWARE – They application in ApplicationToDtdlModelIdMapping is case sensitive!
Azure IoT Central Device Template Configuration
I used Azure IoT Central Device Template functionality to create my Azure Digital Twin definitions.
Azure IoT Function with Azure IoT Hub Device Provisioning Service(DPS) configuration
Symmetric key attestation with the Azure IoT Hub Device Provisioning Service(DPS) is performed using the same security tokens supported by Azure IoT Hubs to securely connect devices. The symmetric key of an enrollment group isn’t used directly by devices in the provisioning process. Instead, devices that provision through an enrollment group do so using a derived device key.
private async Task<DeviceClient> AzureIoTHubDeviceProvisioningServiceConnectAsync(string terminalId, object context)
{
DeviceClient deviceClient;
string deviceKey;
using (var hmac = new HMACSHA256(Convert.FromBase64String(_azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GroupEnrollmentKey)))
{
deviceKey = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(terminalId)));
}
using (var securityProvider = new SecurityProviderSymmetricKey(terminalId, deviceKey, null))
{
using (var transport = new ProvisioningTransportHandlerAmqp(TransportFallbackType.TcpOnly))
{
DeviceRegistrationResult result;
ProvisioningDeviceClient provClient = ProvisioningDeviceClient.Create(
_azureIoTSettings.AzureIoTHub.DeviceProvisioningService.GlobalDeviceEndpoint,
_azureIoTSettings.AzureIoTHub.DeviceProvisioningService.IdScope,
securityProvider,
transport);
result = await provClient.RegisterAsync();
if (result.Status != ProvisioningRegistrationStatusType.Assigned)
{
_logger.LogWarning("Uplink-DeviceID:{0} RegisterAsync status:{1} failed ", terminalId, result.Status);
throw new ApplicationException($"Uplink-DeviceID:{0} RegisterAsync status:{1} failed");
}
IAuthenticationMethod authentication = new DeviceAuthenticationWithRegistrySymmetricKey(result.DeviceId, (securityProvider as SecurityProviderSymmetricKey).GetPrimaryKey());
deviceClient = DeviceClient.Create(result.AssignedHub, authentication, TransportSettings);
}
}
await deviceClient.OpenAsync();
return deviceClient;
}
The derived device key is a hash of the device’s registration ID and is computed using the symmetric key of the enrollment group. The device can then use its derived device key to sign the SAS token it uses to register with DPS.
Azure Device Provisioning Service Adding Enrollment Group Attestation
Azure Device Provisioning Service Add Enrollment Group IoT Hub(s) selection.
Azure Device Provisioning Service Manager Enrollments
For initial development and testing I ran the function application in the desktop emulator and simulated Myriota Device Manager webhook calls with Azure Storage Explorer and modified sample payloads.
Azure Storage Explorer Storage Account Queued Messages
I then used Azure IoT Explorer to configure devices, view uplink traffic etc.
Azure IoT Explorer Devices
When I connected to my Azure IoT Hub shortly after starting the Myriota Azure IoT Connector Function my test devices started connecting as messages arrived.
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddApplicationInsightsTelemetry();
// Add services to the container.
builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));
builder.Services.AddControllers();
builder.Services.AddSingleton<IConnectionMultiplexer>(s => ConnectionMultiplexer.Connect(builder.Configuration.GetConnectionString("Redis")));
var app = builder.Build();
// Configure the HTTP request pipeline.
app.UseHttpsRedirection();
app.MapControllers();
app.Run();
}
I trialed the initial versions of my Redis project with Memurai on my development machine, then configured an Azure Cache for Redis. I then load tested the project with several Azure AppService client and there was a significant improvement in response time.
[ApiController]
[Route("api/[controller]")]
public class StockItemsController : ControllerBase
{
private const int StockItemSearchMaximumRowsToReturn = 15;
private readonly TimeSpan StockItemListExpiration = new TimeSpan(0, 5, 0);
private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]";
//private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02'";
private readonly ILogger<StockItemsController> logger;
private readonly IDbConnection dbConnection;
private readonly IDatabase redisCache;
public StockItemsController(ILogger<StockItemsController> logger, IDapperContext dapperContext, IConnectionMultiplexer connectionMultiplexer)
{
this.logger = logger;
this.dbConnection = dapperContext.ConnectionCreate();
this.redisCache = connectionMultiplexer.GetDatabase();
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
var cached = await redisCache.StringGetAsync("StockItems");
if (cached.HasValue)
{
return Content(cached, "application/json");
}
var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
#if SERIALISER_SOURCE_GENERATION
string json = JsonSerializer.Serialize(stockItems, typeof(List<Model.StockItemListDtoV1>), Model.StockItemListDtoV1GenerationContext.Default);
#else
string json = JsonSerializer.Serialize(stockItems);
#endif
await redisCache.StringSetAsync("StockItems", json, expiry: StockItemListExpiration);
return Content(json, "application/json");
}
...
[HttpDelete()]
public async Task<ActionResult> ListCacheDelete()
{
await redisCache.KeyDeleteAsync("StockItems");
logger.LogInformation("StockItems list removed");
return this.Ok();
}
}
public class StockItemListDtoV1
{
public int Id { get; set; }
public string Name { get; set; }
public decimal RecommendedRetailPrice { get; set; }
public decimal TaxRate { get; set; }
}
[JsonSourceGenerationOptions(PropertyNamingPolicy = JsonKnownNamingPolicy.CamelCase)]
[JsonSerializable(typeof(List<StockItemListDtoV1>))]
public partial class StockItemListDtoV1GenerationContext : JsonSerializerContext
{
}
The cost of constructing the Serialiser may be higher, but the cost of performing serialisation with it is much smaller.
I used Telerik Fiddler to empty the cache then load the StockItems list 10 times (more tests would improve the quality of the results). The first trial was with the “conventional” serialiser
The average time for the conventional serialiser was 0.028562 seconds
The average time for the generated version was 0.030546 seconds. But, if the initial compilation step was ignored the average duration dropped to 0.000223 seconds a significant improvement.
I have found that putting the C/C++ structure for the uplink payload at the top of the convertor really helpful.
/*
myriota tracker payload format
typedef struct {
uint16_t sequence_number;
int32_t latitude; // scaled by 1e7, e.g. -891234567 (south 89.1234567)
int32_t longitude; // scaled by 1e7, e.g. 1791234567 (east 179.1234567)
uint32_t time; // epoch timestamp of last fix
} __attribute__((packed)) tracker_message;
*/
using System;
using System.Collections.Generic;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public class FormatterUplink : PayloadFormatter.IFormatterUplink
{
public JObject Evaluate(IDictionary<string, string> properties, string application, string terminalId, DateTime timestamp, JObject payloadJson, string payloadText, byte[] payloadBytes)
{
JObject telemetryEvent = new JObject();
telemetryEvent.Add("SequenceNumber", BitConverter.ToUInt16(payloadBytes));
double latitude = BitConverter.ToInt32(payloadBytes, 2) / 10000000.0;
telemetryEvent.Add("Latitude", latitude);
double longitude = BitConverter.ToInt32(payloadBytes, 6) / 10000000.0;
telemetryEvent.Add("Longitude", longitude);
UInt32 packetimestamp = BitConverter.ToUInt32(payloadBytes, 10);
DateTime lastFix = DateTime.UnixEpoch.AddSeconds(packetimestamp);
properties.Add("iothub-creation-time-utc", lastFix .ToString("s", CultureInfo.InvariantCulture));
return telemetryEvent;
}
}
The sample Tracker.cs payload formatter unpacks a message from Myriota Dev Kit running the Tracker sample and returns an Azure IoT Central compatible location telemetry payload.
BEWARE : I think the Azure IoT Central Position lat, lon + alt values might be case sensitive.
Azure IoT Explorer displaying Tracker.cs payload formatter output
Myriota Destination configuration application name URL configuration
namespace devMobile.IoT.MyriotaAzureIoTConnector.Connector.Models
{
public class UplinkPayloadQueueDto
{
public string Application { get; set; }
public string EndpointRef { get; set; }
public DateTime PayloadReceivedAtUtc { get; set; }
public DateTime PayloadArrivedAtUtc { get; set; }
public QueueData Data { get; set; }
public string Id { get; set; }
public Uri CertificateUrl { get; set; }
public string Signature { get; set; }
}
public class QueueData
{
public List<QueuePacket> Packets { get; set; }
}
public class QueuePacket
{
public string TerminalId { get; set; }
public DateTime Timestamp { get; set; }
public string Value { get; set; }
}
}
A pair of Azure Blob Storage containers are used to store the uplink/downlink (coming soon) formatter files. The compiled payload formatters are cached with Uplink/Downlink + Application (from the UplinkPayloadQueueDto) as the key.
Azure IoT Storage Explorer uplink payload formatters
The default uplink and downlink formatters used when there is no payload formatter for “Application” are configured in the application settings.
While exploring some of the functionality of MiniProfiler there were some 3rd party examples which caught my attention.
using (SqlConnection connection = new SqlConnection(@"Data Source=...; Initial Catalog=SyncDB; Trusted_Connection=Yes"))
{
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
{
if (profiledDbConnection.State != System.Data.ConnectionState.Open)
profiledDbConnection.Open();
using (SqlCommand command = new SqlCommand("Select * From Authors", connection))
{
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, connection, MiniProfiler.Current))
{
var data = profiledDbCommand.ExecuteReader();
//Write code here to populate the list of Authors
}
}
}
“Inspired” by code like this my first attempt to retrieve a list of stock items didn’t look right.
[HttpGet("AdoProfiledOtt")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAdoProfiledOtt()
{
List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();
using (SqlConnection connection = new SqlConnection(configuration.GetConnectionString("default")))
{
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
{
await profiledDbConnection.OpenAsync();
using (SqlCommand command = new SqlCommand(sqlCommandText, connection))
{
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, profiledDbConnection, MiniProfiler.Current))
{
using (SqlDataReader reader = await command.ExecuteReaderAsync())
{
using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
{
var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();
while (await profiledDbDataReader.ReadAsync())
{
response.Add(rowParser(profiledDbDataReader));
}
}
}
}
}
await profiledDbConnection.CloseAsync();
}
}
}
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="behavior">The behavior specified during command execution.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
{
WrappedReader = reader;
Behavior = behavior;
_profiler = profiler;
}
...
/// <summary>
/// The <see cref="DbDataReader"/> that is being used.
/// </summary>
public DbDataReader WrappedReader { get; }
/// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
protected override void Dispose(bool disposing)
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Dispose();
base.Dispose(disposing);
}
Another “using” not required as ProfiledDbDataReader “automagically” disposes the SqlDataReader. This was my final version of profiling the System.Data.SqlClient code to retrieve a list of stock items.
[HttpGet("AdoProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetProfiledAdo()
{
List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection((SqlConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
{
await profiledDbConnection.OpenAsync();
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(new SqlCommand(sqlCommandText), profiledDbConnection, MiniProfiler.Current))
{
DbDataReader reader = await profiledDbCommand.ExecuteReaderAsync();
using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
{
var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();
while (await profiledDbDataReader.ReadAsync())
{
response.Add(rowParser(profiledDbDataReader));
}
}
}
}
return this.Ok(response);
}
The profileDbDataReader.cs implementation was “sparse” and when loading a longer list of stock items there were some ReadAsync calls which took a bit longer.
/// <summary>
/// The profiled database data reader.
/// </summary>
public class ProfiledDbDataReader : DbDataReader
{
private readonly IDbProfiler? _profiler;
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="behavior">The behavior specified during command execution.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
{
WrappedReader = reader;
Behavior = behavior;
_profiler = profiler;
}
/// <summary>Gets the behavior specified during command execution.</summary>
public CommandBehavior Behavior { get; }
/// <inheritdoc cref="DbDataReader.Depth"/>
public override int Depth => WrappedReader.Depth;
/// <inheritdoc cref="DbDataReader.FieldCount"/>
public override int FieldCount => WrappedReader.FieldCount;
/// <inheritdoc cref="DbDataReader.HasRows"/>
public override bool HasRows => WrappedReader.HasRows;
/// <inheritdoc cref="DbDataReader.IsClosed"/>
public override bool IsClosed => WrappedReader.IsClosed;
/// <inheritdoc cref="DbDataReader.RecordsAffected"/>
public override int RecordsAffected => WrappedReader.RecordsAffected;
/// <summary>
/// The <see cref="DbDataReader"/> that is being used.
/// </summary>
public DbDataReader WrappedReader { get; }
/// <inheritdoc cref="DbDataReader.this[string]"/>
public override object this[string name] => WrappedReader[name];
/// <inheritdoc cref="DbDataReader.this[int]"/>
public override object this[int ordinal] => WrappedReader[ordinal];
...
/// <inheritdoc cref="DbDataReader.GetString(int)"/>
public override string GetString(int ordinal) => WrappedReader.GetString(ordinal);
/// <inheritdoc cref="DbDataReader.GetValue(int)"/>
public override object GetValue(int ordinal) => WrappedReader.GetValue(ordinal);
/// <inheritdoc cref="DbDataReader.GetValues(object[])"/>
public override int GetValues(object[] values) => WrappedReader.GetValues(values);
/// <inheritdoc cref="DbDataReader.IsDBNull(int)"/>
public override bool IsDBNull(int ordinal) => WrappedReader.IsDBNull(ordinal);
/// <inheritdoc cref="DbDataReader.IsDBNullAsync(int, CancellationToken)"/>
public override Task<bool> IsDBNullAsync(int ordinal, CancellationToken cancellationToken) => WrappedReader.IsDBNullAsync(ordinal, cancellationToken);
/// <inheritdoc cref="DbDataReader.NextResult()"/>
public override bool NextResult() => WrappedReader.NextResult();
/// <inheritdoc cref="DbDataReader.NextResultAsync(CancellationToken)"/>
public override Task<bool> NextResultAsync(CancellationToken cancellationToken) => WrappedReader.NextResultAsync(cancellationToken);
/// <inheritdoc cref="DbDataReader.Read()"/>
public override bool Read() => WrappedReader.Read();
/// <inheritdoc cref="DbDataReader.ReadAsync(CancellationToken)"/>
public override Task<bool> ReadAsync(CancellationToken cancellationToken) => WrappedReader.ReadAsync(cancellationToken);
/// <inheritdoc cref="DbDataReader.Close()"/>
public override void Close()
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Close();
_profiler?.ReaderFinish(this);
}
/// <inheritdoc cref="DbDataReader.GetSchemaTable()"/>
public override DataTable? GetSchemaTable() => WrappedReader.GetSchemaTable();
/// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
protected override void Dispose(bool disposing)
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Dispose();
base.Dispose(disposing);
}
}
In the [HttpGet(“DapperProfiledQueryMultipleStep”)] method I wrapped ReadAsync and could see in the profiling that every so often a call did take significantly longer.
using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
{
response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
}
I did consider modifying profileDbDataReader.cs to add some instrumentation to the Read… and Get… methods but, the authors of miniprofiler are way way smarter than me so there must be a reason why they didn’t.