Sometimes there is no easy way to build a “list of lists” using the contents of multiple database tables. I have run into this problem a few times especially when building webby services which query the database of a “legacy” (aka. production) system.
Retrieving a list of StockGroups and their StockItems from the World Wide Importers database was one of the better “real world” examples I could come up with.
SQL Server Management Studio Diagram showing relationships of tables
There is a fair bit of duplication (StockGroupID, StockGroupName) in the results set
SQL Server Management Studio StockItems-StockItemStockGroups-StockGroups query and results
There were 442 rows in the results set and 227 StockItems in the database so I ordered the query results by StockItemID and confirmed that there were many StockItems in several StockGroups.
public class StockItemListDtoV1
{
public int Id { get; set; }
public string Name { get; set; }
public decimal RecommendedRetailPrice { get; set; }
public decimal TaxRate { get; set; }
}
public class StockGroupStockItemsListDto
{
StockGroupStockItemsListDto()
{
StockItems = new List<StockItemListDto>();
}
public int StockGroupID { get; set; }
public string StockGroupName { get; set; }
public List<StockItemListDto> StockItems { get; set; }
}
My initial version uses a Generic List for a StockGroup’s StockItems which is most probably not a good idea.
[Route("api/[controller]")]
[ApiController]
public class InvoiceQuerySplitOnController : ControllerBase
{
private readonly string connectionString;
private readonly ILogger<InvoiceQuerySplitOnController> logger;
public InvoiceQuerySplitOnController(IConfiguration configuration, ILogger<InvoiceQuerySplitOnController> logger)
{
this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
this.logger = logger;
}
[HttpGet]
public async Task<ActionResult<IAsyncEnumerable<StockGroupStockItemsListDto>>> Get()
{
IEnumerable<StockGroupStockItemsListDto> response = null;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
var stockGroups = await db.QueryAsync<StockGroupStockItemsListDto, StockItemListDto, StockGroupStockItemsListDto>(
sql: @"SELECT [StockGroups].[StockGroupID] as 'StockGroupID'" +
",[StockGroups].[StockGroupName]" +
",[StockItems].StockItemID as 'ID'" +
",[StockItems].StockItemName as 'Name'" +
",[StockItems].TaxRate" +
",[StockItems].RecommendedRetailPrice " +
"FROM [Warehouse].[StockGroups] " +
"INNER JOIN[Warehouse].[StockItemStockGroups] ON ([StockGroups].[StockGroupID] = [StockItemStockGroups].[StockGroupID])" +
"INNER JOIN[Warehouse].[StockItems] ON ([Warehouse].[StockItemStockGroups].[StockItemID] = [StockItems].[StockItemID])",
(stockGroup, stockItem) =>
{
// Not certain I think using a List<> here is a good idea...
stockGroup.StockItems.Add(stockItem);
return stockGroup;
},
splitOn: "ID",
commandType: CommandType.Text);
response = stockGroups.GroupBy(p => p.StockGroupID).Select(g =>
{
var groupedStockGroup = g.First();
groupedStockGroup.StockItems = g.Select(p => p.StockItems.Single()).ToList();
return groupedStockGroup;
});
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving S, Invoice Lines or Stock Item Transactions");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
The MultiMapper syntax always trips me up and it usually takes a couple of attempts to get it to work.
List of StockGroups with StockItems
I have extended my DapperTransient module adding WithRetry versions of the 14 MultiMapper methods.
My current “day job” is building applications for managing portfolios of foreign currency instruments. A portfolio can contain many different types of instrument (Forwards, Options, Swaps etc.). One of the “optimisations” we use is retrieving all the different types of instruments in a portfolio with one stored procedure call.
SQL Server Management Studio Dependency viewer
The closest scenario I could come up with using the World Wide Importers database was retrieving a summary of all the information associated with an Invoice for display on a single screen.
CREATE PROCEDURE [Sales].[InvoiceSummaryGetV1](@InvoiceID as int)
AS
BEGIN
SELECT [InvoiceID]
-- ,[CustomerID]
-- ,[BillToCustomerID]
,[OrderID]
,[Invoices].[DeliveryMethodID]
,[DeliveryMethodName]
-- ,[ContactPersonID]
-- ,[AccountsPersonID]
,[SalespersonPersonID] as SalesPersonID
,[SalesPerson].[PreferredName] as SalesPersonName
-- ,[PackedByPersonID]
,[InvoiceDate]
,[CustomerPurchaseOrderNumber]
,[IsCreditNote]
,[CreditNoteReason]
,[Comments]
,[DeliveryInstructions]
-- ,[InternalComments]
-- ,[TotalDryItems]
-- ,[TotalChillerItems]
,[DeliveryRun]
,[RunPosition] as DeliveryRunPosition
,[ReturnedDeliveryData] as DeliveryData
,[ConfirmedDeliveryTime] as DeliveredAt
,[ConfirmedReceivedBy] as DeliveredTo
-- ,[LastEditedBy]
-- ,[LastEditedWhen]
FROM [Sales].[Invoices]
INNER JOIN [Application].[People] as SalesPerson ON (Invoices.[SalespersonPersonID] = [SalesPerson].[PersonID])
INNER JOIN [Application].[DeliveryMethods] as DeliveryMethod ON (Invoices.[DeliveryMethodID] = DeliveryMethod.[DeliveryMethodID])
WHERE ([Invoices].[InvoiceID] = @InvoiceID)
SELECT [InvoiceLineID]
,[InvoiceID]
,[StockItemID]
,[Description] as StockItemDescription
,[InvoiceLines].[PackageTypeID]
,[PackageType].[PackageTypeName]
,[Quantity]
,[UnitPrice]
,[TaxRate]
,[TaxAmount]
-- ,[LineProfit]
,[ExtendedPrice]
-- ,[LastEditedBy]
-- ,[LastEditedWhen]
FROM [Sales].[InvoiceLines]
INNER JOIN [Warehouse].[PackageTypes] as PackageType ON ([PackageType].[PackageTypeID] = [InvoiceLines].[PackageTypeID])
WHERE ([InvoiceLines].[InvoiceID] = @InvoiceID)
SELECT [StockItemTransactionID]
,[StockItemTransactions].[StockItemID]
,StockItem.[StockItemName] as StockItemName
,[StockItemTransactions].[TransactionTypeID]
,[TransactionType].[TransactionTypeName]
-- ,[CustomerID]
-- ,[InvoiceID]
-- ,[SupplierID]
-- ,[PurchaseOrderID]
,[TransactionOccurredWhen] as TransactionAt
,[Quantity]
-- ,[LastEditedBy]
-- ,[LastEditedWhen]
FROM [Warehouse].[StockItemTransactions]
INNER JOIN [Warehouse].[StockItems] as StockItem ON ([StockItemTransactions].StockItemID = [StockItem].StockItemID)
INNER JOIN [Application].[TransactionTypes] as TransactionType ON ([StockItemTransactions].[TransactionTypeID] = TransactionType.[TransactionTypeID])
WHERE ([StockItemTransactions].[InvoiceID] = @InvoiceID)
END
The stored procedure returns 3 recordsets, a “summary” of the Order, a summary of the associated OrderLines and a summary of the associated StockItemTransactions.
public async Task<ActionResult<Model.InvoiceSummaryGetDtoV1>>Get([Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
Model.InvoiceSummaryGetDtoV1 response = null;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
var invoiceSummary = await db.QueryMultipleWithRetryAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);
response = await invoiceSummary.ReadSingleOrDefaultWithRetryAsync<Model.InvoiceSummaryGetDtoV1>();
if (response == default)
{
logger.LogInformation("Invoice:{0} not found", id);
return this.NotFound($"Invoice:{id} not found");
}
response.InvoiceLines = (await invoiceSummary.ReadWithRetryAsync<Model.InvoiceLineSummaryListDtoV1>()).ToArray();
response.StockItemTransactions = (await invoiceSummary.ReadWithRetryAsync<Model.StockItemTransactionSummaryListDtoV1>()).ToArray();
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving Invoice, Invoice Lines or Stock Item Transactions");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
I started again, but kept the first section as it covers one of the simplest possible approaches to caching using the [ResponseCache] attribute and VaryByQueryKeys.
[HttpGet("Response")]
[ResponseCache(Duration = StockItemsListResponseCacheDuration)]
public async Task<ActionResult<IAsyncEnumerable<Model.StockItemListDtoV1>>> GetResponse()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
logger.LogInformation("Response cache load");
try
{
response = await dapper.QueryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving list of StockItems");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
[HttpGet("ResponseVarying")]
[ResponseCache(Duration = StockItemsListResponseCacheDuration, VaryByQueryKeys = new string[] { "id" })]
public async Task<ActionResult<Model.StockItemGetDtoV1>> Get([FromQuery(Name = "id"), Range(1, int.MaxValue, ErrorMessage = "Stock item id must greater than 0")] int id)
{
Model.StockItemGetDtoV1 response = null;
logger.LogInformation("Response cache varying load id:{0}", id);
try
{
response = await dapper.QuerySingleOrDefaultAsync<Model.StockItemGetDtoV1>(sql: "[Warehouse].[StockItemsStockItemLookupV1]", param: new { stockItemId = id }, commandType: CommandType.StoredProcedure);
if (response == default)
{
logger.LogInformation("StockItem:{0} not found", id);
return this.NotFound($"StockItem:{id} not found");
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Looking up StockItem with Id:{0}", id);
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
All the browsers appeared to respect the cache control headers but Firefox was the only one which did not initiate a new request when I pressed return in the Uniform Resource Locator(URL) field.
[HttpGet("DapperMemory")]
public async Task<ActionResult<IAsyncEnumerable<Model.StockItemListDtoV1>>> GetDapper()
{
List<Model.StockItemListDtoV1> response;
logger.LogInformation("Dapper cache load");
try
{
response = await dapper.QueryAsync<Model.StockItemListDtoV1>(
sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]",
commandType: CommandType.Text,
enableCache: true,
cacheExpire: TimeSpan.Parse(this.Configuration.GetValue<string>("DapperCachingDuration"))
);
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving list of StockItems");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
[HttpGet("DapperMemoryVarying")]
public async Task<ActionResult<Model.StockItemGetDtoV1>> GetDapperVarying([FromQuery(Name = "id"), Range(1, int.MaxValue, ErrorMessage = "Stock item id must greater than 0")] int id)
{
Model.StockItemGetDtoV1 response = null;
logger.LogInformation("Dapper cache varying load id:{0}", id);
try
{
response = await dapper.QuerySingleOrDefaultAsync<Model.StockItemGetDtoV1>(
sql: "[Warehouse].[StockItemsStockItemLookupV1]",
param: new { stockItemId = id },
commandType: CommandType.StoredProcedure,
cacheKey: $"StockItem:{id}",
enableCache: true,
cacheExpire: TimeSpan.Parse(this.Configuration.GetValue<string>("DapperCachingDuration"))
);
if (response == default)
{
logger.LogInformation("StockItem:{0} not found", id);
return this.NotFound($"StockItem:{id} not found");
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Looking up StockItem with Id:{0}", id);
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
Both the Dapper.Extensions In-Memory and Redis cache reduced the number of database requests to the bare minimum. In a larger application the formatting of the cacheKey (cacheKey: “StockItems” & cacheKey: $”StockItem:{id}”) would be important to stop database query result collisions.
SQL Server Profiler displaying the list and single record requests.
Memurai running as a Windows Service on my development machine
When the Web API project was restarted the contents in-memory cache were lost. The Redis cache contents survive a restart and can be access from multiple clients.
Uniview IPC3635SB-ADZK-I0 Security camera test rig with Raspberry PI and PIR motion detector
I tried to keep the .Net Core 5 console applications as simple as possible, they download an image from the camera “snapshot” endpoint (In this case http://10.0.0.47:85/images/snapshot.jpg), save it to the local filesystem and then upload it.
The core of the two applications is the “upload” image method, which is called by a timer or GPIO pin EventHandler
private static async void ImageUpdateTimerCallback(object state)
{
CommandLineOptions options = (CommandLineOptions)state;
DateTime requestAtUtc = DateTime.UtcNow;
// Just incase - stop code being called while retrival of the photo already in progress
if (cameraBusy)
{
return;
}
cameraBusy = true;
Console.WriteLine($"{requestAtUtc:yy-MM-dd HH:mm:ss} Image up load start");
try
{
// First go and get the image file from the camera onto local file system
using (var client = new WebClient())
{
NetworkCredential networkCredential = new NetworkCredential()
{
UserName = options.UserName,
Password = options.Password
};
client.Credentials = networkCredential;
await client.DownloadFileTaskAsync(new Uri(options.CameraUrl), options.LocalFilename);
}
// Then open the file ready to stream ito upto storage account associated with Azuure IoT Hub
using (FileStream fileStreamSource = new FileStream(options.LocalFilename, FileMode.Open))
{
var fileUploadSasUriRequest = new FileUploadSasUriRequest
{
BlobName = string.Format("{0:yyMMdd}/{0:yyMMddHHmmss}.jpg", requestAtUtc)
};
// Get the plumbing sorted for where the file is going in Azure Storage
FileUploadSasUriResponse sasUri = await azureIoTCentralClient.GetFileUploadSasUriAsync(fileUploadSasUriRequest);
Uri uploadUri = sasUri.GetBlobUri();
try
{
var blockBlobClient = new BlockBlobClient(uploadUri);
var response = await blockBlobClient.UploadAsync(fileStreamSource, new BlobUploadOptions());
var successfulFileUploadCompletionNotification = new FileUploadCompletionNotification()
{
// Mandatory. Must be the same value as the correlation id returned in the sas uri response
CorrelationId = sasUri.CorrelationId,
// Mandatory. Will be present when service client receives this file upload notification
IsSuccess = true,
// Optional, user defined status code. Will be present when service client receives this file upload notification
StatusCode = 200,
// Optional, user-defined status description. Will be present when service client receives this file upload notification
StatusDescription = "Success"
};
await azureIoTCentralClient.CompleteFileUploadAsync(successfulFileUploadCompletionNotification);
}
catch (Exception ex)
{
Console.WriteLine($"Failed to upload file to Azure Storage using the Azure Storage SDK due to {ex}");
var failedFileUploadCompletionNotification = new FileUploadCompletionNotification
{
// Mandatory. Must be the same value as the correlation id returned in the sas uri response
CorrelationId = sasUri.CorrelationId,
// Mandatory. Will be present when service client receives this file upload notification
IsSuccess = false,
// Optional, user-defined status code. Will be present when service client receives this file upload notification
StatusCode = 500,
// Optional, user defined status description. Will be present when service client receives this file upload notification
StatusDescription = ex.Message
};
await azureIoTCentralClient.CompleteFileUploadAsync(failedFileUploadCompletionNotification);
}
}
TimeSpan uploadDuration = DateTime.UtcNow - requestAtUtc;
Console.WriteLine($"{requestAtUtc:yy-MM-dd HH:mm:ss} Image up load done. Duration:{uploadDuration.TotalMilliseconds:0.} mSec");
}
catch (Exception ex)
{
Console.WriteLine($"Camera image upload process failed {ex.Message}");
}
finally
{
cameraBusy = false;
}
}
As part of a contract a customer sent me a UniviewIPC3635SB-ADZK-I0 Security camera for a proof of concept(PoC) project. Before the PoC I wanted to explore the camera functionality in more depth, especially how to retrieve individual images from the camera, remotely control it’s zoom, focus, pan, tilt etc.. I’m trying to source a couple of other vendors’ security cameras with remotely controllable pan and tilt for testing.
Uniview IPC3635SB-ADZK-I0 Security camera
It appears that many cameras support retrieving the latest image with an HyperText Transfer Protocol (HTTP) GET so that looked like a good place to start. For the next couple of posts the camera will be sitting on the bookcase in my office looking through the window at the backyard.
My .Net Core 5 console application is as simple possible, it just downloads an image from the camera “snapshot” endpoint (In this case http://10.0.0.47:85/images/snapshot.jpg) and saves it to the local filesystem.
Visual Studio 2019 Debug Output showing application download process
Once the application had finished running on the device I wanted to check that the file was on the local filesystem. I used Putty to connect to the Raspberry PI then searched for LatestImage.jpg.
Linux find utility displaying the location of the downloaded file
This post was about selecting the tooling I’m comfortable with and configuring my development environment so they work well together. The next step will be using Open Network Video Interface Forum (ONVIF) to discover, determine the capabilities of and then control the camera (for this device just zoom and focus).
On a couple of the systems I work on there are a number of queries (often complex spatial searches) which are very resource intensive but are quite readily cached. In these systems we have used HTTP GET and HEAD Request methods together so that the client only re-GETs the query results after a HEAD method indicates there have been updates.
I have been trying to keep the number of changes to my Microsoft SQL Azure World Wide Importers database to a minimum but for this post I have added a rowversion column to the StockGroups table. The rowversion data type is an automatically generated, unique 8 byte binary(12 bytes Base64 encoded) number within a database.
StockGroups table with Version column
Adding a rowversion table to an existing System Versioned table in the SQL Server Management Studio Designer is painful so I used…
ALTER TABLE [Warehouse].[StockGroups] ADD [Version] [timestamp] NULL
To reduce complexity the embedded SQL is contains two commands (normally I wouldn’t do this) one for retrieving the list StockGroups the other for retrieving the maximum StockGroup rowversion. If a StockGroup is changed the rowversion will be “automagically” updated and the maximum value will change.
[HttpGet]
public async Task<ActionResult<IAsyncEnumerable<Model.StockGroupListDtoV1>>> Get()
{
IEnumerable<Model.StockGroupListDtoV1> response = null;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
var parameters = new DynamicParameters();
parameters.Add("@RowVersion", dbType: DbType.Binary, direction: ParameterDirection.Output, size: ETagBytesLength);
response = await db.QueryAsync<Model.StockGroupListDtoV1>(sql: @"SELECT [StockGroupID] as ""ID"", [StockGroupName] as ""Name""FROM [Warehouse].[StockGroups] ORDER BY Name; SELECT @RowVersion=MAX(Version) FROM [Warehouse].[StockGroups]", param: parameters, commandType: CommandType.Text);
if (response.Any())
{
byte[] rowVersion = parameters.Get<byte[]>("RowVersion");
this.HttpContext.Response.Headers.Add("ETag", Convert.ToBase64String(rowVersion));
}
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving list of StockGroups");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
I used Telerik Fiddler to to capture the GET response payload.
The HEAD method requests the maximum rwoversion value from the StockGroups table and compares it to the eTag. In a more complex scenario this could be a call to a local cache to see if a query result has bee refreshed.
[HttpHead]
public async Task<ActionResult> Head([Required][FromHeader(Name = "ETag")][MinLength(ETagBase64Length, ErrorMessage = "eTag length invalid too short")][MaxLength(ETagBase64Length, ErrorMessage = "eTag length {0} invalid too long")] string eTag)
{
byte[] headerVersion = new byte[ETagBytesLength];
if (!Convert.TryFromBase64String(eTag, headerVersion, out _))
{
logger.LogInformation("eTag invalid format");
return this.BadRequest("eTag invalid format");
}
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
byte[] databaseVersion = await db.ExecuteScalarAsync<byte[]>(sql: "SELECT MAX(Version) FROM [Warehouse].[StockGroups]", commandType: CommandType.Text);
if (headerVersion.SequenceEqual(databaseVersion))
{
return this.StatusCode(StatusCodes.Status304NotModified);
}
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving StockItem list");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok();
}
I used Fiddler to to capture a HEAD response payload a 304 Not modified.
HTTP/1.1 304 Not Modified
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Sat, 26 Jun 2021 22:09:02 GMT
I then modified the database and the response changed to 200 OK indicating the local cache should be updated with a GET.
HTTP/1.1 200 OK
Transfer-Encoding: chunked
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Sat, 26 Jun 2021 22:09:59 GMT
This approach combined with the use of the If-Match, If-Modified-Since, If-None-Match and If-Unmodified-since allows web and client side caches to use previously requested results when there have been no changes. This can significantly reduce the amount of network traffic and server requests.
HTTP/1.1 400 Bad Request
Content-Length: 240
Content-Type: application/problem+json; charset=utf-8
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Sat, 26 Jun 2021 06:28:11 GMT
This was unlike the helpful validation messages returned by the GET method of the StockItems pagination example code
{
"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1",
"title":"One or more validation errors occurred.",
"status":400,
"traceId":"00-bd68c94bf05f5c4ca8752011d6a60533-48e966211dec4847-00",
"errors":
{
"PageSize":["PageSize must be present and greater than 0"],
"PageNumber":["PageNumber must be present and greater than 0"]
}
}
The lack of diagnostic information was not helpful and I’ll explore this further in a future post. I often work on Fintech applications which are “insert only”, or nothing is deleted just marked as inactive/readonly so this approach is viable.