Random wanderings through Microsoft Azure esp. the IoT bits, AI on Micro controllers, .NET nanoFramework, .NET Core on *nix, and GHI Electronics TinyCLR
As part of some scale testing of my WebAPIDapper and WebMinimalAPIDapper i have been “cleaning up” a portion of the Inside AirbnbLondon dataset. To make the scale testing results more realistic I wanted at least one table with lots of rows.
CREATE TABLE [dbo].[CalendarRawDetailed](
[listing_id] [bigint] NOT NULL,
[Date] [date] NOT NULL,
[Xavailable] [bit] NULL,
[available] [nvarchar](5) NOT NULL,
[Xprice] [money] NULL,
[price] [nvarchar](30) NOT NULL,
[Xadjusted_price] [money] NULL,
[adjusted_price] [nvarchar](30) NOT NULL,
[Xminimum_nights] [smallint] NULL,
[minimum_nights] [nvarchar](30) NOT NULL,
[Xmaximum_nights] [smallint] NULL,
[maximum_nights] [nvarchar](30) NOT NULL
) ON [PRIMARY]
The CalendarRawDetailed had some invalid values which were most probably due formatting inconsistencies on the AirBnb website
SELECT COUNT(*) FROM CalendarRawDetailed WHERE Xminimum_nights IS NULL
SELECT * FROM CalendarRawDetailed WHERE Xminimum_nights IS NULL
SELECT COUNT(*) FROM CalendarRawDetailed WHERE Xmaximum_nights IS NULL
SELECT * FROM CalendarRawDetailed WHERE Xmaximum_nights IS NULL
SELECT COUNT(*) FROM CalendarRawDetailed WHERE Xadjusted_price IS NULL
SELECT * FROM CalendarRawDetailed WHERE Xadjusted_price IS NULL
SELECT COUNT(*) FROM CalendarRawDetailed WHERE Xprice IS NULL
SELECT * FROM CalendarRawDetailed WHERE Xprice IS NULL
Where possible I recovered the values with an “incorrect” format, but some rows had to be deleted.
UPDATE CalendarRawDetailed SET Xmaximum_nights = TRY_CONVERT(smallint, RTRIM(maximum_nights, '"')) WHERE Xmaximum_nights IS NULL
UPDATE CalendarRawDetailed SET XmINimum_nights = TRY_CONVERT(smallint, RTRIM(mINimum_nights, '"')) WHERE Xminimum_nights IS NULL
UPDATE CalendarRawDetailed SET Xadjusted_price = TRY_CONVERT(money, LTRIM(adjusted_price, '$')) --WHERE Xmaximum_nights IS NULL
SELECT *
FROM CalendarRawDetailed
WHERE Xadjusted_price IS NULL
DELETE FROM CalendarRawDetailed WHERE Xmaximum_nights IS NULL
UPDATE CalendarRawDetailed set Xavailable = 1 where available = 't'
The Calendar table has 365 rows for each listing, and I will update Calendar dates, so they are in the “future”.
CREATE TABLE [dbo].[Calendar](
[listing_id] [bigint] NOT NULL,
[date] [date] NOT NULL,
[available] [bit] NOT NULL,
[price] [money] NOT NULL,
[adjusted_price] [money] NOT NULL,
[minimum_nights] [smallint] NOT NULL,
[maximum_nights] [smallint] NOT NULL
) ON [PRIMARY]
The Calendar table as approximately 31 million rows which should be plenty for my scale testing.
CREATE PROCEDURE [dbo].[ListingsSpatialNearbyNTSLocation]
@Origin AS GEOGRAPHY,
@distance AS INTEGER
AS
BEGIN
DECLARE @Circle AS GEOGRAPHY = @Origin.STBuffer(@distance);
SELECT TOP(50) UID AS ListingUID
,[Name]
,listing_url as ListingUrl
,Listing.Location.STDistance(@Origin) as Distance
,Listing.Location
FROM Listing
WHERE (Listing.Location.STWithin(@Circle) = 1)
ORDER BY Distance
END
CREATE PROCEDURE [dbo].[ListingsSpatialNearbyNTSSerialize]
@Origin AS GEOGRAPHY,
@distance AS INTEGER
AS
BEGIN
DECLARE @Circle AS GEOGRAPHY = @Origin.STBuffer(@distance);
SELECT TOP(50) UID AS ListingUID
,[Name]
,listing_url as ListingUrl
,Listing.Location.STDistance(@Origin) as Distance
,Location.Serialize() as Location
FROM [listing]
WHERE (Location.STWithin(@Circle) = 1)
ORDER BY Distance
END
class PointHandlerSerialise : SqlMapper.TypeHandler<Point>
{
public override Point Parse(object value)
{
var reader = new SqlServerBytesReader { IsGeography = true };
return (Point)reader.Read((byte[])value);
}
public override void SetValue(IDbDataParameter parameter, Point? value)
{
((SqlParameter)parameter).SqlDbType = SqlDbType.Udt; // @Origin parameter?
((SqlParameter)parameter).UdtTypeName = "GEOGRAPHY";
var writer = new SqlServerBytesWriter { IsGeography = true };
parameter.Value = writer.Write(value);
}
}
Once the location column serialisation was working (I could see a valid response in the debugger) the generation of the response was failing with a “System.Text.Json.JsonException: A possible object cycle was detected. This can either be due to a cycle or if the object depth is larger than the maximum allowed depth of 64″.
After fixing that issue the response generation failed with “System.ArgumentException: .NET number values such as positive and negative infinity cannot be written as valid JSON.”
Fixing these two issues required adjustment of two HttpJsonOptions
CREATE PROCEDURE [dbo].[ListingsSpatialNearbyNTSWkb]
@Origin AS GEOGRAPHY,
@distance AS INTEGER
AS
BEGIN
DECLARE @Circle AS GEOGRAPHY = @Origin.STBuffer(@distance);
SELECT TOP(50) UID AS ListingUID
,[Name]
,listing_url as ListingUrl
,Listing.Location.STDistance(@Origin) as Distance
,Location.STAsBinary() as Location
FROM [listing]
WHERE (Location.STWithin(@Circle) = 1)
ORDER BY Distance
END
CREATE PROCEDURE [dbo].[ListingsSpatialNearbyNTSWkt]
@Origin AS GEOGRAPHY,
@distance AS INTEGER
AS
BEGIN
DECLARE @Circle AS GEOGRAPHY = @Origin.STBuffer(@distance);
SELECT TOP(50) UID AS ListingUID
,[Name]
,listing_url as ListingUrl
,Listing.Location.STDistance(@Origin) as Distance
,Location.STAsText() as Location
FROM [listing]
WHERE (Location.STWithin(@Circle) = 1)
ORDER BY Distance
END
class PointHandlerWkt : SqlMapper.TypeHandler<Point>
{
public override Point Parse(object value)
{
WKTReader wktReader = new WKTReader();
return (Point)wktReader.Read(value.ToString());
}
public override void SetValue(IDbDataParameter parameter, Point? value)
{
((SqlParameter)parameter).SqlDbType = SqlDbType.Udt; // @Origin parameter?
((SqlParameter)parameter).UdtTypeName = "GEOGRAPHY";
parameter.Value = new SqlServerBytesWriter() { IsGeography = true }.Write(value);
}
}
I have focused on getting the spatial queries to work and will stress/performance test my implementations in a future post. I will also revisit the /Spatial/NearbyGeography method to see if I can get it to work without “Location.Serialize() as Location”.
CREATE TABLE [dbo].[Neighbourhood](
[id] [bigint] IDENTITY(1,1) NOT NULL,
[NeighbourhoodUID] [uniqueidentifier] NOT NULL,
[name] [nvarchar](50) NOT NULL,
[neighbourhood_url] [nvarchar](100) NOT NULL,
[boundary] [geography] NOT NULL,
CONSTRAINT [PK_Neighbourhood] PRIMARY KEY CLUSTERED
(
[id] ASC
)
-- Then create a spatial index on GEOGRAPHY which contains the boundary polygon(s)
CREATE SPATIAL INDEX [ISX_NeighbourhoodBoundary] ON [dbo].[Neighbourhood]
(
[boundary]
)
I added a GEOGRAPHY column to the Listing table, populated it using the Latitude and Longitudes of the Listings then added a spatial index.
-- Use latitude and longitude to populate Location GEOGRAPHY column
UPDATE listing
SET Listing.Location = geography::Point(latitude, longitude, 4326)
-- Then index Location column after changing to NOT NULL
CREATE SPATIAL INDEX [IXS_ListingByLocation] ON [dbo].[listing]
(
[Location]
)
The first spatial search uses the latitude and longitude (most probably extracted from image metadata) to get a Listing’s neighbourhood.
It uses the STContains method to find the neighbourhood polygon (if there is one) which the listing location is inside.
const string ListingInNeighbourhoodSQL = @"SELECT neighbourhoodUID, name, neighbourhood_url as neighbourhoodUrl FROM Neighbourhood WHERE Neighbourhood.Boundary.STContains(geography::Point(@Latitude, @Longitude, 4326)) = 1";
...
app.MapGet("/Spatial/Neighbourhood", async (double latitude, double longitude, [FromServices] IDapperContext dapperContext) =>
{
Model.NeighbourhoodSearchDto neighbourhood;
using (var connection = dapperContext.ConnectionCreate())
{
neighbourhood = await connection.QuerySingleOrDefaultWithRetryAsync<Model.NeighbourhoodSearchDto>(ListingInNeighbourhoodSQL, new { latitude, longitude });
}
if (neighbourhood is null)
{
return Results.Problem($"Neighbourhood for Latitude:{latitude} Longitude:{longitude} not found", statusCode: StatusCodes.Status404NotFound);
}
return Results.Ok(neighbourhood);
})
.Produces<IList<Model.NeighbourhoodSearchDto>>(StatusCodes.Status200OK)
.Produces(StatusCodes.Status404NotFound )
.WithOpenApi();
In a future blog post I will compare the performance of STContains vs. STWithin with a load testing application.
The second search simulates a customer looking for listing(s) within a specified distance of a point of interest.
const string ListingsNearbySQL = @"DECLARE @Origin AS GEOGRAPHY = geography::Point(@Latitude, @Longitude, 4326);
DECLARE @Circle AS GEOGRAPHY = @Origin.STBuffer(@distance);
--DECLARE @Circle AS GEOGRAPHY = @Origin.BufferWithTolerance(@distance, 0.09,true);
SELECT uid as ListingUID, Name, listing_url as ListingUrl,
@Origin.STDistance(Listing.Location) as Distance
FROM [listing]
WHERE Listing.Location.STWithin(@Circle) = 1 ORDER BY Distance";
...
app.MapGet("/Spatial/NearbyText", async (double latitude, double longitude, double distance, [FromServices] IDapperContext dapperContext) =>
{
using (var connection = dapperContext.ConnectionCreate())
{
return await connection.QueryWithRetryAsync<Model.ListingNearbyListDto>(ListingsNearbySQL, new { latitude, longitude, distance });
}
})
.Produces<IList<Model.ListingNearbyListDto>>(StatusCodes.Status200OK)
.WithOpenApi();
The STBuffer command returns a geography object that represents represent a circle centered on @Location with a radius of @distance.
The third and final search simulates a customer looking for listing(s) within a specified distance of a point of interest with the latitude and longitude of the listing included in the results.
const string ListingsNearbyLatitudeLongitudeSQL = @"DECLARE @Location AS GEOGRAPHY = geography::Point(@Latitude, @longitude,4326)
DECLARE @Circle AS GEOGRAPHY = @Location.STBuffer(@distance);
SELECT UID as ListingUID
,[Name]
,listing_url as ListingUrl
,Listing.Location.STDistance(@Location) as Distance
,latitude
,longitude
FROM [listing]
WHERE Listing.Location.STWithin(@Circle) = 1
ORDER BY Distance";
app.MapGet("/Spatial/NearbyLatitudeLongitude", async (double latitude, double longitude, double distance, [FromServices] IDapperContext dapperContext) =>
{
using (var connection = dapperContext.ConnectionCreate())
{
return await connection.QueryWithRetryAsync<Model.ListingNearbyListLatitudeLongitudeDto>(ListingsNearbyLatitudeLongitudeSQL, new { latitude, longitude, distance });
}
})
.Produces<IList<Model.ListingNearbyListLatitudeLongitudeDto>>(StatusCodes.Status200OK)
.WithOpenApi();
The next couple of posts will use the third-party libraries Geo and NetTopolgySuite
using System.Text.Json;
using System.Data;
using System.Data.SqlClient;
using Microsoft.SqlServer.Types;
using Dapper;
string jsonString = File.ReadAllText("your geoJSON file");
using (IDbConnection connection = new SqlConnection("This is not the connection string you are looking for"))
{
connection.Open();
var neighbourHoods = JsonSerializer.Deserialize<GeoJSON.Text.Feature.FeatureCollection>(jsonString)!;
Console.WriteLine($"Features:{neighbourHoods.Features.Count}");
foreach (var feature in neighbourHoods.Features)
{
string neighbourhood = feature.Properties["neighbourhood"].ToString();
Console.WriteLine($"Neightbourhood:{neighbourhood}");
var geometery = (GeoJSON.Text.Geometry.MultiPolygon)feature.Geometry;
var s = new SqlGeographyBuilder();
s.SetSrid(4326);
s.BeginGeography(OpenGisGeographyType.MultiPolygon);
s.BeginGeography(OpenGisGeographyType.Polygon); // A
Console.WriteLine($"Polygon cordinates:{geometery.Coordinates.Count}");
foreach (var coordinates in geometery.Coordinates)
{
//s.BeginGeography(OpenGisGeographyType.Polygon); // B
Console.WriteLine($"Linestring cordinates:{coordinates.Coordinates.Count}");
foreach (var c in coordinates.Coordinates)
{
Console.WriteLine($"Point cordinates:{c.Coordinates.Count}");
s.BeginFigure(c.Coordinates[0].Latitude, c.Coordinates[0].Longitude, null, null);
for (int i = 1; i < c.Coordinates.Count; i++)
{
s.AddLine(c.Coordinates[i].Latitude, c.Coordinates[i].Longitude);
Console.Write('.');
}
Console.WriteLine();
s.EndFigure();
}
//s.EndGeography(); //B
}
s.EndGeography(); //A
s.EndGeography(); // OpenGisGeographyType.MultiPolygon
connection.Execute("INSERT INTO Neighbourhood (Name, Boundary) VALUES( @Neighbourhood, geography::STMPolyFromText(@boundary, 4326))", new { neighbourhood, boundary = s.ConstructedGeography.ToString()});
Console.WriteLine();
}
}
Console.WriteLine("loaded press <enter> to exit");
Console.ReadLine();
The neighbourhood feature loader utility is pretty “nasty” and was built for my specific scenario
CREATE TABLE [dbo].[Neighbourhood](
[NeighbourhoodUID] [uniqueidentifier] NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[Boundary] [geography] NOT NULL,
CONSTRAINT [PK_Neighbourhood] PRIMARY KEY CLUSTERED
(
[NeighbourhoodUID] ASC
)
I used Dapper to insert rows into the Neighbourhood table
DECLARE @PointGeography AS GEOGRAPHY
SET @PointGeography = geography::Point(51.512837,-0.2894983, 4326) --Flat
--SET @PointGeography = geography::Point(51.5053469,-0.0262693,4326) -- Canary Wharf
--SET @PointGeography = geography::Point(51.476853,0.0,4326) -- Greenwich
--SET @PointGeography = geography::Point(51.501476,-0.140634, 4326) -- Buckingham palace
--SET @PointGeography = geography::Point(51.533611, -0.318889, 4326) -- Hoover factory
--SET @PointGeography = geography::Point(51.5045, -0.0865, 4326) -- Shard
--SET @PointGeography = geography::Point(51.5145683288574,-0.0194199997931719, 4326) -- D0FD60C0-CC45-4517-91F6-00161E710F28 Tower Hamlets
--SET @PointGeography = geography::Point(51.5553092956543,0.00039999998989515, 4326) -- 80264AED-BC74-4150-B393-02D42711E2E6 Waltham Forest
--SET @PointGeography = geography::Point(51.4925193786621,-0.192310005426407, 4326) -- D36E4D1C-4A35-4B6E-B9A7-01E8D732FD3B Kensington and Chelsea
SET @PointGeography = geography::Point(51.5185317993164,-0.199739992618561, 4326) -- 2CAE3CAE-5E43-4F20-9550-01B86D7EF6FF Westminster
SELECT Name, @PointGeography.STWithin(Neighbourhood.Boundary) as 'STWithin', Neighbourhood.Boundary.STContains(@PointGeography) as 'STContains'
FROM Neighbourhood
ORDER BY Name
To test the neighbourhood geography I built a test harness with some “known” locations
The query results were the inverse of what I was expecting.
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddApplicationInsightsTelemetry();
// Add services to the container.
builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));
builder.Services.AddControllers();
builder.Services.AddSingleton<IConnectionMultiplexer>(s => ConnectionMultiplexer.Connect(builder.Configuration.GetConnectionString("Redis")));
var app = builder.Build();
// Configure the HTTP request pipeline.
app.UseHttpsRedirection();
app.MapControllers();
app.Run();
}
I trialed the initial versions of my Redis project with Memurai on my development machine, then configured an Azure Cache for Redis. I then load tested the project with several Azure AppService client and there was a significant improvement in response time.
[ApiController]
[Route("api/[controller]")]
public class StockItemsController : ControllerBase
{
private const int StockItemSearchMaximumRowsToReturn = 15;
private readonly TimeSpan StockItemListExpiration = new TimeSpan(0, 5, 0);
private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]";
//private const string sqlCommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]; WAITFOR DELAY '00:00:02'";
private readonly ILogger<StockItemsController> logger;
private readonly IDbConnection dbConnection;
private readonly IDatabase redisCache;
public StockItemsController(ILogger<StockItemsController> logger, IDapperContext dapperContext, IConnectionMultiplexer connectionMultiplexer)
{
this.logger = logger;
this.dbConnection = dapperContext.ConnectionCreate();
this.redisCache = connectionMultiplexer.GetDatabase();
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
var cached = await redisCache.StringGetAsync("StockItems");
if (cached.HasValue)
{
return Content(cached, "application/json");
}
var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
#if SERIALISER_SOURCE_GENERATION
string json = JsonSerializer.Serialize(stockItems, typeof(List<Model.StockItemListDtoV1>), Model.StockItemListDtoV1GenerationContext.Default);
#else
string json = JsonSerializer.Serialize(stockItems);
#endif
await redisCache.StringSetAsync("StockItems", json, expiry: StockItemListExpiration);
return Content(json, "application/json");
}
...
[HttpDelete()]
public async Task<ActionResult> ListCacheDelete()
{
await redisCache.KeyDeleteAsync("StockItems");
logger.LogInformation("StockItems list removed");
return this.Ok();
}
}
public class StockItemListDtoV1
{
public int Id { get; set; }
public string Name { get; set; }
public decimal RecommendedRetailPrice { get; set; }
public decimal TaxRate { get; set; }
}
[JsonSourceGenerationOptions(PropertyNamingPolicy = JsonKnownNamingPolicy.CamelCase)]
[JsonSerializable(typeof(List<StockItemListDtoV1>))]
public partial class StockItemListDtoV1GenerationContext : JsonSerializerContext
{
}
The cost of constructing the Serialiser may be higher, but the cost of performing serialisation with it is much smaller.
I used Telerik Fiddler to empty the cache then load the StockItems list 10 times (more tests would improve the quality of the results). The first trial was with the “conventional” serialiser
The average time for the conventional serialiser was 0.028562 seconds
The average time for the generated version was 0.030546 seconds. But, if the initial compilation step was ignored the average duration dropped to 0.000223 seconds a significant improvement.
ASP.NET Core identityRoles can also have individual claims but with the authorisation model of the legacy application I work on this functionality hasn’t been useful. We use role based authentication with a few user claims to minimise the size of our Java Web Tokens(JWT)
I tried to minimise the modifications to the application. I added EnableRetryOnFailure, some changes to names spaces etc. I also added support for email address confirmation with SendGrid and “authentication” link to the navabar in _Layout.cshtml.
While exploring some of the functionality of MiniProfiler there were some 3rd party examples which caught my attention.
using (SqlConnection connection = new SqlConnection(@"Data Source=...; Initial Catalog=SyncDB; Trusted_Connection=Yes"))
{
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
{
if (profiledDbConnection.State != System.Data.ConnectionState.Open)
profiledDbConnection.Open();
using (SqlCommand command = new SqlCommand("Select * From Authors", connection))
{
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, connection, MiniProfiler.Current))
{
var data = profiledDbCommand.ExecuteReader();
//Write code here to populate the list of Authors
}
}
}
“Inspired” by code like this my first attempt to retrieve a list of stock items didn’t look right.
[HttpGet("AdoProfiledOtt")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAdoProfiledOtt()
{
List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();
using (SqlConnection connection = new SqlConnection(configuration.GetConnectionString("default")))
{
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection(connection, MiniProfiler.Current))
{
await profiledDbConnection.OpenAsync();
using (SqlCommand command = new SqlCommand(sqlCommandText, connection))
{
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(command, profiledDbConnection, MiniProfiler.Current))
{
using (SqlDataReader reader = await command.ExecuteReaderAsync())
{
using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
{
var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();
while (await profiledDbDataReader.ReadAsync())
{
response.Add(rowParser(profiledDbDataReader));
}
}
}
}
}
await profiledDbConnection.CloseAsync();
}
}
}
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="behavior">The behavior specified during command execution.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
{
WrappedReader = reader;
Behavior = behavior;
_profiler = profiler;
}
...
/// <summary>
/// The <see cref="DbDataReader"/> that is being used.
/// </summary>
public DbDataReader WrappedReader { get; }
/// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
protected override void Dispose(bool disposing)
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Dispose();
base.Dispose(disposing);
}
Another “using” not required as ProfiledDbDataReader “automagically” disposes the SqlDataReader. This was my final version of profiling the System.Data.SqlClient code to retrieve a list of stock items.
[HttpGet("AdoProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetProfiledAdo()
{
List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();
using (ProfiledDbConnection profiledDbConnection = new ProfiledDbConnection((SqlConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
{
await profiledDbConnection.OpenAsync();
using (ProfiledDbCommand profiledDbCommand = new ProfiledDbCommand(new SqlCommand(sqlCommandText), profiledDbConnection, MiniProfiler.Current))
{
DbDataReader reader = await profiledDbCommand.ExecuteReaderAsync();
using (ProfiledDbDataReader profiledDbDataReader = new ProfiledDbDataReader(reader, MiniProfiler.Current))
{
var rowParser = profiledDbDataReader.GetRowParser<Model.StockItemListDtoV1>();
while (await profiledDbDataReader.ReadAsync())
{
response.Add(rowParser(profiledDbDataReader));
}
}
}
}
return this.Ok(response);
}
The profileDbDataReader.cs implementation was “sparse” and when loading a longer list of stock items there were some ReadAsync calls which took a bit longer.
/// <summary>
/// The profiled database data reader.
/// </summary>
public class ProfiledDbDataReader : DbDataReader
{
private readonly IDbProfiler? _profiler;
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class (with <see cref="CommandBehavior.Default"/>).
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, IDbProfiler profiler) : this(reader, CommandBehavior.Default, profiler) { }
/// <summary>
/// Initializes a new instance of the <see cref="ProfiledDbDataReader"/> class.
/// </summary>
/// <param name="reader">The reader.</param>
/// <param name="behavior">The behavior specified during command execution.</param>
/// <param name="profiler">The profiler.</param>
public ProfiledDbDataReader(DbDataReader reader, CommandBehavior behavior, IDbProfiler? profiler)
{
WrappedReader = reader;
Behavior = behavior;
_profiler = profiler;
}
/// <summary>Gets the behavior specified during command execution.</summary>
public CommandBehavior Behavior { get; }
/// <inheritdoc cref="DbDataReader.Depth"/>
public override int Depth => WrappedReader.Depth;
/// <inheritdoc cref="DbDataReader.FieldCount"/>
public override int FieldCount => WrappedReader.FieldCount;
/// <inheritdoc cref="DbDataReader.HasRows"/>
public override bool HasRows => WrappedReader.HasRows;
/// <inheritdoc cref="DbDataReader.IsClosed"/>
public override bool IsClosed => WrappedReader.IsClosed;
/// <inheritdoc cref="DbDataReader.RecordsAffected"/>
public override int RecordsAffected => WrappedReader.RecordsAffected;
/// <summary>
/// The <see cref="DbDataReader"/> that is being used.
/// </summary>
public DbDataReader WrappedReader { get; }
/// <inheritdoc cref="DbDataReader.this[string]"/>
public override object this[string name] => WrappedReader[name];
/// <inheritdoc cref="DbDataReader.this[int]"/>
public override object this[int ordinal] => WrappedReader[ordinal];
...
/// <inheritdoc cref="DbDataReader.GetString(int)"/>
public override string GetString(int ordinal) => WrappedReader.GetString(ordinal);
/// <inheritdoc cref="DbDataReader.GetValue(int)"/>
public override object GetValue(int ordinal) => WrappedReader.GetValue(ordinal);
/// <inheritdoc cref="DbDataReader.GetValues(object[])"/>
public override int GetValues(object[] values) => WrappedReader.GetValues(values);
/// <inheritdoc cref="DbDataReader.IsDBNull(int)"/>
public override bool IsDBNull(int ordinal) => WrappedReader.IsDBNull(ordinal);
/// <inheritdoc cref="DbDataReader.IsDBNullAsync(int, CancellationToken)"/>
public override Task<bool> IsDBNullAsync(int ordinal, CancellationToken cancellationToken) => WrappedReader.IsDBNullAsync(ordinal, cancellationToken);
/// <inheritdoc cref="DbDataReader.NextResult()"/>
public override bool NextResult() => WrappedReader.NextResult();
/// <inheritdoc cref="DbDataReader.NextResultAsync(CancellationToken)"/>
public override Task<bool> NextResultAsync(CancellationToken cancellationToken) => WrappedReader.NextResultAsync(cancellationToken);
/// <inheritdoc cref="DbDataReader.Read()"/>
public override bool Read() => WrappedReader.Read();
/// <inheritdoc cref="DbDataReader.ReadAsync(CancellationToken)"/>
public override Task<bool> ReadAsync(CancellationToken cancellationToken) => WrappedReader.ReadAsync(cancellationToken);
/// <inheritdoc cref="DbDataReader.Close()"/>
public override void Close()
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Close();
_profiler?.ReaderFinish(this);
}
/// <inheritdoc cref="DbDataReader.GetSchemaTable()"/>
public override DataTable? GetSchemaTable() => WrappedReader.GetSchemaTable();
/// <inheritdoc cref="DbDataReader.Dispose(bool)"/>
protected override void Dispose(bool disposing)
{
// reader can be null when we're not profiling, but we've inherited from ProfiledDbCommand and are returning a
// an unwrapped reader from the base command
WrappedReader?.Dispose();
base.Dispose(disposing);
}
}
In the [HttpGet(“DapperProfiledQueryMultipleStep”)] method I wrapped ReadAsync and could see in the profiling that every so often a call did take significantly longer.
using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
{
response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
}
I did consider modifying profileDbDataReader.cs to add some instrumentation to the Read… and Get… methods but, the authors of miniprofiler are way way smarter than me so there must be a reason why they didn’t.
The note on the wiki page“For LazyCache v2+ users, you should consider switching away from LazyCache to IDistributedCache. More information at #59“ caught my attention.
I have written other posts about caching Dapper query results with the Dapper Extension Library which worked well but had some configuration limitations. I also have posts about off-loading read-only workloads with Azure Active geo-replication or SQL Data Sync for Azure, which worked well in some scenarios but had limitations (performance and operational costs).
I explored the in-memory implementation (AddDistributedMemoryCache) on my development machine and found “tinkering” with the configuration options had little impact on the performance of my trivial sample application.
CREATE TABLE [dbo].[StockItemsCache](
[Id] [nvarchar](449) NOT NULL,
[Value] [varbinary](max) NOT NULL,
[ExpiresAtTime] [datetimeoffset](7) NOT NULL,
[SlidingExpirationInSeconds] [bigint] NULL,
[AbsoluteExpiration] [datetimeoffset](7) NULL,
PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
The table used to store the data wasn’t very complex and I could view the data associated with a cache key in SQL Server Mangement studio.
One of the applications I work on uses a complex SQL Server Stored procedure to load reference data (updated daily) and being able to purge the cache at the end of this process like this might be useful. For a geographically distributed application putting the Azure SQL instance “closer” to the application’s users might be worth considering.
[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
var utcNow = DateTime.UtcNow;
var cached = await distributedCache.GetAsync("StockItems");
if (cached != null)
{
#if SERIALISATION_JSON
return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
}
var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
#if SERIALISATION_JSON
await distributedCache.SetAsync("StockItems", JsonSerializer.SerializeToUtf8Bytes(stockItems), new DistributedCacheEntryOptions()
#endif
#if SERIALISATION_MESSAGE_PACK
await distributedCache.SetAsync("StockItems", MessagePackSerializer.Serialize(stockItems), new DistributedCacheEntryOptions()
#endif
{
AbsoluteExpiration = new DateTime(utcNow.Year, utcNow.Month, DateTime.DaysInMonth(utcNow.Year, utcNow.Month), StockItemListAbsoluteExpiration.Hours, StockItemListAbsoluteExpiration.Minutes, StockItemListAbsoluteExpiration.Seconds)
});
return this.Ok(stockItems);
}
[HttpGet("NoLoad")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetNoLoad()
{
var cached = await distributedCache.GetAsync("StockItems");
if (cached == null)
{
return this.NoContent();
}
#if SERIALISATION_JSON
return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
}
In my test environment the JSON payload for a list of stock items was a bit “chunky” at 25K bytes, so I added compile time configurable support for the MessagePack library. This significantly reduced the size of the payload LZ4Block (5K bytes) and LZ4BlockArray (5K2 bytes) which should reduce network traffic.
Assuming the overheads of JSON vs. MessagePack serialisation are similar and the much smaller MessagePack library payload I would most probably use MessagePack and LZ4BlockArray (For improved compatibility with other implementations) compression.
Initially, I was running the WebAPI Dapper DistributedCache application on my development box and it ran third/forth time after sorting out the Azure Cache for Redis connection string and firewall configuration.
There were many questions and tentative answers on stackoverflow some of them I tried, and which didn’t work in my scenario.
No connection is active/available to service this operation: HMGET Dapper WebAPI InstanceStockItems;
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.redis.cache.windows.net:6380/Interactive,
Initializing/NotStarted,
last: NONE,
origin: BeginConnectAsync,
outstanding: 0,
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s,
state: Connecting,
mgr: 10 of 10 available,
last-heartbeat: never, global: 10s ago,
v: 2.2.4.27433,
mc: 1/1/0, mgr: 10 of 10 available,
clientName: XXXXXXXXXXXX, <----------------------This was important
IOCP: (Busy=0,Free=1000,Min=1,Max=1000),
WORKER: (Busy=5,Free=1018,Min=1,Max=1023),
v: 2.2.4.27433
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.cache.windows.net:6380/Interactive,
Initializing/NotStarted,
last: NONE,
origin: BeginConnectAsync,
outstanding: 0,
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s,
state: Connecting,
mgr: 10 of 10 available,
last-heartbeat: never, global: 10s ago,
v: 2.2.4.27433
I then recreated my configuration from scratch so none of my random changes based on stackoverflow posts would mess my next round of debugging.
The error appeared to be a networking issue, most probably the firewall blocking connections so I added the Azure AppService Virtual IP Address which didn’t work…
I then went back and re-read the error message and noticed “clientName: XXXXXXXXXXXX,”. The Azure AppService client name wasn’t the default domain…
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddApplicationInsightsTelemetry();
// Add services to the container.
builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));
builder.Services.AddControllers();
// Register IAppCache as a singleton CachingService
builder.Services.AddLazyCache();
var app = builder.Build();
// Configure the HTTP request pipeline.
app.UseHttpsRedirection();
app.MapControllers();
app.Run();
}
One of the applications I work uses a lot of “reference” data which is effectively static e.g. national/regional holidays in countries where they have customers, and data which is updated on a schedule e.g. exchange rates downloaded at a known time every day.
For the holiday information updates, the dataset could be deleted(“burped”) then refreshed the next time it is referenced or deleted then re-loaded. For exchange rate updates the cache could automatically expire the dataset shortly after the scheduled download.