Random wanderings through Microsoft Azure esp. PaaS plumbing, the IoT bits, AI on Micro controllers, AI on Edge Devices, .NET nanoFramework, .NET Core on *nix and ML.NET+ONNX
I was using Azure IoT Explorer to monitor the telemetry and found that the initial versions of the application would fail after 6 or 7 hours. After reviewing the code I added a couple of “using” statements which appear to have fixed the problem as the soak test has been running for 12hrs, 24hrs, 36hrs, 48hrs, 96hrs…
A customer with large application which had a lot of ADO.Net code was comfortable Dapper DynamicParameters. Hundreds of stored procedures with input (some output) parameters were used to manage access to data. The main advantage of this approach was “familiarity” and the use of DynamicParameters made mapping of C# variable and stored procedure parameters (with different naming conventions) obvious.
[HttpGet("Dynamic")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDynamic(
[Required][MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")] string searchText,
[Required][Range(1, int.MaxValue, ErrorMessage = "MaximumRowsToReturn must be greater than or equal to {1}")] int maximumRowsToReturn)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
DynamicParameters parameters = new DynamicParameters();
parameters.Add("MaximumRowsToReturn", maximumRowsToReturn);
parameters.Add("SearchText", searchText);
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: parameters, commandType: CommandType.StoredProcedure);
}
return this.Ok(response);
}
Error message displayed when SearchText field missing
Error message displayed when SearchText is too short
Error message displayed when SearchText too long
Successful query of StockItems table
The developers at another company used anonymous typed variables everywhere. They also had similar C# and stored procedure parameter naming conventions so there was minimal (in the example code only maximumRowsToReturn vs. stockItemsMaximum) mapping required. They found mapping stored procedure output parameters was problematic. For longer parameter lists they struggled with formatting the code in a way which was readable.
[HttpGet("Anonymous")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAnonymous(
[Required][MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")] string searchText,
[Required][Range(1, 100, ErrorMessage = "The maximum number of stock items to return must be greater than or equal to {1} and less then or equal {2}")] int stockItemsMaximum)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", new { searchText, maximumRowsToReturn = stockItemsMaximum }, commandType: CommandType.StoredProcedure);
}
return this.Ok(response);
}
public class StockItemNameSearchDtoV1
{
[Required]
[MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more that {1} characters long")]
public string SearchText { get; set; }
[Required]
[Range(1, 100, ErrorMessage = "The maximum number of rows to return must be greater than or equal to {1} and less then or equal {2}")]
public int MaximumRowsToReturn { get; set; }
}
[HttpGet("AutomagicDefault")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDefault([FromQuery] Model.StockItemNameSearchDtoV1 request)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: request, commandType: CommandType.StoredProcedure);
}
return this.Ok(response);
}
At another customer the developers used Data Transfer Objects(DTOs)/Plain Old CLR Objects(POCOs) to access the database which had several hundred stored procedures. They had no control over the stored procedure parameter names so they mapped query string parameters to the properties of their POCOs.
public class StockItemNameSearchDtoV2
{
[Required]
[FromQuery(Name = "SearchText")]
[MinLength(3, ErrorMessage = "The name search text must be at least {1} characters long"), MaxLength(20, ErrorMessage = "The name search text must be no more than {1} characters long")]
public string SearchText { get; set; }
[Required]
[FromQuery(Name = "StockItemsMaximum")]
[Range(1, 100, ErrorMessage = "The maximum number of stock items to return must be greater than or equal to {1} and less then or equal {2}")]
public int MaximumRowsToReturn { get; set; }
}
[HttpGet("AutomagicMapped")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetMapperDecorated([FromQuery] Model.StockItemNameSearchDtoV2 request)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: "[Warehouse].[StockItemsNameSearchV1]", param: request, commandType: CommandType.StoredProcedure);
}
return this.Ok(response);
}
I don’t think that [FromQuery] decorations on POCOs is a good idea. If the classes are only used for one method I would consider moving them into the controller file.
//
// https://localhost:5001/api/StockItemsParameter/Array?StockItemId=1&StockItemId=5&StockItemId=10
//
[HttpGet("Array")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetArray(
[FromQuery(Name = "stockItemID")][Required(), MinLength(1, ErrorMessage = "Minimum of {1} StockItem id(s)"), MaxLength(100, ErrorMessage = "Maximum {1} StockItem ids")] int[] stockItemIDs)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems] WHERE StockItemID IN @StockItemIds ", new { StockItemIDs = stockItemIDs }, commandType: CommandType.Text);
}
return this.Ok(response);
}
A customer wanted users to be able search for items selected in a multiple selection list so a DapperWHERE IN value array was used.
Dapper WHERE IN with no StockItemIds on the query string
Dapper WHERE IN with several StockItemIds on query string
To explore how this worked I downloaded the Dapper source code and reference the project in my solution.
After single stepping through the Dapper source code I found where the array of StockTtems was getting mapped into a “generated” parameterised SQL statement.
Dapper generated parameterised SQL Statement
Based on my customer’s experiences a “mix ‘and ‘n’ match” approach to parameterising Dapper queries looks like a reasonable approach.
[HttpGet("IEnumerableSmall")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableSmall([FromQuery] bool buffered = false)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
logger.LogInformation("IEnumerableSmall start Buffered:{buffered}", buffered);
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
sql: @"SELECT [SI1].[StockItemID] as ""ID"", [SI1].[StockItemName] as ""Name"", [SI1].[RecommendedRetailPrice], [SI1].[TaxRate]" +
"FROM [Warehouse].[StockItems] as SI1",
buffered,
commandType: CommandType.Text);
logger.LogInformation("IEnumerableSmall done");
}
return this.Ok(response);
}
The easiest way to increase the size of the returned record was with CROSS JOIN(s). This is the first (and most probably the last time) I have used a cross join in a “real” application.
[HttpGet("IEnumerableMedium")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableMedium([FromQuery] bool buffered = false)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
logger.LogInformation("IEnumerableMedium start Buffered:{buffered}", buffered);
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
sql: @" SELECT [SI2].[StockItemID] as ""ID"", [SI2].[StockItemName] as ""Name"", [SI2].[RecommendedRetailPrice], [SI2].[TaxRate]" +
"FROM [Warehouse].[StockItems] as SI1" +
"CROSS JOIN[Warehouse].[StockItems] as SI2",
buffered,
commandType: CommandType.Text);
logger.LogInformation("IEnumerableMedium done");
}
return this.Ok(response);
}
The medium controller returns 51,529 (227 x 227) rows and the large controller upto 11,697,083 (227 x 227 x 227) rows.
[HttpGet("IEnumerableLarge")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetIEnumerableLarge()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
logger.LogInformation("IEnumerableLarge start");
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
sql: $@"SELECT [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
"FROM [Warehouse].[StockItems] as SI1" +
" CROSS JOIN[Warehouse].[StockItems] as SI2" +
" CROSS JOIN[Warehouse].[StockItems] as SI3",
commandType: CommandType.Text);
logger.LogInformation("IEnumerableLarge done");
}
return this.Ok(response);
}
if (command.Buffered)
{
var buffer = new List<T>();
var convertToType = Nullable.GetUnderlyingType(effectiveType) ?? effectiveType;
while (await reader.ReadAsync(cancel).ConfigureAwait(false))
{
object val = func(reader);
buffer.Add(GetValue<T>(reader, effectiveType, val));
}
while (await reader.NextResultAsync(cancel).ConfigureAwait(false))
{ /* ignore subsequent result sets */ }
command.OnCompleted();
return buffer;
}
else
{
// can't use ReadAsync / cancellation; but this will have to do
wasClosed = false; // don't close if handing back an open reader; rely on the command-behavior
var deferred = ExecuteReaderSync<T>(reader, func, command.Parameters);
reader = null; // to prevent it being disposed before the caller gets to see it
return deferred;
}
IEnumberableLarge method (buffered=false) response sizes and timings
IEnumberableLarge method (buffered=true) response sizes and timings
The unbuffered buffered version was slower Time To Last Byte(TTLB) and failed earlier which I was expecting.
[HttpGet("IAsyncEnumerableLarge")]
public async Task<ActionResult<IAsyncEnumerable<Model.StockItemListDtoV1>>> GetAsyncEnumerableLarge([FromQuery] bool buffered = false, [FromQuery]int recordCount = 10)
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
logger.LogInformation("IAsyncEnumerableLarge start RecordCount:{recordCount} Buffered:{buffered}", recordCount, buffered);
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(
sql: $@"SELECT TOP({recordCount}) [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
"FROM [Warehouse].[StockItems] as SI1" +
" CROSS JOIN[Warehouse].[StockItems] as SI2" +
" CROSS JOIN[Warehouse].[StockItems] as SI3",
buffered,
commandType: CommandType.Text);
logger.LogInformation("IAsyncEnumerableLarge done");
}
return this.Ok(response);
}
IAsyncEnumberableLarge method response sizes and timings
[HttpGet("IAsyncEnumerableLargeYield")]
public async IAsyncEnumerable<Model.StockItemListDtoV1> GetAsyncEnumerableLargeYield([FromQuery] int recordCount = 10)
{
int rowCount = 0;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
logger.LogInformation("IAsyncEnumerableLargeYield start RecordCount:{recordCount}", recordCount);
CommandDefinition commandDefinition = new CommandDefinition(
$@"SELECT TOP({recordCount}) [SI3].[StockItemID] as ""ID"", [SI3].[StockItemName] as ""Name"", [SI3].[RecommendedRetailPrice], [SI3].[TaxRate]" +
"FROM [Warehouse].[StockItems] as SI1" +
" CROSS JOIN[Warehouse].[StockItems] as SI2" +
" CROSS JOIN[Warehouse].[StockItems] as SI3",
//commandTimeout:
CommandType.Text,
//flags: CommandFlags.Pipelined
);
using var reader = await db.ExecuteReaderWithRetryAsync(commandDefinition);
var rowParser = reader.GetRowParser<Model.StockItemListDtoV1>();
while (await reader.ReadAsync())
{
rowCount++;
if ((rowCount % 10000) == 0)
{
logger.LogInformation("Row count:{0}", rowCount);
}
yield return rowParser(reader);
}
logger.LogInformation("IAsyncEnumerableLargeYield done");
}
}
When this post was written (August 2022) Dapper IAsyncEnumerable understanding was limited so I trialed the approach suggested in the StackOverflow post.
IAsyncEnumberableLargeYield method response sizes and timings
The IAsyncEnumerableLargeYield was faster to start responding, the overall duration was less and returned significantly more records 7000000 vs. 13000000. I assume this was because the response was streamed so there wasn’t a timeout.
Azure Application Insights displaying the IAsyncEnumerable with yield method executing
The results of my tests should be treated as “indicative” rather than “definitive”. In a future post I compare the scalability of different approaches. The number of records returned by the IAsyncEnumerableLargeYield not realistic and in a “real-world” scenario paging or an alternate approach should be used.
Read-only replicas of an Azure SQL Database database with Active geo-replication are easy to setup but there are some disadvantages. e.g. bi-directional synchronisation is not supported, not all tables or selected columns of some tables might not be needed\should not be accessible for reporting, the overhead of replicating tables used for transaction processing might impact on the performance of the solution etc. Azure SQL Data Sync is a service built on Azure SQL Database that can synchronise selected data bi-directionally across multiple databases, both on-premises and in the cloud.
StockItemsReadOnlyReplicas Controller JSON after first replication completed
Azure application Insights Dependencies showing usage of different synchronised databases
StockItems table in source database with updated RRP
StockItems table in destination database with updated RRP after next scheduled snychronisation
StockItems table in destination database after next scheduled synchronisation
The Azure SQL Database Data Sync was pretty easy to setup (configuration in the hub database tripped me up initially). For a production scenario where only a portion of the database (e.g. shaped by Customer, Geography, security considerations, or a bi-directional requirement) it would be an effective solution, though for some applications the delay between synchronisations might be an issue.
namespace devMobile.WebAPIDapper.Lists.Controllers
{
[ApiController]
[Route("api/[controller]")]
public class StockItemsRetryADONetController : ControllerBase
{
private readonly string connectionString;
private readonly ILogger<StockItemsRetryADONetController> logger;
// This is a bit nasty but sufficient for PoC
private readonly int NumberOfRetries = 3;
private readonly TimeSpan TimeBeforeNextExecution = TimeSpan.Parse("00:00:01");
private readonly TimeSpan MaximumInterval = TimeSpan.Parse("00:00:30");
private readonly List<int> TransientErrors = new List<int>()
{
49920, // Cannot process rquest. Too many operations in progress for subscription
49919, // Cannot process create or update request.Too many create or update operations in progress for subscription
49918, // Cannot process request. Not enough resources to process request.
41839, // Transaction exceeded the maximum number of commit dependencies.
41325, // The current transaction failed to commit due to a serializable validation failure.
41305, // The current transaction failed to commit due to a repeatable read validation failure.
41302, // The current transaction attempted to update a record that has been updated since the transaction started.
41301, // Dependency failure: a dependency was taken on another transaction that later failed to commit.
40613, // Database XXXX on server YYYY is not currently available. Please retry the connection later.
40501, // The service is currently busy. Retry the request after 10 seconds
40197, // The service has encountered an error processing your request. Please try again
20041, // Transaction rolled back. Could not execute trigger. Retry your transaction.
17197, // Login failed due to timeout; the connection has been closed. This error may indicate heavy server load.
14355, // The MSSQLServerADHelper service is busy. Retry this operation later.
11001, // Connection attempt failed
10936, // The request limit for the elastic pool has been reached.
10929, // The server is currently too busy to support requests.
10928, // The limit for the database is has been reached
10922, // Operation failed. Rerun the statement.
10060, // A network-related or instance-specific error occurred while establishing a connection to SQL Server.
10054, // A transport-level error has occurred when sending the request to the server.
10053, // A transport-level error has occurred when receiving results from the server.
9515, // An XML schema has been altered or dropped, and the query plan is no longer valid. Please rerun the query batch.
8651, // Could not perform the operation because the requested memory grant was not available in resource pool
8645, // A timeout occurred while waiting for memory resources to execute the query in resource pool, Rerun the query
8628, // A timeout occurred while waiting to optimize the query. Rerun the query.
4221, // Login to read-secondary failed due to long wait on 'HADR_DATABASE_WAIT_FOR_TRANSITION_TO_VERSIONING'. The replica is not available for login because row versions are missing for transactions that were in-flight when the replica was recycled
4060, // Cannot open database requested by the login. The login failed.
3966, // Transaction is rolled back when accessing version store. It was earlier marked as victim when the version store was shrunk due to insufficient space in tempdb. Retry the transaction.
3960, // Snapshot isolation transaction aborted due to update conflict. You cannot use snapshot isolation to access table directly or indirectly in database
3935, // A FILESTREAM transaction context could not be initialized. This might be caused by a resource shortage. Retry the operation.
1807, // Could not obtain exclusive lock on database 'model'. Retry the operation later.
1221, // The Database Engine is attempting to release a group of locks that are not currently held by the transaction. Retry the transaction.
1205, // Deadlock
1204, // The instance of the SQL Server Database Engine cannot obtain a LOCK resource at this time. Rerun your statement.
1203, // A process attempted to unlock a resource it does not own. Retry the transaction.
997, // A connection was successfully established with the server, but then an error occurred during the login process.
921, // Database has not been recovered yet. Wait and try again.
669, // The row object is inconsistent. Please rerun the query.
617, // Descriptor for object in database not found in the hash table during attempt to un-hash it. Rerun the query. If a cursor is involved, close and reopen the cursor.
601, // Could not continue scan with NOLOCK due to data movement.
233, // The client was unable to establish a connection because of an error during connection initialization process before login.
121, // The semaphore timeout period has expired.
64, // A connection was successfully established with the server, but then an error occurred during the login process.
20, // The instance of SQL Server you attempted to connect to does not support encryption.
};
...
}
After some experimentation the most reliable way I could reproduce a transient failure (usually SQL Error 11001-“An error has occurred while establishing a connection to the server”) was by modifying the database connection string or unplugging the network cable after a connection had been explicitly opened or command executed.
namespace devMobile.WebAPIDapper.Lists.Controllers
{
[ApiController]
[Route("api/[controller]")]
public class StockItemsRetryADONetController : ControllerBase
{
...
[HttpGet("Dapper")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapper()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
SqlRetryLogicOption sqlRetryLogicOption = new SqlRetryLogicOption()
{
NumberOfTries = NumberOfRetries,
DeltaTime = TimeBeforeNextExecution,
MaxTimeInterval = MaximumInterval,
TransientErrors = TransientErrors,
//AuthorizedSqlCondition = x => string.IsNullOrEmpty(x) || Regex.IsMatch(x, @"^SELECT", RegexOptions.IgnoreCase),
};
SqlRetryLogicBaseProvider sqlRetryLogicProvider = SqlConfigurableRetryFactory.CreateFixedRetryProvider(sqlRetryLogicOption);
using (SqlConnection db = new SqlConnection(this.connectionString))
{
db.RetryLogicProvider = sqlRetryLogicProvider;
db.RetryLogicProvider.Retrying += new EventHandler<SqlRetryingEventArgs>(OnDapperRetrying);
await db.OpenAsync(); // Did explicitly so I could yank out the LAN cable.
response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
}
return this.Ok(response);
}
protected void OnDapperRetrying(object sender, SqlRetryingEventArgs args)
{
logger.LogInformation("Dapper retrying for {RetryCount} times for {args.Delay.TotalMilliseconds:0.} mSec - Error code: {Number}", args.RetryCount, args.Delay.TotalMilliseconds, (args.Exceptions[0] as SqlException).Number);
}
...
}
}
ADO.Net RetryLogicProvider retrying request 3 times
I then added an OpenAsync just before the Dapper query so I could open the database connection, pause the program with a breakpoint, unplug the LAN cable and then continue execution. The QueryAsync failed without any retries and modifying the AuthorizedSqlCondition didn’t seem change the way different SQL statement failures were handled.
There was limited documentation about how to use ADO.Net retry functionality so I hacked up another method to try and figure out what I had done wrong. The method uses the same SqlRetryLogicOption configuration for retrying connection and command failures.
namespace devMobile.WebAPIDapper.Lists.Controllers
{
[ApiController]
[Route("api/[controller]")]
public class StockItemsRetryADONetController : ControllerBase
{
...
[HttpGet("AdoNet")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetAdoNet()
{
List<Model.StockItemListDtoV1> response = new List<Model.StockItemListDtoV1>();
// Both connection and command share same logic not really an issue for nasty demo
SqlRetryLogicOption sqlRetryLogicOption = new SqlRetryLogicOption()
{
NumberOfTries = NumberOfRetries,
DeltaTime = TimeBeforeNextExecution,
MaxTimeInterval = MaximumInterval,
TransientErrors = TransientErrors,
//AuthorizedSqlCondition = x => string.IsNullOrEmpty(x) || Regex.IsMatch(x, @"^SELECT", RegexOptions.IgnoreCase),
};
SqlRetryLogicBaseProvider sqlRetryLogicProvider = SqlConfigurableRetryFactory.CreateFixedRetryProvider(sqlRetryLogicOption);
// This ADO.Net is a bit overkill but just wanted to highlight ADO.Net vs. Dapper
using (SqlConnection sqlConnection = new SqlConnection(this.connectionString))
{
sqlConnection.RetryLogicProvider = sqlRetryLogicProvider;
sqlConnection.RetryLogicProvider.Retrying += new EventHandler<SqlRetryingEventArgs>(OnConnectionRetrying);
await sqlConnection.OpenAsync(); // Did explicitly so I could yank out the LAN cable.
using (SqlCommand sqlCommand = new SqlCommand())
{
sqlCommand.Connection = sqlConnection;
sqlCommand.CommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]";
sqlCommand.CommandType = CommandType.Text;
sqlCommand.RetryLogicProvider = sqlRetryLogicProvider;
sqlCommand.RetryLogicProvider.Retrying += new EventHandler<SqlRetryingEventArgs>(OnCommandRetrying);
// Over kill but makes really obvious
using (SqlDataReader sqlDataReader = await sqlCommand.ExecuteReaderAsync(CommandBehavior.CloseConnection))
{
while (await sqlDataReader.ReadAsync())
{
response.Add(new Model.StockItemListDtoV1()
{
Id = sqlDataReader.GetInt32("Id"),
Name = sqlDataReader.GetString("Name"),
RecommendedRetailPrice = sqlDataReader.GetDecimal("RecommendedRetailPrice"),
TaxRate = sqlDataReader.GetDecimal("TaxRate"),
});
}
}
};
}
return this.Ok(response);
}
protected void OnConnectionRetrying(object sender, SqlRetryingEventArgs args)
{
logger.LogInformation("Connection retrying for {RetryCount} times for {args.Delay.TotalMilliseconds:0.} mSec - Error code: {Number}", args.RetryCount, args.Delay.TotalMilliseconds, (args.Exceptions[0] as SqlException).Number);
}
protected void OnCommandRetrying(object sender, SqlRetryingEventArgs args)
{
logger.LogInformation("Command retrying for {RetryCount} times for {args.Delay.TotalMilliseconds:0.} mSec - Error code: {Number}", args.RetryCount, args.Delay.TotalMilliseconds, (args.Exceptions[0] as SqlException).Number);
}
}
}
One of the easiest ways to create read-only replicas of an Azure SQL Database database is with Active geo-replication(it’s also useful for disaster recovery with geo-failure to a geo-secondary in a different Azure Region).
I then created replicas in the same region (if the application had a global customer base creating read only geo replicas in regions close to users might be worth considering) for the read-only queries.
Azure SQL Database no replicas configured
Azure Portal Create Geo Replica
I created four replicas which is the maximum number supported. If more replicas were required a secondary of a secondary (a process known as chaining) could be use to create additional geo-replicas
Azure Portal Primary Database and four Geo-replicas
Azure Application Insights showing multiple Geo-Replicas being used.
The Azure Database Geo-replication was pretty easy to setup. For a production scenario where only a portion of the database (e.g. shaped by Customer or Geography) is required it might not be the “right hammer”.
WebAPI Dapper Azure Resource Group
The other limitation I encountered was the resources used by the replication of “transaction processing” tables (in the World Wide Importers database tables like the Sales.OrderLines, Sales.CustomerTransactions etc.) which often wouldn’t be required for read-only applications.
The company builds a Software as a Service(Saas) product for managing portfolios of foreign currency forwards, options, swaps etc. Part of the solution has an application which customers use to get an “aggregated” view of their purchases.
The database queries to lookup reference data (forward curves etc.), return a shaped dataset for each supported instrument type, then “aggregating” the information with C# code consumes significant database and processing resources.
The configuration strings of the read-only replicas are loaded as the application starts.
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers();
var errorHandlerSettings = Configuration.GetSection(nameof(ErrorHandlerSettings));
services.Configure<ErrorHandlerSettings>(errorHandlerSettings);
var readonlyReplicaServersConnectionStringSettings = Configuration.GetSection("ReadonlyReplicaServersConnectionStringSettings");
services.Configure<List<string>>(readonlyReplicaServersConnectionStringSettings);
services.AddResponseCaching();
services.AddDapperForMSSQL();
#if DAPPER_EXTENSIONS_CACHE_MEMORY
services.AddDapperCachingInMemory(new MemoryConfiguration
{
AllMethodsEnableCache = false
});
#endif
#if DAPPER_EXTENSIONS_CACHE_REDIS
services.AddDapperCachingInRedis(new RedisConfiguration
{
AllMethodsEnableCache = false,
KeyPrefix = Configuration.GetValue<string>("RedisKeyPrefix"),
ConnectionString = Configuration.GetConnectionString("RedisConnection")
});
#endif
services.AddApplicationInsightsTelemetry();
}
Then code was added to the controller to randomly select which read-only replica to use. More complex approaches were considered but not implemented for the initial version.
[ApiController]
[Route("api/[controller]")]
public class StockItemsReadonlyReplicasController : ControllerBase
{
private readonly ILogger<StockItemsReadonlyReplicasController> logger;
private readonly List<string> readonlyReplicasConnectionStrings;
public StockItemsReadonlyReplicasController(ILogger<StockItemsReadonlyReplicasController> logger, IOptions<List<string>> readonlyReplicasServerConnectionStrings)
{
this.logger = logger;
this.readonlyReplicasConnectionStrings = readonlyReplicasServerConnectionStrings.Value;
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
if (readonlyReplicasConnectionStrings.Count == 0)
{
logger.LogError("No readonly replica server Connection strings configured");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
Random random = new Random(); // maybe this should be instantiated ever call, but "danger here by thy threading"
string connectionString = readonlyReplicasConnectionStrings[random.Next(0, readonlyReplicasConnectionStrings.Count)];
logger.LogTrace("Connection string {connectionString}", connectionString);
using (SqlConnection db = new SqlConnection(connectionString))
{
response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems]", commandType: CommandType.Text);
}
return this.Ok(response);
}
}
The Read-only replica server connection string setup template in appsettings.Development.json.
The Manage UserSecrets(Secrets.json) functionality was used for testing on my development machine. In production Azure App Service the array of connections strings was configured with ReadonlyReplicaServersConnectionStringSettings:0, ReadonlyReplicaServersConnectionStringSettings:1 etc. syntax
Sample application Azure App Service Configuration
Azure Application Insights with connections to different read-only replicas highlighted
I had incorrectly configured the firewall on one of the read-only replica database servers so roughly one in four connection attempts failed.
For some historical reason I can’t remember my controllers often had an outer try/catch and associated logging. I think may have been ensure no “sensitive” information was returned to the caller even if the application was incorrectly deployed. So I could revisit my approach I added a controller with two methods one which returns an HTTP 500 error and another which has un-caught exception.
[Route("api/[controller]")]
[ApiController]
public class StockItemsNok500Controller : ControllerBase
{
private readonly string connectionString;
private readonly ILogger<StockItemsNok500Controller> logger;
public StockItemsNok500Controller(IConfiguration configuration, ILogger<StockItemsNok500Controller> logger)
{
this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
this.logger = logger;
}
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get500()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItem500]", commandType: CommandType.Text);
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Retrieving list of StockItems");
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok(response);
}
}
The information returned to a caller was generic and the only useful information was the “traceId”.
StockItemsNok500Controller error page
[Route("api/[controller]")]
[ApiController]
public class StockItemsNokExceptionController : ControllerBase
{
private readonly string connectionString;
public StockItemsNokExceptionController(IConfiguration configuration)
{
this.connectionString = configuration.GetConnectionString("WorldWideImportersDatabase");
}
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetException()
{
IEnumerable<Model.StockItemListDtoV1> response = null;
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItemsException]", commandType: CommandType.Text);
}
return this.Ok(response);
}
}
In “Development” mode the information returned to the caller contains a detailed stack trace that reveals implementation details which are useful for debugging but would also be useful to an attacker.
Developer StockItemsNok Controller Exception page
When not in “Development” mode no additional information is returned (not even a TraceId).
Production StockItemsNok500Controller Exception
The diagnostic stacktrace information logged by the two different controllers was essentially the same
System.Data.SqlClient.SqlException:
at System.Data.SqlClient.SqlCommand+<>c.<ExecuteDbDataReaderAsync>b__126_0 (System.Data.SqlClient, Version=4.6.1.3, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a)
at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Threading.Tasks.Task+<>c.<.cctor>b__272_0 (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Threading.ExecutionContext.RunInternal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Threading.ExecutionContext.RunInternal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Threading.Tasks.Task.ExecuteWithThreadLocal (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at Dapper.SqlMapper+<QueryAsync>d__33`1.MoveNext (Dapper, Version=2.0.0.0, Culture=neutral, PublicKeyToken=null: /_/Dapper/SqlMapper.Async.cs:418)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at Polly.Retry.AsyncRetryEngine+<ImplementationAsync>d__0`1.MoveNext (Polly, Version=7.0.0.0, Culture=neutral, PublicKeyToken=c8a3ffc3f8f825cc)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at Polly.AsyncPolicy+<ExecuteAsync>d__21`1.MoveNext (Polly, Version=7.0.0.0, Culture=neutral, PublicKeyToken=c8a3ffc3f8f825cc)
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult (System.Private.CoreLib, Version=6.0.0.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e)
at devMobile.WebAPIDapper.Lists.Controllers.StockItemsNokController+<Get500>d__4.MoveNext (ListsClassic, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null: C:\Users\BrynLewis\source\repos\WebAPIDapper\Lists\Controllers\14.StockItemsNokController.cs:70)
One customer wanted their client application to display a corporate help desk number for staff to call for support. This information was made configurable
namespace devMobile.WebAPIDapper.Lists
{
public class ErrorHandlerSettings
{
public string Detail { get; set; } = "devMobile Lists Classic API failure";
public string Title { get; set; } = "System Error";
}
}
{
...
},
"ErrorHandlerSettings": {
"Title": "Webpage has died",
"Detail": "Something has gone wrong call the help desk on 0800-RebootIt"
},
...
}
namespace devMobile.WebAPIDapper.Lists.Controllers
{
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Options;
[ApiController]
public class ErrorController : Controller
{
private readonly ErrorHandlerSettings errorHandlerSettings;
public ErrorController(IOptions<ErrorHandlerSettings> errorHandlerSettings)
{
this.errorHandlerSettings = errorHandlerSettings.Value;
}
[Route("/error")]
public IActionResult HandleError([FromServices] IHostEnvironment hostEnvironment)
{
return Problem(detail: errorHandlerSettings.Detail, title: errorHandlerSettings.Title);
}
}
}
StockItemsNok Controller Error page with configurable title and details
Another customer wanted their client application to display a corporate help desk number based on the source hostname.
ClientA.SaasApplicationProvider.co.nz
ClientB.SaasApplicationProvider.co.nz
ClientC.SaasApplicationProvider.co.nz
SaasApplication.ClientD.co.nz
This information was also made configurable
namespace devMobile.WebAPIDapper.Lists
{
using System.Collections.Generic;
public class UrlSpecificSetting
{
public string Title { get; set; } = "";
public string Detail { get; set; } = "";
public UrlSpecificSetting()
{
}
public UrlSpecificSetting(string title, string detail)
{
this.Title = title;
this.Detail = detail;
}
}
public class ErrorHandlerSettings
{
public string Title { get; set; } = "System Error";
public string Detail { get; set; } = "devMobile Lists Classic API failure";
public Dictionary<string, UrlSpecificSetting> UrlSpecificSettings { get; set; }
public ErrorHandlerSettings()
{
}
public ErrorHandlerSettings(string title, string detail, Dictionary<string, UrlSpecificSetting> urlSpecificSettings )
{
Title = title;
Detail = detail;
UrlSpecificSettings = urlSpecificSettings;
}
}
}
We considered storing the title and details message in the database but that approach was discounted as we wanted to minimise dependencies.
{
...
"ErrorHandlerSettings": {
"Detail": "Default detail",
"Title": "Default title",
"UrlSpecificSettings": {
"localhost": {
"Title": "Title for localhost",
"Detail": "Detail for localhost"
},
"127.0.0.1": {
"Title": "Title for 127.0.0.1",
"Detail": "Detail for 127.0.0.1"
}
}
}
}
namespace devMobile.WebAPIDapper.Lists.Controllers
{
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Options;
[ApiController]
public class ErrorController : Controller
{
private readonly ErrorHandlerSettings errorHandlerSettings;
public ErrorController(IOptions<ErrorHandlerSettings> errorHandlerSettings)
{
this.errorHandlerSettings = errorHandlerSettings.Value;
}
[Route("/error")]
public IActionResult HandleError([FromServices] IHostEnvironment hostEnvironment)
{
if (!this.errorHandlerSettings.UrlSpecificSettings.ContainsKey(this.Request.Host.Host))
{
return Problem(detail: errorHandlerSettings.Detail, title: errorHandlerSettings.Title);
}
return Problem(errorHandlerSettings.UrlSpecificSettings[this.Request.Host.Host].Title, errorHandlerSettings.UrlSpecificSettings[this.Request.Host.Host].Detail);
}
}
}
The sample configuration has custom title and details text for localhost and 127.0.0.1 with a default title and details text for all other hostnames.
StockItemsNok Controller Error page with 127.0.0.1 specific title and details
StockItemsNok Controller Error page with localhost specific title and details
One customer had a staff member who would take a photo of the client application error page with their mobile and email it to us which made it really easy to track down issues. This was especially usefully as they were in an awkward timezone.
Application Insights TraceId search
Application Insights TraceId search result with exception details
With a customisable error page my approach with the outer try/catch has limited benefit and just adds complexity.
This post builds on my Smartish Edge Camera -Azure IoT Direct Methods post adding two updateable properties for the image capture and processing timer the due and period values. The two properties can be updated together or independently but the values are not persisted.
When I was searching for answers I found this code in many posts and articles but it didn’t really cover my scenario.
private static async Task OnDesiredPropertyChanged(TwinCollection desiredProperties,
object userContext)
{
Console.WriteLine("desired property chPleange:");
Console.WriteLine(JsonConvert.SerializeObject(desiredProperties));
Console.WriteLine("Sending current time as reported property");
TwinCollection reportedProperties = new TwinCollection
{
["DateTimeLastDesiredPropertyChangeReceived"] = DateTime.Now
};
await Client.UpdateReportedPropertiesAsync(reportedProperties).ConfigureAwait(false);
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
_logger.LogInformation("Azure IoT Smart Edge Camera Service starting");
try
{
#if AZURE_IOT_HUB_CONNECTION
_deviceClient = await AzureIoTHubConnection();
#endif
#if AZURE_IOT_HUB_DPS_CONNECTION
_deviceClient = await AzureIoTHubDpsConnection();
#endif
#if AZURE_DEVICE_PROPERTIES
_logger.LogTrace("ReportedPropeties upload start");
TwinCollection reportedProperties = new TwinCollection();
reportedProperties["OSVersion"] = Environment.OSVersion.VersionString;
reportedProperties["MachineName"] = Environment.MachineName;
reportedProperties["ApplicationVersion"] = Assembly.GetAssembly(typeof(Program)).GetName().Version;
reportedProperties["ImageTimerDue"] = _applicationSettings.ImageTimerDue;
reportedProperties["ImageTimerPeriod"] = _applicationSettings.ImageTimerPeriod;
reportedProperties["YoloV5ModelPath"] = _applicationSettings.YoloV5ModelPath;
reportedProperties["PredictionScoreThreshold"] = _applicationSettings.PredictionScoreThreshold;
reportedProperties["PredictionLabelsOfInterest"] = _applicationSettings.PredictionLabelsOfInterest;
reportedProperties["PredictionLabelsMinimum"] = _applicationSettings.PredictionLabelsMinimum;
await _deviceClient.UpdateReportedPropertiesAsync(reportedProperties, stoppingToken);
_logger.LogTrace("ReportedPropeties upload done");
#endif
_logger.LogTrace("YoloV5 model setup start");
_scorer = new YoloScorer<YoloCocoP5Model>(_applicationSettings.YoloV5ModelPath);
_logger.LogTrace("YoloV5 model setup done");
_ImageUpdatetimer = new Timer(ImageUpdateTimerCallback, null, _applicationSettings.ImageTimerDue, _applicationSettings.ImageTimerPeriod);
await _deviceClient.SetMethodHandlerAsync("ImageTimerStart", ImageTimerStartHandler, null);
await _deviceClient.SetMethodHandlerAsync("ImageTimerStop", ImageTimerStopHandler, null);
await _deviceClient.SetMethodDefaultHandlerAsync(DefaultHandler, null);
await _deviceClient.SetDesiredPropertyUpdateCallbackAsync(OnDesiredPropertyChangedAsync, null);
try
{
await Task.Delay(Timeout.Infinite, stoppingToken);
}
catch (TaskCanceledException)
{
_logger.LogInformation("Application shutown requested");
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Application startup failure");
}
finally
{
_deviceClient?.Dispose();
}
_logger.LogInformation("Azure IoT Smart Edge Camera Service shutdown");
}
// Lots of other code here
private async Task OnDesiredPropertyChangedAsync(TwinCollection desiredProperties, object userContext)
{
TwinCollection reportedProperties = new TwinCollection();
_logger.LogInformation("OnDesiredPropertyChanged handler");
// NB- This approach does not save the ImageTimerDue or ImageTimerPeriod, a stop/start with return to appsettings.json configuration values. If only
// one parameter specified other is default from appsettings.json. If timer settings changed I think they won't take
// effect until next time Timer fires.
try
{
// Check to see if either of ImageTimerDue or ImageTimerPeriod has changed
if (!desiredProperties.Contains("ImageTimerDue") && !desiredProperties.Contains("ImageTimerPeriod"))
{
_logger.LogInformation("OnDesiredPropertyChanged neither ImageTimerDue or ImageTimerPeriod present");
return;
}
TimeSpan imageTimerDue = _applicationSettings.ImageTimerDue;
// Check that format of ImageTimerDue valid if present
if (desiredProperties.Contains("ImageTimerDue"))
{
if (TimeSpan.TryParse(desiredProperties["ImageTimerDue"].Value, out imageTimerDue))
{
reportedProperties["ImageTimerDue"] = imageTimerDue;
}
else
{
_logger.LogInformation("OnDesiredPropertyChanged ImageTimerDue invalid");
return;
}
}
TimeSpan imageTimerPeriod = _applicationSettings.ImageTimerPeriod;
// Check that format of ImageTimerPeriod valid if present
if (desiredProperties.Contains("ImageTimerPeriod"))
{
if (TimeSpan.TryParse(desiredProperties["ImageTimerPeriod"].Value, out imageTimerPeriod))
{
reportedProperties["ImageTimerPeriod"] = imageTimerPeriod;
}
else
{
_logger.LogInformation("OnDesiredPropertyChanged ImageTimerPeriod invalid");
return;
}
}
_logger.LogInformation("Desired Due:{0} Period:{1}", imageTimerDue, imageTimerPeriod);
if (!_ImageUpdatetimer.Change(imageTimerDue, imageTimerPeriod))
{
_logger.LogInformation("Desired Due:{0} Period:{1} failed", imageTimerDue, imageTimerPeriod);
}
await _deviceClient.UpdateReportedPropertiesAsync(reportedProperties);
}
catch (Exception ex)
{
_logger.LogError(ex, "OnDesiredPropertyChangedAsync handler failed");
}
}
The TwinCollection desiredProperties is checked for ImageTimerDue and ImageTimerPeriod properties and if either of these are present and valid the Timer.Change method is called.
Azure IoT Central SmartEdgeCamera Device template capabilities
I added a View to the template so the two properties could be changed (I didn’t configure either as required)
Azure IoT Central SmartEdgeCamera Device Default view designer
In the “Device Properties”, “Operation Tab” when I changed the ImageTimerDue and/or ImageTimerPeriod there was visual feedback that there was an update in progress.
Azure IoT Central SmartEdgeCamera Device Properties update start