Swarm Space – FromDevice with webhooks

I modified my TTI V3 Connector Azure Storage Queues project which uses Azure Functions HTTP Triggers to put messages into Azure Storage Queues to process Swarm FromDevice Webhook messages.

First step was to configure a webhook with the Swarm dashboard

Swarm dashboard webhooks configuration

I configured the webhook, and to “acknowledge messages on successful delivery”. Then checked my configuration with a couple of “Test” messages.

Swarm dashboard webhook configuration

The Swagger API documentation has methods for configuring endpoints which can be called by an application.

Swagger API Documentation for managing endpoints

I queued a couple of messages on my Satellite Transceiver Breakout and when the next satellite passed overhead, shortly after they were visible in the Swarm Dashboard Messages tab.

Swarm Dashboard with test and live fromdevice messages

The messages were also delivered to an Azure Storage Queue, and I could view them with Azure Storage Explorer.

Azure Storage Explorer displaying a webhook message payload

Swarm Space – Azure IoT Basic Client

To figure out how to poll the Swarm Hive API I have built yet another “nasty” Proof of Concept (PoC) which gets ToDevice and FromDevice messages. Initially I have focused on polling as the volume of messages from my single device is pretty low (WebHooks will be covered in a future post).

Like my Azure IoT The Things Industry connector I use Alastair Crabtrees’s LazyCache to store Azured IoT Hub DeviceClient instances.

NOTE: Swarm Space technical support clarified the parameter values required to get FromDevice and ToDevice messages using the Bumbleebee Hive API.

Swarm API Docs messages functionality

The Messages Get method has a lot of parameters for filtering and paging the response message lists. Many of the parameters have default values so can be null or left blank.

Swarm API Get User Message filters

I started off by seeing if I could duplicate the functionality of the user interface and get a list of all ToDevice and FromDevice messages.

Swarm Dashboard messages list

I first called the Messages Get method with the direction set to “fromdevice” (Odd this is a string rather than an enumeration) and the messages I had sent from my Sparkfun Satellite Transceiver Breakout – Swarm M138 were displayed.

Swarm API Docs displaying “fromdevice” messages

I then called the Messages Get method with the direction set to “all” and only the FromDevice messages were displayed which I wasn’t expecting.

Swarm API Docs displaying ToDevice and FromDevices messages

I then called the Messages Get method with the direction set to “FromDevice and no messages were displayed which I wasn’t expecting

Swarm API Docs displaying “todevice” messages

I then called the Message Get method with the messageId of a ToDevice message and the detailed message information was displayed.

Swarm API Docs displaying the details of a specific inbound message

For testing I configured 5 devices (a real device and the others simulated) in my Azure IoT Hub with the Swarm Device ID ued as the Azure IoT Hub device ID.

Devices configured in Azure IoT Hub

My console application calls the Swarm Bumblebee Hive API Login method, then uses Azure IoT Hub DeviceClient SendEventAsync upload device telemetry.

Nasty console application processing the three “fromdevice” messages which have not been acknowledged.

The console application stores the Swarm Hive API username, password and the Azure IoT Hub Device Connection string locally using the UserSecretsConfigurationExtension.

internal class Program
{
    private static string AzureIoTHubConnectionString = "";
    private readonly static IAppCache _DeviceClients = new CachingService();

    static async Task Main(string[] args)
    {
        Debug.WriteLine("devMobile.SwarmSpace.Hive.AzureIoTHubBasicClient starting");

        IConfiguration configuration = new ConfigurationBuilder()
            .SetBasePath(Directory.GetCurrentDirectory())
            .AddJsonFile("appsettings.json")
            .AddUserSecrets("b4073481-67e9-41bd-bf98-7d2029a0b391").Build();

        AzureIoTHubConnectionString = configuration.GetConnectionString("AzureIoTHub");

        using (HttpClient httpClient = new HttpClient())
        {
            BumblebeeHiveClient.Client client = new BumblebeeHiveClient.Client(httpClient);

            client.BaseUrl = configuration.GetRequiredSection("SwarmConnection").GetRequiredSection("BaseURL").Value;

            BumblebeeHiveClient.LoginForm loginForm = new BumblebeeHiveClient.LoginForm();

            loginForm.Username = configuration.GetRequiredSection("SwarmConnection").GetRequiredSection("UserName").Value;
            loginForm.Password = configuration.GetRequiredSection("SwarmConnection").GetRequiredSection("Password").Value;

            BumblebeeHiveClient.Response response = await client.PostLoginAsync(loginForm);

            Debug.WriteLine($"Token :{response.Token[..5]}.....{response.Token[^5..]}");

            string apiKey = "bearer " + response.Token;
            httpClient.DefaultRequestHeaders.Add("Authorization", apiKey);

            var devices = await client.GetDevicesAsync(null, null, null, null, null, null, null, null, null);

            foreach (BumblebeeHiveClient.Device device in devices)
            {
                Debug.WriteLine($" Id:{device.DeviceId} Name:{device.DeviceName} Type:{device.DeviceType} Organisation:{device.OrganizationId}");

                DeviceClient deviceClient = await _DeviceClients.GetOrAddAsync<DeviceClient>(device.DeviceId.ToString(), (ICacheEntry x) => IoTHubConnectAsync(device.DeviceId.ToString()), memoryCacheEntryOptions);
            }

            foreach (BumblebeeHiveClient.Device device in devices)
            {
                DeviceClient deviceClient = await _DeviceClients.GetAsync<DeviceClient>(device.DeviceId.ToString());

                var messages = await client.GetMessagesAsync(null, null, null, device.DeviceId.ToString(), null, null, null, null, null, null, "all", null, null);
                foreach (var message in messages)
                {
                    Debug.WriteLine($" PacketId:{message.PacketId} Status:{message.Status} Direction:{message.Direction} Length:{message.Len} Data: {BitConverter.ToString(message.Data)}");

                    JObject telemetryEvent = new JObject
                    {
                        { "DeviceID", device.DeviceId },
                        { "ReceivedAtUtc", DateTime.UtcNow.ToString("s", CultureInfo.InvariantCulture) },
                    };

                    telemetryEvent.Add("Payload",BitConverter.ToString(message.Data));

                    using (Message telemetryMessage = new Message(Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(telemetryEvent))))
                    {
                        telemetryMessage.Properties.Add("iothub-creation-time-utc", message.HiveRxTime.ToString("s", CultureInfo.InvariantCulture));

                        await deviceClient.SendEventAsync(telemetryMessage);
                    };

                    //BumblebeeHiveClient.PacketPostReturn packetPostReturn = await client.AckRxMessageAsync(message.PacketId, null);
                }
            }

            foreach (BumblebeeHiveClient.Device device in devices)
            {
                DeviceClient deviceClient = await _DeviceClients.GetAsync<DeviceClient>(device.DeviceId.ToString());

                await deviceClient.CloseAsync();
            }
        }
    }

    private static async Task<DeviceClient> IoTHubConnectAsync(string deviceId)
    {
        DeviceClient deviceClient;

        deviceClient = DeviceClient.CreateFromConnectionString(AzureIoTHubConnectionString, deviceId, TransportSettings);

        await deviceClient.OpenAsync();

        return deviceClient;
    }

    private static readonly MemoryCacheEntryOptions memoryCacheEntryOptions = new MemoryCacheEntryOptions()
    {
        Priority = CacheItemPriority.NeverRemove
    };

    private static readonly ITransportSettings[] TransportSettings = new ITransportSettings[]
    {
        new AmqpTransportSettings(TransportType.Amqp_Tcp_Only)
        {
            AmqpConnectionPoolSettings = new AmqpConnectionPoolSettings()
            {
                Pooling = true,
            }
        }
    };
}

While testing I disabled the message RxAck functionality so I could repeatedly call the MessagesGet method so I didn’t have to send new messages and burn through my 50 free messages.

Azure IoT Explorer telemetry displaying the three messages processed by my console application.

.

Updated parameters based on feedback from Swarm technical support

Need to have status set to -1

.NET Core web API + Dapper – Authorisation of Data Access

The theme of this post is controlling users’ ability to read and write rows in a table. The best scenario I could come up with using the World Wide Importers database was around controlling access to Customer information.

This would be a representative set of “project requirements”…

  • Salespeople tend to look after categories of Customers
    • Kayla – Novelty Shops
    • Hudson – Supermarkets
    • Issabella – Computer Stores
    • Sophia – Gift Stores, Novelty Shops
    • Amy – Corporates
    • Anthony – Novelty Stores
    • Alica – Coporates
    • Stella – Supermarkets
  • But some Salespeople have direct relationships with Customers
    • Kayla – Corporate customers Eric Torres & Cosmina
    • Hudson – Tailspin Toys Head Office
    • Issabell – Tailspin Toys (Sylvanite, MT), Tailspin Toys (Sun River, MT), Tailspin Toys (Sylvanite, MT)
  • No changes to the database which could break the existing solution

In a previous engagement we added CustomerCategoryPerson and CustomerPerson like tables to the database to control read/write access to Customers’ information.

The CustomerCategoryPerson table links the CustomerCategory and Person tables with a flag (IsWritable) which indicates whether the Person can read/write Customer information for all the Customers in a CustomerCategory.

CREATE TABLE [Sales].[CustomerCategoryPerson](
	[CustomerCategoryPersonID] [int] IDENTITY(1,1) NOT NULL,
	[CustomerCategoryId] [int] NOT NULL,
	[PersonId] [int] NOT NULL,
	[IsWritable] [bit] NOT NULL,
	[LastUpdatedBy] [int] NOT NULL,
 CONSTRAINT [PK_CustomerCategoryPerson] PRIMARY KEY CLUSTERED 
(
	[CustomerCategoryPersonID] ASC
)...

The CustomerPerson table links the Customer and Person tables with a flag (IsWritable) which indicates whether a Person can read/write a Customer’s information.

CREATE TABLE [Sales].[CustomerPerson](
	[CustomerPersonId] [int] IDENTITY(1,1) NOT NULL,
	[CustomerID] [int] NOT NULL,
	[PersonId] [int] NOT NULL,
	[IsWritable] [bit] NOT NULL,
	[LastEditedBy] [int] NOT NULL,
 CONSTRAINT [PK_CustomerPerson] PRIMARY KEY CLUSTERED 
(
	[CustomerPersonId] ASC
)...

Users can do “wildcard” searches for Customers and the results set has to be limited to “their” customers and customers in the customer categories they are assigned too.

ALTER PROCEDURE [Sales].[CustomersNameSearchUnionV1]
@UserId as int,
@SearchText nvarchar(20),
@MaximumRowsToReturn int
AS
BEGIN
	-- Individual assignment
    SELECT TOP(@MaximumRowsToReturn) [Customers].[CustomerID] as "ID", [Customers].[CustomerName] as "Name", [Customers].[IsOnCreditHold] as "IsOnCreditHold"
    FROM Sales.Customers
	INNER JOIN [Sales].[CustomerPerson] ON ([Sales].[Customers].[CustomerId] = [Sales].[CustomerPerson].[CustomerId])
    WHERE ((CustomerName LIKE N'%' + @SearchText + N'%')
		AND ([Sales].[CustomerPerson].PersonId = @UserId))
    --ORDER BY [CustomerName]

	UNION 
	
	-- group assignment
   SELECT TOP(@MaximumRowsToReturn) [Customers].[CustomerID] as "ID", [Customers].[CustomerName] as "Name", [Customers].[IsOnCreditHold] as "IsOnCreditHold"
   FROM [Sales].[Customers]
      INNER JOIN [Sales].[CustomerCategories] ON ([Sales].[Customers].[CustomerCategoryID] = [Sales].[CustomerCategories].[CustomerCategoryID])
      INNER JOIN [Sales].[CustomerCategoryPerson] ON ([Sales].[Customers].[CustomerCategoryID] = [CustomerCategoryPerson].[CustomerCategoryID])
    WHERE ((CustomerName LIKE N'%' + @SearchText + N'%')
		AND ([Sales].[CustomerCategoryPerson].PersonId = @UserId))

END;

This approach increases the complexity and reduces the maintainability of stored procedures which have to control the reading/writing of Customer information. Several times I have extracted customer information read\write controls to a couple of database views, one for controlling read access.

CREATE VIEW [Sales].[CustomerPersonReadV1]
AS
-- Individual assignment
   SELECT [Sales].[Customers].[CustomerID], [Sales].[CustomerPerson].[PersonID], [Sales].[Customers].[CustomerCategoryID]
   FROM [Sales].[Customers]
      INNER JOIN [Sales].[CustomerPerson] ON ( [Sales].[Customers].[CustomerID] = CustomerPerson.CustomerID)

UNION -- Takes care of duplicates

-- Group assignment
   SELECT [Sales].[Customers].[CustomerID], [Sales].[CustomerCategoryPerson].[PersonID], [Sales].[Customers].[CustomerCategoryID]
   FROM [Sales].[Customers]
      --INNER JOIN [Sales].[CustomerCategories] ON ([Sales].[Customers].[CustomerCategoryID] = [Sales].[CustomerCategories].[CustomerCategoryID])
      INNER JOIN [Sales].[CustomerCategoryPerson] ON ([Sales].[Customers].[CustomerCategoryID] = [CustomerCategoryPerson].[CustomerCategoryID])

The other database for controlling write access

CREATE VIEW [Sales].[CustomerPersonWriteV1]
AS
-- Individual assignment
   SELECT [Sales].[Customers].[CustomerID], [Sales].[CustomerPerson].[PersonID], [Sales].[Customers].[CustomerCategoryID]
   FROM [Sales].[Customers]
      INNER JOIN [Sales].[CustomerPerson] ON (([Sales].[Customers].[CustomerID] = [CustomerPerson].[CustomerID]) AND ([Sales].[CustomerPerson].[IsWritable] = 1))

UNION -- Takes care of duplicates

-- Group assignment
   SELECT [Sales].[Customers].[CustomerID], [Sales].[CustomerCategoryPerson].[PersonID], [Sales].[Customers].[CustomerCategoryID]
   FROM [Sales].[Customers]
      INNER JOIN [Sales].[CustomerCategories] ON ([Sales].[Customers].[CustomerCategoryID] = [Sales].[CustomerCategories].[CustomerCategoryID])
      INNER JOIN [Sales].[CustomerCategoryPerson] ON ([Sales].[Customers].[CustomerCategoryID] = [CustomerCategoryPerson].[CustomerCategoryID] AND ([Sales].[CustomerCategoryPerson].[IsWritable] = 1))

The versioning of database views uses the same approach as stored procedures. When a view is updated (the columns returned changes , updated constraints etc.) the version number is incremented. Then we work through the dependencies list checking and updating the view version used and re-testing.

SQL Server Management Studio displaying objects which depend on the view

These two views are the UNION of the users individual and group access permissions. (If a user has Write they also have Read access). This reduces the complexity of stored procedures used for reading from and writing to the Customer table.

CREATE PROCEDURE [Sales].[CustomersListV1]
@UserId as int
AS
BEGIN
SELECT [Customers].[CustomerID] as "ID", [Customers].[CustomerName] as "Name", [Customers].[IsOnCreditHold] as "IsOnCreditHold"
	FROM [Sales].[Customers]
		INNER JOIN [Sales].[CustomerPersonReadV1] ON ([Sales].[Customers].[CustomerID] = [Sales].[CustomerPersonReadV1].CustomerID)
    WHERE ([Sales].[CustomerPersonReadV1].PersonId = @UserId)
	ORDER BY Name
END

The GET method of the Customer controller returns a list of all the Customers the current user has read only access to using their individual and group assignment.

[HttpGet(), Authorize(Roles = "SalesPerson,SalesAdministrator")]
[ProducesResponseType(StatusCodes.Status200OK, Type = typeof(List<Models.CustomerListDtoV1>))]
public async Task<ActionResult<IEnumerable<Models.CustomerListDtoV1>>> Get()
{
      IEnumerable<Models.CustomerListDtoV1> response;

      using (SqlConnection db = new SqlConnection(this.connectionString))
      {
         response = await db.QueryWithRetryAsync<Models.CustomerListDtoV1>(sql: "[Sales].[CustomersListV1]", param: new { userId = HttpContext.PersonId() }, commandType: CommandType.StoredProcedure);
      }

   return this.Ok(response);
}

The CustomerPersonWriteV1 view is used to stop users without IsWritable set (individual or group) updating a Customers IsOnCreditHold flag.

CREATE PROCEDURE [Sales].[CustomerCreditHoldStatusUpdateV1]
@UserID int,
@CustomerId int,
@IsOnCreditHold Bit
AS
BEGIN
    UPDATE [Sales].[Customers]
    SET IsOnCreditHold = @IsOnCreditHold, LastEditedBy = @UserID
	FROM [Sales].[Customers]
		INNER JOIN [Sales].[CustomerPersonWriteV1] ON ([Sales].[Customers].[CustomerID] = [Sales].[CustomerPersonWriteV1].CustomerID)
    WHERE (([Sales].[CustomerPersonWriteV1].PersonId = @UserId) 
		AND ([Sales].[Customers].[CustomerID] = @CustomerId )
		AND (IsOnCreditHold <> @IsOnCreditHold))
	
END

The PUT CreditHold method uses a combination of roles (Aministrator,SalesAdministrator,SalesPerson) and database views (CustomerPersonWriteV1) to control the updating of customer data.

[HttpPut("{customerId}/CreditStatus", Name ="CreditHold")]
[Authorize(Roles = "Aministrator,SalesAdministrator,SalesPerson")]
[ProducesResponseType(StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status409Conflict)]
public async Task<IActionResult> CustomerCeditHold(int customerId, [FromBody] Models.CustomerCreditHoldUpdateV1 request )
{
    request.UserId = HttpContext.PersonId();
    request.CustomerId = customerId;

    using (SqlConnection db = new SqlConnection(connectionString))
    {
        if (await db.ExecuteWithRetryAsync("[Sales].[CustomerCreditHoldStatusUpdateV1]", param: request, commandType: CommandType.StoredProcedure) != 1)
        {
            logger.LogWarning("Person {UserId} Customer {CustomerId} IsOnCreditHold {IsOnCreditHold} update failed", request.UserId, request.CustomerId, request.IsOnCreditHold);

            return this.Conflict();
        }
    }

    return this.Ok();
}

My customers usually don’t have a lot of automated testing so minimising the impact of changes across the database and codebase is critical. Sometimes we duplicate code (definitely not DRY) so that the amount of functionality that has to be retested is reduced. We ensure this is time allocated for revisiting these decisions and remediating as required.

.NET Core web API + Dapper – Authorisation Permissions

The permissions required for an on-premises system running in a trusted environment are often minimalist. The World Wide Importers database People table has IsSystemUser, IsEmployee, IsSalesperson which is representative of the granularity of permissions I have encountered in Windows Forms .NET, ASP.NET Web Forms and other “legacy” applications.

CREATE TABLE [Application].[People](
    [PersonID] [int] NOT NULL,
    [FullName] nvarchar NOT NULL,
    [PreferredName] nvarchar NOT NULL,
    [SearchName] AS (concat([PreferredName],N' ',[FullName])) PERSISTED NOT NULL,
    [IsPermittedToLogon] [bit] NOT NULL,
    [LogonName] nvarchar NULL,
    [IsExternalLogonProvider] [bit] NOT NULL,
    [HashedPassword] varbinary NULL,
    [IsSystemUser] [bit] NOT NULL,
    [IsEmployee] [bit] NOT NULL,
    [IsSalesperson] [bit] NOT NULL,
    [UserPreferences] nvarchar NULL,
    [PhoneNumber] nvarchar NULL,
    [FaxNumber] nvarchar NULL,
    [EmailAddress] nvarchar NULL,
    [Photo] varbinary NULL,
    [CustomFields] nvarchar NULL,
    [OtherLanguages] AS (json_query([CustomFields],N'$.OtherLanguages')),
    [LastEditedBy] [int] NOT NULL,
    [ValidFrom] datetime2 GENERATED ALWAYS AS ROW START NOT NULL,
    [ValidTo] datetime2 GENERATED ALWAYS AS ROW END NOT NULL,
CONSTRAINT [PK_Application_People] PRIMARY KEY CLUSTERED
(
    [PersonID] ASC
)

The existing application appears to have a method for a Person to change their password which calls the [Website].[ChangePassword] stored procedure (I was surprised that the stored procedure didn’t set the LastEditedBy value).

CREATE PROCEDURE [Website].[ChangePassword]
@PersonID int,
@OldPassword nvarchar(40),
@NewPassword nvarchar(40)
WITH EXECUTE AS OWNER
AS
BEGIN
    SET NOCOUNT ON;
    SET XACT_ABORT ON;

    UPDATE [Application].People
    SET IsPermittedToLogon = 1,
        HashedPassword = HASHBYTES(N'SHA2_256', @NewPassword + FullName)
    WHERE PersonID = @PersonID
    AND PersonID <> 1
    AND HashedPassword = HASHBYTES(N'SHA2_256', @OldPassword + FullName);

    IF @@ROWCOUNT = 0
    BEGIN
        PRINT N'The PersonID must be valid, and the old password must be valid.';
        PRINT N'If the user has also changed name, please contact the IT staff to assist.';
        THROW 51000, N'Invalid Password Change', 1;
        RETURN -1;
    END;
END;

The new version removes the PersonId special case (Assumed that PersonId 1 can’t logon and the use of Throw). I think the use of Throw can add significant complexity to the exception handling of the WebAPI controller that calls the stored procedure

ALTER PROCEDURE [Website].[PersonPasswordChangeV1]
	@UserID int,
	@PasswordOld nvarchar(40),
	@PasswordNew nvarchar(40)
WITH EXECUTE AS OWNER
AS
BEGIN
    UPDATE [Application].People
    SET IsPermittedToLogon = 1
        ,HashedPassword = HASHBYTES(N'SHA2_256', @PasswordNew + FullName)
		,LastEditedBy = @UserID
    WHERE ((PersonID = @UserID )
		AND (HashedPassword = HASHBYTES(N'SHA2_256', @PasswordOld + FullName)))
END;

The PasswordChange method of the Person Controller only requires the caller to be authenticated.

/// <summary>
/// Changes current user's password.
/// </summary>
/// <param name="request">Current password and new password</param>
/// <response code="200">Password changed.</response>
/// <response code="401">Unauthorised, bearer token missing or expired.</response>
/// <response code="409">Previous password invalid or User name has changed.</response>
[Authorize()]
[HttpPut(Name = "PasswordChange")]
public async Task<ActionResult> PasswordChange([FromBody] Models.PersonPasswordChangeRequest request)
{
    request.UserID = HttpContext.PersonId();

    using (SqlConnection db = new SqlConnection(connectionString))
    {
        if (await db.ExecuteWithRetryAsync("[WebSite].[PersonPasswordChangeV1]", param: request, commandType: CommandType.StoredProcedure) != 1)
        {
            logger.LogWarning("Person {0} password change failed", request.UserID);

            return this.Conflict();
        }
    }

    return this.Ok();
}

The new application will have functionality for resetting a Person’s password. Access to this functionality will be restricted to people with the “Administrator” and “PasswordReset” roles.

CREATE PROCEDURE [Website].[PersonPasswordResetV1]
@UserID int,
@PersonID int,
@Password nvarchar(40)
WITH EXECUTE AS OWNER
AS
BEGIN
    UPDATE [Application].People
    SET IsPermittedToLogon = 1
        ,HashedPassword = HASHBYTES(N'SHA2_256', @Password + FullName)
		,LastEditedBy = @UserID
    WHERE PersonID = @PersonID
END;

One of the conventions we often use, is that the first parameter of any stored procedure that is called once a User has logged on is their unique identifier which is used for data access permissions and change tracking.

[Authorize(Roles = "Administrator")]
[HttpPut("{personId:int}", Name = "PasswordReset")]
public async Task<ActionResult> PasswordReset([Range(1, int.MaxValue, ErrorMessage = "Person id must greater than 1")] int personId, [FromBody] Models.PersonPasswordResetRequest request)
{
    request.UserId = HttpContext.PersonId();
    request.PersonID = personId;

    using (SqlConnection db = new SqlConnection(connectionString))
    {
        if (await db.ExecuteWithRetryAsync("[WebSite].[PersonPasswordResetV1]", param: request, commandType: CommandType.StoredProcedure) != 1)
        {
            logger.LogWarning("Person {0} password change failed", request.PersonID);

            return this.Conflict();
        }
    }

    return this.Ok();
}

For a couple of applications, we have added “Permissions” and “PersonPermissions” tables alongside the existing authorisation functionality to reduce the likely hood of any unintended side effects.

CREATE TABLE [Application].[Permissions](
	[PermissionID] [int] IDENTITY(1,1) NOT NULL,
	[Name] [nvarchar](20) NOT NULL,
	[Description] [nvarchar](50) NOT NULL,
	[LastEditedBy] [int] NOT NULL,
	[ValidFrom] [datetime2](7) GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
	[ValidUntil] [datetime2](7) GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
 CONSTRAINT [PK_Permissions] PRIMARY KEY CLUSTERED 
(
	[PermissionID] ASC
)...

We try and keep the names of the permissions short, so the token doesn’t get too large.

CREATE TABLE [Application].[PersonPermissions](
	[PersonPermissionId] [int] IDENTITY(1,1) NOT NULL,
	[PersonId] [int] NOT NULL,
	[PermIssionId] [int] NOT NULL,
	[Active] [bit] NOT NULL,
	[ValidFrom] [datetime2](7) GENERATED ALWAYS AS ROW START HIDDEN NOT NULL,
	[ValidUntil] [datetime2](7) GENERATED ALWAYS AS ROW END HIDDEN NOT NULL,
 CONSTRAINT [PK_PersonPermissions] PRIMARY KEY CLUSTERED 
(
	[PersonPermissionId] ASC
)...
Permissions, PersonPermissions and People with Foreign Keys

The additional permissions (from the Person record) and the PersonPermissions table required some modifications to the PersonAuthenticateLookupByLogonNameV1 stored procedure and the addition of the PersonPermissionsByPersonIdV1 stored procedure.

ALTER PROCEDURE [Website].[PersonAuthenticateLookupByLogonNameV2]
@LogonName nvarchar(50),
@Password nvarchar(40)
AS
BEGIN
	SELECT PersonID
		,FullName
		,EmailAddress
		,IsSystemUser
		,IsEmployee
		,IsSalesPerson
	FROM [Application].[People]
	WHERE (( LogonName = @LogonName)
		AND (IsPermittedToLogon = 1)
		AND (HASHBYTES(N'SHA2_256', @Password + FullName) = HashedPassword))

The IsSystemUser, IsEmployee and IsSalesPerson bit flags were added to the stored procedure and Data Transfer Object(DTO)

private class PersonAuthenticateLogonDetailsDto
{
    public int PersonID { get; set; }    

    public string FullName { get; set; }

    public string EmailAddress { get; set; }

    public bool IsSystemUser { get; set; }

    public bool IsEmployee { get; set; }

    public bool IsSalesPerson { get; set; }
}

The PersonPermissionsByPersonIdV1 retrieves a list of the permissions of the User who has been authenticated.

ALTER PROCEDURE [Website].[PersonPermissionsByPersonIdV1]
	@PersonId AS int
AS
BEGIN

	SELECT [Application].[Permissions].[Name]
	FROM [Application].[Permissions]
		INNER JOIN [Application].[PersonPermissions] ON ([Application].[Permissions].PermissionID = [Application].[PersonPermissions].[PermissionId] )
	WHERE [Application].[PersonPermissions].[PersonId] = @PersonId
	ORDER BY [Application].[Permissions].[Name]

END

The Person’s permissions(effectively roles) are added as claims, the IsSystemUser, IsEmployee and IsSalesPerson flags are also added to the list of claims so they can be used in the new application.

[HttpPost("logon")]
public async Task<ActionResult> Logon([FromBody] Models.LogonRequest request )
{
    PersonAuthenticateLogonDetailsDto userLogonUserDetails;
    IEnumerable<string> permissions;
    var claims = new List<Claim>();

    using (SqlConnection db = new SqlConnection(configuration.GetConnectionString("WorldWideImportersDatabase")))
    {
        userLogonUserDetails = await db.QuerySingleOrDefaultWithRetryAsync<PersonAuthenticateLogonDetailsDto>("[Website].[PersonAuthenticateLookupByLogonNameV2]", param: request, commandType: CommandType.StoredProcedure);
        if (userLogonUserDetails == null)
        {
            logger.LogWarning("Login attempt by user {0} failed", request.LogonName);

           return this.Unauthorized();
        }

        // Lookup the Person's permissions
        permissions = await db.QueryWithRetryAsync<string>("[Website].[PersonPermissionsByPersonIdV1]", new { userLogonUserDetails.PersonID }, commandType: CommandType.StoredProcedure);
    }

    // Setup the primary SID + name info
    claims.Add(new Claim(ClaimTypes.PrimarySid, userLogonUserDetails.PersonID.ToString()));
    if (userLogonUserDetails.IsSystemUser)
    {
       claims.Add(new Claim(ClaimTypes.Role, "SystemUser"));
    }
    if (userLogonUserDetails.IsEmployee)
    {
       claims.Add(new Claim(ClaimTypes.Role, "Employee"));
    }
    if (userLogonUserDetails.IsSalesPerson)
    {
        claims.Add(new Claim(ClaimTypes.Role, "SalesPerson"));
    }

    foreach(string permission in permissions)
    {
        claims.Add(new Claim(ClaimTypes.Role, permission));
    }

    var authSigningKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(jwtIssuerOptions.SecretKey));

    var token = new JwtSecurityToken(
         issuer: jwtIssuerOptions.Issuer,
          audience: jwtIssuerOptions.Audience, 
          expires: DateTime.UtcNow.Add(jwtIssuerOptions.TokenExpiresAfter),
          claims: claims,
          signingCredentials: new SigningCredentials(authSigningKey, SecurityAlgorithms.HmacSha256));
                
    return this.Ok(new
   {
        token = new JwtSecurityTokenHandler().WriteToken(token),
        expiration = token.ValidTo,
    });
}

We try to reduce the number of roles a User requires by having core roles (Administrator, Sales consultant, Warehouse administrator etc.) with additional roles for each task that can be added as required (ResetPassword, CustomerIsOnCreditHold Set/Clear etc.)

The Get Method of Authorisation controller returns a list of the User’s Roles which can be used to enable/disable functionality of the user interface.

/// <summary>
/// Gets a list of the current User's roles.
/// </summary>
/// <response code="200">List of claims returned.</response>
/// <response code="401">Unauthorised, bearer token missing or expired.</response>
/// <returns>list of claims.</returns>
[HttpGet]
[ProducesResponseType(StatusCodes.Status200OK, Type = typeof(List<string>))]
public List<string> Get()
{
    List<string> claimNames = new List<string>();

    foreach (var claim in this.User.Claims.Where(c => c.Type == ClaimTypes.Role))
    {
        claimNames.Add(claim.Value);
    }

    return claimNames;
}

We have found this approach to be a robust way to add granular authorisation for new functionality to a “legacy’ application without breaking the existing solution. I have ignored a user being disabled after a number of failed logins, password complexity rules etc. as these tend to be application specific and not really related to the use of Dapper.

Most blog posts talk about building “green fields” applications, I have found hardly any cover “muddy fields” development where you have to deal with “legacy” code.

Not all “legacy” code is bad, I work on one code base which is nearly 20years old. It started as a spreadsheet plug-in and has grown of time to a SaaS application. There is very little of the original code left it has just been carefully re-factored over the years with time allocated to chip away at technical debt.

.NET Core web API + Dapper – Authentication

A couple of recent contracts have been maintaining and “remediating” legacy codebases which have been in production for upto a decade. The applications are delivering business value (can’t stop working) and the customer’s budgets are limited (they can only afford incremental change). As a result of this we end up making tactical decisions to keep the application working and longer-term ones to improve the “ilities“(taking into account the customer’s priorities).

It is rare to have a “green fields” project, so my plan was to use the next couple of WebAPI + Dapper posts to illustrate the sort of challenges we have encountered.

Customer: “we want to put a nice webby frontend on our existing bespoke solution” (“putting lipstick on a pig”).

Customer: The user’s login details are stored in the database and we can’t change the login process as the help desk won’t cope.

The remediation of Authentication and Authorisation(A&A) functionality can be particularly painful and is often driven by compliance issues e.g. EU GDPR, The Privacy Act 202 etc.

The World Wide Importers database would be a representative example of databases we have worked with.

CREATE TABLE [Application].[People](
	[PersonID] [int] NOT NULL,
	[FullName] [nvarchar](50) NOT NULL,
	[PreferredName] [nvarchar](50) NOT NULL,
	[SearchName]  AS (concat([PreferredName],N' ',[FullName])) PERSISTED NOT NULL,
	[IsPermittedToLogon] [bit] NOT NULL,
	[LogonName] [nvarchar](50) NULL,
	[IsExternalLogonProvider] [bit] NOT NULL,
	[HashedPassword] [varbinary](max) NULL,
	[IsSystemUser] [bit] NOT NULL,
	[IsEmployee] [bit] NOT NULL,
	[IsSalesperson] [bit] NOT NULL,
	[UserPreferences] [nvarchar](max) NULL,
	[PhoneNumber] [nvarchar](20) NULL,
	[FaxNumber] [nvarchar](20) NULL,
	[EmailAddress] [nvarchar](256) NULL,
	[Photo] [varbinary](max) NULL,
	[CustomFields] [nvarchar](max) NULL,
	[OtherLanguages]  AS (json_query([CustomFields],N'$.OtherLanguages')),
	[LastEditedBy] [int] NOT NULL,
	[ValidFrom] [datetime2](7) GENERATED ALWAYS AS ROW START NOT NULL,
	[ValidTo] [datetime2](7) GENERATED ALWAYS AS ROW END NOT NULL,
 CONSTRAINT [PK_Application_People] PRIMARY KEY CLUSTERED 
(
	[PersonID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [USERDATA],
	PERIOD FOR SYSTEM_TIME ([ValidFrom], [ValidTo])
) ON [USERDATA] TEXTIMAGE_ON [USERDATA]
WITH
(
SYSTEM_VERSIONING = ON (HISTORY_TABLE = [Application].[People_Archive])
)

Initial observations

  • The LogonName column doesn’t have an index and uniqueness is not enforced which is a bit odd.
  • The table has SYSTEM_VERSIONING enabled so any structural changes are going to be hard work.
  • There are a couple of computed columns, so we need to be careful with any changes.
  • A password hash is a varbinary max column, so we need to figure out how this is generated and updated
  • Surprising number of nullable columns
  • The code associated with IsExternalLogonProvider needs to be investigated.
  • Looks like the granularity of permissions i.e. IsSystemUser, IsEmployee, IsSalesperson is low.
  • The database must be old there is a FaxNumber column.
  • Looks like there are internal and external (maybe IsSystemUser, IsEmployee, IsSalesperson are all false) people.
  • The PersonId is a Sequence rather than an Identity column which is unusual.
  • IsPermittedToLogin indicates that login process might be a bit more complex than expected
  • The terms Login and Logon appear to be used interchangeably.
  • No lockout after several failed logon attempts, lockout until etc. functionality.
  • No concurrency control (optimistic or pessimistic) for updates (with TimeStamp or Version column) so last update wins.

The next step would be to have a look at the contents of the People table with SQL Server Management Studio(SSMS)

SQL Server Management Studio [Application].[Person] table

Initial observations

  • Looks like the system administrators are in the first couple of rows. (makes Indirect Object Reference Attack easier).
  • Lots of NULL values which often makes application code more complex
  • Duplicates in LoginName column e.g. “NO LOGON”
  • Some “magic” values e.g. “NO LOGON”
  • Why does the “Data Conversion Only” person have a photo?
  • The IsExternalLogonProvider is always false.
  • The UserPreferences, CustomFields and OtherLanguages columns contain Java Script Object Notation(JSON), need to see how these a generated, updated and used.

The application must have an existing external user provisioning process. It looks like a Person record is created by an Administrator then the User sets their password on first Logon. (not certain if there are any password complexity rules enforced)

An application I worked on didn’t have any enforcement of password complexity and minimum length in earlier versions. This caused issues when a number of their clients couldn’t logon to the new application because their existing password was too short. We updated the logon field rules and retained minimum complexity and length rules on change and reset password field validation. We then forced 5-10% of users per month (so the helpdesk wasn’t overwhelmed by support calls) to update their passwords.

ALTER PROCEDURE [Website].[ActivateWebsiteLogon]
@PersonID int,
@LogonName nvarchar(50),
@InitialPassword nvarchar(40)
WITH EXECUTE AS OWNER
AS
BEGIN
    SET NOCOUNT ON;
    SET XACT_ABORT ON;

    UPDATE [Application].People
    SET IsPermittedToLogon = 1,
        LogonName = @LogonName,
        HashedPassword = HASHBYTES(N'SHA2_256', @InitialPassword + FullName),
        UserPreferences = (SELECT UserPreferences FROM [Application].People WHERE PersonID = 1) -- Person 1 has User Preferences template
    WHERE PersonID = @PersonID
    AND PersonID <> 1
    AND IsPermittedToLogon = 0;

    IF @@ROWCOUNT = 0
    BEGIN
        PRINT N'The PersonID must be valid, must not be person 1, and must not already be enabled';
        THROW 51000, N'Invalid PersonID', 1;
        RETURN -1;
    END;
END;

Initial observations

  • The password hashing uses SHA2_256 which is good
  • The password hash is seeded with the persons FullName, this could be a problem if a user changes their name.
  • The user selects their LoginName, this needs further investigate as duplicates could be an issue
  • XACT_ABORT ON + THROW for validation and state management is odd, need to check how SqlExceptions are handled in application code.
  • PersonID magic number handling adds complexity and needs further investigation.
  • EXECUTE AS OWNER caught my attention, checked only one Database user for application.
  • The LastEditBy isn’t set to the PersonID which seems a bit odd.

Based on the [Website].[ActivateWebsiteLogon] (couldn’t find Logon so reviewed ChangePassword) this is my first attempt at a stored procedure which validates a user’s LogonName and Password.

CREATE PROCEDURE [Application].[PersonAuthenticateLookupByLogonNameV1]
@LogonName nvarchar(50),
@Password nvarchar(40)
AS
BEGIN
	SELECT PersonID, FullName, EmailAddress
	FROM [Application].[People]
	WHERE (( LogonName = @LogonName)
		AND (IsPermittedToLogon = 1)
		AND (HASHBYTES(N'SHA2_256', @Password + FullName) = HashedPassword))
END
GO

We use the ..VX approach to reduce issues when doing canary and rolling deployments. If the parameter list or return dataset of a stored procedure changes (even if we think it is backwards compatible) the version number is incremented so that different versions of the application can be run concurrently and backing out application updates is less fraught.

The next step was to create a Data Transfer Object(DTO) for the Logon request payload

public class LogonRequest
{
    [JsonRequired]
    [MinLength(Constants.LogonNameLengthMinimum)]
    [MaxLength(Constants.LogonNameLengthMaximum)]
    public string LogonName { get; set; }

    [JsonRequired]
    [MinLength(Constants.PasswordLengthMinimum)]
    [MaxLength(Constants.PasswordLengthMaximum)]
    public string Password { get; set; }
}

Then a Proof of Concept (PoC) AuthenticationController to process a login request.

Swagger Docs Authentication controller Login

The Logon method calls the PersonAuthenticateLookupByLogonNameV1 stored procedure to validate the LoginName and password. In this iteration the only claim added to the JSON Web Token(JWT) is the PersonId. We try and keep the JWTs small as possible as one customer’s application failed randomly because a couple of user’s JWTs were so large (lots of roles) that some versions of browsers choked.

public AuthenticationController(IConfiguration configuration, ILogger<AuthenticationController> logger, IOptions<Model.JwtIssuerOptions> jwtIssuerOptions)
{
    this.configuration = configuration;
    this.logger = logger;
    this.jwtIssuerOptions = jwtIssuerOptions.Value;
 }

[HttpPost("login")]
public async Task<ActionResult> Login([FromBody] Model.LoginRequest request )
{
    var claims = new List<Claim>();

    using (SqlConnection db = new SqlConnection(configuration.GetConnectionString("WorldWideImportersDatabase")))
    {
        UserLogonUserDetailsDto userLogonUserDetails = await db.QuerySingleOrDefaultWithRetryAsync<UserLogonUserDetailsDto>("[Application].[PersonAuthenticateLookupByLogonNameV1]", param: request, commandType: CommandType.StoredProcedure);
        if (userLogonUserDetails == null)
        {
             logger.LogWarning("Login attempt by user {0} failed", request.LogonName);

             return this.Unauthorized();
        }

        // Setup the primary SID + name info
        claims.Add(new Claim(ClaimTypes.PrimarySid, userLogonUserDetails.PersonID.ToString()));
    }

    var authSigningKey = new SymmetricSecurityKey(Encoding.UTF8.GetBytes(jwtIssuerOptions.SecretKey));

    var token = new JwtSecurityToken(
              issuer: jwtIssuerOptions.Issuer,
              audience: jwtIssuerOptions.Audience, 
              expires: DateTime.UtcNow.Add(jwtIssuerOptions.TokenExpiresAfter),
              claims: claims,
              signingCredentials: new SigningCredentials(authSigningKey, SecurityAlgorithms.HmacSha256));
                
        return this.Ok(new
        {
            token = new JwtSecurityTokenHandler().WriteToken(token),
            expiration = token.ValidTo,
        });
    }
}

After a successful logon the Token has to be copied (I regularly miss the first or the last character) from the response payload to the Authorisation form.

The Swagger UI Authentication method after a successful Logon with the bearer token highlighted

I decorated the SystemController DeploymentVersion deployment with the [Authorize] attribute to force a check that the user is authenticated.

/// <summary>
/// WebAPI controller for handling System Dapper functionality.
/// </summary>
[Route("api/[controller]")]
[ApiController]
public class SystemController : ControllerBase
{
    /// <summary>
    /// Returns the Application version in [Major].[Minor].[Build].Revision] format.
    /// </summary>
    /// <response code="200">List of claims returned.</response>
    /// <response code="401">Unauthorised, bearer token missing or expired.</response>
    /// <returns>Returns the Application version in [Major].[Minor].[Build].Revision] format.</returns>
    [HttpGet("DeploymentVersion"), Authorize]
    public string DeploymentVersion()
    {
        return Assembly.GetExecutingAssembly().GetName().Version.ToString();
    }
}

If the bearer token is missing, invalid (I accidentally didn’t copy either the first or last character) or expired the method call will fail with a 401 Unauthorized error.

Swagger UI System Controller DeploymentVersion method failing because the JWT is missing, invalid or expired

Controlling access to controllers and methods of controllers is probably not granular for most applications so adding “coarse” and “fine grained” authorisation to an existing application and the configuration of Swashbuckle and application request processing middleware to support JWTs will be covered in a couple of future posts.

For remediation projects we try to keep the code as simple as possible (but no simpler), by minimising the plumbing and only using advanced language features etc. where it adds value.

I’m leaning towards using Dependency Injection for configuration information, so the way connection strings and jwtIssuerOptions is going to be harmonised. In a future version of the application the jwtIssuerOptions will be migrated to an Azure Key Vault.

.NET nanoFramework RAK11200 – Azure IoT Hub HTTP Power conservation

My test setup was a RAK11200 WisBlock WiFi Module, RAK19007 WisBlock Base Board, RAK1901 WisBlock Temperature and Humidity Sensor and Keweisi KWS-MX19 USB Tester DC 4V-30V 0-5A Current Voltage Detector to measure the power consumption of my test setup.

RAK11200 + RAK19007 +RAK1901+Keweisi KWS-MX19 test setup

The baseline version of the RAK11200 WisBlock WiFi Module software had no power conservation functionality.

public static void Main()
{
    DateTime sasTokenValidUntilUtc = DateTime.UtcNow;

    Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoTHub.RAK11200.PowerConservation starting");

    Configuration.SetPinFunction(Gpio.IO04, DeviceFunction.I2C1_DATA);
    Configuration.SetPinFunction(Gpio.IO05, DeviceFunction.I2C1_CLOCK);

    if (!WifiNetworkHelper.ConnectDhcp(Config.Ssid, Config.Password, requiresDateTime: true))
    {
        if (NetworkHelper.HelperException != null)
        {
             Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} WifiNetworkHelper.ConnectDhcp failed {NetworkHelper.HelperException}");
        }

        Thread.Sleep(Timeout.Infinite);
    }

    string uri = $"{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}";

    // not setting Authorization here as it will change as SAS Token refreshed
    HttpClient httpClient = new HttpClient
    {
        SslProtocols = System.Net.Security.SslProtocols.Tls12,
        HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
        BaseAddress = new Uri($"https://{uri}/messages/events?api-version=2020-03-13"),
    };

    I2cConnectionSettings settings = new(I2cDeviceBusID, Shtc3.DefaultI2cAddress);
    I2cDevice device = I2cDevice.Create(settings);
    Shtc3 shtc3 = new(device);

    AdcController adcController = new AdcController();
    AdcChannel batteryChargeAdcChannel = adcController.OpenChannel(AdcControllerChannel);

    string sasToken = "";

    while (true)
    {
        DateTime standardisedUtcNow = DateTime.UtcNow;

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub device {Config.DeviceID} telemetry update start");

        if (sasTokenValidUntilUtc <= standardisedUtcNow)
        {
            sasTokenValidUntilUtc = standardisedUtcNow.Add(Config.SasTokenRenewEvery);

            sasToken = SasTokenGenerate(uri, Config.Key, sasTokenValidUntilUtc);

            Debug.WriteLine($" Renewing SAS token for {Config.SasTokenRenewFor} valid until {sasTokenValidUntilUtc:HH:mm:ss dd-MM-yy}");
        }

        if (!shtc3.TryGetTemperatureAndHumidity(out var temperature, out var relativeHumidity))
        {
            Debug.WriteLine($" Temperature and Humidity read failed");

            continue;
        }

        double batteryCharge = batteryChargeAdcChannel.ReadRatio() * 100.0;

        Debug.WriteLine($" Temperature {temperature.DegreesCelsius:F1}°C Humidity {relativeHumidity.Value:F0}% BatteryCharge {batteryCharge:F1}%");

        string payload = $"{{\"RelativeHumidity\":{relativeHumidity.Value:F0},\"Temperature\":{temperature.DegreesCelsius.ToString("F1")}, \"BatteryCharge\":{batteryCharge:F1}}}";

        try
        {
            using (HttpContent content = new StringContent(payload))
            {
                content.Headers.Add("Authorization", sasToken);

                using (HttpResponseMessage response = httpClient.Post("", content))
                {
                    Console.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Response code:{response.StatusCode}");

                    response.EnsureSuccessStatusCode();
                 }
            }
        }
        catch (Exception ex)
        {
            Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub POST failed:{ex.Message} {ex?.InnerException?.Message}");
        }

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub telemetry update done");

        Thread.Sleep(Config.TelemetryUploadInterval);
    }
}

When the program was “idle” the current varied between 0.067A to 0.074A with “spikes” when transmitting.

The second version of the application could be configured to “sleep” the RAK11200 WisBlock WiFi Module and RAK1901 WisBlock Temperature and Humidity Sensor. The RAK11200 WisBlock WiFi Module can be put into a LightSleep or DeepSleep.

public static void Main()
{
    Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoTHub.RAK11200.PowerSleep starting");

    Thread.Sleep(5000);

    try
    {
        Configuration.SetPinFunction(Gpio.IO04, DeviceFunction.I2C1_DATA);
        Configuration.SetPinFunction(Gpio.IO05, DeviceFunction.I2C1_CLOCK);

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Wifi connecting");

        if (!WifiNetworkHelper.ConnectDhcp(Config.Ssid, Config.Password, requiresDateTime: true))
       {
            if (NetworkHelper.HelperException != null)
            {
                 Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} WifiNetworkHelper.ConnectDhcp failed {NetworkHelper.HelperException}");
            }

            Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
            Sleep.StartDeepSleep();
        }

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Wifi connected");

        // Configure the SHTC3 
        I2cConnectionSettings settings = new(I2cDeviceBusID, Shtc3.DefaultI2cAddress);
        I2cDevice device = I2cDevice.Create(settings);
        Shtc3 shtc3 = new(device);

        // Assuming that if TryGetTemperatureAndHumidity fails accessing temperature or relativeHumidity will cause an exception
        shtc3.TryGetTemperatureAndHumidity(out var temperature, out var relativeHumidity);

#if SLEEP_SHT3C
        shtc3.Sleep();
#endif

        // Configure Analog input (AIN0) port then read the "battery charge"
        AdcController adcController = new AdcController();
        AdcChannel batteryChargeAdcChannel = adcController.OpenChannel(AdcControllerChannel);

        double batteryCharge = batteryChargeAdcChannel.ReadRatio() * 100.0;

        Debug.WriteLine($" Temperature {temperature.DegreesCelsius:F1}°C Humidity {relativeHumidity.Value:F0}% BatteryCharge {batteryCharge:F1}");

        // Assemble the JSON payload, should use nanoFramework.Json
        string payload = $"{{\"RelativeHumidity\":{relativeHumidity.Value:F0},\"Temperature\":{temperature.DegreesCelsius.ToString("F1")}, \"BatteryCharge\":{batteryCharge:F1}}}";

        // Configure the HttpClient uri, certificate, and authorization
        string uri = $"{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}";

        HttpClient httpClient = new HttpClient()
        {
            SslProtocols = System.Net.Security.SslProtocols.Tls12,
            HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
            BaseAddress = new Uri($"https://{uri}/messages/events?api-version=2020-03-13"),
        };
        httpClient.DefaultRequestHeaders.Add("Authorization", SasTokenGenerate(uri, Config.Key, DateTime.UtcNow.Add(Config.SasTokenRenewFor)));

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub device {Config.DeviceID} telemetry update start");

        HttpResponseMessage response = httpClient.Post("", new StringContent(payload));

        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Response code:{response.StatusCode}");

        response.EnsureSuccessStatusCode();
    }
    catch (Exception ex)
    {
        Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub telemetry update failed:{ex.Message} {ex?.InnerException?.Message}");

        Sleep.EnableWakeupByTimer(Config.FailureRetryInterval);
        Sleep.StartDeepSleep();
    }

    Sleep.EnableWakeupByTimer(Config.TelemetryUploadInterval);
#if SLEEP_LIGHT
    Sleep.StartLightSleep();
#endif
#if SLEEP_DEEP
    Sleep.StartDeepSleep();
#endif
 }

The LightSleep or DeepSleep based code is significantly less complex because the allocation and deallocation of resources does not have to be managed because the application is restarted when the WakeUp Timer triggers.

Both LightSleep and DeepSleep reduced the idle current to 0.000A. The Keweisi KWS-MX19 USB Tester DC 4V-30V 0-5A Current Voltage Detector is not a precision laboratory instrument. I couldn’t detect if sleeping the RAK1901 WisBlock Temperature and Humidity Sensor or LightSleep vs. DeepSleep made any difference. But it did show the power consumption of my setup could be significantly reduced by using the ESP32 LightSleep and DeepSleep functionality.

.NET nanoFramework RAK11200 – Azure IoT Hub HTTP battery charge monitoring

The first step was to check that I could get a “battery charge” value for the RAKWireless RAK11200 WisBlock WiFi Module on a RAK19007 WisBlock Base Board to send to an Azure IoT Hub.

RAK1702 Schematic with voltage divider to ADC_VBAT connection highlighted
RAK1701 Schematic with ADC_VBAT to CPU slot connection highlighted

The RAK19007 WisBlock Base Board has a voltage divider (R3&R4 with output ADC_VBAT) which is connected (via R7) to pin 21(AIN0) on the CPU slot connector.

RAK11200 schematic with CPU Slot to ESP32-WROVER-B connection highlighted

The AIN0(pin 21) of the RAK11200 WisBlock WiFi Module is connected to SENSOR_VP(pin4) of the Espressif ESP32-WROVER-B so I could measure the battery charge.

RAK11200+RAK19007+RAK1901+ LiPo battery test rig

My test setup was a RAK11200 WisBlock WiFi Module, RAK19007 WisBlock Base Board, RAK1901 WisBlock Temperature and Humidity Sensor and 1200mAH Lithium Polymer (LiPo) battery which uploads temperature, humidity and battery charge telemetry to an Azure IoT Hub every 10 minutes.

I used AdcController + AdcChannel to read the AIN0 value which was then inserted in the Java Script Object Notation(JSON) telemetry payload.

 public class Program
 {
     private const int I2cDeviceBusID = 1;
     private const int AdcControllerChannel = 0;

     public static void Main()
     {
         DateTime sasTokenValidUntilUtc = DateTime.UtcNow;

         Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoTHub.RAK11200.PowerBaseline starting");

         Configuration.SetPinFunction(Gpio.IO04, DeviceFunction.I2C1_DATA);
         Configuration.SetPinFunction(Gpio.IO05, DeviceFunction.I2C1_CLOCK);

         if (!WifiNetworkHelper.ConnectDhcp(Config.Ssid, Config.Password, requiresDateTime: true))
         {
             if (NetworkHelper.HelperException != null)
             {
                 Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} WifiNetworkHelper.ConnectDhcp failed {NetworkHelper.HelperException}");
             }

             Thread.Sleep(Timeout.Infinite);
         }

         string uri = $"{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}";

         // not setting Authorization here as it will change as SAS Token refreshed
         HttpClient httpClient = new HttpClient
         {
             SslProtocols = System.Net.Security.SslProtocols.Tls12,
             HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
             BaseAddress = new Uri($"https://{uri}/messages/events?api-version=2020-03-13"),
         };

         I2cConnectionSettings settings = new(I2cDeviceBusID, Shtc3.DefaultI2cAddress);
         I2cDevice device = I2cDevice.Create(settings);
         Shtc3 shtc3 = new(device);

         AdcController adcController = new AdcController();
         AdcChannel batteryChargeAdcChannel = adcController.OpenChannel(AdcControllerChannel);

         string sasToken = "";

         while (true)
         {
             DateTime standardisedUtcNow = DateTime.UtcNow;

             Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub device {Config.DeviceID} telemetry update start");

             if (sasTokenValidUntilUtc <= standardisedUtcNow)
             {
                 sasTokenValidUntilUtc = standardisedUtcNow.Add(Config.SasTokenRenewEvery);

                 sasToken = SasTokenGenerate(uri, Config.Key, sasTokenValidUntilUtc);

                 Debug.WriteLine($" Renewing SAS token for {Config.SasTokenRenewFor} valid until {sasTokenValidUntilUtc:HH:mm:ss dd-MM-yy}");
             }

             if (!shtc3.TryGetTemperatureAndHumidity(out var temperature, out var relativeHumidity))
             {
                 Debug.WriteLine($" Temperature and Humidity read failed");

                 continue;
             }

             double batteryCharge = batteryChargeAdcChannel.ReadRatio() * 100.0;

             Debug.WriteLine($" Temperature {temperature.DegreesCelsius:F1}°C Humidity {relativeHumidity.Value:F0}% BatteryCharge {batteryCharge:F1}%");

             string payload = $"{{\"RelativeHumidity\":{relativeHumidity.Value:F0},\"Temperature\":{temperature.DegreesCelsius.ToString("F1")}, \"BatteryCharge\":{batteryCharge:F1}}}";

             try
             {
                 using (HttpContent content = new StringContent(payload))
                 {
                     content.Headers.Add("Authorization", sasToken);

                     using (HttpResponseMessage response = httpClient.Post("", content))
                     {
                         Console.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Response code:{response.StatusCode}");

                         response.EnsureSuccessStatusCode();
                     }
                 }
             }
             catch (Exception ex)
             {
                 Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub POST failed:{ex.Message} {ex?.InnerException?.Message}");
             }

             Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub telemetry update done");

             Thread.Sleep(Config.TelemetryUploadInterval);
         }
     }
...
}

I used Azure IoT Explorer to monitor the Azure IoT Hub device telemetry to see how BatteryCharge value decreased to a level where the device wouldn’t transmit.

Azure IoT Explorer telemetry – device connected to a USB charger (11:01:19) then un-plugged (11:02:02)
Azure IoT Explorer telemetry – Last two messages sent by the device

With no use of the “power conservation” functionality of the ESP32-WROVER-B powered by a 1200mAH battery the device ran for approximately 11hrs (11:00am – 10:00pm).

RAK2305 Wisblock AIN0 pin highlighted

I think the RAK2305 will not be able to measure “battery charge” as the SENSOR_VP pin on the Espressif ESP32-WROVER-B is not connected to AIN0.

.NET nanoFramework RAK11200 – Azure IoT Hub HTTP SAS Tokens – Revisited

Several times my client apps inspired by Azure IoT Hub HTTP Basic have not worked because I have failed to correctly trim the Azure IoT Hub Shared Access Signature(SAS) generated with tools like Azure Command Line az iot hub generate-sas-token, Azure IoT Tools for Visual Studio Code or Azure IoT Explorer.

The tokens are quite long but “the only “important” parts are the resource(sr), signature(sig) and expiry(se) values. If the connection string is generated

HostName=01234567890123456789.azure-devices.net;DeviceId=RAK11200-RAK19001;SharedAccessSignature=SharedAccessSignature sr=01234567890123456789.azure-devices.net%2Fdevices%2FRAK11200-RAK19001&sig=ABCDEFGHIJLMNOPQRSTUVWXYZ1234567890abcdefghijklmnopqrs&se=1663810576

The final version of the application constructs the Azure IoT Hub Shared Access Signature(SAS) with the AzureIoTHubHostName, DeviceID, signature(sig) & expiry(se) values in the config.cs file.

public class Config
{
   public const string DeviceID = "RAK11200-RAK19001";
   public const string SasSignature = "..."; // sig
   public const string SasExpiryTime = "..."; // se

   public const string AzureIoTHubHostName = "..";
   public const string Ssid = "...";
   public const string Password = "..";
   ...
}
 _httpClient = new HttpClient
{
   SslProtocols = System.Net.Security.SslProtocols.Tls12,
   HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
   BaseAddress = new Uri($"https://{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}/messages/events?api-version=2020-03-13"),
};

string sasKey = $"SharedAccessSignature sr={Config.AzureIoTHubHostName}.azure-devices.net%2Fdevices%2F{Config.DeviceID}&sig={Config.SasSignature}&se={Config.SasExpiryTime}";

_httpClient.DefaultRequestHeaders.Add("Authorization", sasKey);

.NET nanoFramework RAK11200 – Azure IoT Hub HTTP SAS Keys

This is a significantly improved .NET nanoFramework Azure IoT Hub client (inspired by this nanoFramework sample) which “automatically” generates and then renews the SAS Token connection string used for authorisation.

RAK11200 + RAL19001 + RAK1901 test hardware

My test setup was a RAKwireless RAK11200 WisBlock WiFi Module, RAK19001 WisBlock Dual IO Base Board and RAK1901 WisBlock Temperature and Humidity Sensor

public static void Main()
{
   DateTime sasTokenValidUntilUtc = DateTime.UtcNow;

   Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} devMobile.IoT.RAK.Wisblock.AzureIoHub.RAK1901.SasKey starting");

...         
   string uri = $"{Config.AzureIoTHubHostName}.azure-devices.net/devices/{Config.DeviceID}";

   // not setting Authorization here as it will change as SAS Token refreshed
   _httpClient = new HttpClient
   {
      SslProtocols = System.Net.Security.SslProtocols.Tls12,
      HttpsAuthentCert = new X509Certificate(Config.DigiCertBaltimoreCyberTrustRoot),
      BaseAddress = new Uri($"https://{uri}/messages/events?api-version=2020-03-13"),
   };

   I2cConnectionSettings settings = new(I2cDeviceBusID, Shtc3.DefaultI2cAddress);
   I2cDevice device = I2cDevice.Create(settings);
   Shtc3 shtc3 = new(device);

   string sasToken = "";

   while (true)
   {
      DateTime standardisedUtcNow = DateTime.UtcNow;

      Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub device {Config.DeviceID} telemetry update start");

      if (sasTokenValidUntilUtc <= standardisedUtcNow)
      {
         sasTokenValidUntilUtc = standardisedUtcNow.Add(Config.SasTokenRenewEvery);

         sasToken = SasTokenGenerate(uri, Config.Key, sasTokenValidUntilUtc);

         Debug.WriteLine($" Renewing SAS token for {Config.SasTokenRenewFor} valid until {sasTokenValidUntilUtc:HH:mm:ss dd-MM-yy}");
      }

      if (!shtc3.TryGetTemperatureAndHumidity(out var temperature, out var relativeHumidity))
      {
         Debug.WriteLine($" Temperature and Humidity read failed");

         continue;
      }

      Debug.WriteLine($" Temperature {temperature.DegreesCelsius:F1}°C Humidity {relativeHumidity.Value:F0}%");

      string payload = $"{{\"RelativeHumidity\":{relativeHumidity.Value:F0},\"Temperature\":{temperature.DegreesCelsius.ToString("F1")}}}";

      try
      {
         using (HttpContent content = new StringContent(payload))
         {
            content.Headers.Add("Authorization", sasToken);

            using (HttpResponseMessage response = _httpClient.Post("", content))
            {
               Console.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Response code:{response.StatusCode}");

               response.EnsureSuccessStatusCode();
            }
         }
      }
     catch (Exception ex)
     {
         Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub POST failed:{ex.Message} {ex?.InnerException?.Message}");
      }

      Debug.WriteLine($"{DateTime.UtcNow:HH:mm:ss} Azure IoT Hub telemetry update done");

      Thread.Sleep(Config.TelemetryUploadInterval);
   }
}

How long a SAS Token is valid for and how often it has to be renewed is specified in the config.cs file

public class Config
{
   public const string DeviceID = "RAK11200-RAK19001";
   public const string AzureIoTHubHostName = "...";
   public const string Key = "...";
   public readonly static TimeSpan SasTokenRenewFor = new TimeSpan(24, 0, 0);
   public readonly static TimeSpan SasTokenRenewEvery = new TimeSpan(0, 30, 0);
   public readonly static TimeSpan TelemetryUploadInterval = new TimeSpan(0, 10, 0);

   public const string Ssid = "Orcon-Wireless";
   public const string Password = "160220502280";
...
}

The SasTokenGenerate method is based on code from an old blog post “Azure IoT Hub SAS Tokens revisited again” from, late 2019

public static string SasTokenGenerate(string resourceUri, string key, DateTime sasKeyTokenUntilUtc)
{
   long sasKeyvalidUntilUtcUnix = sasKeyTokenUntilUtc.ToUnixTimeSeconds();

   string stringToSign = $"{HttpUtility.UrlEncode(resourceUri)}\n{sasKeyvalidUntilUtcUnix}";

   var hmac = SHA.computeHMAC_SHA256(Convert.FromBase64String(key), Encoding.UTF8.GetBytes(stringToSign));

   string signature = Convert.ToBase64String(hmac);

   return $"SharedAccessSignature sr={HttpUtility.UrlEncode(resourceUri)}&sig={HttpUtility.UrlEncode(signature)}&se={sasKeyvalidUntilUtcUnix}";
}

I use Azure IoT Explorer to monitor the telemetry and the application appears to run reliably for weeks

Azure IoT Explorer displaying test rig telemetry(22/09)
Azure IoT Explorer displaying test rig telemetry(03/10)

.NET nanoFramework BME680 Library Debugging Part 2

Reading the RAK1906 WisBlock Environment Sensor/BME680 GasResistance was failing randomly so I decided to dig a bit deeper. I checked the termination resistors, made sure the sensor was firmly seated on the RAK5005, and tried another Inter-Integrated Circuit(I²C) device on the same physical port.

I then used Visual Studio 2022 Debugger to “single step” further into the BME680 code and the first thing that looked a bit odd was the TryReadTemperatureCore, TryReadPressureCore, TryReadHumidityCore and TryReadGasResistanceCore return values were ignored.

/// <summary>
/// Performs a synchronous reading.
/// </summary>
/// <returns><see cref="Bme680ReadResult"/></returns>
public Bme680ReadResult Read()
{
   SetPowerMode(Bme680PowerMode.Forced);
   Thread.Sleep((int)GetMeasurementDuration(HeaterProfile).Milliseconds);

    TryReadTemperatureCore(out Temperature temperature);
    TryReadPressureCore(out Pressure pressure, skipTempFineRead: true);
    TryReadHumidityCore(out RelativeHumidity humidity, skipTempFineRead: true);
    TryReadGasResistanceCore(out ElectricResistance gasResistance);

    return new Bme680ReadResult(temperature, pressure, humidity, gasResistance);
}

I then single stepped into the TryReadTemperatureCore which was returning a boolean indicating whether the read was success.

private bool TryReadTemperatureCore(out Temperature temperature)
{
    if (TemperatureSampling == Sampling.Skipped)
    {
        temperature = default;
        return false;
    }

    var temp = (int)Read24BitsFromRegister((byte)Bme680Register.TEMPDATA, Endianness.BigEndian);

    temperature = CompensateTemperature(temp >> 4);
    return true;
}

This library was based on the dotnet/iot Bmxx80 code, it looked similar, but I missed an important detail lots more ?’s…

Console.WriteLine("Hello BME680!");

// The I2C bus ID on the Raspberry Pi 3.
const int busId = 1;
// set this to the current sea level pressure in the area for correct altitude readings
Pressure defaultSeaLevelPressure = WeatherHelper.MeanSeaLevel;

I2cConnectionSettings i2cSettings = new(busId, Bme680.DefaultI2cAddress);
I2cDevice i2cDevice = I2cDevice.Create(i2cSettings);

using Bme680 bme680 = new Bme680(i2cDevice, Temperature.FromDegreesCelsius(20.0));

while (true)
{
    // reset will change settings back to default
    bme680.Reset();

    // 10 consecutive measurement with default settings
    for (var i = 0; i < 10; i++)
    {
        // Perform a synchronous measurement
        var readResult = bme680.Read();

        // Print out the measured data
        Console.WriteLine($"Gas resistance: {readResult.GasResistance?.Ohms:0.##}Ohm");
        Console.WriteLine($"Temperature: {readResult.Temperature?.DegreesCelsius:0.#}\u00B0C");
        Console.WriteLine($"Pressure: {readResult.Pressure?.Hectopascals:0.##}hPa");
        Console.WriteLine($"Relative humidity: {readResult.Humidity?.Percent:0.#}%");

        if (readResult.Temperature.HasValue && readResult.Pressure.HasValue)
        {
            var altValue = WeatherHelper.CalculateAltitude(readResult.Pressure.Value, defaultSeaLevelPressure, readResult.Temperature.Value);
            Console.WriteLine($"Altitude: {altValue.Meters:0.##}m");
        }

        if (readResult.Temperature.HasValue && readResult.Humidity.HasValue)
        {
            // WeatherHelper supports more calculations, such as saturated vapor pressure, actual vapor pressure and absolute humidity.
            Console.WriteLine($"Heat index: {WeatherHelper.CalculateHeatIndex(readResult.Temperature.Value, readResult.Humidity.Value).DegreesCelsius:0.#}\u00B0C");
            Console.WriteLine($"Dew point: {WeatherHelper.CalculateDewPoint(readResult.Temperature.Value, readResult.Humidity.Value).DegreesCelsius:0.#}\u00B0C");
        }

        // when measuring the gas resistance on each cycle it is important to wait a certain interval
        // because a heating plate is activated which will heat up the sensor without sleep, this can
        // falsify all readings coming from the sensor
        Thread.Sleep(1000);
    }
    ...
}

The Bme680 Read() method checked the TryReadTemperatureCore, TryReadPressureCore, TryReadHumidityCore & TryReadGasResistanceCore return values.

/// <summary>
/// Performs a synchronous reading.
/// </summary>
/// <returns><see cref="Bme680ReadResult"/></returns>
public Bme680ReadResult Read()
{
    SetPowerMode(Bme680PowerMode.Forced);
    Thread.Sleep((int)GetMeasurementDuration(HeaterProfile).Milliseconds);

    var tempSuccess = TryReadTemperatureCore(out var temperature);
    var pressSuccess = TryReadPressureCore(out var pressure, skipTempFineRead: true);
    var humiditySuccess = TryReadHumidityCore(out var humidity, skipTempFineRead: true);
    var gasSuccess = TryReadGasResistanceCore(out var gasResistance);

    return new Bme680ReadResult(tempSuccess ? temperature : null, pressSuccess ? pressure : null, humiditySuccess ? humidity : null, gasSuccess ? gasResistance : null);
}

The dotnet/iot Bmxx80 library uses Nullable reference types which are not supported by the nanoFramework(Sept 2022), and this was overlooked when the library was ported.

I have created a Github issue.