Azure IoT Hub SAS Tokens revisited yet again

Based my previous post on SAS Token Expiry I wrote a test harness to better understand DateTimeOffset

using System;

namespace UnixEpochTester
{
   class Program
   {
      static void Main(string[] args)
      {
         Console.WriteLine($"DIY                {new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc)}");
         Console.WriteLine($"DateTime.UnixEpoch {DateTime.UnixEpoch} {DateTime.UnixEpoch.Kind}");
         Console.WriteLine();

         TimeSpan fromUnixEpochNow = DateTime.UtcNow - DateTime.UnixEpoch;
         Console.WriteLine($"Epoc now {fromUnixEpochNow} {fromUnixEpochNow.TotalSeconds.ToString("f0")} sec");
         Console.WriteLine();

         TimeSpan fromUnixEpochFixed = new DateTime(2019, 11, 30, 2, 0, 0, DateTimeKind.Utc) - DateTime.UnixEpoch;
         Console.WriteLine($"Epoc  {fromUnixEpochFixed} {fromUnixEpochFixed.TotalSeconds.ToString("f0")} sec");
         Console.WriteLine();

         DateTimeOffset dateTimeOffset = new DateTimeOffset( new DateTime( 2019,11,30,2,0,0, DateTimeKind.Utc));
         Console.WriteLine($"Epoc DateTimeOffset {fromUnixEpochFixed} {dateTimeOffset.ToUnixTimeSeconds()}");
         Console.WriteLine();

         TimeSpan fromEpochStart = new DateTime(2019, 11, 30, 2, 0, 0, DateTimeKind.Utc) - DateTime.UnixEpoch;
         Console.WriteLine($"Epoc DateTimeOffset {fromEpochStart} {fromEpochStart.TotalSeconds.ToString("F0")}");
         Console.WriteLine();


         // https://www.epochconverter.com/ matches
         // https://www.unixtimestamp.com/index.php matches

         Console.WriteLine("Press ENTER to exit");
         Console.ReadLine();
      }
   }
}

I validated my numbers against a couple of online calculators and they matched which was a good start.

DateTimeOffset test harness

As I was testing my Azure MQTT Test Client I had noticed some oddness with MQTT connection timeouts.

string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,5,0));
1/12/2019 1:29:52 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.391","OfficeHumidity":"93"}]
1/12/2019 1:30:22 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.29","OfficeHumidity":"64"}]
...
1/12/2019 1:43:56 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.591","OfficeHumidity":"98"}]
1/12/2019 1:44:26 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.754","OfficeHumidity":"68"}]


string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,5,0));
1/12/2019 1:29:52 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.391","OfficeHumidity":"93"}]
1/12/2019 1:30:22 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.29","OfficeHumidity":"64"}]
...
1/12/2019 2:01:37 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.334","OfficeHumidity":"79"}]
1/12/2019 2:02:07 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.503","OfficeHumidity":"49"}]


string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,5,0));
2/12/2019 9:27:21 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.196","OfficeHumidity":"61"}]
2/12/2019 9:27:51 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.788","OfficeHumidity":"91"}]
...
2/12/2019 9:36:24 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.670","OfficeHumidity":"64"}]
2/12/2019 9:36:54 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.836","OfficeHumidity":"94"}]


string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,5,0));
2/12/2019 9:40:52 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.46","OfficeHumidity":"92"}]
2/12/2019 9:41:22 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.443","OfficeHumidity":"62"}]
...
2/12/2019 9:50:55 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.742","OfficeHumidity":"95"}]


string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,10,0));
approx 15min as only 30 sec resolution
1/12/2019 12:50:23 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.630","OfficeHumidity":"65"}]
1/12/2019 12:50:53 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.798","OfficeHumidity":"95"}]
...
1/12/2019 1:03:59 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.677","OfficeHumidity":"41"}]
1/12/2019 1:04:30 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.26","OfficeHumidity":"72"}]


string token = generateSasToken($"{server}/devices/{clientId}", password, "", new TimeSpan(0,10,0));
approx 15min as only 30 sec resolution
1/12/2019 1:09:30 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.106","OfficeHumidity":"72"}]
1/12/2019 1:10:00 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.463","OfficeHumidity":"42"}]
...
1/12/2019 1:23:35 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.366","OfficeHumidity":"77"}]
1/12/2019 1:24:05 PM> Device: [MQTTLoRa915MHz], Data:[{"OfficeTemperature":"22.537","OfficeHumidity":"47"}]

The dataset with the 5 minute expiry which remained connected for approximately 30 mins was hopefully a configuration issue.

The updated SAS Token code not uses ToUnixTimeSeconds to eliminate the scope for local vs. UTC issues.

      public static string generateSasToken(string resourceUri, string key, string policyName, TimeSpan timeToLive)
      {
         DateTimeOffset expiryDateTimeOffset = new DateTimeOffset(DateTime.UtcNow.Add(timeToLive));

         string expiryEpoch = expiryDateTimeOffset.ToUnixTimeSeconds().ToString();
         string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiryEpoch;

         HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key));
         string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

         string token = $"SharedAccessSignature sr={WebUtility.UrlEncode(resourceUri)}&sig={WebUtility.UrlEncode(signature)}&se={expiryEpoch}";

         if (!String.IsNullOrEmpty(policyName))
         {
            token += "&skn=" + policyName;
         }

         return token;
      }

I need to test the expiry of my SAS Tokens some more especially with the client running on my development machine (NZT which is currently UTC+13) and in Azure (UTC timezone)

Azure IoT Hub SAS Tokens revisited again

This post has been edited (2019-11-24) my original assumption about how DateTime.Kind unspecified was handled were incorrect.

As I was testing my Azure MQTT Test Client I noticed some oddness with MQTT connection timeouts and this got me wondering about token expiry times. So, I went searching again and found this Azure IoT Hub specific sample code

public static string generateSasToken(string resourceUri, string key, string policyName, int expiryInSeconds = 3600)
{
    TimeSpan fromEpochStart = DateTime.UtcNow - new DateTime(1970, 1, 1);
    string expiry = Convert.ToString((int)fromEpochStart.TotalSeconds + expiryInSeconds);

    string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiry;

    HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key));
    string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

    string token = String.Format(CultureInfo.InvariantCulture, "SharedAccessSignature sr={0}&sig={1}&se={2}", WebUtility.UrlEncode(resourceUri), WebUtility.UrlEncode(signature), expiry);

    if (!String.IsNullOrEmpty(policyName))
    {
        token += "&skn=" + policyName;
    }

    return token;
}

This code worked first time and was more flexible than mine which was a bonus. Though while running my MQTTNet based client I noticed the connection would drop after approximately 10mins (EDIT this was probably an unrelated networking issue).

A long time ago (25 years) I had issues sharing a Unix time value between an applications written with Borland C and Microsoft Visual C which made me wonder about Unix epoch base offsets.

So to test my theory I built a Unix epoch test harness console application

using System;

namespace UnixEpocTest
{
   class Program
   {
      static void Main(string[] args)
      {
         TimeSpan ttl = new TimeSpan(0, 0, 0);

         Console.WriteLine("Current time");
         Console.WriteLine($"Local     {DateTime.Now} {DateTime.Now.Kind}");
         Console.WriteLine($"UTC       {DateTime.UtcNow} {DateTime.UtcNow.Kind}");
         Console.WriteLine($"Unix DIY  {new DateTime(1970, 1, 1)} {new DateTime(1970, 1, 1).Kind}");
         Console.WriteLine($"Unix DIY+ {new DateTime(1970, 1, 1).ToUniversalTime()} {new DateTime(1970, 1, 1).ToUniversalTime().Kind}");
         Console.WriteLine($"Unix DIY  {new DateTime(1970, 1, 1, 0,0,0, DateTimeKind.Utc)}");
         Console.WriteLine($"Unix      {DateTime.UnixEpoch} {DateTime.UnixEpoch.Kind}");
         Console.WriteLine();

         TimeSpan fromEpochStart = DateTime.UtcNow - new DateTime(1970, 1, 1);
         TimeSpan fromEpochStartUtc = DateTime.UtcNow - new DateTime(1970, 1, 1,0,0,0, DateTimeKind.Utc);
         TimeSpan fromEpochStartUnixEpoch = DateTime.UtcNow - DateTime.UnixEpoch;

         Console.WriteLine("Epoch comparison");
         Console.WriteLine($"Local {fromEpochStart} {fromEpochStart.TotalSeconds.ToString("f0")} sec");
         Console.WriteLine($"UTC   {fromEpochStartUtc} {fromEpochStartUtc.TotalSeconds.ToString("f0")} sec");
         Console.WriteLine($"Epoc  {fromEpochStartUnixEpoch} {fromEpochStartUnixEpoch.TotalSeconds.ToString("f0")} sec");
         Console.WriteLine();

         TimeSpan afterEpoch = DateTime.UtcNow.Add(ttl) - new DateTime(1970, 1, 1);
         TimeSpan afterEpochUtC = DateTime.UtcNow.Add(ttl) - new DateTime(1970, 1, 1).ToUniversalTime();
         TimeSpan afterEpochEpoch = DateTime.UtcNow.Add(ttl) - DateTime.UnixEpoch;

         Console.WriteLine("Epoch calculation");
         Console.WriteLine($"Local {afterEpoch}");
         Console.WriteLine($"UTC   {afterEpochUtC}");
         Console.WriteLine($"Epoch {afterEpochEpoch}");
         Console.WriteLine();

         Console.WriteLine("Epoch DateTime");
         Console.WriteLine($"Local :{new DateTime(1970, 1, 1)}");
         Console.WriteLine($"UTC   :{ new DateTime(1970, 1, 1).ToUniversalTime()}");

         Console.WriteLine("Press ENTER to exit");
         Console.ReadLine();

         Console.WriteLine("Hello World!");
      }
   }
}

EDIT: I now think the UtcNow to “unspecified” kind mathematics was being handled correctly. I have updated the code to use the DateTime.UnixEpoch constant so the code is more readable.

public static string generateSasToken(string resourceUri, string key, string policyName, int expiryInSeconds = 900)
      {
         TimeSpan fromEpochStart = DateTime.UtcNow - DateTime.UnixEpoch;
         string expiry = Convert.ToString((int)fromEpochStart.TotalSeconds + expiryInSeconds);

         string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiry;

         HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key));
         string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

         string token = String.Format(CultureInfo.InvariantCulture, "SharedAccessSignature sr={0}&sig={1}&se={2}", WebUtility.UrlEncode(resourceUri), WebUtility.UrlEncode(signature), expiry);

         if (!String.IsNullOrEmpty(policyName))
         {
            token += "&skn=" + policyName;
         }

         return token;
      }

I need to test the expiry of my SAS Tokens some more especially with the client running on my development machine (NZT which is currently UTC+13) and in Azure (UTC timezone)

Microsoft Sync Framework timezones

Over the last few months I have been working with the Microsoft Sync Framework and the time zone issues have been a problem.

New Zealand has a 12hr standard time or 13 hr daylight savings time offset from Coordinated Universal Time (UTC) and at a glance our customer data could look ok if treated as either local or UTC.

After some experimentation I found that it was due to Windows Communication Foundation(WCF) serialisation issues (The proposed solutions looks like it might have some limitations, especially across daylight savings time transitions).

For the initial synchronisation DateTime values in the database were unchanged, but for any later incremental synchronisations the DateTime values were adjusted to the timezone of the server (Our Azure Cloud Services are UTC timezone, though I don’t understand why Microsoft by default has them set to US locale with MM/DD/YY date formats)

In our scenario having all of the DateTime values in the cloud local looked like a reasonable option and this article provided some useful insights.

In the end I found that setting the DateSetDateTime  for every DateTime column in each DataTable in the synchronisation DataSet to unspecified in the ProcessChangeBatch (our code was based on the samples) method meant that no adjustment was applied to the incremental updates

public override void ProcessChangeBatch(ConflictResolutionPolicy resolutionPolicy, ChangeBatch sourceChanges, object changeDataRetriever, SyncCallbacks syncCallbacks, SyncSessionStatistics sessionStatistics)
{
try
{
DbSyncContext context = changeDataRetriever as DbSyncContext;

if (context != null)
{
foreach (DataTable table in context.DataSet.Tables)
{
foreach (DataColumn column in table.Columns)
{
// Switching from UnspecifiedLocal to Unspecified is allowed even after the DataSet has rows.
if ((column.DataType == typeof(DateTime)) && (column.DateTimeMode == DataSetDateTime.UnspecifiedLocal))
{
column.DateTimeMode = DataSetDateTime.Unspecified;
}
}
}
...

Hope this helps someone else