.Net Core & WCF TransportWithMessageCredential

In one of my day jobs I look after a system which has been around since 2010 (Early adopter of Microsoft Azure, developement started on .Net 3.5). The product has a number of Windows Communication Foundation(WCF) services hosted in an Azure CloudService.

A client built with .Net Core wanted to be able to call one of the services which was implemented using wsHttpBinding and TransportWithMessageCredential and this proved a bit more painful than expected…

I first tried the Visual Studio 2017 Microsoft WCF Web Service Reference Provider fromt the WCF Core Team.

The “add connected service” extension dialog allowed me to select an endpoint

ConfigureWCFWebSeriveReference

But the code generation process failed

WCFWebServiceReferenceError.png

The error message wasn’t particularly helpful so I used the command line utility svcutil to generate client classes. Which I used to built a .net core client with and the associated .Net Core WCF NuGet packages.

The console application failed when I called the service with a “PlatformNotSupportedException”. After some searching I found that the .Net Core WCF libraries don’t support TransportWithMessageCredential (September 2017).

Some more searching lead to a StackOverflow article where an answer suggested using the SimpleSOAPClient NuGet package. I then created a new client using the generated classes as the basis for the ones used in my SimpleSOAPClient proof of concept(PoC)

[System.Diagnostics.DebuggerStepThroughAttribute()]
[System.CodeDom.Compiler.GeneratedCodeAttribute("System.ServiceModel", "4.0.0.0")]
[System.ServiceModel.MessageContractAttribute(WrapperName="Redeem", WrapperNamespace="http://qwertyuiop.com/services2011/08", IsWrapped=true)]
public partial class RedeemRequest
{
    [System.ServiceModel.MessageBodyMemberAttribute(Namespace="http://qwertyuiop.com/services2011/08", Order=1)]
    public string voucherCode;

    [System.ServiceModel.MessageBodyMemberAttribute(Namespace="http://qwertyuiop.com/services2011/08", Order=2)]
    public string merchantId;

    [System.ServiceModel.MessageBodyMemberAttribute(Namespace="http://qwertyuiop.com/services2011/08", Order=3)]
    public string merchantReference;

    [System.ServiceModel.MessageBodyMemberAttribute(Namespace="http://qwertyuiop.com/services2011/08", Order=4)]
    public string terminalId;

    public RedeemRequest()
    {
    }

    public RedeemRequest(string voucherCode, string merchantId, string merchantReference, string terminalId)
    {
        this.voucherCode = voucherCode;
        this.merchantId = merchantId;
        this.merchantReference = merchantReference;
        this.terminalId = terminalId;
    }
}

became

[XmlRoot("Redeem", Namespace = "http://qwertyuiop.com/services2011/08")]
public partial class RedeemRequest
{
   [XmlElement("voucherCode")]
   public string voucherCode;
   [XmlElement("transactionAmount")]
   public decimal transactionAmount;
   [XmlElement("merchantId")]
   public string merchantId;
   [XmlElement("merchantReference")]
   public string merchantReference;
   [XmlElement("terminalId")]
   public string terminalId;
}

This client failed with a SOAPAction related exception so I fired up Telerik Fiddler and found that the header was missing. When I manually added the header in the request composer (after dragging one of my failed requests onto the composer tab) it worked.

I had a look at the code in the SimpleSOAPClient repository to see how to add a custom HTTP Header to a request.

RedeemRequest redeemRequest = new RedeemRequest()
{
   merchantId = "......",
   merchantReference = "......",
   terminalId = "......",
   voucherCode = "......",
};

using (var client = SoapClient.Prepare())
{
   client.HttpClient.DefaultRequestHeaders.Add("SOAPAction", "http://qwertyuiop.com/services2011/08/IRedemptionProxyServiceV1/Redeem");
   var responseEnvelope = await client.SendAsync(
      "https://qwertyuiop.com/RedemptionProxy.svc",
      "https://qwertyuiop.com/services2011/08/IRedemptionProxyServiceV1/Redeem",
      SoapEnvelope.Prepare()
      .WithHeaders(KnownHeader.Oasis.Security.UsernameTokenAndPasswordText(".....", "......"))
      .Body(redeemRequest), ct);

      var response = responseEnvelope.Body<RedeemResponse>();

      Console.WriteLine("Redeem Result:{0}  Message:{1}", response.Result, response.messageText);
   }
}

After sorting out a few typos my request worked as expected. Only a couple of hours lost from my life, hopefully this post will help someone else.

nRF24L01 Raspberry PI Gateway Hardware

For those who came to my MS Ignite AU Intelligent Cloud booth session

Building Wireless Field Gateways

Connecting wireless sensor nodes to the cloud is not the mission it used to be, because the Azure team (and many OS projects) have developed tooling which can help hobbyist and professional developers build solutions. How could you build a home scale robust, reliable and secure solution with off the shelf kit without blowing the budget?

Sparkfun nRF24L01 module &Adafruit perma proto hat

NRF24L01 Raspberry PI DIY Gateway Hardware

BoM (all prices as at Feb 2016)

You will also need some short lengths of wire and a soldering iron.

For those who want an “off the shelf” solution (still requires a minor modification for interrupt support) I have used the Raspberry Pi to NRF24l01+ Shield USD9.90

2015-09-25t072754-447z-20150925_091942-855x570_q85_pad_rcrop

Instructions for modifications and software to follow.

Microsoft Sync Framework timezones

Over the last few months I have been working with the Microsoft Sync Framework and the time zone issues have been a problem.

New Zealand has a 12hr standard time or 13 hr daylight savings time offset from Coordinated Universal Time (UTC) and at a glance our customer data could look ok if treated as either local or UTC.

After some experimentation I found that it was due to Windows Communication Foundation(WCF) serialisation issues (The proposed solutions looks like it might have some limitations, especially across daylight savings time transitions).

For the initial synchronisation DateTime values in the database were unchanged, but for any later incremental synchronisations the DateTime values were adjusted to the timezone of the server (Our Azure Cloud Services are UTC timezone, though I don’t understand why Microsoft by default has them set to US locale with MM/DD/YY date formats)

In our scenario having all of the DateTime values in the cloud local looked like a reasonable option and this article provided some useful insights.

In the end I found that setting the DateSetDateTime  for every DateTime column in each DataTable in the synchronisation DataSet to unspecified in the ProcessChangeBatch (our code was based on the samples) method meant that no adjustment was applied to the incremental updates

public override void ProcessChangeBatch(ConflictResolutionPolicy resolutionPolicy, ChangeBatch sourceChanges, object changeDataRetriever, SyncCallbacks syncCallbacks, SyncSessionStatistics sessionStatistics)
{
try
{
DbSyncContext context = changeDataRetriever as DbSyncContext;

if (context != null)
{
foreach (DataTable table in context.DataSet.Tables)
{
foreach (DataColumn column in table.Columns)
{
// Switching from UnspecifiedLocal to Unspecified is allowed even after the DataSet has rows.
if ((column.DataType == typeof(DateTime)) && (column.DateTimeMode == DataSetDateTime.UnspecifiedLocal))
{
column.DateTimeMode = DataSetDateTime.Unspecified;
}
}
}
...

Hope this helps someone else

Enterprise Library V6 Logging with Azure SDK 2.8 and Azure Diagnostics 1.3

In a previous post I wrote about configuring the Enterprise Library V6 to work with Azure Diagnostics. There have been significant changes (detailed in this very helpful post) to the way the Azure Diagnostics infrastructure works for Azure SDK Versions 2.4/2.5. If the diagnostics infrastructure is not properly configured there will be no WADLogs tables created and/or trace information logged.

The following steps provision diagnostics for a Azure web role or worker role. This “cheat sheet” assumes you already have the Azure Service Management Cmdlets installed.

Add-AzureAccount This will prompt for Azure credentials

Get-AzureSubscription –Display details about your subscription(s)

SubscriptionId : 15daec19-f6e9-403c-8652-1234567890123
SubscriptionName : MyCompany
Environment : AzureCloud
DefaultAccount : me@mycompany.co.nz
IsDefault : False
IsCurrent : False
TenantId : e07af3b3-10c2-49a5-97cc-123456789012
CurrentStorageAccountName :
SubscriptionId : eba7ed1c-5503-4349-bcc7-123456789012
SubscriptionName : YourCompany
Environment : AzureCloud
DefaultAccount : you@yourcompany.co.nz
IsDefault : True
IsCurrent : True
TenantId : e07af3b3-10c2-49a5-97cc-1234567890
CurrentStorageAccountName :

If you have more than one Azure subscription you will need to select the one you want to use.

Select-AzureSubscription -Current -SubscriptionName “MyCompany” (beware names are case sensitive)

Get-AzureServiceDisplays a list of your Azure services

ServiceName : myDemoApp
Url : https://management.core.windows.net/eba7ed1c-5503-4349-bcc7-123456789012/services/hostedservices/myDemoApp
Label : myDemoApp
Description :
Location : Australia Southeast
AffinityGroup :
Status : Created
ExtendedProperties : {[ResourceGroup, myDemoApp], [ResourceLocation, Australia Southeast]}
DateModified : 7/01/2016 7:02:44 p.m.
DateCreated : 28/12/2015 6:23:44 p.m.
ReverseDnsFqdn :
WebWorkerRoleSizes : {A5, A6, A7, ExtraLarge, ExtraSmall, Large, Medium, Small, Standard_D1, Standard_D1_v2, Standard_D11, Standard_D11_v2, Standard_D12, Standard_D12_v2, Standard_D13,
Standard_D13_v2, Standard_D14, Standard_D14_v2, Standard_D2, Standard_D2_v2, Standard_D3, Standard_D3_v2, Standard_D4, Standard_D4_v2, Standard_D5_v2}
VirtualMachineRoleSizes : {A5, A6, A7, Basic_A0, Basic_A1, Basic_A2, Basic_A3, Basic_A4, ExtraLarge, ExtraSmall, Large, Medium, Small, Standard_D1, Standard_D1_v2, Standard_D11,
Standard_D11_v2, Standard_D12, Standard_D12_v2, Standard_D13, Standard_D13_v2, Standard_D14, Standard_D14_v2, Standard_D2, Standard_D2_v2, Standard_D3,
Standard_D3_v2, Standard_D4, Standard_D4_v2, Standard_D5_v2}
OperationDescription : Get-AzureService
OperationId : 73d37e69-d3d8-6769-94a2-123456789012
OperationStatus : Succeeded

Get-AzureRole -ServiceName “myDemoApp”

RoleName : WebRole
InstanceCount : 1
DeploymentID : cb4e439907774090be8d123456789012
ServiceName : myDemoApp
OperationDescription : Get-AzureRole
OperationId : 85233c60-f39a-6c01-b51a-123456789012
OperationStatus : Succeeded

RoleName : WorkerRole
InstanceCount : 1
DeploymentID : cb4e439907774090be8d123456789012
ServiceName : myDemoApp
OperationDescription : Get-AzureRole
OperationId : 85233c60-f39a-6c01-b51a-123456789012
OperationStatus : Succeeded

I then modified the role diagnostics config file (diagnostics.wadcfgx) by removing

<DiagnosticsConfiguration xmlns=”http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration”&gt;
+
</DiagnosticMonitorConfiguration>

+

<PrivateConfig xmlns=”http://schemas.microsoft.com/ServiceHosting/2010/10/DiagnosticsConfiguration”&gt;
<StorageAccount endpoint=”” />
</PrivateConfig>
<IsEnabled>true</IsEnabled> – Not certain about this

I then uploaded it with the following powershell script
$storage_name = “entlib”
$key = “Storage key goes here==”
$config_path=”C:\..\diagnostics.xml”
$service_name=”myDemoApp”
$storageContext = New-AzureStorageContext -StorageAccountName $storage_name -StorageAccountKey $key
Set-AzureServiceDiagnosticsExtension -StorageContext $storageContext -DiagnosticsConfigurationPath $config_path -ServiceName $service_name -Slot Production -Role WebRole

Repeat for WorkerRole and WebRole

Enterprise Library V6 Data, Exception and Logging with Azure SDK 2.8

I have used the Enterprise library Blocks (which in different forms have been around since 2005) in quite a few projects. Individually the components are pretty good (not always best of breed) but they are well integrated and when used in the way which they were intended to be used work well.

I have just upgraded a client application to Visual Studio 2015 + .Net 4.5 + Enterprise Library V6 and some of the steps were not immediately obvious so hopefully this saves someone else some time. I have sample code for Azure Cloud Service Web and Worker roles.

For both web and worker roles I added the Azure Diagnostics listener to the listener config section of the enterprise library logging settings.

<loggingConfiguration name="" tracingEnabled="true" defaultCategory="General">
	<listeners>
    <add listenerDataType="Microsoft.Practices.EnterpriseLibrary.Logging.Configuration.SystemDiagnosticsTraceListenerData, Microsoft.Practices.EnterpriseLibrary.Logging, Version=6.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35&amp;amp;quot;
         type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=2.8.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
         name="AzureDiagnosticTraceListener"/>
   </listeners>
...
</loggingConfiguration>

I then enabled diagnostics on the role and configured the transfer of logs.

Azure Diagnostics configuration dialog

Azure Diagnostic Configuration

This replaces the DiagnosticMonitorConfiguration based approach

DiagnosticMonitorConfiguration diagConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();
diagConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

// Enable scheduled transfer
diagConfig.Directories.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
diagConfig.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);

...
DiagnosticMonitor.Start(&amp;amp;quot;Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString&amp;amp;quot;, diagConfig);

For the web role I configured the exception and logging blocks in the Global.asax.cs file


protected void Application_Start()
{
   AreaRegistration.RegisterAllAreas();
   GlobalConfiguration.Configure(WebApiConfig.Register);
   FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
   RouteConfig.RegisterRoutes(RouteTable.Routes);
   BundleConfig.RegisterBundles(BundleTable.Bundles);

   // Load the Entlib logging block configuration
   LogWriterFactory logWriterFactory = new LogWriterFactory();
   LogWriter logWriter = logWriterFactory.Create();
   Logger.SetLogWriter(logWriter);

   // Load the Entlib Exception block configuration
   ExceptionPolicyFactory policyFactory = new ExceptionPolicyFactory();
   exManager = policyFactory.CreateManager();
}

For the worker role I configured the exception and logging blocks in the worker role startup

public override bool OnStart()
{
   // Set the maximum number of concurrent connections
   ServicePointManager.DefaultConnectionLimit = 12;
   ...
   LogWriterFactory logWriterFactory = new LogWriterFactory();
   LogWriter logWriter = logWriterFactory.Create();
   Logger.SetLogWriter(logWriter);
   ...
   return result;
}

Then in the webrole webapi2 API controllers you can use embedded SQL or call stored procedures with retries. (This sample code uses the Northwind database and default retry configuration)

public IEnumerable&amp;amp;lt;ProductDto&amp;amp;gt; Get()
{
var products = new List&amp;amp;lt;ProductDto&amp;amp;gt;();

WebApiApplication.exManager.Process(() =&amp;amp;gt;
{
Database db = new DatabaseProviderFactory().Create(&amp;amp;quot;NorthwindInstance&amp;amp;quot;);

RetryPolicy retry = new RetryPolicy&amp;amp;lt;SqlDatabaseTransientErrorDetectionStrategy&amp;amp;gt;(RetryStrategy.DefaultExponential);

var productAccessor = db.CreateSqlStringAccessor(
&amp;amp;quot;SELECT [ProductID],[ProductName],[QuantityPerUnit],[UnitPrice],[UnitsInStock],[Discontinued] FROM Products&amp;amp;quot;,
MapBuilder&amp;amp;lt;ProductDto&amp;amp;gt;
.MapAllProperties()
.Map(p =&amp;amp;gt; p.ID).ToColumn(&amp;amp;quot;ProductID&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.Name).ToColumn(&amp;amp;quot;ProductName&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.QuantityPerUnit).ToColumn(&amp;amp;quot;QuantityPerUnit&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.UnitPrice).ToColumn(&amp;amp;quot;UnitPrice&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.UnitsInStock).ToColumn(&amp;amp;quot;UnitsInStock&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.Discontinued).ToColumn(&amp;amp;quot;Discontinued&amp;amp;quot;)
.Build());
products = retry.ExecuteAction(() =&amp;amp;gt;
{
return productAccessor.Execute().ToList();
});

}, &amp;amp;quot;ProductService&amp;amp;quot;);

return products;
}
public ProductDto Get(int id)
{
ProductDto productDto = null;

WebApiApplication.exManager.Process(() =&amp;amp;gt;
{
Database db = new DatabaseProviderFactory().Create(&amp;amp;quot;NorthwindInstance&amp;amp;quot;);

var productAccessor = db.CreateSqlStringAccessor(
&amp;amp;quot;SELECT [ProductID],[ProductName],[QuantityPerUnit],[UnitPrice],[UnitsInStock],[Discontinued] FROM Products WHERE [ProductID]=@ProductID&amp;amp;quot;,
new ProdductGetByProductIdParameterMapper(db),
MapBuilder&amp;amp;amp;lt;ProductDto&amp;amp;amp;gt;
.MapAllProperties()
.Map(p =&amp;amp;gt; p.ID).ToColumn(&amp;amp;quot;ProductID&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.Name).ToColumn(&amp;amp;quot;ProductName&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.QuantityPerUnit).ToColumn(&amp;amp;quot;QuantityPerUnit&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.UnitPrice).ToColumn(&amp;amp;quot;UnitPrice&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.UnitsInStock).ToColumn(&amp;amp;quot;UnitsInStock&amp;amp;quot;)
.Map(p =&amp;amp;gt; p.Discontinued).ToColumn(&amp;amp;quot;Discontinued&amp;amp;quot;)
.Build());

productDto = productAccessor.Execute(id).SingleOrDefault();

}, &amp;amp;quot;ProductService&amp;amp;quot;);

return productDto;
}

Netduino 3 Wifi Queued Azure Event Hub Field Gateway V1.0

My ADSL connection had been a bit flaky which had meant I had lost some sensor data with my initial Azure Event Hub gateway. In attempt make the solution more robust this version of the gateway queues unsent messages using the on-board MicroSD card support.

The code assumes that a file move is an “atomic operation”, so it streams the events received from the devices into a temporary directory (configurable) then moves them to the upload directory (configurable).

This code is proof of concept and needs to be soak tested, improved error handling and some additional multi threading locking added plus the magic constants refactored.

This code is called in the nRF24 receive messages handler

private void OnReceive(byte[] data)
{
   activityLed.Write(!activityLed.Read());

   // Ensure that we have a payload
   if (data.Length < 1 )
   {
      Debug.Print( "ERROR - Message has no payload" ) ;
      return ;
   }

   string message = new String(Encoding.UTF8.GetChars(data));
   Debug.Print("+" + DateTime.UtcNow.ToString("HH:mm:ss") + " L=" + data.Length + " M=" + message);

   string filename = DateTime.UtcNow.ToString("yyyyMMddhhmmssff") + ".txt";

   string tempDirectory = Path.Combine("\\sd", "temp");
   string tempFilePath = Path.Combine(tempDirectory, filename);

   string queueDirectory = Path.Combine("\\sd", "data");
   string queueFilePath = Path.Combine(queueDirectory, filename);

   File.WriteAllBytes(tempFilePath, data);

   File.Move(tempFilePath, queueFilePath);

   new Microsoft.SPOT.IO.VolumeInfo("\\sd").FlushAll();
}

A timer initiates the upload process which uses the AMQPNetlite library

bool UploadInProgress = false;

      
void uploaderCallback(object state)
{
   Debug.Print("uploaderCallback - start");

   if (UploadInProgress)
   {
      return;
   }
   UploadInProgress = true;

   string[] eventFilesToSend = Directory.GetFiles(Path.Combine("\\sd", "data")) ;

   if ( eventFilesToSend.Length == 0 )
   {
      Debug.Print("uploaderCallback - no files");
      UploadInProgress = false;
      return ;
   }

   try
   {
      Debug.Print("uploaderCallback - Connect");
      Connection connection = new Connection(new Address(serviceBusHost, serviceBusPort, serviceBusSasKeyName, serviceBusSasKey));

      Session session = new Session(connection);

      SenderLink sender = new SenderLink(session, "send-link", eventHubName);

      for (int index = 0; index < System.Math.Min(eventUploadBatchSizeMaximum, eventFilesToSend.Length); index++)
      {
         string eventFile = eventFilesToSend[ index ] ;

         Debug.Print("-" + DateTime.UtcNow.ToString("HH:mm:ss") + " " + eventFile ); ;

         Message message = new Message()
         {
            BodySection = new Data()
            {
               Binary = File.ReadAllBytes(eventFile),
            },
         ApplicationProperties = new Amqp.Framing.ApplicationProperties(),
         };

         FileInfo fileInfo = new FileInfo(eventFile);

         message.ApplicationProperties["AcquiredAtUtc"] = fileInfo.CreationTimeUtc;
         message.ApplicationProperties["UploadedAtUtc"] = DateTime.UtcNow;
         message.ApplicationProperties["GatewayId"] = gatewayId;
         message.ApplicationProperties["DeviceId"] = deviceId;
         message.ApplicationProperties["EventId"] = Guid.NewGuid();

         sender.Send(message);

         File.Delete(eventFile);

         new Microsoft.SPOT.IO.VolumeInfo("\\sd").FlushAll();
      }

      sender.Close();
      session.Close();
      connection.Close();
   }
   catch (Exception ex)
   {
      Debug.Print("ERROR: Upload failed with error: " + ex.Message);
   }
   finally
   {
      Debug.Print("uploaderCallback - finally");
      UploadInProgress = false;
   }
}

The timer period and number of files uploaded in each batch is configurable. I need to test the application to see how it handles power outages and MicroSD card corruption. The source is Netduino NRF24L01 AMQPNetLite Queued Azure EventHub Gatewaywith all the usual caveats.

This project wouldn’t have been possible without

Netduino 3 Wifi pollution Sensor Part 1

I am working on a Netduino 3 Wifi based version for my original concept as a STEM project for high school students. I wanted to be able to upload data to a Microsoft Azure Eventhub or other HTTPS secured RESTful endpoint (e.g. xivelyIOT) to show how to build a securable solution. This meant a Netduino 3 Wifi device with the TI C3100 which does all the crypto processing was necessary.

The aim was to (over a number of blog posts) build a plug ‘n play box that initially was for measuring airborne particulates and then overtime add more sensors e.g. atmospheric gas concentrations, (Grove multichannel gas sensor), an accelerometer for earthquake early warning/monitoring (Grove 3-Axis Digital Accelerometer) etc.

Netduino 3 Wifi based pollution sensor

Bill of materials for prototype as at (October 2015)

  • Netduino 3 Wifi USD69.95
  • Seeedstudio Grove base shield V2 USD8.90
  • Seeedstudio Grove smart dust sensor USD16.95
  • Seeedstudio Grove Temperature & Humidity Sensor pro USD14.90
  • Seeedstudio ABS outdoor waterproof case USD1.65
  • Seeedstudio Grove 4 pin female to Grove 4 pin conversion cable USD3.90
  • Seeedstudio Grove 4 pin buckled 5CM cabed USD1.90

After the first assembly I have realised the box is a bit small. There is not a lot of clearance around the Netduino board (largely due to the go!bus connectors on the end making it a bit larger than a standard *duino board) and the space for additional sensors is limited so I will need to source a larger enclosure.

The dust sensor doesn’t come with a cable so I used the conversion cable instead. NOTE – The pins on the sensor are numbered right->Left rather than left->right.

The first step is to get the temperature and humidity sensor working with my driver code, then adapt the Seeedstudio Grove-Dust sensor code for the dual outputs of the SM-PWM-01 device.

According to the SM-PWM-01A device datasheet The P1 output is for small particles < 1uM (smoke) and P2 output is for large particles > 2uM (dust). The temperature & humidity sensor is included in the first iteration as other researchers have indicated that humidity levels can impact on the accuracy of optical particle counters.

Then, once the sensors are working as expected I will integrate a cut back version of the AMQPNetLite code and configuration storage code I wrote for my Netduino 3 wifi Azure EventHub Field Gateway.