HTTP Headers request reduction

The HTTP protocol is pretty light weight but in some cases the overhead can impact on performance\operational costs\scalability etc. So I fired up Fiddler to have a look at what was going on.

Request – Bytes Sent: 305
POST http://gpstrackerhttpheaders.cloudapp.net/posV1.aspx HTTP/1.1
x-DeviceMacAddress: 5C-86-4A-00-3F-63
x-3DFix: True
x-GPSTime: 2011 06 01 01:52:05
x-Latitude: -43.XXXXX
x-Longitude: 172.XXXXX
x-HDoP: 0.83
x-Altitude: 24.1
x-Speed: 0
x-Heading: 0
Content-Length: 0
Connection: Keep-Alive
Host: gpstrackerhttpheaders.cloudapp.net

Response – Bytes Received: 278
HTTP/1.1 200 OK
Cache-Control: private
Server: Microsoft-IIS/7.0
Set-Cookie: ASP.NET_SessionId=2giugnxvke4iv3vtekln1n0k; path=/; HttpOnly
x-UpdateIntervalSecs: 30
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Sat, 19 Jan 2013 10:00:49 GMT
Content-Length: 0

The first step was to shorten the header names (this could be taken to extremes with short names and a limited number of headers) which would reduce the size of the request and remove the ASP.Net session state information from the response.

Request – Bytes Sent: 263
POST http://gpstrackerhttpheaders.cloudapp.net/posV2.aspx HTTP/1.1
x-ID: 5C-86-4A-00-3F-63
x-3D: True
x-Time: 2011 06 01 02:16:26
x-Lat: -43.XXXXX
x-Lon: 172.XXXXX
x-HDoP: 0.92
x-Alt: 25.0
x-Spd: 1
x-Hdg: 0
Content-Length: 0
Connection: Keep-Alive
Host: gpstrackerhttpheaders.cloudapp.net

Response – Bytes Received: 217
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/html
Server: Microsoft-IIS/7.0
x-UpdMin: 30
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Sat, 19 Jan 2013 10:25:12 GMT
Content-Length: 0

With minimal effort the payloads were reduced by roughly 90 bytes (V1 573 bytes V2 480 bytes),

HTTP Headers baseline

Using the HttpWebRequest functionality in system.http and using the HTTP headers to upload the GPS position data is a pretty simple and low code approach. For this initial version I’m uploading the

  • Device Mac Address
  • 2D vs 3D fix
  • Latitude
  • Longitude
  • Horizontal dilution of position (HDoP)
  • Altitude
  • Speed
  • Heading

The server responds with

  • Minimum time between position reports

This approach does have some disadvantages

Adding system.http increases the size of the download to by roughly 40K which on the Netduino Plus could be a problem. The HTTP requests and responses can also be a bit chunky. The HttpWebRequest.GetResponse call is synchronous which could cause some issues with the processing of the GPS NMEA data stream (particularly when there are connectivity problems).

I’ll be looking at solutions to these issues in future posts

Client application

using (HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(CloudServerUrl))
{
request.Method = "POST";
request.Headers.Add("x-DeviceMacAddress", DeviceMacAddress());
request.Headers.Add("x-3DFix", Gps.Fix3D.ToString());
request.Headers.Add("x-GPSTime", DateTime.Now.ToString("yyyy MM dd hh:mm:ss"));
request.Headers.Add("x-Latitude", Gps.Latitude.ToString("F5"));
request.Headers.Add("x-Longitude", Gps.Longitude.ToString("F5"));
request.Headers.Add("x-HDoP", Gps.HDoP.ToString("F2"));
request.Headers.Add("x-Altitude", Gps.Altitude.ToString("F1"));
request.Headers.Add("x-Speed", Gps.Kmh.ToString("F0"));
request.Headers.Add("x-Heading", Gps.TrackAngle.ToString("F0"));
request.ContentLength = 0;


using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Debug.Print(" HTTP Status:" + response.StatusDescription);

if (response.StatusCode == HttpStatusCode.OK)
{
if (response.Headers["x-UpdateIntervalSecs"] != null)
{
positionUpdateIntervalMinimum = new TimeSpan(0, 0, int.Parse(response.Headers["x-UpdateIntervalSecs"].ToString()));
}
}
}

On the server side

protected void Page_Load(object sender, EventArgs e)
{
if (this.Request.Headers["x-DeviceMacAddress"] == null)
{
return;
}
string deviceMacAddress = this.Request.Headers["x-DeviceMacAddress"];


if (this.Request.Headers["x-GPSTime"] == null)
{
return;
}
DateTime gpsTime = DateTime.ParseExact(this.Request.Headers["x-GPSTime"], "yyyy MM dd hh:mm:ss", CultureInfo.InvariantCulture);

if (this.Request.Headers["x-3DFix"] == null)
{
return;
}
bool is3DFix = bool.Parse(this.Request.Headers["x-3DFix"]);

if (this.Request.Headers["x-Latitude"] == null)
{
return;
}
Double latitude = Double.Parse(this.Request.Headers["x-Latitude"]);

if (this.Request.Headers["x-Longitude"] == null)
{
return;
}
Double longitude = Double.Parse(this.Request.Headers["x-Longitude"]);

if (this.Request.Headers["x-HDoP"] == null)
{
return;
}
Double hDoP = Double.Parse(this.Request.Headers["x-HDoP"]);

if (this.Request.Headers["x-Altitude"] == null)
{
return;
}
Double altitude = Double.Parse(this.Request.Headers["x-Altitude"]);

if (this.Request.Headers["x-Speed"] == null)
{
return;
}
int speed = int.Parse(this.Request.Headers["x-Speed"]);

if (this.Request.Headers["x-Heading"] == null)
{
return;
}
int heading = int.Parse(this.Request.Headers["x-Heading"]);

this.Response.Headers.Add("x-UpdateIntervalSecs", "30") ;
}

Source for HTTP Headers Client V1, and HTTP Headers ServiceV1

NetMF cloud connectivity options

I do most of my dev work with NetMF devices and Windows Azure so I though it would be useful\interesting to explore the different cloud connectivity options.

I’ll start with the simplest possible HTTP based approach, look at how to reduce the device memory footprint, reduce traffic on the wire, secure the data, and take the solution mobile.

Then, as time allows I’ll build clients which use

The initial scenario is a GPS equiped NetMF device reporting position information to an application running in Windows Azure. Rather than reinventing the wheel I have used the NetMF Toolbox NMEA GPS module but have added HDoP reporting.

The samples will run on a Netduino, Netduino Plus or Fez Spider.

For the netduino based examples the BoM is

Writing to an Azure Storage Queue from a Micro Framework device

In my TechEd presention I demoed writing messages into an Azure Storage Queue from a Netduino Plus. I have knocked up a couple of basic NetduinoPlus samples to show how this works. The code was inspired by these articles

http://azurestoragesamples.codeplex.com/
http://soumya.wordpress.com/2010/05/21/azure-simplified-part-5-using-rest-api-for-azure-queue-storage/
http://convective.wordpress.com/2010/08/18/examples-of-the-windows-azure-storage-services-rest-api/
http://brentdacodemonkey.wordpress.com/2009/04/16/azure-storage-%E2%80%93-hands-on-with-queues-part-1/
http://msdn.microsoft.com/en-us/magazine/jj190807.aspx

Both versions have a dependency on a third party SHA implementation (I really didn’t want to implement that myself) of which there are several available for the .Net MF. I have used ones from inControl & ElzeKool successfully.

There are two versions, one which has a dependency on the .Net MF Common extensions for basic string manipulation fuinctionality and a low dependency version which uses only what is available in the .Net MF framework.

Code is available from here

EntLib Logging in Azure Pt 3 – Working

After some more reading starting here I added a counter to the onRun so I could see if log entries were getting dropped (my gut feel was correct, they were)

public override void Run()
{
int index = 0;
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Start " + DateTime.UtcNow.ToLongTimeString());

while (true)
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write(“Run Loop ” + index.ToString() + ” ” + DateTime.UtcNow.ToLongTimeString());
index++;
Thread.Sleep(10000);
}
}

So after some trial and error I settled on this initialisation approach as the most robust

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue(“Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString”));
RoleInstanceDiagnosticManager roleInstanceDiagnosticManager = storageAccount.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId, RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);

var configuration = roleInstanceDiagnosticManager.GetCurrentConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

roleInstanceDiagnosticManager.SetCurrentConfiguration(configuration);

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource(“web.config”, false));

Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write(“OnStart ” + DateTime.UtcNow.ToLongTimeString());

return base.OnStart();
}
This sample application seems to work and reliably log all the information I expect to the trace logs.

EntLib Logging in Azure Pt 2 – All is not what it seems

After confirming my environment was good I modified the sample code and added logging to the OnStart, I also added an OnRun with some logging and a simple while loop.

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
DiagnosticMonitor.Start(“Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString”, configuration);

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource("web.config", false));

Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("OnStart " + DateTime.UtcNow.ToLongTimeString());

return base.OnStart();
}

public override void Run()
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Start " + DateTime.UtcNow.ToLongTimeString());

while (true)
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Loop " + DateTime.UtcNow.ToLongTimeString());
Thread.Sleep(10000);
}

base.Run();
}

Initially the webrole wouldn’t start and then after re-reading the full IIS vs. core post I realised that the entlib couldn’t find its config file so I added

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource(“web.config”, false));

I could then see some log entries (note the process name…)

<TraceSource>General</TraceSource>

But my onStart logging wasn’t working and when I looked at the timestamps on the log entries it felt like there were some missing. Time for some more reading…

Entlib Logging in Azure Pt 1 – The baseline

Back when the Azure SDK V1.3 was released several of my cloud services broke due to the change from core to full IIS. I use the Enterprise Library which was also broken by this update. The Azure team talked about the changes in detail here and SMarx talked about how to fix you configuration problems here. My customer was in a hurry so I just put in the quick ‘n’ dirty hack and moved on. Every so often the warnings in the error list tab would annoy me so I would go back and have another look.

warning CloudServices078: The web role XXX is configured using a legacy syntax that specifies that it runs in Hostable Web Core.

One of these webroles was an SMS Gateway which polled an Azure storage queue for messages to send in the onRun and had two simple asp.net pages which inserted inbound messages and delivery notifications into Azure Storage queues for processing by a queue handler worker role.

The problem was that at 5-15mins for a deploy it was easy to get distracted and with the release of the Azure integration pack nearly 2 years later I figured it was time to sort this properly. I wanted to be able use the Entlib logging etc in the onStart, onRun and ASP.Net pages of an webrole. There was some useful reading here and here.

To reduce the application to a managable size I started with the Azure Integration pack logging hands on lab and this is the key section of webrole.cs

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", configuration);

// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

return base.OnStart();
}

I built the application and deployed it into the cloud and everything worked as expected. When I went to the default.aspx page and clicked on the button. I use Cerebrata Diagnostics Manager and with this I could see log entries appearing.

Microsoft TechEd NZ 2012

Spoke at TechEd this year. My topic was about the Internet of Things and Windows Azure

(http://channel9.msdn.com/Events/TechEd/NewZealand/TechEd-New-Zealand-2012/AZR302)

Presentation Abstract Ericsson CEO Hans Vestberg estimates 50 billion devices will be connected to the Web by 2020. That’s seven devices for each human on the planet. Writing embedded software which has to interface to data acquisition devices, and building scalable Internet of Things applications, has historically been hard. The .Net Micro Framework makes embedded development easy for .Net developers. Coupled with Windows Azure, developing a scalable Internet of Things application is much easier. Using Windows Azure and the .Net Micro Framework I will give a practical demonstration of how easily you can use the Microsoft stack to construct an Internet of Things application. The application will acquire data from a selection of sensors, upload it to Azure for processing, and keep users updated in close to real-time. Come and see what you can build with a GPS, an Accelerometer, a Netduino Plus, SignalR and Windows Azure.

The scenario was QuakeZure – Commodity Hardware and cloud computing for earthquake early warning.

My feedback scores out of 4

AZR302 Azure, the Platform for Internet of Things Applications 06/9/2012 16:30 Epsom Room Speaker(s): Bryn Lewis

Num. Submitted 34

Session Content 3.44

Session Presenter 3.82

Overall Session 3.59

Technical level of this session

Just Right 91.18%

Not Technical Enough 5.88%

Too Technical 2.94%