MEMS Accelerometers – Issues

The BMA180 looked ideal with good sensitivity etc. but I was wondering why the Bosch accelerometer page only had a flyer and no data sheet.

The device has been discontinued so I need to do further research. I think the Kionix KXCJ9 or Freescale MMA8451 are worth a look

Also considering an integrated Gyroscope + Accelerometers device like the InvenSense MPU 6000 series and only using the accelerometers.

MEMS Accelerometers – Options

Have ordered an Analog Devices ADXL345 Seedstudio twig and the specs look positive.

Next range of sensors to look at is from Bosch (BMA180 most probably) I’ll have to see what I can get on a breakout board, but I will most probably need a logic level convertor as well.

Seeedstudio have a wish list might see if I can get them interested….

Need to do this without blowing the toys budget as I paid provisional tax tonight

MEMs Accelerometers – Trade offs

Spending a lot of time looking at consumer, industrial & military grade accelerometer specification sheets to see what is available/suitable. As soon as I go beyond strong motion & early warning objectives the accelerometer device cost gets prohibitive. Though, I might buy a couple of the more expensive sensors and look at providing optional driver support for them.

I have been looking mainly at I2C connected devices so that I can avoid dealing with Analog to Digitial converters etc. Might have to buy an analog device like the ADXL335 and and see how well it works with the processor boards I am planning to use.The I2C interface may have some limitations for maximum sampling speed but the onboard queues in some of the devices look useful.

I have been using a SeeedStudio Grove twig based on the Freescale MMA7660FC but they also have another twig based on the Analog Devices ADXL 345 and I will include a couple in my next order. Looking for a daughter board based on the ADXL103 single axis or ADXL203 dual axis device.

Other wildcard ideas include DIY seismometer project

QuakeZure – Elevator pitch

Over the last couple of years my home town of Christchurch has been badly damaged by a series of earthquakes. The Alpine Fault which runs along the Southern Alps is due to rupture anytime. The impact of the Alpine Fault rupturing on the Otago region is discussed here. Based on the Christchurch experience even 10’s of seconds of warning could save many lives.

An earthquake consists of P-Waves & S-Waves which travel at 2-8 km/s (S-Waves travel at roughly 60% the speed of P-Waves) so by using complex event analysis software running in the cloud and a dense array of commodity sensors it should be possible to provide 10’s of seconds warning depending on how far away the epicentre is located.

My TechEd 2012 presentation demonstrated a proof of concept of how you could use commodity embedded hardware & cloud computing resources to provide early warning of earthquakes.

In my presentation I talk about how a dense network of sensors could be deployed by using ADSL cabinets, cell sites, police stations, telephone exchanges etc. (basically anywhere with robust power and communications)In addition, private individuals could contribute by hosting devices in their homes, like the Quake Catcher Network.

There are already systems available for providing early warning, including at least one locally developed one. But, in a country like New Zealand where population density is low getting a sufficiently dense network of conventional seismic grade sensors would be expensive.

I’m not a Seismologist and I’m struggling to find one who has the time to listen, but I think the problem is solvable by using existing technologies in new and innovative ways.

It wouldn’t take a lot of cash to build a working prototype and deploy some units into the field. Other developers and Microsoft have offered development and cloud computing resources. I just need a seismologist to help with the validating the algorithms and approach.

So if you know a seismologist who could help…

Quakezure device

This is the prototype QuakeZure device for detecting P-Waves and notifying Azure backoffice.

The kit cost about USD 120 and would get cheaper in quantity. Organisations like Seeedstudio can take a concept built with the Grove prototyping kit I used and organise production engineering of a real product from a concept if volumes are sufficient.

Netduino Plus USD 57.95
Grove Base shield USD 9.90
Grove Accelerometer USD 12.90
Grove GPS USD 39.90

I have been looking at other MEMS devices from Freescale,  ST MIcroelectronics, Analog devices, Memsic  plus a few others. The key issue is that as soon as you go beyond consumer grade accelerometers the price rapidly rises to several times the cost of the rest of the kit.

For a production system you would most probably use something like GHI G120 Module USD37.39 with a custom board with the necessary power supply, connectivity & sensors mounted on it.

PoC QuakeZure Client on my desk

Writing to an Azure Storage Queue from a Micro Framework device

In my TechEd presention I demoed writing messages into an Azure Storage Queue from a Netduino Plus. I have knocked up a couple of basic NetduinoPlus samples to show how this works. The code was inspired by these articles

http://azurestoragesamples.codeplex.com/
http://soumya.wordpress.com/2010/05/21/azure-simplified-part-5-using-rest-api-for-azure-queue-storage/
http://convective.wordpress.com/2010/08/18/examples-of-the-windows-azure-storage-services-rest-api/
http://brentdacodemonkey.wordpress.com/2009/04/16/azure-storage-%E2%80%93-hands-on-with-queues-part-1/
http://msdn.microsoft.com/en-us/magazine/jj190807.aspx

Both versions have a dependency on a third party SHA implementation (I really didn’t want to implement that myself) of which there are several available for the .Net MF. I have used ones from inControl & ElzeKool successfully.

There are two versions, one which has a dependency on the .Net MF Common extensions for basic string manipulation fuinctionality and a low dependency version which uses only what is available in the .Net MF framework.

Code is available from here

EntLib Logging in Azure Pt 3 – Working

After some more reading starting here I added a counter to the onRun so I could see if log entries were getting dropped (my gut feel was correct, they were)

public override void Run()
{
int index = 0;
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Start " + DateTime.UtcNow.ToLongTimeString());

while (true)
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write(“Run Loop ” + index.ToString() + ” ” + DateTime.UtcNow.ToLongTimeString());
index++;
Thread.Sleep(10000);
}
}

So after some trial and error I settled on this initialisation approach as the most robust

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue(“Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString”));
RoleInstanceDiagnosticManager roleInstanceDiagnosticManager = storageAccount.CreateRoleInstanceDiagnosticManager(RoleEnvironment.DeploymentId, RoleEnvironment.CurrentRoleInstance.Role.Name, RoleEnvironment.CurrentRoleInstance.Id);

var configuration = roleInstanceDiagnosticManager.GetCurrentConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;

roleInstanceDiagnosticManager.SetCurrentConfiguration(configuration);

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource(“web.config”, false));

Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write(“OnStart ” + DateTime.UtcNow.ToLongTimeString());

return base.OnStart();
}
This sample application seems to work and reliably log all the information I expect to the trace logs.

EntLib Logging in Azure Pt 2 – All is not what it seems

After confirming my environment was good I modified the sample code and added logging to the OnStart, I also added an OnRun with some logging and a simple while loop.

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
DiagnosticMonitor.Start(“Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString”, configuration);

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource("web.config", false));

Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("OnStart " + DateTime.UtcNow.ToLongTimeString());

return base.OnStart();
}

public override void Run()
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Start " + DateTime.UtcNow.ToLongTimeString());

while (true)
{
Microsoft.Practices.EnterpriseLibrary.Logging.Logger.Write("Run Loop " + DateTime.UtcNow.ToLongTimeString());
Thread.Sleep(10000);
}

base.Run();
}

Initially the webrole wouldn’t start and then after re-reading the full IIS vs. core post I realised that the entlib couldn’t find its config file so I added

EnterpriseLibraryContainer.Current = EnterpriseLibraryContainer.CreateDefaultContainer(new FileConfigurationSource(“web.config”, false));

I could then see some log entries (note the process name…)

<TraceSource>General</TraceSource>

But my onStart logging wasn’t working and when I looked at the timestamps on the log entries it felt like there were some missing. Time for some more reading…

Entlib Logging in Azure Pt 1 – The baseline

Back when the Azure SDK V1.3 was released several of my cloud services broke due to the change from core to full IIS. I use the Enterprise Library which was also broken by this update. The Azure team talked about the changes in detail here and SMarx talked about how to fix you configuration problems here. My customer was in a hurry so I just put in the quick ‘n’ dirty hack and moved on. Every so often the warnings in the error list tab would annoy me so I would go back and have another look.

warning CloudServices078: The web role XXX is configured using a legacy syntax that specifies that it runs in Hostable Web Core.

One of these webroles was an SMS Gateway which polled an Azure storage queue for messages to send in the onRun and had two simple asp.net pages which inserted inbound messages and delivery notifications into Azure Storage queues for processing by a queue handler worker role.

The problem was that at 5-15mins for a deploy it was easy to get distracted and with the release of the Azure integration pack nearly 2 years later I figured it was time to sort this properly. I wanted to be able use the Entlib logging etc in the onStart, onRun and ASP.Net pages of an webrole. There was some useful reading here and here.

To reduce the application to a managable size I started with the Azure Integration pack logging hands on lab and this is the key section of webrole.cs

public override bool OnStart()
{
CloudStorageAccount.SetConfigurationSettingPublisher((configName, configSetter) =>
{
configSetter(RoleEnvironment.GetConfigurationSettingValue(configName));
});

var configuration = DiagnosticMonitor.GetDefaultInitialConfiguration();
configuration.Logs.BufferQuotaInMB = 4;
configuration.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
configuration.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", configuration);

// For information on handling configuration changes
// see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

return base.OnStart();
}

I built the application and deployed it into the cloud and everything worked as expected. When I went to the default.aspx page and clicked on the button. I use Cerebrata Diagnostics Manager and with this I could see log entries appearing.