Fiddler Composer with the image field name and upload file button highlighted
The currentimplementation only supports the uploading of one image at a time in a field called “image”.
Fiddler console after succesfull upload
This implementation supports a “Content-Type” of “application/octet-stream” or “image/jpeg”.
[HttpPost("{id}/image")]
public async Task<ActionResult> Upload([FromRoute(Name = "id")][Range(1, int.MaxValue, ErrorMessage = "StockItem id must greater than 0")] int id, [FromForm] IFormFile image)
{
if (image == null)
{
return this.BadRequest("Image image file missing");
}
if (image.Length == 0)
{
return this.BadRequest("Image image file is empty");
}
if ((string.Compare(image.ContentType, "application/octet-stream",true) != 0) && (string.Compare(image.ContentType, "image/jpeg", true) != 0))
{
return this.BadRequest("Image image file content-type is not application/octet-stream or image/jpeg");
}
try
{
using (MemoryStream ms = new MemoryStream())
{
await image.CopyToAsync(ms);
ms.Seek(0, SeekOrigin.Begin);
using (SqlConnection db = new SqlConnection(this.connectionString))
{
DynamicParameters parameters = new DynamicParameters();
parameters.Add("StockItemId", id);
parameters.Add("photo", ms, DbType.Binary, ParameterDirection.Input);
await db.ExecuteAsync(sql: @"UPDATE [WareHouse].[StockItems] SET [Photo]=@Photo WHERE StockItemID=@StockItemId", param: parameters, commandType: CommandType.Text);
}
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Updating photo of StockItem with ID:{0}", id);
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return this.Ok();
}
After uploading the image I could download it as either a stream of bytes(displayed in Fiddler) or Base64 encoded (this had to be converted to an image)
Fiddler displaying downloaded jpeg image
This implementation doesn’t support the uploading of multiple images or the streaming of larger images but would be sufficient for uploading thumbnails etc.
I needed to add some code using Dapper to retrieve images stored in a database for a webby client. The stockItems table has a column for a photo but they were all null…
CREATE TABLE [Warehouse].[StockItems](
[StockItemID] [int] NOT NULL,
[StockItemName] [nvarchar](100) NOT NULL,
[SupplierID] [int] NOT NULL,
[ColorID] [int] NULL,
[UnitPackageID] [int] NOT NULL,
[OuterPackageID] [int] NOT NULL,
[Brand] [nvarchar](50) NULL,
[Size] [nvarchar](20) NULL,
[LeadTimeDays] [int] NOT NULL,
[QuantityPerOuter] [int] NOT NULL,
[IsChillerStock] [bit] NOT NULL,
[Barcode] [nvarchar](50) NULL,
[TaxRate] [decimal](18, 3) NOT NULL,
[UnitPrice] [decimal](18, 2) NOT NULL,
[RecommendedRetailPrice] [decimal](18, 2) NULL,
[TypicalWeightPerUnit] [decimal](18, 3) NOT NULL,
[MarketingComments] [nvarchar](max) NULL,
[InternalComments] [nvarchar](max) NULL,
[Photo] [varbinary](max) NULL,
[CustomFields] [nvarchar](max) NULL,
[Tags] AS (json_query([CustomFields],N'$.Tags')),
[SearchDetails] AS (concat([StockItemName],N' ',[MarketingComments])),
[LastEditedBy] [int] NOT NULL,
[ValidFrom] [datetime2](7) GENERATED ALWAYS AS ROW START NOT NULL,
[ValidTo] [datetime2](7) GENERATED ALWAYS AS ROW END NOT NULL,
CONSTRAINT [PK_Warehouse_StockItems] PRIMARY KEY CLUSTERED
(
[StockItemID] ASC
)
I uploaded images of three different colours of sellotape dispensers with the following SQL
UPDATE Warehouse.StockItems
SET [Photo] =(SELECT * FROM Openrowset( Bulk 'C:\Users\BrynLewis\Pictures\TapeDispenserBlue.jpg', Single_Blob) as MyImage) where StockItemID =
-- 203 Tape dispenser (Black)
-- 204 Tape dispenser (Red)
-- 205 Tape dispenser (Blue)
There are two options for downloading the image. The first is as a stream of bytes
[HttpGet("{id}/image")]
public async Task<ActionResult> GetImage([Range(1, int.MaxValue, ErrorMessage = "StockItem id must greater than 0")] int id)
{
Byte[] response;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.ExecuteScalarAsync<byte[]>(sql: @"SELECT [Photo] as ""photo"" FROM [Warehouse].[StockItems] WHERE StockItemID=@StockItemId", param: new { StockItemId = id }, commandType: CommandType.Text);
}
if (response == default)
{
logger.LogInformation("StockItem:{0} image not found", id);
return this.NotFound($"StockItem:{id} image not found");
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Looking up a StockItem:{0} image", id);
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return File(response, "image/jpeg");
}
[HttpGet("{id}/base64")]
public async Task<ActionResult> GetBase64([Range(1, int.MaxValue, ErrorMessage = "Stock item id must greater than 0")] int id)
{
Byte[] response;
try
{
using (SqlConnection db = new SqlConnection(this.connectionString))
{
response = await db.ExecuteScalarAsync<byte[]>(sql: @"SELECT [Photo] as ""photo"" FROM [Warehouse].[StockItems] WHERE StockItemID=@StockItemId", param: new { StockItemId = id }, commandType: CommandType.Text);
}
if (response == default)
{
logger.LogInformation("StockItem:{0} Base64 not found", id);
return this.NotFound($"StockItem:{id} image not found");
}
}
catch (SqlException ex)
{
logger.LogError(ex, "Looking up a StockItem withID:{0} base64", id);
return this.StatusCode(StatusCodes.Status500InternalServerError);
}
return Ok("data:image/jpeg;base64," + Convert.ToBase64String(response));
}
I lost an hour from my life I will never get back figuring out that a correctly formatted/spelt content types “image/jpeg” and “data:image/jpeg;base64” were key to getting the webby client to render image.
After some research I found references to the underlying problem in TTN and OpenAPI forums. The Dev_addr and Dev_eui fields are Base16(Hexidecimal) encoded binary but are being processed as if they were Base64(mime) encoded.
The TTI connector only displays the Device EUI so I changed the Dev_eui property to a string
Now the DeviceEUI values are displayed correctly and searching for EndDevices in Azure Application Insights is easier
TTI V3 Connector application running as a console application showing correct DeviceEUIs
Modifying the nSwag generated classes is a really nasty way of fixing the problem but I think this approach is okay as it’s only one field and any other solution I could find was significantly more complex.
In part 1 & part 2 I had been ignoring the payload_fields property of the Payload class. The documentation indicates that payload_fields property is populated when an uplink message is Decoded.
I used JSON2Csharp to generate C# classes which would deserialise the above uplink message.
// Third version of classes for unpacking HTTP payload
public class Gps1V3
{
public int altitude { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
}
public class PayloadFieldsV3
{
public double analog_in_1 { get; set; }
public int digital_in_1 { get; set; }
public Gps1V3 gps_1 { get; set; }
public int luminosity_1 { get; set; }
public double temperature_1 { get; set; }
}
public class GatewayV3
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV3
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV3> gateways { get; set; }
}
public class PayloadV3
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool is_retry { get; set; }
public string payload_raw { get; set; }
public PayloadFieldsV3 payload_fields { get; set; }
public MetadataV3 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV3Fields : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV3 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV3Fields validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw} analog_in_1:{payload.payload_fields.analog_in_1} digital_in_1:{payload.payload_fields.digital_in_1} gps_1:{payload.payload_fields.gps_1.latitude},{payload.payload_fields.gps_1.longitude},{payload.payload_fields.gps_1.altitude} luminosity_1:{payload.payload_fields.luminosity_1} temperature_1:{payload.payload_fields.temperature_1}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint. In the body of the Application Insights events I could see the devEUI, port, and the payload fields had been extracted from the message.
This arrangement was pretty nasty and sort of worked but in the “real world” would not have been viable. I would need to generate lots of custom classes for each application taking into account the channel numbers (e,g, analog_in_1,analog_in_2) and datatypes used.
I also explored which datatypes were supported by the TTN decoder, after some experimentation (Aug 2019) it looks like only the LPPV1 ones are.
AnalogInput
AnalogOutput
DigitalInput
DigitalOutput
GPS
Accelerometer
Gyrometer
Luminosity
Presence
BarometricPressure
RelativeHumidity
Temperature
What I need is a more flexible way to stored and decode payload_fields property..
I used JSON2Csharp and a payload I downloaded in Part 1 to generate C# classes which would deserialise my minimalist messages.
// First version of classes for unpacking HTTP payload https://json2csharp.com/
public class GatewayV1
{
public string gtw_id { get; set; }
public int timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
public class MetadataV1
{
public string time { get; set; }
public double frequency { get; set; }
public string modulation { get; set; }
public string data_rate { get; set; }
public string coding_rate { get; set; }
public List<GatewayV1> gateways { get; set; }
}
public class PayloadV1
{
public string app_id { get; set; }
public string dev_id { get; set; }
public string hardware_serial { get; set; }
public int port { get; set; }
public int counter { get; set; }
public bool confirmed { get; set; }
public string payload_raw { get; set; }
public MetadataV1 metadata { get; set; }
public string downlink_url { get; set; }
}
I added a new to controller to my application which used the generated classes to deserialise the body of the POST from the TTN Application Integration.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV1 : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV1 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV1 validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Payload Base64:{payload.payload_raw}");
return this.Ok();
}
}
I then updated the TTN application integration to send messages to my new endpoint.
TTN Application configuration overview
In the body of the Application Insights events I could see the devEUI, port, and the raw payload had been extracted from the message.
I then added another controller which decoded the Base64 encoded payload_raw.
[Route("[controller]")]
[ApiController]
public class ClassSerialisationV2Base64Decoded : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public IActionResult Post([FromBody] PayloadV2 payload)
{
// Check that the post data is good
if (!this.ModelState.IsValid)
{
log.WarnFormat("ClassSerialisationV2BCDDecoded validation failed {0}", this.ModelState.Messages());
return this.BadRequest(this.ModelState);
}
log.Info($"DevEUI:{payload.hardware_serial} Port:{payload.port} Payload:{ Encoding.UTF8.GetString(Convert.FromBase64String(payload.payload_raw))}");
return this.Ok();
}
}
Then after a while the deserialisation started to fail with an HTTP 400-Bad request. When I ran the same request with Telerik Fiddler on my desktop the raw response was
HTTP/1.1 400 Bad Request
Transfer-Encoding: chunked
Content-Type: application/problem+json; charset=utf-8
Server: Microsoft-IIS/10.0
Request-Context: appId=cid-v1:f4f72f2e-1144-4578-923f-d3ebdcfb7766
X-Powered-By: ASP.NET
Date: Mon, 31 Aug 2020 09:07:30 GMT
17a
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1",
"title":"One or more validation errors occurred.",
"status":400,
"traceId":"00-45033ec030b63d4ebb82b95b67cb8142-9fc52a18be202848-00",
"errors":{
"$.metadata.gateways[0].timestamp":["The JSON value could not be converted to System.Int32.
Path: $.metadata.gateways[0].timestamp | LineNumber: 21 | BytePositionInLine: 35."]}}
0
The line in the payload was the gateway timestamp. The value was 2,426,973,100 which larger than 2,147,483,647 the maximum number that can be stored in a signed 32 bit integer. The JSON2CSharp generator had made a reasonable choice of datatype but in this case the range was not sufficient.
public class GatewayV2
{
public string gtw_id { get; set; }
public ulong timestamp { get; set; }
public DateTime time { get; set; }
public int channel { get; set; }
public int rssi { get; set; }
public double snr { get; set; }
public int rf_chain { get; set; }
public double latitude { get; set; }
public double longitude { get; set; }
public int altitude { get; set; }
}
I checked the TTN code where the variable was declared as an unsigned 64 bit integer.
This issue could occur for other variables so I need to manually check all the generated classes.
This is the first in a series of posts about building an HTTP Integration for a The Things Network(TTN) application. I have assumed that readers are familiar with the configuration and operation of a TTN instance so I’m not going to cover that in detail.
#include <LoRaWan.h>
unsigned char data[] = {0x53, 0x65, 0x65, 0x65, 0x64, 0x75, 0x69, 0x6E, 0x6F, 0x20, 0x4C, 0x6F, 0x52, 0x61, 0x57, 0x41, 0x4E};
char buffer[256];
void setup(void)
{
SerialUSB.begin(9600);
while (!SerialUSB);
lora.init();
memset(buffer, 0, 256);
lora.getVersion(buffer, 256, 1);
SerialUSB.print("Ver:");
SerialUSB.print(buffer);
memset(buffer, 0, 256);
lora.getId(buffer, 256, 1);
SerialUSB.print(buffer);
SerialUSB.print("ID:");
lora.setKey(NULL, NULL, "12345678901234567890123456789012");
lora.setId(NULL, "1234567890123456", "1234567890123456");
lora.setPort(10);
lora.setDeciveMode(LWOTAA);
lora.setDataRate(DR0, AS923);
lora.setDutyCycle(false);
lora.setJoinDutyCycle(false);
lora.setPower(14);
while (!lora.setOTAAJoin(JOIN, 10))
{
SerialUSB.println("");
}
SerialUSB.println( "Joined");
}
void loop(void)
{
bool result = false;
//result = lora.transferPacket("Hello World!", 10);
result = lora.transferPacket(data, sizeof(data));
if (result)
{
short length;
short rssi;
memset(buffer, 0, 256);
length = lora.receivePacket(buffer, 256, &rssi);
if (length)
{
SerialUSB.print("Length is: ");
SerialUSB.println(length);
SerialUSB.print("RSSI is: ");
SerialUSB.println(rssi);
SerialUSB.print("Data is: ");
for (unsigned char i = 0; i < length; i ++)
{
SerialUSB.print("0x");
SerialUSB.print(buffer[i], HEX);
SerialUSB.print(" ");
}
SerialUSB.println();
}
}
delay( 10000);
}
The SetKey and SetId parameters are not obvious and it would be much easier if there were two methods one for OTTA and the other for Activation by-Personalization(ABP). I then built an Net Core 3.1 Web API application which had a single controller to receive messages from TTN.
[Route("[controller]")]
[ApiController]
public class Raw : ControllerBase
{
private static readonly ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
[HttpGet]
public string Index()
{
return "move along nothing to see";
}
[HttpPost]
public void PostRaw([FromBody]JsonElement body)
{
string json = JsonSerializer.Serialize(body);
log.Info(json);
}
}
I then configured my TTN application integration to send messages to my shinny new endpoint
TTN Application configuration overview
My controller logged events to Azure application Insights so I could see if there were any errors and inspect message payloads. The TTN developers provide sample payloads to illustrate the message format but they were a bit chunky for my initial testing.
Application Insights event list
I could then inspect individual events and payloads
Application Insights event display
At this point I could download message payloads and save them locally.
These were useful because I could then use a tool like Telerik Fiddler to submit messages to my application when it was running locally in the Visual Studio 2019 debugger.
Application Insights logging with message unpackingApplication Insights logging message payload
Then in the last log entry the decoded message payload
/*
Copyright ® 2020 Feb devMobile Software, All Rights Reserved
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
Default URL for triggering event grid function in the local environment.
http://localhost:7071/runtime/webhooks/EventGrid?functionName=functionname
*/
namespace EventGridProcessorAzureIotHub
{
using System;
using System.IO;
using System.Reflection;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using log4net;
using log4net.Config;
using Newtonsoft.Json;
public static class Telemetry
{
[FunctionName("Telemetry")]
public static void Run([EventGridTrigger]Microsoft.Azure.EventGrid.Models.EventGridEvent eventGridEvent, ExecutionContext executionContext )//, TelemetryClient telemetryClient)
{
ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
var logRepository = LogManager.GetRepository(Assembly.GetEntryAssembly());
XmlConfigurator.Configure(logRepository, new FileInfo(Path.Combine(executionContext.FunctionAppDirectory, "log4net.config")));
log.Info($"eventGridEvent.Data-{eventGridEvent}");
log.Info($"eventGridEvent.Data.ToString()-{eventGridEvent.Data.ToString()}");
IotHubDeviceTelemetryEventData iOThubDeviceTelemetryEventData = (IotHubDeviceTelemetryEventData)JsonConvert.DeserializeObject(eventGridEvent.Data.ToString(), typeof(IotHubDeviceTelemetryEventData));
log.Info($"iOThubDeviceTelemetryEventData.Body.ToString()-{iOThubDeviceTelemetryEventData.Body.ToString()}");
byte[] base64EncodedBytes = System.Convert.FromBase64String(iOThubDeviceTelemetryEventData.Body.ToString());
log.Info($"System.Text.Encoding.UTF8.GetString(-{System.Text.Encoding.UTF8.GetString(base64EncodedBytes)}");
}
}
}
Overall it took roughly half a page of code (mainly generated by a tool) to unpack and log the contents of an Azure IoT Hub EventGrid payload to Application Insights.
I have one an Azure IoT HubLoRa Telemetry Field Gateway running in my office and I wanted to process the data collected by the sensors around my property without using a Software as a Service(SaaS) Internet of Things (IoT) package.
Rather than lots of screen grabs of my configuration steps I figured people reading this series of posts would be able to figure the details out themselves.
I downloaded the JSON configuration file template from my Windows 10 device (which is created on first startup after installation) and configured the Azure IoT Hub connection string.
I then uploaded this to my Windows 10 IoT Core device and restarted the Azure IoT Hub Field gateway so it picked up the new settings.
I could then see on the device messages from sensor nodes being unpacked and uploaded to my Azure IoT Hub.
ETW logging on device
In the Azure IoT Hub metrics I graphed the number of devices connected and the number of telemetry messages sent and could see my device connect then start uploading telemetry.
Azure IoT Hub metrics
One of my customers uses Azure Event Grid for application integration and I wanted to explore using it in an IoT solution. The first step was to create an Event Grid Domain.
To confirm my event subscriptions were successful I previously found the “simplest” approach was to use an Azure storage queue endpoint. I had to create an Azure Storage Account with two Azure Storage Queues one for device connectivity (.DeviceConnected & .DeviceDisconnected) events and the other for device telemetry (.DeviceTelemetry) events.
I created a couple of other subscriptions so I could compare the different Event schemas (Event Grid Schema & Cloud Event Schema v1.0). At this stage I didn’t configure any Filters or Additional Features.
Azure IoT Hub Telemetry Event Metrics
I use Cerebrate Cerculean for monitoring and managing a couple of other customer projects so I used it to inspect the messages in the storage queues.
Without writing any code (I will script the configuration) I could upload sensor data to an Azure IoT Hub, subscribe to a selection of events the Azure IoT Hub publishes and then inspect them in an Azure Storage Queue.
I did notice that the .DeviceConnected and .DeviceDisconnected events did take a while to arrive. When I started the field gateway application on the device I would get several DeviceTelemetry events before the DeviceConnected event arrived.