IoT – Integration of Things: Connecting Raspberry Pi 2 to Azure

This is the second post in my series on Integration of Things. In my previous post I have explained the scenario and architecture, so today I will explain how you can use a Raspberry Pi 2 to act as an IoT fieldhub. The RPi2 is a low power ARM based single board computer at the size of a creditcard, which can run Windows 10 IoT Core, a windows version created specifically for these kind of boards. This also means we can use the .NET framework to set up our solution. The code for this post can be dwonloaded here.


First we will have to flash your RPi2 with Windows 10. There are some great walkthroughs out there, so I will not explain this here, but instead link you to this site which will explain all the steps to be taken.

After we have flashed our RPi2, it’s time to set up Visual Studio. To develop on Windows IoT Core, we will need to install the project templates. This can be done from Visual Studio by going to Tools, Extensions and Updates, and searching for Windows IoT Core Project Templates. Install these templates, and restart Visual Studio to start creating your own IoT solutions. Don’t forget to enable developer mode on your machine by following these instructions, as this is needed to publish your solution to your device.

Windows IoT Core Project Templates

Creating Azure Artifacts

Now that we have installed everything that is needed, we will start creating the Azure artifacts we will need for our fieldhub. As was discussed in my previous post about the architecture for my solution, we will be using Event Hubs and Service Bus queues and topics to get data out from and in to the fieldhub.


First of all we will create a queue which will be used to send errors and warnings from the fieldhub to the custom administration console. This can be technical information like exceptions thrown in the code, but also functional information like warnings generated by the engines (f.e. overheating of the engine, low oil level, etc.). Queues are a common pattern, which most integration developers already know in some form (for example MSMQ). A queue has 1 sender, and 1 or multiple receivers, providing automated load balancing over its receivers. Each message which is placed on the queue will only be processed by 1 receiver, which means that you can easily scale out your solution by adding more clients.Azure Queue

To create a new queue in Azure, open up the portal, go to Service Bus and click Create. Here we can define a new namespace, which is a container for all your Service Bus entities (queues, topics, event hubs, etc.). Provide a name for your namespace, select a pricing tier and region, and make sure you have set the type to Messaging.

ServiceBus Namespace Creation

Once created, open your namespace, and go to the Queues tab. We will now create a queue for the errors and warnings by clicking Create a new queue.Queue Creation


Next we will create a topic, which will be used to send settings (f.e. maximum engine temperature) from the administration console back to the fieldhub. Because we can have multiple fieldhubs for different ships, each with different engines and therefore different settings, we need a mechanisms with which we can route the message to the correct fieldhub. This is what topics are used for. A topic is in many ways the same as a queue, with the main difference being that you can have multiple subscriptions within a topic. A subscription is a filtered output of the incoming messages, where the subscription filters can be based on the message’s properties. In our scenario, we will set a message property with the name of the ship to which the message should be routed, and the client in the fieldhub will read from the subscription to which these messages are routed. The filters you can set can vary from very fine-grained to “catch-all”.

Azure Topic

To create the topic, we go back to our namespace we previously created, and open the Topics tab. Here we will create the topic for the engine administration.

Azure Topic Creation

Once the topic is created, we will add the subscriptions. This can be done in various ways, like using PowerShell, programmatically from your application or by using the portal, but we will use a great tool by Paolo Salvatori called Service Bus Explorer, which is a must have for anyone working with the Service Bus stack. Start by opening the tool, and selecting Enter connection string. The connection string to be used can be found in the Azure portal by going to your Service Bus namespace and clicking Connection Information. You should use the RootManageSharedAccessKey to be able to create, update and remove Service Bus entities from the tool.

Service Bus Explorer

All entities we have created so far will now be shown, including the topic. Right click on the topic, and select Create Subscription. Provide a name for the subscription, for example the name of the ship. We’re also going to set the filter which will make sure that only messages for this fieldhub are going to be sent to this subscription. These filters are specified in a SQL like syntax (ship=’Hydra’). Create 3 subscriptions in total, 2 for ships, and 1 for logging, which will capture all messages (with a filter of 1=1).

Service Bus Explorer Create Subscription

Event Hubs

Finally, we will create the event hub which will be used to capture the sensor data collected from the fieldhubs. Event hubs are highly scalable and can process up to a million messages per second, which makes them ideal for this kind of scenario where you will be collecting readings from many devices. To send the data to event hubs we will make use of AMQP, an open standard application layer protocol for message-oriented middleware, which focusses on message orientation, queuing, routing (including point-to-point and pub-sub), reliability and security. Once inside the event hub the data will be distributed over partitions. You can specify how many partitions your event hub has in the portal, where you can select from 4 to 32 partitions. In case you want even more partitions, you can contact Microsoft, at which point you can get up to 1024 partitions. Optionally you can set a partitionkey when sending data to event hubs, which will make sure all data with the same partitionkey will be placed in the same partition. This is important if data has to be processed sequentially, because the data from event hubs is processed in a streaming manner, and partitions are processed independently from each other. The data is placed in consumer groups, from which our receivers will get the data for further processing. Each consumer group will get a copy of all the messages, and each message will be received by one client in a consumer group. Therefor you should create a consumer group for each application group (in our case one for the administration portal and another for Stream Analytics).

Azure Event Hubs

To create the event hub, we go back to our namespace and open the Event Hubs tab. Now we create the event hub which will be used to receive the data from our fieldhubs.

Azure Event Hubs Creation

As already mentioned, we will need to create 2 consumer groups as well, so once the event hub has been created, open it, and click Create Consumer Group. Add a consumer group for the custom administration console and one for Stream Analytics.

Azure Event Hubs Consumer Group Creation


To connect with the Azure entities we just created we can use Shared Access Keys (SAS). These keys can be set on different levels and with different permissions in our entities to provide better security on our shared entities. The permissions that can be set on these keys are Send, Listen (receive) or Manage. At the top level, we have SAS keys for our namespace. Anyone with these keys has access to all the entities in our namespace. You can also set SAS keys on the entities themselves, which provides much more fine-grained access permissions. Let’s create a policy on our event hub, which will make sure the fieldhubs can only send data towards it. Open the event hub we created earlier, and go to the Configure tab. Add a new policy under Share Access Policies, give this new policy Send permissions, and save your changes. You will now see the Shared Access Key Generator, with which you can create the keys which we will later on use to connect to Event Hubs. Do the same for the queue (Send permissions) and topic (Listen permissions) we created earlier.


Creating Solution

Now that we have our entities in Azure ready, we can create a new project. Notice we have a new category in our Visual Studio projects called Windows IoT Core. We will create a new IoT Background Application project which will act as our fieldhub.

IoT Background Application

A project will now be created with a class StartupTask, which is the entrypoint of our application. The first thing we will want to do is to make sure our application keeps running after it has started up. If we do not do this the application will close once it has finished it’s logic in the Run method.

public sealed class StartupTask : IBackgroundTask
    // Needed to make sure the application keeps running in the background
    private BackgroundTaskDeferral _backgroundTaskDeferral;
    public void Run(IBackgroundTaskInstance taskInstance)
        // Do not close application after startup
        _backgroundTaskDeferral = taskInstance.GetDeferral();

Next we will connect our application with the Azure entities we created. Currently the WindowsAzure.ServiceBus package does not yet work with Windows Runtime, but fortunately Paolo Patierno has created a package called Azure SB Lite, which we can use with WinRT (as well as .NET Compact Framework, .NET Micro Framework and Mono). This package is published as a NuGet package, so we can very easily add this to our project.

Azure SB Lite

One major advantage of Azure SB Lite is that it implements almost all methods from the full WindowsAzure.ServiceBus library, which means that if we already have some code you would like to re-use, it can most probably just copied over and will just work. Let’s start with creating a client which will be used to send the data from our fieldhub into Azure Event Hubs by declaring an EventHubClient under the BackgroundTaskReferral we created earlier. The connection string can be retrieved in the Azure portal by going to your event hub, opening Connection Information, and selecting the connection string for the SAS policy you created earlier.

private readonly EventHubClient _eventHub = EventHubClient.CreateFromConnectionString("Endpoint=sb://;SharedAccessKeyName=fieldhubs;SharedAccessKey=3SkPy94xxxxxxxxxxxl76sBk=", "eventhubfieldhubs");

We will also create a subscription client which will be used to get the messages from the topic subscription for this fieldhub. Once again be sure to create the SAS policy you created on the topic earlier. Use the ship’s name for the name of the subscription, this will make sure you only get the messages which are meant for this fieldhub.

private readonly SubscriptionClient _topic = SubscriptionClient.CreateFromConnectionString("Endpoint=sb://;SharedAccessKeyName=fieldhubs;SharedAccessKey=Ex17Aixxxxxxxxx+AM=", "topicengineadministration", "Hydra");

Now we need to generate some data to be sent to the event hub. In real life these readings would be gathered by the fieldhub from the various sensors on the engines, however for this example a SHT15 sensor will be used which we connect to the Raspberry Pi 2’s GPIO pins. In case you don’t have a sensor, you can off course also generate some dummy data from code. Connect your SHT15 in the following way:

  • VCC to PIN1 (3.3V DC power)
  • GND to PIN9 (Ground)
  • DATA to PIN18 (GPIO24)
  • SCK to PIN16 (GPIO23)

Currently, one of the downsides of using Windows 10 IoT Core is the lack of support from sensor manufacturers when it comes to drivers and libraries for their sensors. Fortunately there is a library available which has been written by Krishnaraj Varma. We start by importing the SHT15.cs file in our project. As we are going to work with the GPIO pins of the RPi2, we also need a reference to the Windows IoT Extensions for the UWP, which contains the libraries to work with these computer boards’ hardware.

Windows IoT Extensions for the UWP

Now that we have all needed libraries, we can go and create an instance of the SHT15.

private readonly SHT15 _sht15 = new SHT15(24, 23);

Next we are going to create a method which will read the temperature from the SHT15, and calculate the temperature in Celsius degrees. With this temperature we will create an EngineInformation object, which is the object we will be sending to the event hub. Create the class for EngineInformation object.

/// <summary>
/// Class representing the information an engine would send out.
/// To be able to serialize, we have to annotate class and it's members.
/// </summary>
internal class EngineInformation
    internal Guid Identifier;
    internal string ShipName;
    internal string EngineName;
    internal double Temperature;
    internal double RPM;
    internal bool Warning;
    internal int EngineWarning;
    internal DateTime CreatedDateTime;

To be able to work with our object later on in Stream Analytics, it should be serialized as a UTF8 encoded JSON string. There are some libraries out there which we can use to serialize to JSON, with Newtonsoft’s serializer probably being the most well-known, however this library currently does not work with WinRT, so we’ll create our serializing extension method.

/// <summary>
/// Class used to serialize engine information to JSON.
/// Created this due to issues with the NewtonSoft JSON serializer, probably will work with a future version.
/// </summary>
internal static class EngineInformationSerialization
    internal static string Serialize(this EngineInformation engineInformation)
        // Create a stream to serialize the object to
        var memoryStream = new MemoryStream();
        // Serialize the object to the stream
        var jsonSerializer = new DataContractJsonSerializer(typeof(EngineInformation));
        jsonSerializer.WriteObject(memoryStream, engineInformation);
        var json = memoryStream.ToArray();
        return Encoding.UTF8.GetString(json, 0, json.Length);

Now we can create the method which will get our sensor readings, and send them out to our event hub. For this example we will simulate some of the input, which in real life would be gathered from the engines. Also note we use the name of the ship as the partition key, this will make sure that messages end up grouped by ship in the partitions.

/// <summary>
/// Get readings from sensors.
/// </summary>
private void GetSensorReadings()
    // Get temperature from SHT15
    // To simulate more realistic engine temperatures, we multiply it
    var temperature = _sht15.CalculateTemperatureC(_sht15.ReadRawTemperature()) * 20;
    Debug.WriteLine($"Temperature: {temperature}");
    // Check if a warning should be generated
    var warning = temperature &gt; _maximumTemperature;
        // Create engine information object, simulate some of the input
        var engineInformation = new EngineInformation
	    Identifier = Guid.NewGuid(),
            ShipName = "Hydra",
            EngineName = "Main Engine Port",
            CreatedDateTime = DateTime.UtcNow,
            RPM = new Random().Next(400, 1000),
            Temperature = temperature,
            Warning = warning,
            EngineWarning = !_warningGenerated &amp;&amp; new Random().Next(0, 10) &gt; 9 ? new Random().Next(1, 3) : 0
        // Check if a warning was generated
        if (engineInformation.EngineWarning &gt; 0)
            _warningGenerated = true;
            Debug.WriteLine($"EngineWarning sent: {engineInformation.EngineWarning}");
        // Serialize to JSON
        // With the current version (7.0.1) the Newtonsoft JSON serializer does not work, so created our own serializer
        var serializedString = engineInformation.Serialize();
        // Create brokered message
        // Send with shipname as partitionkey, to make sure messages for 1 ship are processed in correct order
        var message = new EventData(Encoding.UTF8.GetBytes(serializedString));
        message.Properties.Add("haswarning", engineInformation.Warning);
        message.PartitionKey = engineInformation.ShipName;
        // Send to event hub
    catch (Exception exception)

As you can see, there’s an extension method on the exception to log the exception. For local debugging you can use the debug console in Visual Studio, which will capture the exceptions from your RPi2, however once the device goes into the field, this will no longer be an option, as these devices most probably will be working headless, and even if a monitor was attached, the exception information is not something the crew of a ship would make much sense out of. We therefor will send the exception information to an Azure queue, which the custom administration console will retrieve and display at the engine’s supplier. To do this, we create a new static class, and define a queue client. Once again remember to use the SAS policy we created earlier on the queue for your connection string.

internal static class Exceptions
    private static QueueClient _queue;
    private static QueueClient Queue = _queue ?? (_queue = QueueClient.CreateFromConnectionString("Endpoint=sb://;SharedAccessKeyName=fieldhubs;SharedAccessKey=EDfKxxxxxxxxxxLZ2cA=", "queueerrorsandwarnings"));

Next we are going to create a method which will write to the queue, as well as the Log extention method.

public static void WriteToServiceBusQueue(this Exception exception)
    var message = new BrokeredMessage();
    message.Properties["ship"] = "Hydra";
    message.Properties["time"] = DateTime.UtcNow;
    message.Properties["exceptionmessage"] = exception.ToString();
public static void Log(this Exception exception)

The next step will be to make the RPi2 get the sensor readings on a regular interval, and send them out to our event hub. To do this, we will create a timer which calls the GetSensorReadings every 5 seconds. As you remember we made sure the application will keep running after startup, so this will send us the sensor readings indefinitely. We’ll add the following code for this in the Run method.

// Set task to be run in background
ThreadPoolTimer.CreatePeriodicTimer(timer =&gt;
}, new TimeSpan(0, 0, 5));

Now the final step for this part is to add a listener on the subscription we created on the topic, which will listen for incoming messages. On this subscription we will be receiving the maximum temperature as set from the administration console. Add the following code in the Run method for this.

// Trigger for receiving messages from a topic
_topic.OnMessage(message =&gt; {
    var newMaximumTemperature = message.Properties["maximumtemperature"];
    _maximumTemperature = Convert.ToInt32(newMaximumTemperature);
    Debug.WriteLine($"Maximum temperature has been set to {newMaximumTemperature}");

As we have finished our code, we can deploy the solution onto the Raspberry Pi 2. Make sure the device is connected and can be reached from the computer where you are running Visual Studio. We now open the properties of our project, and change our Platform to ARM. Go to the Debug tab, and set the Target device to Remote Machine. This will allow us to specify the DNS name or ip-address of our RPi2.

ARM Properties

Switch of the mark for authentication. Now that we have set up everything, we can deploy and debug the application using F5 (make sure you have switched to the ARM profile). Once the application has been built and deployed, you will see output coming by in the debug console with the temperature readings from your device.

Debug Output

Now that we have sent some data to our event hub, let’s have a look at it using Service Bus Explorer. We connect to our namespace, open the event hub, right click on one of the consumer groups, and select Create Consumer Group Listener. This will create a client on the consumer group, and retrieve its messages, which can be viewed in the Events tab.


In my next blogpost I’ll show how you can retrieve this data using your own applications.


2 thoughts on “IoT – Integration of Things: Connecting Raspberry Pi 2 to Azure

  1. Pingback: IoT – Integration of Things: Processing Service Bus Queue Using Azure Functions | Wonderful world of Microsoft integration

  2. Pingback: IoT – Integration of Things: Processing Service Bus Queue Using WebJobs - BizTalkGurus

Leave a Reply

Your email address will not be published. Required fields are marked *