Last year I did an IoT session at the Dutch BizTalk User Group, and since then I have had several requests for more information on this topic. After having had a couple of very busy months at my client, I finally decided to make a series of blogposts on this topic. IoT is a major growing industry, and gives a lot of really nice opportunities for us integration specialists. All the code and projects I will be creating will be provided along with the posts, and can be downloaded from here. Coming from a nautical background myself, I see more and more scenarios here where IoT might be a real game changer, so this will be the scenario I will be using throughout these series.
In the upcoming weeks I will be creating my blogposts from the following scenario. In this scenario an engine supplier for ships, like Caterpillar or ABC, might want to get information about the health and status of their engines. In general, a ship has several engines for propulsion (main engine and bow thrusters), generators, pumps on tankers, etc. All these engines already have a lot of data about their health, like temperatures, oil and filter conditions, or issues that might occur during operations. Currently most of this information is displayed to the crew of the ship, and they have to interpret this information, which often means issues are not noticed until it becomes a real problem, and they have to go in for repairs or adjustments, where every day they are not out working can cost a lot of money, especially if the ship is working under contract.
What if all this data could be shared with the engine supplier near real-time, and they could determine the health of their engines, and even predict failures? This would mean the shipping companies can be contacted before any problems arise, and they could plan reparations and maintenance at the most convenient times for both the engine suppliers and shipping companies. This could result in less costs, and better planning for dispatching technicians. This could also help the shipping companies in other ways, for example having a real-time overview of materials and stock being used means the ships themselves could keep a lower stock on board, and ordering of materials could become more automated.
As we would not want to connect all the engines straight to the internet, we are going to use a field hub in this scenario, which will gather all data from the various sensors, and be our gateway to Azure. For this gateway, we will be using a RaspberryPi2, with Windows 10 IoT Core, a Windows version created specifically for these kind of IoT scenarios.
As a large engine supplier like can have millions of engines in the field at any given day, it’s important we have a highly scalable service to gather all this data. We will be using Azure Event Hubs as for this, which is a highly scalable service using pub – sub, and which can process millions of events per second. Once loaded into event hubs, we will have 2 destinations. The first will be to an administration console, where this data can be monitored. The data will also be processed by Azure Stream Analytics, and from there it will feed into PowerBI for visualization, into Azure Machine Learning for analysis with which the engine supplier can set up predictive maintenance, and into Logic Apps for automation of tasks. Finally, we will also use Azure Service Bus to show alerts in our custom administration console, as well as communicate back to the field hub in a secure manner. The architecture might change or be expanded over the course of this series, but currently this is how I image this.