A while ago I created a post on using the BizTalk Deployment Framework for automated build and deployment. Since then I have worked this out to be more easy and maintainable using PowerShell, which I will show in this post. BizTalk Deployment Framework is one of those pearls for BizTalk developers, allowing complex BizTalk solutions to be deployed easily, having all our artifacts and dependencies together in one MSI. The code with this post can be downloaded from here.
Using PowerShell we will make scripts which will handle all steps of the build and deployment process for us. This will make sure our applications are always deployed in the correct order, using the right versions, and with minimal effort. We have some general helper functions, which will help us clear log files, wait for user input, iterate through directories, etc. We assume you have are using some of the BTDF best practices for these scripts, where it comes to naming conventions and folder structure. Of course, in case anything differs in your environment, you can easily adjust the scripts to meet your requirements.
Everyone who has been working with BizTalk knows how powerful this product can be. It will allow you to tackle a lot of integration scenarios out of the box, but sometimes you will run into a requirement which can not be handled using just the standard BizTalk components. Luckily BizTalk can be extended on many points, giving you the power to handle all your scenarios. Some of these extensibility points are:
- Ports (Custom behaviors and adapters)
- Pipelines (Pipeline components)
- Mappings (XSLT, Functoids, XPATH)
- Orchestration (XPATH, Helper classes)
- Configuration (SSO Helper)
- Deployment (Deployment Framework)
- Testing (BizUnit, Visual Studio Test, Custom clients)
- Monitoring (BAM, BizTalk assemblies)
- Rules (BRE)
This is the third post in my series on Integration of Things. In my previous post I explained how you could send and receive data on a Raspberry Pi 2 to Azure. Today I will explain how you can use an Azure cloud service as a worker role for retrieving the data from Event Hubs using the Event Processor Host library. We will save the retrieved data in an Azure Table Storage, which is a great service for working with large amounts of structured, non-relational data. Azure Table Storage is very fast, and cost efficient especially when working with lots of data, which makes it ideal for our scenario. The code for this blogpost can be found here.
The Event Processor Host library will be used to retrieve the data from our event hub, and load it into Azure Table Storage. This library will distribute Event Hubs partitions accross our instances of the worker role, keeping track of leases and snapshots. This library really makes working with Event Hubs from .NET code a breeze to go through. We will need a blob storage for for the table and for the library to store its data, so let’s start by setting one up via the Azure Portal.
Custom widgets are one of the new features in BizTalk360. These widgets are small web-snippets, which can be used to show dynamic data on a BizTalk360 dashboard. In this post I will show you how we can use these widgets together with Power BI to create a dashboard with information about our BizTalk environment. The data for this dashboard can be retrieved from tracking databases, BAM, etc. Tord Glad Nordahl has done a session at the BizTalk Summit 2015 on how BAM can be exposed via Power BI.
This is the second post in my series on Integration of Things. In my previous post I have explained the scenario and architecture, so today I will explain how you can use a Raspberry Pi 2 to act as an IoT fieldhub. The RPi2 is a low power ARM based single board computer at the size of a creditcard, which can run Windows 10 IoT Core, a windows version created specifically for these kind of boards. This also means we can use the .NET framework to set up our solution. The code for this post can be dwonloaded here.
First we will have to flash your RPi2 with Windows 10. There are some great walkthroughs out there, so I will not explain this here, but instead link you to this site which will explain all the steps to be taken.
After we have flashed our RPi2, it’s time to set up Visual Studio. To develop on Windows IoT Core, we will need to install the project templates. This can be done from Visual Studio by going to Tools, Extensions and Updates, and searching for Windows IoT Core Project Templates. Install these templates, and restart Visual Studio to start creating your own IoT solutions. Don’t forget to enable developer mode on your machine by following these instructions, as this is needed to publish your solution to your device.
Last year I did an IoT session at the Dutch BizTalk User Group, and since then I have had several requests for more information on this topic. After having had a couple of very busy months at my client, I finally decided to make a series of blogposts on this topic. IoT is a major growing industry, and gives a lot of really nice opportunities for us integration specialists. All the code and projects I will be creating will be provided along with the posts, and can be downloaded from here. Coming from a nautical background myself, I see more and more scenarios here where IoT might be a real game changer, so this will be the scenario I will be using throughout these series.
In the upcoming weeks I will be creating my blogposts from the following scenario. In this scenario an engine supplier for ships, like Caterpillar or ABC, might want to get information about the health and status of their engines. In general, a ship has several engines for propulsion (main engine and bow thrusters), generators, pumps on tankers, etc. All these engines already have a lot of data about their health, like temperatures, oil and filter conditions, or issues that might occur during operations. Currently most of this information is displayed to the crew of the ship, and they have to interpret this information, which often means issues are not noticed until it becomes a real problem, and they have to go in for repairs or adjustments, where every day they are not out working can cost a lot of money, especially if the ship is working under contract.
Often you will have to get some content from your messages, and use this to set the filename of your outgoing files, in our case we needed to use a sequencenumber. I will show a way to do this using a pipeline component in a streaming matter.
Xpath is a very nice way to retrieve values from BizTalk messages, especially when you can not use distinguished fields, for example in looping records. It can however be quite a complicated task as well, to find out how to retrieve a certain value. To that end, I have created a list of xpath filter expressions I commonly use. In these examples I will be using the following XML.
<contents>Surface Pro 2</contents>
- Filter on index
Get the third delivery node.
Get the deliverytype node of the second delivery.
- Filter on subnode text
Get all delivery nodes, which have a deliverytype of Home.
Get the deliverytype node of the delivery for BoxID 87.
Get the deliverytype node, of the box which contains the stickers.
Often when you want to create a message in an orchestration, the option of using an XMLDocument is chosen for this. However, this option can be a serious performance hit, as your entire message will be loaded into memory, and can become as large as 10 times it’s original size. To avoid these performance hits, I have created 2 small helper methods, which allow you to instead use BizTalk’s native messages instead of an XMLDocument, and create the message in a streaming way. Using these, you can simply create a message from a string (and the other way around).
In one of our projects, we hadto do an xpath expression in an orchestration to find a value from a nodes subnode, where another subnode has a specific value.
So our input is like this:
<Contents>Surface Pro 2</Contents>
Now what we wanted to do, is to have the ID’s of the boxes, where the delivery type was Home.