Looking back on another great year…

Looking back on 2016

The first month of 2017 is almost over, and I was thinking back on my experiences over the last year. Looking at the integration space, 2016 was the year where Azure really matured. Of course, we already have been having the Service Bus stack for quite some time, but last year we also had Logic Apps go GA, which allows us to create flows in Azure, and easily connecting cloud services. And later in the year, Azure Functions went live as well, which gives us the ability to write small pieces of code, which can also be used from Locic Apps, closing the gap for custom code. And this was also the year we got a new BizTalk Server release, BizTalk 2016, which brings us even better integration with Azure, allowing us to focus even more on hybrid integration scenarios. For me personally, this was a year where I had lot of fun speaking, writing and visiting conferences.

Continue reading

Writing Azure Functions for Logic Apps using Visual Studio

In this post, I will show how we can use Visual Studio to write Azure Functions and use these in our Logic Apps. Azure Functions, which went GA on November 15th are a great way to write small pieces of code, which then can be used from various places, like being triggered by a HTTP request or a message on Service Bus, and can easily integrate with other Azure services like Storage, Service Bus, DocumentDB and more. We can also use our Azure Functions from Logic Apps, which gives us powerful integrations and workflow, using the out of the box Logic Apps connectors and actions, and placing our custom code in re-usable Functions.

Writing Azure Functions from Visual Studio

Previously, our main option to write Azure Functions was by using the online editor, which can be found in the portal.
azurefunctionsonlineeditor

However, most developers will be used to develop from Visual Studio, with its great debugging abilities and easy integration with our source control. Luckily, earlier this month the preview of Visual Studio Tools for Azure Functions was announced, giving us the ability to write Functions from our beloved IDE. For this post I used a machine with Visual Studio 2015 installed, along with Microsoft Web Developer Tools and Azure 2.9.6 .NET SDK.
visualstudioinstallationoptions
Continue reading

Using Common SettingsFileGenerator File With BizTalk Deployment Framework

One of the great features of the BizTalk Deployment Framework is the ability to use a SettingsFileGenerator file to set your environment specific settings in an excel file, and use this in your other files, so you can have generic files like portbindings and BRE rules, being updated with the correct settings for the environment we’re working on, like DEV, TEST or PROD. If you are like me, you will probably also have placed a lot of common settings which are used accross all your applications in this file, like SSO user groups, host instance names, common endpoints, webservice users, etc. This means we end up with a lot of duplicate settings accross our environment settings files, which becomes cumbersome to maintain. Fortunatly, there is a way to work around this.

Common SettingsFileGenerator

The BTDF has a nice option which we can use, to have a single SettingsFileGenerator file for all our applications. In this example we have two applications, with a couple of common settings, as well as some application specific settings. The applications were already set up with BTDF, so we already have all necessary placeholders in the PortBindingsMaster file. Lets start by creating a CommonSettingsFileGenerator file which has all these settings in one place. To do this, copy the SettingsFileGenerator from one of my projects to a general Tools directory, rename it, and update it with all the common and application specific settings.

Common SettingsFileGenerator
Continue reading

Integrate 2016 – My take-aways

Last week I attended Integrate 2016 in London, the biggest Microsoft integration event this year, organized by BizTalk360. The outcome was almost 400 attendees, and there were sessions from the Microsoft Product Group, industry leaders and MVP’s.

Vision

The major take-away I have from the event, is that Microsoft now has a great vision on the future of integration, which I felt was missing the last couple of years. Now though, they recognize that even though the cloud is a great asset, on premise is not going away for a long time. Also Microsoft has now officially announced that their on-premise integration solution will be BizTalk, which has not been getting a lot of love lately.

1. Cloud and OnPremise

Continue reading

IoT – Integration of Things: Processing Service Bus Queues Using Azure Functions

In my my previous post, I showed how we can use a WebJob to process a Service Bus queue and store the data in an Azure SQL database. This was pretty simple to set up, but it did require a good understanding of how to connect with these and process the data. Sometimes however we just want to do a quick integration without needing to set up all this plumbing. Recently Microsoft announced a new feature called Azure Functions, with now makes this possible. Azure functions can be used to create a small function which can run stand-alone, or be called from other applications, for example from a logic app, as has been described here by Sandro Pereira. Azure Functions provide out of the box connections for triggers, input and output to a lot of other Azure features, including Event Hubs, Service Bus, Azure Storage and DocumentDB. In this post I will show how we can process our message from the queue we created in this blogpost, and store it in an Azure Storage table. We will start by creating a new Function App in the portal.

Create Azure Function Continue reading

IoT – Integration of Things: Processing Service Bus Queue Using WebJobs

This is the fifth post in my series on Integration of Things. In this post I showed how you can send messages from a Raspberry Pi 2 into a Service Bus Queue, and in our previous blogpost we have set up a library for connecting to an Azure SQL database. Today I will explain how we can use a WebJob to retrieve the messages from the queue and send them to our database. The code for this blogpost can be found here.

A WebJob is a simple way to set up a background job, which can process continuously or on a schedule. WebJobs differ from a cloud service (which we discussed in this blogpost) as it gives you get less fine-grained control over your processing environment, making it a more true PaaS service.

We will need a Web App to host our WebJob, so lets create one in the Azure Portal. You can create a new Web App by going to App Services, and selecting New.

Azure Web App

Continue reading

IoT – Integration of Things: Entity Framework Code First And Azure SQL

This is the fourth post in my series on Integration of Things. In this post, we will use Entity Framework Code First to set up an Azure Sql database, which will later on be filled with the data we receive from our Service Bus Queue. As we will want to access this database from multiple projects, we will add it to the DataTypes Class Library we created in the previous blogpost. The code for this blogpost can be found here.

First we will create an empty database in Azure.

Azure SQL Database

Continue reading

Automated Build and Deployment With BizTalk Deployment Framework

A while ago I created a post on using the BizTalk Deployment Framework for automated build and deployment. Since then I have worked this out to be more easy and maintainable using PowerShell, which I will show in this post. BizTalk Deployment Framework is one of those pearls for BizTalk developers, allowing complex BizTalk solutions to be deployed easily, having all our artifacts and dependencies together in one MSI. The code with this post can be downloaded from here.

Description

Using PowerShell we will make scripts which will handle all steps of the build and deployment process for us. This will make sure our applications are always deployed in the correct order, using the right versions, and with minimal effort. We have some general helper functions, which will help us clear log files, wait for user input, iterate through directories, etc. We assume you have are using some of the BTDF best practices for these scripts, where it comes to naming conventions and folder structure. Of course, in case anything differs in your environment, you can easily adjust the scripts to meet your requirements.

Continue reading

BizTalk Server Extensibility E-Book

Everyone who has been working with BizTalk knows how powerful this product can be. It will allow you to tackle a lot of integration scenarios out of the box, but sometimes you will run into a requirement which can not be handled using just the standard BizTalk components. Luckily BizTalk can be extended on many points, giving you the power to handle all your scenarios. Some of these extensibility points are:

  • Ports (Custom behaviors and adapters)
  • Pipelines (Pipeline components)
  • Mappings (XSLT, Functoids, XPATH)
  • Orchestration (XPATH, Helper classes)
  • Configuration (SSO Helper)
  • Deployment (Deployment Framework)
  • Testing (BizUnit, Visual Studio Test, Custom clients)
  • Monitoring (BAM, BizTalk assemblies)
  • Rules (BRE)

Continue reading

IoT – Integration of Things: Processing Event Hubs From Azure Cloud Service

This is the third post in my series on Integration of Things. In my previous post I explained how you could send and receive data on a Raspberry Pi 2 to Azure. Today I will explain how you can use an Azure cloud service as a worker role for retrieving the data from Event Hubs using the Event Processor Host library. We will save the retrieved data in an Azure Table Storage, which is a great service for working with large amounts of structured, non-relational data. Azure Table Storage is very fast, and cost efficient especially when working with lots of data, which makes it ideal for our scenario. The code for this blogpost can be found here.

The Event Processor Host library will be used to retrieve the data from our event hub, and load it into Azure Table Storage. This library will distribute Event Hubs partitions accross our instances of the worker role, keeping track of leases and snapshots. This library really makes working with Event Hubs from .NET code a breeze to go through. We will need a blob storage for for the table and for the library to store its data, so let’s start by setting one up via the Azure Portal.

Azure Storage

Continue reading