Last week Microsoft announced during Build that they are now supporting the Kafka protocol 1.0 and onward on Azure Event Hubs. This allows us to connect our Kafka clients, which can be either producers or consumers, to Event Hubs and take advantage of all features which Event Hubs gives us, like easy integration to Azure services including Stream Analytics, Functions and Logic Apps, use Capture and auto-inflate, and tie into Azure core features like MSI, RBAC and Virtual Networks.
Category Archives: Event Hubs
Creating event driven integrations using Azure Event Grid
Yesterday Microsoft announced their newest service on Azure, and it is called Azure Event Grid. With this new service, we now have event based serverless routing, from any source, to any destination. Of course, we all love integration, and Azure Event Grid brings a whole new world of possibilities. The service is currently in public preview, and we already have various publishes and event handlers to our disposal, and more will be rolling out over the coming months. In the end, expect every service within Azure to have a connection to Event Grid.
Integrate 2016 – My take-aways
Last week I attended Integrate 2016 in London, the biggest Microsoft integration event this year, organized by BizTalk360. The outcome was almost 400 attendees, and there were sessions from the Microsoft Product Group, industry leaders and MVP’s.
Vision
The major take-away I have from the event, is that Microsoft now has a great vision on the future of integration, which I felt was missing the last couple of years. Now though, they recognize that even though the cloud is a great asset, on premise is not going away for a long time. Also Microsoft has now officially announced that their on-premise integration solution will be BizTalk, which has not been getting a lot of love lately.
IoT – Integration of Things: Processing Event Hubs From Azure Cloud Service
This is the third post in my series on Integration of Things. In my previous post I explained how you could send and receive data on a Raspberry Pi 2 to Azure. Today I will explain how you can use an Azure cloud service as a worker role for retrieving the data from Event Hubs using the Event Processor Host library. We will save the retrieved data in an Azure Table Storage, which is a great service for working with large amounts of structured, non-relational data. Azure Table Storage is very fast, and cost efficient especially when working with lots of data, which makes it ideal for our scenario. The code for this blogpost can be found here.
The Event Processor Host library will be used to retrieve the data from our event hub, and load it into Azure Table Storage. This library will distribute Event Hubs partitions accross our instances of the worker role, keeping track of leases and snapshots. This library really makes working with Event Hubs from .NET code a breeze to go through. We will need a blob storage for for the table and for the library to store its data, so let’s start by setting one up via the Azure Portal.