API Management CI/CD using ARM Templates – Unversioned API

This is the thirth post in my series around setting up CI/CD for Azure API Management using Azure Resource Manager templates. In the first post we created our API Management instance, and have set up our build and release pipelines, while in the second post we added the products, users and groups for Contoso. In this post we will create an unversioned API , and expose it through the product from the previous post.

image

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading

Azure Event Hubs and Apache Kafka, a match made in messaging heaven

Last week Microsoft announced during Build that they are now supporting the Kafka protocol 1.0 and onward on Azure Event Hubs. This allows us to connect our Kafka clients, which can be either producers or consumers, to Event Hubs and take advantage of all features which Event Hubs gives us, like easy integration to Azure services including Stream Analytics, Functions and Logic Apps, use Capture and auto-inflate, and tie into Azure core features like MSI, RBAC and Virtual Networks.

Azure Event Hubs loves Apache Kafka

So why is this important?

This was the first question when I discussed this with my clients and colleagues, why is it so important that we now can use the Kafka protocol with Event Hubs? When we are working with Kafka, we have different options of hosting our Kafka server.

As always, these options have impact on the amount of work you have to do yourself, and the control you have over your environment.

Work vs. responsibility

Work vs. responsibility

At our clients, we almost always encounter a SaaS over PaaS over IaaS over On-Premises policy, as they want to focus on delivering value, and not have to worry on keeping things running.

So how does Event Hubs fit in here? Event Hubs is a fully managed PaaS solution, meaning you don’t have to worry about any of the maintenance, as this all is handled by Microsoft, and instead we can just focus on delivering value, placing it in the serverless space. For Kafka this means we don’t have to worry about the cluster either, we can just point our Kafka application to our Event Hubs endpoint, and everything will be handled for us. So if you don’t need to have any special configurations, but just need a way to handle your data, Event Hubs is the perfect solution.

Similarities and differences

When looking at Event Hubs and Kafka, we will see a lot of similarities. They are both designed to handle large streams of messages, allowing for real time data streaming. They both make sure messages are handled in a reliable fashion, and can scale extremely well even under high load.

Event Hubs architectur

Event Hubs architectur

But of course, there are also differences. The biggest difference was already explained, where for Event Hubs you don’t have to worry about configuring and managing your brokers, and don’t have to worry about the servers and network. But not only the hosting options are a big difference, we also see a lot of variation in the features and tools provided by both platforms.

Event Hubs gives us many great features

Event Hubs gives us many great features

By taking advantage of Event Hubs very interesting features, and combining these with Kafka’s ecosystem and tools, we can truly have the best of both worlds. Using Event Hubs with Kafka support you can keep using your existing tools for get insights in and working with your existing applications, including things like using MirrorMaker to replicate your Kafka messages with with Kafka and Event Hubs, while gaining all the features and possibilities which Event Hubs provides us.

Kafka has some great tools and ecosystem

Kafka has some great tools and ecosystem

And you can mix and match this, so you could for example have producers sending messages with the Kafka protocol to Azure Event Hubs, and have consumers working with the Event Hubs protocol to process these messages.

So how do we do this?

Getting started with Event Hubs for Kafka is extremely easy, and only needs two changes the configuration file of the Kafka client. The first change is to switch the endpoint to which the client will connect to our Azure Event Hubs instance, and the second is to update the security protocol into SASL PLAIN, using the connection string from our Event Hubs instance as the password.

bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";

On the Event Hubs side, we can easily enable Kafka support when creating an Event Hub namespace, simply by ticking a checkbox.

Simply tick the box to enable Kafka support on Event Hubs

Simply tick the box to enable Kafka support on Event Hubs

To see this in action, just follow the instructions on the documentation page.

Conclusion

As we have seen, with just a few minor configuration changes we can now connect our Kafka clients to Azure Event Hubs, allowing us to have the best of both worlds. We can keep working with our existing Kafka applications, managing this with its own tools and ecosystem, while leveraging the easy of use and many great features of Event Hubs.

API Management CI/CD using ARM Templates – Products, users and groups

This is the second post in my series around setting up CI/CD for Azure API Management using Azure Resource Manager templates. In the previous post we created our API Management instance, and have set up our build and release pipelines. In this post we will add custom products, users and groups to our API Management instance, which will be used to set up our policies and access to our APIs.

API Management products, users and groups

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading

Working with CloudEvents in Azure Event Grid

Recently Microsoft announced Azure Event Grid, a highly scalable serverless event driven offering allowing us to implement publish and subscribe patterns. Event driven scenarios are becoming more common by the day, which means that we see these type of integrations increasing a lot as well. A lot of times applications will define their own message formats for their events, however, with the recent announcement of native support in Azure Event Grid for CloudEvents our lives should be made a lot easier. CloudEvents is a standard for working with events accross platforms, and gives us a specification for describing event data in a common way. This will allow any platform or application which is working with events, to implement a common format, allowing easy integration and interoperability, for example between Azure, AWS and Oracle. The specification is still under active development, and Microsoft is one of the big contributors, especially Clemens Vasters, Lead Architect on Azure Messaging Services.

CloudEvents logo Continue reading

API Management CI/CD using ARM Templates – API Management Instance

This is the first in a series of blogposts around setting up CI/CD for Azure API Management using Azure Resource Manager templates. We will be using Visual Studio Team Services to host our repositories and set up our build and release pipeline. By using CI/CD our API Management will be updated any time we check in changes made in our ARM templates.

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading

Microsoft Azure becomes Magic Quadrant leader in Enterprise iPaaS

Last week the new Gartner Magic Quadrant for Enterprise Integration Platform as a Service (EiPaaS) was published, listing Microsoft in the coveted leader space. Having worked with Azure’s iPaaS products for a long time now, I wholeheartedly agree with this decision, and congratulate all the teams within Microsoft who have been working so hard to get to where we are today. The complete report, with all requirements and results can be found in this report.

Source: Gartner (April 2018)

Source: Gartner (April 2018)

Continue reading

Correlating messages over Logic Apps using Service Bus

When working with Azure Logic Apps, I like to have each Logic App do a single piece of work, as this allows us to mix and match these Logic Apps in various flows. For this demo, we will be using a very simple representation of this, where we have one Logic App which receives the message and send back a response to the original caller, another Logic App which does transformation of the message, and finally a Logic App which calls a backend system. To decouple these Logic Apps we will be using Azure Service Bus topics, providing us with routing capabilities and allowing us to handle downtime more easily.

Architecture

Architecture

Now the challenge we were running into, is that we needed to give the response which we received from the backend, back as a response to the client.

Requested architecture

Requested architecture

Of course, since we have implemented communication between the Logic Apps asynchronously and decoupled by using Service Bus in between, we don’t have a return channel on which we can send the response. In this post, I will show how we can solve this by using Service Bus sessions.

Continue reading

Global Integration Bootcamp 2018

Last Saturday was the second edition of the Global Integration Bootcamp, and we can certainly say it was another big hit! In total we had 15 locations in 12 countries running the Bootcamp, and about 600 participants including the speakers.

Locations all over the world

Locations all over the world

This is an amazing achievement, and I would like to thank all the local organizers, and of course my fellow global organizers.

The global organizers

The global organizers

Continue reading

Integration Patterns In Azure – Message Router Using Service Bus

In the previous post, we have seen how we can implement the Message Router pattern when working with Logic Apps. As discussed, Logic Apps are a great fit if you have a limited set of endpoints to which you want to route the message, and if you have a need for various connectors. In this post we will look into another technology to implement this pattern, Azure Service Bus Topics. Topics are a great solution if we want to implement a publish / subscribe mechanism.

  • Capability to send our messages to one or more subscriptions in our topic.
  • Each subscription represents a virtual queue, from where subscribers can pull their messages, allowing receiving systems to process messages at their own speed.
  • Receiver and sender are completely decoupled, so systems can work independently from each other.
  • Topics have dead-lettering capabilities built in, so messages are not lost even in case of issues.
  • Easily add new subscriptions, so we can quickly on-board new systems.
Azure Service Bus Topics

Azure Service Bus Topics

Continue reading