API Management CI/CD using ARM Templates – Unversioned API

This is the thirth post in my series around setting up CI/CD for Azure API Management using Azure Resource Manager templates. In the first post we created our API Management instance, and have set up our build and release pipelines, while in the second post we added the products, users and groups for Contoso. In this post we will create an unversioned API , and expose it through the product from the previous post.

image

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading

Azure Event Hubs and Apache Kafka, a match made in messaging heaven

Last week Microsoft announced during Build that they are now supporting the Kafka protocol 1.0 and onward on Azure Event Hubs. This allows us to connect our Kafka clients, which can be either producers or consumers, to Event Hubs and take advantage of all features which Event Hubs gives us, like easy integration to Azure services including Stream Analytics, Functions and Logic Apps, use Capture and auto-inflate, and tie into Azure core features like MSI, RBAC and Virtual Networks.

Azure Event Hubs loves Apache Kafka

So why is this important?

This was the first question when I discussed this with my clients and colleagues, why is it so important that we now can use the Kafka protocol with Event Hubs? When we are working with Kafka, we have different options of hosting our Kafka server.

As always, these options have impact on the amount of work you have to do yourself, and the control you have over your environment.

Work vs. responsibility

Work vs. responsibility

At our clients, we almost always encounter a SaaS over PaaS over IaaS over On-Premises policy, as they want to focus on delivering value, and not have to worry on keeping things running.

So how does Event Hubs fit in here? Event Hubs is a fully managed PaaS solution, meaning you don’t have to worry about any of the maintenance, as this all is handled by Microsoft, and instead we can just focus on delivering value, placing it in the serverless space. For Kafka this means we don’t have to worry about the cluster either, we can just point our Kafka application to our Event Hubs endpoint, and everything will be handled for us. So if you don’t need to have any special configurations, but just need a way to handle your data, Event Hubs is the perfect solution.

Similarities and differences

When looking at Event Hubs and Kafka, we will see a lot of similarities. They are both designed to handle large streams of messages, allowing for real time data streaming. They both make sure messages are handled in a reliable fashion, and can scale extremely well even under high load.

Event Hubs architectur

Event Hubs architectur

But of course, there are also differences. The biggest difference was already explained, where for Event Hubs you don’t have to worry about configuring and managing your brokers, and don’t have to worry about the servers and network. But not only the hosting options are a big difference, we also see a lot of variation in the features and tools provided by both platforms.

Event Hubs gives us many great features

Event Hubs gives us many great features

By taking advantage of Event Hubs very interesting features, and combining these with Kafka’s ecosystem and tools, we can truly have the best of both worlds. Using Event Hubs with Kafka support you can keep using your existing tools for get insights in and working with your existing applications, including things like using MirrorMaker to replicate your Kafka messages with with Kafka and Event Hubs, while gaining all the features and possibilities which Event Hubs provides us.

Kafka has some great tools and ecosystem

Kafka has some great tools and ecosystem

And you can mix and match this, so you could for example have producers sending messages with the Kafka protocol to Azure Event Hubs, and have consumers working with the Event Hubs protocol to process these messages.

So how do we do this?

Getting started with Event Hubs for Kafka is extremely easy, and only needs two changes the configuration file of the Kafka client. The first change is to switch the endpoint to which the client will connect to our Azure Event Hubs instance, and the second is to update the security protocol into SASL PLAIN, using the connection string from our Event Hubs instance as the password.

bootstrap.servers={YOUR.EVENTHUBS.FQDN}:9093
security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="{YOUR.EVENTHUBS.CONNECTION.STRING}";

On the Event Hubs side, we can easily enable Kafka support when creating an Event Hub namespace, simply by ticking a checkbox.

Simply tick the box to enable Kafka support on Event Hubs

Simply tick the box to enable Kafka support on Event Hubs

To see this in action, just follow the instructions on the documentation page.

Conclusion

As we have seen, with just a few minor configuration changes we can now connect our Kafka clients to Azure Event Hubs, allowing us to have the best of both worlds. We can keep working with our existing Kafka applications, managing this with its own tools and ecosystem, while leveraging the easy of use and many great features of Event Hubs.

API Management CI/CD using ARM Templates – Products, users and groups

This is the second post in my series around setting up CI/CD for Azure API Management using Azure Resource Manager templates. In the previous post we created our API Management instance, and have set up our build and release pipelines. In this post we will add custom products, users and groups to our API Management instance, which will be used to set up our policies and access to our APIs.

API Management products, users and groups

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading

Working with CloudEvents in Azure Event Grid

Recently Microsoft announced Azure Event Grid, a highly scalable serverless event driven offering allowing us to implement publish and subscribe patterns. Event driven scenarios are becoming more common by the day, which means that we see these type of integrations increasing a lot as well. A lot of times applications will define their own message formats for their events, however, with the recent announcement of native support in Azure Event Grid for CloudEvents our lives should be made a lot easier. CloudEvents is a standard for working with events accross platforms, and gives us a specification for describing event data in a common way. This will allow any platform or application which is working with events, to implement a common format, allowing easy integration and interoperability, for example between Azure, AWS and Oracle. The specification is still under active development, and Microsoft is one of the big contributors, especially Clemens Vasters, Lead Architect on Azure Messaging Services.

CloudEvents logo Continue reading

API Management CI/CD using ARM Templates – API Management Instance

This is the first in a series of blogposts around setting up CI/CD for Azure API Management using Azure Resource Manager templates. We will be using Visual Studio Team Services to host our repositories and set up our build and release pipeline. By using CI/CD our API Management will be updated any time we check in changes made in our ARM templates.

The posts in this series are the following, this list will be updated as the posts are being published.

Continue reading