SAP Integration Suite comes with SAP Event Mesh, which is a broker for event notification handling that allows applications to trigger and react to asynchronous business events.
However, it might be the case that you require to use different event brokers in your landscape.
Microsoft Cloud Azure contains the service Event Hubs, which is a simple, trusted and scalable ingestion service for streaming millions of events per second from any source to their pipelines reacting to business events.
In this blog post we will see how we can send events from Cloud Integration to Event Hubs and how to consume them. Event Hubs allow this by means of AMQP, Apache Kafka and HTTP protocols. Each one of these need different configurations and even different credentials in Event Hubs. More information on supported protocols might be found in Exchange events between consumers and producers.
You can get an Azure free account in following URL https://azure.microsoft.com/en-us/free/, which is enough for testing the following scenarios.
Pushing events through AMQP Adapter
For sending events to Azure Event Hubs with the Cloud Integration AMQP adapter, you need a namespace and an event hub. As namespace you can use any pricing tier, even basic (more information on Azure Event Hubs pricing tiers on Event Hubs pricing).
In the following site Create an event hub using Azure portal, you find how to create a Resource Group, a Namespace and an Event Hub.
For getting the credentials, you need to go in the namespace to the shared access policies. There you can use the existing one or create any additional. In the policy, copy the connection string-primary or secondary key.
Taking for example the following connection string, you get the user (SharedAccessKeyName) and the password (SharedAccessKey) as follows:
- Endpoint=sb://nsehamqp.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********************************************
- User: RootManageSharedAccessKey
- Password: **********************************************
With this information you can create a User Credentials artifact in Security Material in Cloud Integration.
Next, comes the integration flow to send the events from Cloud Integration to Azure Event Hubs. For sake of simplicity, I show you a simple integration flow triggered by a timer and where a sample payload is hardcoded as a json in a content modifier. Also, a header Content-Type = application/json is set in the content modifier.
In the AMQP adapter you must set as Host the host name you see in the overview of the namespace (also to be found in the connection string). Port must be 5671, Connection with TLS –> true and Authentication –> SASL. As Credential Name use the one created above.
As Destination Type select Queue and give the name of the event hub as Destination Name (also found in the namespace overview).
Once you deploy the integration flow, you should see a completed message.
In the namespace or in the event hub logs you will see the incoming messages or events.
Consuming events through AMQP Adapter
In this chapter, you will see how to consume the events sent to the event hub in the previous chapter. For that, use an integration flow with a sender AMQP adapter. You must set as Host the host name you see in the overview of the namespace. Port must be 5671, Connection with TLS –> true and Authentication –> SASL. As Credential Name use the one created above in the previous chapter.
As queue name use the following pattern:
- <event_hub_name>/ConsumerGroups/<consumer_group_name>/Partitions/<Partition_number>
In the event hub overview, you will find the partition count (2 in the example) and the consumer group name ($Default is the default consumer group).
After deploying the integration flow, you should be able to see the messages generated for the consumed events.
In the event hub logs you see also the outgoing events in red. The incoming events generated in the previous chapter are colored in blue.
Pushing events through KAFKA Adapter
To send messages to Azure Event Hubs with the Kafka adapter, you need a namespace with at least a pricing tier standard, to have the Kafka surface enabled (more information on Azure Event Hubs pricing tiers on Event Hubs pricing). You can create an event hub in the namespace or it will be created automatically when sending events to a particular event hub, if not already created.
Here you need to also get the shared access policies from the namespace.
With the following connection string you get like this the user (constant $ConnectionString) and password (the whole endpoint):
- Endpoint=sb://nsehkafka.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********************************************
- User: $ConnectionString
- Password: Endpoint=sb://nsehkafka.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=**********************************************
So, you can create a User Credentials artifact in Security Material in Cloud Integration.
Like in the previous integration flow, you see here a sample one. In the Kafka adapter use the host name you get in the namespace overview as Host and 9093 as port. Authentication must be SASL, Connect with TLS true and SASL Mechanism PLAIN. Use the credential you created before.
As topic enter the event hub name where you want to send your events.
Once deployed, you will see a completed message in the message monitor.
In the namespace and event hub logs you should be able to see your incoming events.
Consuming events through Kafka Adapter
To consume the events sent to the event hub in the previous chapter, you use an integration flow with a sender Kafka adapter. You must set as Host the host name you see in the overview of the namespace. Port must be 9093, Connection with TLS –> true, Authentication –> SASL and SASL Mechanism –> true. As Credential Name use the one created above in the previous chapter.
In the processing tab enter the event hub name as topic.
Once the integration flow is deployed, you will be able to see the messages generated from the consumed events.
You will see also the incoming events in red in the event hub logs.
Pushing events through HTTP Adapter
Sending events to Azure Event Hubs with the Cloud Integration Http adapter requires again a namespace and an event hub. As namespace you can use any pricing tier, even basic.
Here you need, as additional step, to register an application, which will act as OAuth2 client. For that, go to Azure Active Directory –> App registrations and add a new application. From the overview, you need to write down application ID and tenant ID, which will be used in the security material.
In the Authentication section of the application, you need to give also a Web Redirect URI. Use the following url:
- https://<your cloud integration tenant management node>/itspaces/odata/api/v1/OAuthTokenFromCode
In the Certificates & secrets section, add a new client secret and write it down, as it will be used also in the security material.
In the API permissions section, add the API Microsoft.EventHubs to grant access to it.
As last step in Azure, you need to assign roles for sending events to Event Hubs to the application acting as OAuth2 client and to the user account whose resources the application wants to get access to (the user account where the namespace was created). This is done in Event Hubs–><your namespace>–>Access control (IAM)–>Role assignments. You need to assign there the role Azure Event Hubs Data Sender.
Next you can create the security material in Cloud Integration. In that case you need an OAuth2 Authorization Code. You obtained the needed information in the previous steps.
- Tenant id: 8e5f260f-fc31-4fa6-9251-55bcfc0938b7
- Client id: b0e99ea7-8311-4a73-b9ec-87ce0a1e512c
- Client secret: *************************************
- Autorization URL: https://login.microsoftonline.com/<Tenant id>/oauth2/v2.0/authorize
- Token Service URL: https://login.microsoftonline.com/<Tenant id>/oauth2/v2.0/token
- User Name: user account where the namespace was created
- Scope: https://eventhubs.azure.net/.default
Once saved, you must still authorize the artifact.
You will get a success message and your artifact will be showed with status Deployed.
Again, you can use the following integration flow to send events through http adapter. In the http adapter use the host name you get in the namespace overview as Address. As Authorization select None as it is handled in a previous step.
Before the adapter call, add a groovy script to get the access token from the OAuth2 Authorization Code you created above and add it as authorization header.
Once the integration flow is deployed, you should see a successful message in the monitor.
Also, you should be able to see the incoming messages in the Event Hub.
Consuming events through HTTP Adapter
Consumption of events from Azure Event Hubs is not supported through Https adapter.
Summary
In this blog post you have seen three possibilities to connect your Cloud Integration tenant with Azure Event Hubs to send and consume events or messages.