SAP Event Objects – What and Why?

Event-driven architecture is an approach to software design where the components of a system are designed to respond to events or messages in a decoupled manner. In this architecture, events are sent and received by the system’s components, triggered by user actions, system events, or external factors.

This design allows for more flexibility, scalability, and responsiveness in system design, as well as improved handling of large volumes of data and traffic. Additionally, it can enable greater collaboration and innovation by allowing organizations to integrate different systems and technologies.

SAP developed event-frameworks for our line-of-business solutions. So in the very recent years a very impressive set of event objects became available for S/4HANA (Cloud and On-Premise) which you can find documented on the API business hub. Events are sent following the Cloud Events specification.

What does this blog address?

This blog serves as a practical how-to guide for setting up SAP event objects, compiled from various sources of information and personal insights. The aim is to help you save time and effort in setting up your system by providing clear and concise steps for implementation. While not addressing a specific use case, this guide will explain the mechanics and setup of SAP event objects, so you can get started with this powerful tool and take full advantage of its capabilities.

By following this guide, you can benefit from SAP’s event objects in many ways, including faster data processing, better integration with other systems, and improved system agility. Whether you’re new to SAP or an experienced user, this guide can help you unlock the full potential of event-driven architecture in your business processes.

Concept of external cloud events on SAP BTP Kyma

One of the most challenging aspects of working with SAP S/4HANA Cloud Events on SAP Event Mesh on SAP BTP Kyma is understanding how external events are processed. Initially, I was caught up in the paradigm of how Kyma and NATS eventing works, which turned out to be a different approach altogether.

To shed some light on the process, let’s take a look at the image below, which illustrates the flow of event information from source to consumer.

Image%3A%20Event%20information%20flow%20from%20source%20to%20consumer.

Image 1: Event information flow from source to consumer.

While this is not a perfect technical explanation, the basic idea is that an event raised in S/4HANA Cloud goes through the event framework on S/4HANA Cloud and identifies a suitable scenario (in this case, 0092) where the event was previously set as an outbound topic through the Fiori Enterprise Event Enablement Application.

From there it travels to the BTP Event Mesh into the client that is generated by the S/4HANA Cloud Messaging extensibility service. There one can also see subscribed event (which was an Outbound Topic in S/4HANA Cloud) – however not set it. This can be seen a read-only client.

In the Event Mesh client, though an Event Queue is set up with a subscription to the event provided by the aforementioned client. Here all incoming external events are collected for consumption. In order to allow for that, a webhook is defined that points to a URL on the Kyma cluster secured through OAuth2 client-credentials.

Finally the event is received by the API Rule on Kyma and forwarded through the service to the program running in a pod’s container. That can be a Kyma serverless function or any other container. Above image also shows how such a cloud event is structured with the information in between the data-section being business object or better: event object specific.

Configure Event consumption from S/4HANA Cloud

Let’s now do the same for events.

A. Define and deploy the Event Mesh for S/4HANA Cloud

This event mesh instance is providing the extensibility for events to and from S/4HANA Cloud. Create a yaml file similar to below.

apiVersion: services.cloud.sap.com/v1
kind: ServiceInstance
metadata:
  name: s4hc-ext-em-service
  labels:
    app.kubernetes.io/name: s4hc-ext-service
  namespace: s4hc-extensibility
spec:
  externalName: ''
  serviceOfferingName: s4-hana-cloud
  servicePlanName: messaging
  parameters:
    systemName: S4HANA-CLOUD-APJREGION
    emClientId: apj0

It’s important to have entitlement for the S/4HANA Cloud Extensibility for the plan messaging set (we did this before already if you followed all steps). The systemName is the same name with which we registered the S/4HANA Cloud tenant. The emClientId must be not more than 4 characters.

This service create takes a while (approx. 5mins) since it also creates an instance of the SAP Event Mesh in the BTP. Once you log in to the Event Mesh UI you should see the emClientId as a client like so:

Image%3A%20SAP%20Event%20Mesh%20instance%20created%20through%20Kyma%20yaml

Image 2: SAP Event Mesh instance created through Kyma yaml

If you check in the S/4HANA Cloud in the communication arrangement, you will also find that some magic happened, the scenario SAP_COM_0092 was created.

Image%3A%20S/4HANA%20Cloud%20-%20generated%20Enterprise%20Eventing%20Integration.

Image 3: S/4HANA Cloud – generated Enterprise Eventing Integration.

Next, search for the Enterprise Event Enablement application in S/4HANA Cloud. You need the role SAP_BR_ADMINISTRATOR assigned to your user for that.

Now add the event you want to work with e.g. sap/s4/beh/salesorder/v1/SalesOrder/Changed/* as an outbound event. Once added, you will see it as an event in the Event Mesh UI under Events.

Create the SAP Event Mesh service on BTP

Wait! Didn’t we just do it? I found this slightly confusing since we already navigated in the SAP Event Mesh UI. However, what we set up was the plan s4-hana-cloud with the plan messaging. What we’re now setting up is the service enterprise-messaging with its plan default.

For this you can refer to below yaml – of course adjust the naming to your needs like before.

---
apiVersion: services.cloud.sap.com/v1
kind: ServiceInstance
metadata:
  name: sap-event-mesh-service
  labels:
    app.kubernetes.io/name: sap-event-mesh-service
  namespace: s4hc-extensibility
spec:
  externalName: ''
  serviceOfferingName: enterprise-messaging
  servicePlanName: default
  parameters:
    emname: apj1-event-mesh
    version: 1.1.0
    namespace: sap/eventmeshazure/apj1
    options:
      management: true
      messagingrest: true
      messaging: true
    rules:
      queueRules:
        subscribeFilter:
        - "${namespace}/*"
        - sap/S4HANAOD/apj0/*
      topicRules:
        subscribeFilter:
        - "${namespace}/*"
        - sap/S4HANAOD/apj0/*

You need to make sure that at least one of the the subscribeFilter entries (separated by dash) match your namespace of S/4HANA Cloud Extensibility for messaging. That’s why I have apj0 in there. If you miss this, you can’t change the service descriptor later (or at least I don’t know how). So you would need to delete the instance and deploy it with the updated filter (again).

Consume the events in the test tool

Now back to the UI of the SAP Event Mesh. Enter the message client we defined before (apj1) and create a new Queue in Queues. Call it e.g. salesOrderChangedQueue. Add an action to the new queue: Queue subscription. Here add the event to it, which we set active in the apj0 client. If we now carry out a change in any sales order in S/4HANA Cloud, we’ll be getting an event raised and can consume it in the test tool.

Image%3A%20Test%20tool%20in%20the%20SAP%20Event%20Mesh%20UI%20-%20Consume%20an%20event.

Image 4: Test tool in the SAP Event Mesh UI – Consume an event.

Note the structure of an event:

{
	"type": "sap.s4.beh.salesorder.v1.SalesOrder.Changed.v1",
	"specversion": "1.0",
	"source": "/default/sap.s4.beh/740755835",
	"id": "9b382d0f-5a6e-1eed-b593-8fe27f15b41b",
	"time": "2023-04-06T17:00:14Z",
	"datacontenttype": "application/json",
	"data": {
		"SalesOrder": "210655",
		"EventRaisedDateTime": "2023-04-06T17:00:14.131574Z",
		"SalesOrderType": "TA",
		"SalesOrganization": "1710",
		"DistributionChannel": "10",
		"OrganizationDivision": "00",
		"SoldToParty": "1000291"
	}
}

Once consumed, the event disappears from the message queue.

Bind the service and activate SAP Event Mesh as eventing framework on Kyma

To understand how to use events in your own application, let’s create the setup for consumption.

First, we have to create a service binding with the SAP Event Mesh. You can do it in the Kyma UI or as part of a yaml file.

---
apiVersion: services.cloud.sap.com/v1
kind: ServiceBinding
metadata:
  name: sap-event-mesh-binding
  namespace: s4hc-extensibility
spec:
  externalName: sap-event-mesh-binding
  secretName: sap-event-mesh-binding
  serviceInstanceName: sap-event-mesh-service

You should then see the binding in status Provisioned after a few seconds. Move to the Configurations → Secrets of your namespace to find a new secret created called sap-event-mesh-binding. You can consider this the equivalent of a service key in Cloud Foundry.

Kyma uses the NATS eventing framework as default, so we got to tell the cluster to switch to SAP Event Mesh. It’s a cluster-wide setting, so you can’t run NATS and SAP Event Mesh in parallel.

Image%3A%20Binding%20secret%3A%20Adding%20activation%20of%20SAP%20Event%20Mesh%20over%20NATS.

Image 5: Binding secret: Adding activation of SAP Event Mesh over NATS.

Add the above two lines into the secret and save it.

Create a OAuth-Client in Kyma and Webhook in the Event Mesh

Create an OAuth Client so that SAP Event Mesh can call into Kyma.

---
apiVersion: hydra.ory.sh/v1alpha1
kind: OAuth2Client
metadata:
  name: s4hc-sochg-oauthclnt
  namespace: s4hc-extensibility
spec:
  grantTypes:
    - client_credentials
  scope: s4hc-sochg-scope
  secretName: s4hc-sochg-secret

Go to the SAP Event Mesh to the client apj1-event-mesh. Choose the menu Webhooks.

Enter a subscription name like kyma-so-change-consumption. Change Quality of Service to 1. Set Exempt Handshake to YesOn-Premise keep at No.

Set the Webhook URL to https://s4-so-change-event.<Cluster ID>.kyma.shoot.live.k8s-hana.ondemand.com.

Default Content Type to application/json.

Authentication to OAuth2ClientCredentials. Now set the Client ID and Client Secret from the OAuth2Client created earlier. Token URL is https://oauth2.<Cluster ID>.kyma.shoot.live.k8s-hana.ondemand.com/oauth2/token.

Enter the scope s4hc-sochg-scope. Press Create. It might be in the state Not Initiated. In that case go to Actions and Trigger Handshake. It should change to Subscription Status Active and Handshake status Exempted.

Consume the event with a simple serverless function

The most simple consumer function might look like this coding below.

module.exports = {
  main: async function (event, context) {
    const axios = require('axios');
    const message = {
      greeting: `Hi there, welcome from Kyma Function ${context['function-name']}`,
      context: context,
      eventdata: event.data,
      eventtype: event.type,
      eventcontent: event.datacontenttype,
    };
    console.log(message);
    const url = 'https://<some other receiver url>';
    axios.post(url, message, { headers: { 'Content-Type': 'application/json' }})
    .then(response => {
      console.log('Response:', response.data);
    })
    .catch(error => {
      console.error('Error:', error.message);
    });
    return message;
  }
}

and so a change in the sales order would make the log look like this:

Image%3A%20SAP%20BTP%20Kyma%20serverless%20function%20log%20-%20event%20received.

Image 6: SAP BTP Kyma serverless function log – event received.

Closing

And there we are with setup and end-to-end test of external cloud event consumption on SAP BTP Kyma using S/4HANA Cloud and SAP BTP Event Mesh. Let me know your feedback in the comments!

Sara Sampaio

Sara Sampaio

Author Since: March 10, 2022

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x