Introduction
In the SAP Analytics Cloud Q2 2023 release, the new Import Data Service API was made available. This API makes it possible to load data into SAC models from external applications. This new feature opens a lot of opportunities to integrate and feed models in SAC in a scheduled but fully automated way.
SAP Data Intelligence has a wide variety of functionalities. One of them is using API Client operators in order to configure API calls with a user interface. It was already possible to load data from SAP Data Intelligence to SAP Analytics Cloud, but this process could not be automated with the standard operators due to authorization restrictions. With this development, we want to build a SAC model, extended with a dashboard in order to visualize our full SAP DI runtime history. In SAP Data Intelligence, the monitoring application does not give us the full overview because runtime history automatically gets removed after five days or after 50 runs (excl. archived ones). This automated process of retrieving access to our SAC model, gives added value to our near real-time reporting solution.
SAP Analytics Cloud prerequisites
In SAC, we need to set up some basic elements. First of all, we want to create a client in order to make it possible for SAP Data Intelligence to interact with our SAC tenant. Make sure to indicate the Interactive Usage and API Access purposes.
A graph runtime has some typical characteristics. For this application, we are using the following ones :
- Handle ID : Unique identifier of a graph runtime
- Name : The description of the Graph name
- Message : the output message of a graph runtime
- Status : whether a graph was completed or dead
- Start time
- Stop time
Next, we set up a basic model for data gathering. We added one private dimension, named Runtime, because we can only import data into a private dimension. One measure is created for the calculation of our total runtime duration and the two mandatory public dimensions : Version and Date. Also, we add the basic characteristics of our runtimes to the model as a property. After doing this, we should be good to go. Be aware that importing data into public dimensions is also on the roadmap!
Data Intelligence Cloud
In SAP Data Intelligence, there are already operators available for pushing data into SAC. Unfortunately these operators do not provide automatic authorization for retrieving an access token, leading to the occasion for testing the new Data Import Service API in an automated way. Below, you can find a video to visualize the authorization process using an access token and cookies.
In order to create this automation, we want to do a “recursive” API call to our own SAP Data Intelligence tenant. This can be done with basic authentication.
Now, we can go ahead and start modeling our solution in the modeler of SAP Data Intelligence.
Start with adding two starting points to our SAP Data Intelligence graph. One for retrieving our Authentication elements and one for gathering our SAP DI runtime history. To make sure a CSRF token is included in our GET API Call, we should add a custom header to out OpenAPI Operator. This can be done by creating a message, which includes a header parameter “openapi.header_params.x-csrf-token”. Use “fetch” as the value for this header.
For the first OpenAPI Operator, we include the following settings to retrieve the needed values from the response it will give. It does not really matter which endpoint you use in order to do this call, just make sure it is a GET request and that the following header is added. The cookies are wrapped in one response header, named Set-Cookies. The token is in the X-CSRF-Token response header.
Call for retrieving the token from SAC Data Import Service
Every runtime in SAP Data Intelligence gets its own Handle ID. Because SAP Data Intelligence always stores 50 runtimes, it could be that duplicates appear. This will not be a problem, due to the fact that when writing to an existing member ID in SAP Analytics Cloud, data will be overwritten.
Python operator to create the post call
Now, we want to use python to create our POST Call. This call should include all the authorization elements that are needed to perform a post call to SAC, knowing that 2-legged OAuth is applied here. Because just adding a CSRF token will not be sufficient to perform a POST Call and retrieve the authorization that is necessary, we also want to make sure we add the responded cookie elements to the body of our message. Once again, we can custom define them as a header_params. When performing the call in PostMan, we can see two cookies appear : JSESSION and VCAP.
Our data wrangling can be done with Python, Pandas and NumPy. This is just an example of what elements can be fetched from the system. All elements received from the SAP DI API Call can be pushed to SAC.
The last part is to bring everything together into one message and send it to the final OpenAPI operator. Notice that every property from our private dimension or measure can be appended to the model. Values for the public dimensions like Version and Date should be defined, it is mandatory that they are available in the model. Because either master data, but also fact data is added to our model, we want to use the masterFactData endpoint that SAP provides in this import API.To gather more information about the endpoints of this Import Service, please refer to this webpage.
SAP Analytics Cloud
After running the DI Graph, we can see the data appended to our private dimension. Now it’s up to you to get creative with the SAC Dashboarding. 😊
A big shoutout to my Flexso colleagues for helping me write this blog post!