In this installment of using SAP Ariba with BTP, I’ll be changing topics and focus on how to extract SAP Ariba analytical data with SAP Integration Suite. Before we dive into the more technical components, let’s look into why this a valuable use case for SAP Ariba customers.

Currently, SAP Ariba customers can leverage pre built content on SAP Analytics Cloud that will allow them to gain greater visibility and insights into their spend data. Why is this important? It allows customers to improve savings, drive compliance, and increase operational efficiency. For more information on this topic, please check the linked blogs below:

How Enterprise Analytics for Procurement Improves the Bottom Line

Let’s Play: Make winning procurement moves with SAP Business Technology Platform and Enterprise Analytics

The benefits of using SAP Analytics Cloud are the pre-built content for displaying stories and insights. How can we extract this data from SAP Ariba? That’s where our data extractor comes into play.

The SAP Ariba Analytical Data Extractor

What we’ve done with Integration Suite is construct an iFlow that extracts the Analytical Reporting API data and transforms it from JSON to XML zip files for consumption. The files can then fed into a data storing solution, such as SAP Data Warehouse Cloud. In this blog post, we’ll be focusing on how to extract a fact table, InvoiceLineItemFact.

What is SAP Integration Suite?

Architecture

Architecture

Creating SAP Ariba Analytical Reporting API

To get this enabled you will first need to create an SAP Ariba Analytical Reporting API in the developer portal, if you don’t have one.

  1. Log onto developer.ariba.com, and then follow the prompt and create an account if you need to. Then follow the video below to create an API:
  2.  Once that is done, copy and paste the OAuth UserID, OAuth Secret, and API key. Those will need to be used for later.

Using SAP Integration Suite

Now you’ll need to login to your BTP cockpit and access Integration Suite, from there you’ll need click on Design, Develop, and Operate Integration Scenarios:

From here you’ll need to go to the Design tab on the top left, which is indicated by a pencil logo. You will need to create 5 iFlows to perform the whole data extraction process:

  1. Establish Connection between Ariba/CPI.
  2. CPI to submit view template request to Ariba.
  3. CPI checks if Pagination is needed.
  4. CPI retrieves the information from Ariba.
  5. CPI converts the data that can be sent off to a DB solution.

There will be a collection of iFlows available shortly that you can use to simply download and import into your CPI tenant. This will be made via Discovery Center Mission.

Before we get connected, we will need you to create two Security Credentials, which are designated by the eye icon on the left.

Security Materials are endpoints we need to create to allow flow of data from the source system. You will need to create a User Credentials (the access token) and then an OAuth2 Client Credentials to call the data.

These credentials are from the API you created in the developer portal. You’ll need the API Key, OAuth Client ID, and OAuth Secret. Once those have been created, click Deploy to save them.

Importing and Running the Job

Once those have been created, we will need to download and import our iFlows. The benefits of SAP Integration Suite is that it is relatively user friendly to use prebuilt iFlows. Currently we are working on making these accessible for customers to use. But for educational purposes the below screenshots will show how simple it is to download and upload prebuilt iFlows.

To upload an iFlow, you would go to the Design tab and then go to artifact you’ve built. Then click edit and add an integration flow for individual flows you’ve downloaded, or if you want to upload an entire script collection you can do that as well.

Once that has been imported, you will need to add your company specific information in the configuration section. In the Query field you would add you SAP Ariba realm name. If you have a parent-child site configuration you’ll need to just use the parent realm since that’s where reporting data lives.

For the next section, you’ll want to add your API key, API url and the date it will be querying data from.

 

You’ll want to repeat these steps for the 5 other iFlows that will be associated with this data fact. Once that is done, save and deploy. Once the jobs are ran successfully you’ll have a zip file containing the API data in XML format that can be used for reporting and data storage tools.

We will have a Discovery Center mission aligned with this use case coming this Summer.

There will be a second part to this blog coming out this Summer too. It will focus on how to send this data into SAP Data Warehouse Cloud to be utilized with pre-built content on SAP Analytics Cloud via an automated end to end process.

Sara Sampaio

Sara Sampaio

Author Since: March 10, 2022

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x