Hello,

In this blog I am going to present you a sample scenario in which SAP Cloud Integration pulls several message types from SAP Data Warehouse Cloud ( SAP DWC ) and sends it to Anaplan via the delete and import jobs.

I had to process multiple message types from SAP DWC ( Sales extract, Materials, BOM, Customers, etc. ), and in order to better organize the process I decided to create multiple iFlows that act only as triggers and the source of fixed parameters.

Anaplan APIs used and API Authentication

Before going through the logic implemented in SAP Cloud Integration, I would like to highlight the steps that we need to follow in order to send a file to Anaplan via APIs, because the difficulty comes from the number and the multiple calls that are needed :

 

Perform a delete action

Get a list of Delete actions, Start the delete action, Check the status of the delete action

 

Perform an import action

  •   Upload a CSV file

Split the file into chunks, Get a list of import actions, Get the list of import files, Set the chunk count, Upload the chunks, Mark the upload as complete.

  •   Initiate the import action

Start the import action, Check the status of the import action, Check the dump file for imports.

 

Authentication

In order to call any of the Anaplan APIs, I had to get an AuthToken, via the Authenticate API.

The authentication call response contains a tokenValue field. In the Create AuthToken Groovyscript tokenValue’s value is concatenated with the ‘AnaplanAuthToken’ string and results the ‘Authorization’ header that is mandatory in all the calls.

Besides this, in order to connect to a specific Anaplan system, it is mandatory to have the following data : workspaceID,modelID and exportID.

SAP Cloud Integration Architecture

For a better overview of the whole proccess, I have created a graph that will help you understand the sequence of the iFlows, what parameters are being sent and where they are used :

Schema%20for%20iFlow%20chaining.

Schema for iFlow chaining.

The process has the following steps :

Step 1

The iFlow TRIGGER_*_EXTRACT is used to store all the neccesary parameters in order to process a specific message, and also can be scheduled to run at a specific time if neccesary. This iFlow can be duplicated as many times for how many items we need to extract from SAP DWC. In my example, I have 7 iFlows of this type, and all of them are pointing to the same iFlow that is described in step 2.

Trigger%20SAP%20DWC%20Extract%20for%20a%20specific%20message%20type

Trigger SAP DWC Extract for a specific table

Step 2 

EXTRACT_DWC_TABLE iFlow has been created to use externalized parameters for the SAP DWC extracts, meaning it is being connected via ProcessDirect by all step 1 iFlows.

Extract%20DWC%20Table%20iFlow

Extract DWC Table iFlow

The connection to SAP DWC is being made via the JDBC Receiver Adapter , using the statement we sent in step 1 as body and a predefined JDBC Data Source Alias. The response of this query will be a XML with multiple rows, which will be used for further processing. In the end, the payload will be sent via the JMS adapter to a JMS Queue, for asynchronous processing, that allows us a better error handling.

Step 3 

InboundAnaplanJMS iFlow is used to store the message in a message queue and re-process in case of any retry-worthy error ( like a connection failure at a specific moment ). I won’t insist too much on this step, because it’s optional and it was only part of our general system architecture.

One thing to mention here : It was neccesary to add in the Allowed headers section all the headers that are being sent in the step 1, otherwise, they would not get through the JMS adapter.

Step 4

TRIGGER_DELETE_JOB iFlow is used to trigger the delete job before importing a new file. This behavior is well documented in Anaplan’s API Documentation, and consists of multiple steps.

Trigger%20Delete%20Job

Trigger Delete Job

As per SAP Cloud Integration perspective, we use the ‘DeleteFileName’ parameter sent in step 1, to filter out all the files that do not contain the parameter value, we extract that file ID, and use it to start the delete action :

https://api.anaplan.com/2/0/workspaces/${header.workspaceID}/models/${header.modelID}/actions/${header.DeleteFileID}/tasks

Step 4

TRIGGER_IMPORT_JOB iFlow is the complex part of the whole process. In here, first we have to check for the success of the delete job, then we have to split the extracted file from DWC into chunks, upload the chunks to the Anaplan server, check that job’s status, and in the end trigger the actual import job which processes the uploaded files.

iFlow%20that%20triggers%20the%20import%20job

iFlow that triggers the import job

Don’t worry, we’ll take each step one by one and explain it :

Set the chunk count local process 

Local%20Process%20that%20sets%20the%20chunk%20count

Local Process that sets the chunk count

Before sending any chunk, we need to check the success of the delete job and after that, set the chunk count to -1 ( as per the documentation ). The message body for setting the chunkcount is :

{
“headerRow”: “1”, – 1 means the first line of the CSV contains the header
“firstDataRow”: “2”, – 2 means the second line of the CSV is the actual data line
“chunkCount”:”-1″
}

and the API call used for setting the chunk count is : https://api.anaplan.com/2/0/workspaces/${header.workspaceID}/models/${header.modelID}/files/${property.importfileid}

 

Split and upload chunks

Local%20Process%20that%20splits%20the%20payload%20and%20uploads%20it

Local Process that splits the payload and uploads it

In order to split the payload from SAP DWC, we use a General Splitter that splits it based on the xpath //row. In order to send the file to Anaplan, only for the first chunk we need to send the header line. If we send the header line on each chunk, these will be added in the end as additional lines in the payload and the whole process will fail.

How do we determine the first chunk generated by the General Splitter ? We use the property CameSplitIndex, which has the value 0 for the first chunk. In the router we check if that property has the value 0, and uses the proper XML to CSV Converter ( one has headers included ,one has not ).

The API call used for uploading the chunks : https://api.anaplan.com/2/0/workspaces/${header.workspaceID}/models/${header.modelID}/files/${property.importfileid}/chunks/${property.ChunkCount}

 

Start the import job

Local%20Process%20that%20start%20the%20import%20job

Local Process that start the import job

This Local Process starts the import job that is again being determined based on the ImportFileName property that was configured in step 1. Here, you can also see a Looping Process Call that checks the status of the import job and has the STOP condition “/root/task[(taskState != ‘COMPLETE’)]”.

The API call used for starting the import job is : https://api.anaplan.com/2/0/workspaces/${header.workspaceID}/models/${header.modelID}/imports/${property.uploadfileid}/tasks

In the end, based on the response from Anaplan, we can end succesfully the message, or, in case of connectivity or any retry-worthy errors, keep it in the JMS queue and re-process the message.

Conclusions

Connecting to SAP DWC was not that hard, the difficulties came from the facts that the system allowed connections only based on IP whitelisting and also from the SELECT statements that have to be aligned with the order and the parameter names stored in the SAP DWC tables.

The mechanism of deleting, uploading and importing the SAP DWC payload into Anaplan is a tedious one, with many steps, and each one is conditioned by a parameter obtained in the previous call, so in the end the process results in a lot of iFlows, steps and local processes.

It was a nice experience, because I have learned about CamelSplitIndex, generating an Auth Token, Looping Process Calls, JDBC connections to SAP DWC, and of course, parameter handling and externalizing.

I hope you liked my article and feel free to reply with  any suggestions here on in the SAP Cloud Integration Questions Section as I am still updating the flow.

Sara Sampaio

Sara Sampaio

Author Since: March 10, 2022

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x