This post will guide you through the steps to connect your SAP Cloud for Real Estate API published by the SAP API Business Hub for Cloud for Real Estate to your SAP Cloud for Real Estate standard content available in SAP Data Warehouse Cloud via SAP Data Intelligence Cloud.
******************************************************************************************************
As long as SAP Data Warehouse Cloud does not support REST API (will probably supported in near future) I decided to realized it in my real life customer scenario with SAP Data Intelligence Cloud where this blog post is focused on. I think this is (as by now) a very convenience way to do so.
I am also fully aware that there are a lot of more option to push your SAP Cloud for Real Estate data to SAP Data Warehouse Cloud.
******************************************************************************************************
Prerequisites are:
- You have already enabled your SAP Cloud for Real Estate API on SAP Business Technology Platform (SAP BTP) to get your individual OAuth2 token by creating your service key.
- You have a running instance of SAP Data Intelligence Cloud in your SAP BTP to create and run pipelines to create tables initially in SAP Data Warehouse Cloud and push data from your SAP Cloud for Real Estate API.
- You have a running instance of SAP Data Warehouse Cloud.
- You created a database user in the SAP_CONTENT Space in your SAP Data Warehouse Cloud system. This is the data entry layer to which you connect SAP Data Intelligence pipelines and write data into the corresponding tables.
- You have imported and activated the Business Content for SAP Cloud for Real Estate in SAP Data Warehouse Cloud via Content Network.
- You have basic experiences in SAP Data Intelligence Cloud, Python, JSON and consuming RESTful APIs.
Steps are:
- Create an inbound connection in SAP Data Intelligence Cloud from your SAP Cloud for Real Estate API.
- Create an outbound connection in SAP Data Intelligence Cloud to your SAP Data Warehouse Cloud instance.
- Create tables in your entry layer of your SAP Data Warehouse Cloud. You can do it via SAP Data Intelligence Cloud and pipelines or directly on your database schema via Database Explorer.
- Create a Python custom operator to get and parse data from your API and push them to your SAP Data Warehouse Cloud instance, specially in your SAP HANA Cloud schema of your database user, created before.
- Create pipelines for each entity you want to extract and push to the tables in your entry layer by using your Python custom operator, created in step 4.
- Manage your inbound and outbound connection of your custom operator in each pipeline.
- Run your pipelines and enjoy our data!
Git repo: I have already prepared those SAP Data Intelligence Cloud pipelines to manage your full ETL process in detail and want to share my solutions (pipelines and a python custom operator) by exporting a SAP Data Intelligence Cloud “solutions” in my personal Git repo but – please aware – without any further support! There, you will find any further steps and guidance to import and configure those pipelines.
Enjoy your data!