I am excited to share about our new mission published this week in the SAP Discovery Center, “Access, Share and Monetize Data with SAP Data Warehouse Cloud”. SAP BTP Startup program “Data-to-Value Track”, an initiative from SAP.iO is supporting startups to become a data provider by leveraging the Data Marketplace for SAP Data Warehouse Cloud. And this mission will guide startups as we provide a detailed step-by-step process on how to replicate/federate data from Cloud storages. As we see more adoptions and understand the different data sources that will be used to ingest data into our SAP Data Warehouse Cloud Data Marketplace, we will update the mission accordingly.
The relevance of external data has been more significant with the enterprises especially with the rise of COVID-19 crisis. Be it the external COVID-19 metrics that will be relevant for Human Resources Team for understanding the impact of employee productivity, geo-spatial data for performing advanced analytics for Logistics/Real estates, external Industry specific data for identifying trends and forecasting, the data providers has been a catalyst for enterprises to plan proactively and take necessary actions in this dynamic environment.
Although this discovery mission focuses on startups for onboarding data providers to SAP Data Warehouse Cloud, we encourage cloud enthusiasts & technical consultants to sign up for the missions to understand the following
- Establishing connections to Cloud Storages[Amazon S3/Azure Blob Storage/ Google Cloud Storage] and create data flows to replicate the data using embedded data pipelines.
- Creating Database Access thereby providing you access to open sql schema and replicate the data from cloud storages using python scripts. You can enable to read-write access to the open sql schema and use the python scripts provided as part of the discovery mission.
- Create Data Profile, data products as a data provider.
- Publishing data with necessary licenses in Data marketplace.
- Validating the data as a data consumer from Data Marketplace.
- Finally loading the external data into your space and blending it with your internal data & visualize using SAP Analytics Cloud.
Additionally we will be providing python scripts focusing on parallel inserts into open sql schemas and will updating the mission in the next few weeks. You can also refer the GitHub repository shared as part of the mission.
Based upon the feedback and consumption, we will add additional data source connections for federation/replication. Here is the link for the mission .
We are excited to hear your feedback. Please feel free to comment on the Discovery mission here or you can reach out to Alina Novosolova or myself if you have issues accessing the mission.