In this blog post, I am going to share some insights on basic steps for Datahub Upgrade and Datahub Migration to set the repository in accordance with CCV2
The reader is expected to know the datahub fundamentals & processes.
Datahub Upgrade
For this documentation, considering upgrading Datahub from source version X to Target Version Y.
Pre-requisites for Local Setup
-
- Java (jdk 11.0.8) min version
-
- Install Maven 3.3.9
-
- Datahub Zip of the target version (CXDATAHUB-Y.zip) Sample Download link.
-
- Integration Pack of the target version (CXDHBINTPK-YY) Sample Download link.
Refer link for the compatibility matrix on the integration pack
Code Adaptations
-
- Extract all the extensions under CXDATAHUB-YY and CXDHBINTPK-YY under hybris/bin/ext-integration/datahub
Below are some of the sample jars that need to be replaced with the upgraded version.
datahub-extension-archetype-X.jar datahub-extension-archetype-Y.jar [Picked from CXDATAHUB-YY] datahub-extension-sdk-X.jar datahub-extension-sdk- Y.jar [Picked from CXDATAHUB-YY] datahub-webapp- X.war datahub-webapp- Y [Picked from CXDATAHUB-YY] party-canonical- X.jar party-canonical- YY.jar [Picked from CXDHBINTPK-YY] product-canonical- X.jar product-canonical- YY.jar [Picked from CXDHBINTPK-YY] sapcoreconfiguration- X.jar sapcoreconfiguration- YY.jar [Picked from CXDHBINTPK-YY]
-
- To upgrade datahub to a newer version, old libraries needs to be replaced with the latest jars. The jars would be checked-in under custom folder for datahub which after build are moved to lib folder of tomcat for local setup.
POM Changes
Under pom.xml, update the versions of the properties as mentioned below.
<archetype.version>X</archetype.version> <archetype.version> Y </archetype.version> <datahub.extension.sdk.version>X</datahub.extension.sdk.version> <datahub.extension.sdk.version>Y<datahub.extension.sdk.version> <datahub.webapp.version>X</datahub.webapp.version> <datahub.webapp.version>Y</datahub.webapp.version> <sap.integration.version>X</sap.integration.version> <sap.integration.version>YY</sap.integration.version> [Picked from CXDHBINTPK-YY] <properties> <archetype.version>Y</archetype.version> <datahub.extension.sdk.version>Y</datahub.extension.sdk.version> <datahub.webapp.version>Y</datahub.webapp.version> <sap.integration.version>YY</sap.integration.version> </properties>
Note: The pom.xml should be updated with the upgraded version of the artifacts. Issues in the configuration could lead to datahub build failure complaining on Missing artifacts.
Also, if there are version upgrades for Java, Spring or Tomcat, those are also required to be updated in pom.xml
Add javax.xml.bind dependency if the java version is upgraded to java 11 as this is not present.
<dependency> <groupId>javax.xml.bind</groupId> <artifactId>jaxb-api</artifactId> <version>2.3.0</version> </dependency>
Release-Specific Upgrade Steps
Once, the above changes are done, follow the below links to check if there are any release specific upgrade steps related to the versions.
Validation
When upgrade is complete execute the following commands for validation in local
mvn validate mvn clean
Create a DatahubInstance for the DataHub backoffice server configuration as below
INSERT_UPDATE DataHubInstanceModel;instanceName[unique=true];InstanceLocation ; localhost;https://localhost:8443/datahub-webapp/v1 ;
The same has to be replaced with the https://datahub:8080/datahub-webapp/v1 for CCV2.
On successful execution of the above steps, the datahub should be up and running and you can access
Datahub Migration
Pre-requisite
-
- The subscription you are using should have the commerce-cloud-datahub application type enabled.
-
- Before you start have the datahub setup with SAP Commerce compatible on CCV2.
Repository setup
The new structure of the datahub will contain the datahub directory with customisation of datahub and manifest.json (${ROOT_DIR}/datahub/manifest.json).
The datahub folder comprises of the following:
manifest.json
-
- Update the dataHubVersion and the corresponding extension pack
-
- List the extensions with the source extensions followed by the dependent extensions as the order specified here is considered during the build.
{ "datahub_version" : "current datahub version", "extensionPacks": [ { "name": "hybris-datahub-integration-suite", "version": "current datahub integration pack version" } ], "extensions" : [ "sapcoreconfiguration", "sapidocintegration", "sapidocoutboundadapter", "party-canonical", "product-canonical", "sapcustomer-raw", "sapcustomer-canonical", "sapcustomer-target", "saporder-raw", .... ] }
- List the extensions with the source extensions followed by the dependent extensions as the order specified here is considered during the build.
datahub/config directory
-
- Define the configuration with environment specific properties in HOCON style *.conf
The file format should be datahub-environment-<environment code>.conf as shown in the screenshot below and configuration common to all the environments will be in the file datahub-environment.conf
Note: The property values having URL, filepath, String should be enclosed in double quotes.
-
- The encryption-key is required if the extensions have any secure attributes. Follow the steps (here) to create the encryption key
-
- A lib folder is optional and is required only if you are using the pre-compiled extensions.
-
-
logback.xml defines the logging information and this is not environment specific.
-
Custom Extension
Move the custom extension configuration to the folder that you created in ${ROOT_DIR}/datahub as in the old datahub structure.
In the pom.xml file specify scope as system in each Maven dependency, provide systemPath attribute pointing to the jar file and commit the jar files into the Git repository.
<dependency> <groupId>com.hybris.datahub</groupId> <artifactId>sapidocoutboundadapter</artifactId> <scope>system</scope> <systemPath>${basedir}/../ext/sapidocoutboundadapter-<current_version>.jar</systemPath> </dependency>
Example
The format for the system path section of the code is <systemPath>${basedir}/.. /jar_filename</systemPath>.
Note: If the path specified is incorrect the missing artifact error is shown up during the build.
Configuration Details for Datahub
-
- Datahub credentials for the two roles is defined as below.
datahub.security.basic.admin.user=<adminuser> datahub.security.basic.admin.password=<adminpassword> datahub.security.basic.read_only.user=<rouser> datahub.security.basic.read_only.password=<ropassword>
- Datahub credentials for the two roles is defined as below.
Note: The username for datahub.security.basic.admin.user and datahub.security.basic.read_only.user should be different which might otherwise lead to deployment failure with error Cannot resolve reference to bean ‘userDetailsService’ while setting bean property
-
- Credentials for the Data Hub Adapter provides connectivity between SAP Commerce Cloud Data Hub and SAP Commerce Cloud.
datahubadapter.datahuboutbound.user=<adminuser> datahubadapter.datahuboutbound.password=<adminpassword> datahubadapter.datahuboutbound.url=https://<env-specific> /datahub-webapp/v1
- Credentials for the Data Hub Adapter provides connectivity between SAP Commerce Cloud Data Hub and SAP Commerce Cloud.
-
- autoInitMode has three accepted values.
create-drop will drop & create new schema each time you start the server hence it is usually set on the first start of datahub
ignore will ignore the init with every start of the server
create will create a schema if none exists.
-
- Datahub address to communicate between the hybris Application server(s) and Data Hub server(s)
datahub.server.url= http://datahub:8080/datahub-webapp/v1
-
- Set the Commerce Cloud Azure SQL dialect
hibernate.dialect=org.hibernate.dialect.SQLServerDialect
Validation of the Datahub Configuration:
A GET to https://<environment-specific>/datahub-webapp/v1/status endpoint to validate the username/password from ‘datahub.security.basic.admin.user’ and ‘datahub.security.basic.admin.password’.
Datahub cockpit in the Backoffice
Atleast a single Datahub server needs to be configured to see the datahub perspective.
Datahub dashboard provide the status of raw, canonical & published items.
Setup in commerce
The manifest in the commerce should have the datahubadapter contextpath enabled for the Backoffice aspect.
"aspects": [ { "name": "backoffice", "properties": [ ], "webapps": [ { "name": "datahubadapter", "contextPath": "/datahubadapter" } ... ] }]
Datahub Settings
Set the below four properties if other than default(admin/nimda),it should match to datahub security user
datahub.backoffice.rest.client.password.admin=<adminpassword> datahub.backoffice.rest.client.username.admin= <adminuser> datahubadapter.datahuboutbound.password=<adminpassword> datahubadapter.datahuboutbound.user=<adminuser>
Note:
We do not migrate the datahub database as only transient data is stored in the database.
Issues & Resolution
-
- On testing if you find the publications in pending status then there would be a running publication in that pool (publication with IN PROGRESS status). It would indicate that the hybris commerce never reported the publication status back to DataHub.
This could be due to configuration issues in commerce for the datahubadapter.
-
- During composition phase, if you observe the rawitem being filtered/ignored look for the grouping/composition handlers in the extension.xml of the custom extensions. There would be conditions in the handler that are not satisfied hence ignoring the item from being processed further.
References:
https://help.sap.com/doc/b468c7d3110f4dffad5ad2ffc32c55a7/6.5.0.0/en-US/DataHubAPI/index.html
Kindly share your thoughts and feedback in the comments section below.