The SAP Cloud Application Programming Model (CAP) is a framework of languages, libraries, and tools for building enterprise-grade services and applications. In order to develop the SAP CAP application, we can choose either Node.js or Java as the language of choice. A major part of SAP CAP application is the domain model, in which all the domain entities can be defined via Core Data Services (CDS) and either be connected to external services or databases. It has an out of box support for SAP HANA (Cloud) and SQLite for development scenarios.
PostgreSQL is a powerful Open-Source database that can be used for SAP CAP application development. SAP HANA may be the better choice in SAP CAP application projects closely related to other SAP systems (SAP S/4HANA, etc.), and PostgreSQL may be a powerful and cheaper alternative in other scenarios. There are blogs related to CAP on PostgreSQL (Node.js). In this blog, we will see an SAP CAP Java project deployed to SAP Business Technology Platform with PostgreSQL, hyperscaler option.
Prerequisites
The following preparations are required to follow the steps and deploy the project by yourself:
– SAP BTP PAYGO or CPEA agreement
– SAP BTP subaccount
– Entitlements for PostgreSQL, hyperscaler option
– Entitlements for SAP BTP, Cloud Foundry runtime
Setting Up Windows Local Environment for CAP Java Development
Note: For a preconfigured environment, use SAP Business Application Studio, which comes with all required tools preinstalled.
Install Visual Studio Code on your local machine and perform the following steps to make VS Code ready for CAP Java development.
1. Install Node.js
- Download the latest version of Node.jsand follow the installer prompts to install it on your machine.
- To verify the installation of Node.js, run command node -v in Command Prompt. It should print the Node.js version. Also, run command npm -v to ensure npm (package manager) is up and running.
2. Install CDS Tools for CAP Development
Run the following command to install @sap/cds-dk globally.
npm i -g @sap/cds-dk
3. Install cf CLI (Command Line Interface) for Cloud Foundry operations
Refer to the repository https://github.com/cloudfoundry/cli and install the cf CLI for your OS.
4. Install MTA Build Tool (MBT)
Run the following command to install MTA Build Tool globally
npm install -g mbt
5. Install a Java VM
Install a Java VM. At least, Java 8 is required. For example, For example, download and install SapMachine 11.
6. Install Apache Maven(at least version 3.5.0 is required).
7. Execute the following commands on the command line to check whether the installed tools are set up correctly:
cds --version
java –version
mvn –version
mbt -v
8. VS Code Extensions for CAP Java
These extensions depend on the developer, and the following are my favourite list.
- SAP Fiori Tools – Extension Pack
- SAP CDS Language Support
- Core data services graphical modeler for VS Code
- Maven for Java
- Language Support for Java(TM) by Red Hat
- Lombok
- Java Extension Pack
- Remote Development Extension Pack
- YAML
- Git Graph
- SQLite
Starting a New Project
1. Use the CAP Java Maven archetypeto bootstrap a new CAP Java project:
mvn archetype:generate -DarchetypeArtifactId="cds-services-archetype" -DarchetypeGroupId="com.sap.cds" -DarchetypeVersion="RELEASE"
When prompted, specify the group ID and artifact ID of your application. The artifact ID also specifies the name of your projects root folder that is generated in your current working directory. For other values prompted, it’s enough to simply confirm the default values.
2. Use the CDS Maven pluginto add a sample CDS model after creating your project. Navigate to the root folder of your CAP Java project and execute the following Maven command:
cd <project root>
mvn com.sap.cds:cds-maven-plugin:addSample
3. Enhance your project with dependencies required for Cloud Foundry, execute the goal addTargetPlatformof the CDS Maven plugin using the following command:
mvn com.sap.cds:cds-maven-plugin:addTargetPlatform -DtargetPlatform=cloudfoundry
4. Open the project in VS Code and remove readonly from srvcat-service.cds, so that we can test the CRUD operation
service CatalogService {
@readonly entity Books as projection on my.Books;
}
5. Build and run the generated project from the command line using following Maven command
mvn spring-boot:run
6. Test the application using the URL http://localhost:8080
Click on books entity and enter “system” as a username, skip the password then click on sign in. This will display the books entity sample data.
Note: This application is running locally with in-memory database (H2). Let us see How to use the PostgreSQL database and deploy the application to SAP BTP.
Using PostgreSQL database
1. We first need to add the build configuration, so that cds build task prepares everything for Cloud Foundry deployment. SAP CAP comes with a default configuration, but we need to explicitly add the following dependencies and cds configurations for PostgreSQL.
Insert the following code snippet into package.json. The final package.json file can be viewed here .
"engines": {
"node": "^16.0.0"
},
"dependencies": {
"@sap/cds": "6.1.3",
"cds-pg": "0.1.30",
"cds-dbm": "^0.0.36",
"express": "^4"
},
"devDependencies": {
"mbt": "~1.2.1",
"cds-dbm": "^0.0.36"
},
"cds": {
"build": {
"tasks": [
{
"for": "java",
"src": "srv",
"options": {
"model": [
"db",
"srv",
"app"
]
}
},
{
"use": "cds-dbm/dist/build/postgres-cf",
"for": "postgres-cf",
"src": "db",
"options": {
"deployCmd": "npx cds-dbm deploy --load-via delta --auto-undeploy"
}
}
]
},
"requires": {
"db": {
"kind": "database"
},
"database": {
"dialect": "plain",
"impl": "cds-pg",
"model": [
"srv"
]
}
},
"migrations": {
"db": {
"schema": {
"default": "public",
"clone": "_cdsdbm_clone",
"reference": "_cdsdbm_ref"
},
"deploy": {
"tmpFile": "tmp/_autodeploy.json",
"undeployFile": "db/undeploy.json"
}
}
}
}
2. Trigger a manual build by executing the following commands:
npm i
cds build
3. Next, you need to generate a mta.yml file which contains the modules and resource definitions.
The following command generates a mta.yml file in project root folder:
cds add mta
4. Update the existing build command in the mta.yml file.
build-parameters:
before-all:
- builder: custom
commands:
- npm install
- npx -p @sap/cds-dk cds build --production
5. Add the following resources definition to create PostgreSQL database service in SAP BTP:
resources:
resources:
- name: sample-database
type: org.cloudfoundry.managed-service
parameters:
service: PostgreSQL
service-plan: dev
skip-service-updates:
parameters: true
Note: Make sure you have the entitlement to create PostgreSQL service
6. Create the PostgreSQL deployer module by adding the following definition in modules section:
- name: db-deployer
type: custom
path: .
parameters:
buildpacks: [https://github.com/cloudfoundry/apt-buildpack#v0.2.11, nodejs_buildpack]
no-route: true
no-start: true
disk-quota: 2GB
memory: 512MB
tasks:
- name: deploy_to_postgresql
command: chmod 755 db/deploy.sh && ./db/deploy.sh
disk-quota: 2GB
memory: 512MB
build-parameters:
ignore: ["node_modules/", "mta_archives/","tmp/"]
requires:
- name: sample-db
7. Add the PostgreSQL service to the server module by simply adding the requires statement:
- name: sample-srv
type: java
path: srv
parameters:
memory: 512M
disk-quota: 256M
buildpack: java_buildpack
properties:
SPRING_PROFILES_ACTIVE: cloud
build-parameters:
builder: custom
commands: [ mvn clean package ]
build-result: "target/*-exec.[wj]ar"
provides:
- name: srv-api # required by consumers of CAP services (e.g. approuter)
properties:
srv-url: ${default-url}
requires:
- name: sample-db
The final mta.yml file can be viewed here.
8. The following files are required to create the resources and cds deploy.
Create an apt.yml file in your project root and copy the following:
---
keys:
- https://dist.sapmachine.io/debian/sapmachine.old.key
- https://dist.sapmachine.io/debian/sapmachine.key
repos:
- deb http://dist.sapmachine.io/debian/amd64/ ./
packages:
- sapmachine-11-jdk
Create pg-options.json in project root and copy the following:
{
"engine_version": "11"
}
9. Add the PostgreSQL dependency in srvpom.xml:
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.3.3</version>
</dependency>
10. Add the following cloud profile in your application.yaml:
---
spring:
config.activate.on-profile: cloud
datasource:
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://${vcap.services.sample-db.credentials.hostname}:${vcap.services.sample-db.credentials.port}/${vcap.services.sample-db.credentials.dbname}
username: ${vcap.services.sample-db.credentials.username}
password: ${vcap.services.sample-db.credentials.password}
initialization-mode: never
hikari:
maximum-pool-size: 10
Note: Change the sample-db with your db instance name.
We completed all the configurations and are now ready for deployment.
Deploy to Cloud Foundry and Test
1. Use the mbt build tool to assemble everything into a single mta.tar archive.
mbt build
2. Finally, we can deploy the generated archive to Cloud Foundry:
cf deploy .mta_archivessample_1.0.0-SNAPSHOT.mtar
3. Execute the following commands to enable SSH for accessing the PostgreSQL DB locally:
cf allow-space-ssh <space>
cf enable-ssh sample
cf restart sample
4. Execute the following command and copy the environment variables. It contains the PostgreSQL credentials and endpoint details, which will be used in upcoming steps:
cf env sample
5. Establish the SSH tunnel using the following command, so that we can connect the database tool to check the schema.
# Replace 63306 by any free port on local computer
# Replace <hostname> by the hostname tag value in the environment variable
# Replace <port> by the port tag value in the environment variable
# Replace test-app by the application name
cf ssh -L 63306:<hostname>:<port> sample
6. Download and install the community edition of DBeaver and connect to PostgreSQL service via SSH tunnel.
HOST: localhost
PORT: Local computer port from step 5
DATABASE: “dbname” tag value from environment variable
USERNAME: “username” tag value from environment variable
PASSWORD: “password” tag value from environment variable
7. Click OK and connect to validate the tables in the public schema.
8. Modify the dbdata-model.cds, so that we can redeploy the cds and check.
Here added the description
namespace my.bookshop;
entity Books {
key ID : Integer;
title : String;
description: String;
stock : Integer;
}
9. As we have modified our initial data model, we need to redeploy the corresponding db artifacts to the PostgreSQL database.
To deploy the changes, use the following commands to build and deploy only the db module:
mbt build
cf deploy .mta_archivessample_1.0.0-SNAPSHOT.mtar -m db-deployer
10. Validate the updated data model changes and sample data using DBeaver. Next will check the CRUD operations using Postman.
11. If you haven’t already done, install Postman.
12. Create a new HTTP Request and enter the deployed Application URL.
To fetch the application URL, execute the following command and copy the routes URL.
cf app <app name>
13. In Postman, click on send to check the GET request response.
Note: Execute the EntitySet URL, so that we can validate the sample data and insert operation.
14. In Postman, change the GET request to PUT and select the body tab.
15. Select the Raw option and select JSON format, then enter the put request something like follows.
{
"ID": 10,
"title": "Test title",
"description": "Test desc",
"stock": 100
}
16. Click on send and validate the results. This should insert the new records in the books table.
Note: The following section is relevant for Windows landscape, if you have different landscape, adjust the path relevant to your environment. This will not work with SAP Business Application Studio (BAS) due to host permission issue.
If you are planning to execute the cds-dbm commands from your local machine, you have to do some more steps. This is because of ‘PostgreSQL, hyperscaler option’, SSL is enabled by default and is enforced. That is, if the client/application tries to establish non-ssl connection to the DB, it will fail. The default sslmode mode is different for different clients, if your connection parameters specify sslmode=verify-ca or sslmode=verify-full, then your client requires the CA certificate to be in the trust store or referenced in the connection URL. The cds-dbm internally uses liquidbase which expects the certificate. Let us start by getting the certificate.
– Copy the sslrootcert from application environment variable using the following command.
cds env
– This gives the entire environment variable. We have to copy only the sslrootcert and save it in C:Users<username>AppDataRoamingpostgresqlroot.crt.
– We have to connect with a proper host name, otherwise local deployment throws an error.
Copy the hostname from the environment variables and add the following entry in
C:WindowsSystem32driversetchosts.
127.0.0.1 <Host name copied from environment variable>
– Create a default-env.json file in project root using the application environment variable.
Example:
{
"VCAP_SERVICES": {
[...]
}
}
That’s all; now we can test the cds-dbm deployment locally.
– Establish the SSH tunnel using the following command, so that we can connect the database tool to check the schema.
# Replace 63306 by any free port on local computer
# Replace <hostname> by the hostname tag value in the environment variable
# Replace <port> by the port tag value in the environment variable
# Replace test-app by the application name
cf ssh -L 63306:<hostname>:<port> sample
– Deploy the changes using the following commands (Refer the cds-dbm document for different options).
npx cds-dbm deploy
npx cds-dbm deploy --load-via delta --auto-undeploy
Related Blogposts
Getting started with CAP on PostgreSQL (Node.js) | SAP Blogs
Architecting solutions on SAP BTP for High Availability | SAP Blogs
Conclusion
Well, hope this gives you a brief idea of how to run and deploy SAP CAP Java application with PostgreSQL on SAP BTP Cloud Foundry. In addition to that, we have explored a local environment setup, database tools, Postman etc… This can be extended to different data sources like PostgreSQL Service, Azure Cosmos DB for PostgreSQL, Amazon Aurora PostgreSQL-Compatible Edition, Google Cloud AlloyDB etc…
Special thanks to Gautam Kumar & Manish Nandamudi, for contributing to this blog post.
Please, feel free to provide feedback. We are always happy and open to your questions.