More from the Series related to the UI5 Excel Upload Control

Simplifying Excel Upload in Fiori Elements: The Open Source and Easy-to-Use UI5 Custom Control
Create a UI5 custom library with versioning using a multi version namespace
How test multiple scenarios and UI5 Versions with wdi5 and GitHub Actions

In a previous blog post, I introduced a UI5 reuse component that simplifies the process of uploading Excel files in UI5. The main objective was to support all current maintenance versions of UI5 and as many usage scenarios as possible. To automate testing for these scenarios, I used wdi5 in conjunction with GitHub Actions. You can find all the currently supported scenarios in the documentation.

Setup

The basic setup involves a monorepo in GitHub with a folder that contains all the supported scenarios and a CAP server that provides the data. The individual UI5 apps consume the latest version directly from the repository. To install the dependencies, we used pnpm. We used these apps for development and testing.

For an overview of which apps are executed with which version and which tests, the testapps.json was created for this purpose, which later scripts fall back on.


As Parent Attributes the scenario is described.
The attributes assigned to it are:

  • appTitel -> Title display in the App Header
  • port -> Port on which the app runs on,in theory all apps could run side by side
  • versionMinor -> Minor Version of UI5
  • testMapping -> all wdi5 test files this app should run
  • copyVersions -> into which versions should the scenario be copied

In “copyVersions” the same attributes are used to create the apps. In addition to this, the exact UI5 version is added.
Theoretically, each scenario and UI5 version could have its own tests.

Setup Apps

As you can see in the packages folder, there is always only one version of the respective scenario. (the latest maintenance version 1.108).
So that the other versions can also be used locally and in GitHub Actions, the script copy-example-apps.js was created so that these scenarios are copied in all other maintenace versions.

As described above, the data from the json file is used to copy the apps.

The result will then look like this:

As you can already also see from the grayed out folders, these are also added to .gitignore.
All apps are designed to work from maintenance versions 1.71 to 1.108. That’s why they can be easily copied. Only for version 1.71 and 1.84 the theme had to be changed to “sap_fiori_3”.

The attributes that were changed was:

  • ui5 version
  • Port number
  • package.json attribute “name
  • App title

The name in the package.json of the respective app was only changed from “ordersv4fe108” to “ordersv4fe96”, for example.
This has especially the advantage to start the app directly with pnpm:

pnpm --filter ordersv4fe96 start

Setup wdi5

The configuration file wdio-base.conf.js with wdi5 or wdio is aligned to test only one app or url.
But since the file is in a javascript file and the config is a simple object in this file, it is easy to make this file more dynamic with javascript.
A good example is the config files in the official wdi5 GitHub repository:
https://github.com/ui5-community/wdi5/tree/main/examples/ui5-js-app/e2e-test-config
Now we can use the data from the testapps.json file to dynamically call only the tests of the respective app via the wdio-base.conf.js .
The code extracts the specs and port from the file and puts them into the object called by wdio.
Now it is possible with pnpm to call the test with parameters exactly only for a special version of an app:
wdio run ./test/wdio-base.conf.js -- -- ordersv4fe 108
In the “examples” folder is again a package.json with the name “ui5-cc-excelupload-sample” which contains the script “test”.
So you can call the test with pnpm from the root folder:
pnpm --filter ui5-cc-excelupload-sample test -- -- ordersv4fe 108

wdi5 Tests

For me personally it is still hard to write wdi5 tests.
At least the initial setup has been simplified a lot with UI5 Journey Recorder which allows you to quickly build a basic framework.
For the tests, similar to the apps, I wanted to have only one spec file and not a separate spec file for each scenario.
Unfortunately, this only worked to a limited extent.
I could at least reuse the test scripts for the Fiori Element Apps, since there are differences between V2 and V4.
Depending on the scenario, different IDs or methods are then used, but the basic structure at least remains the same.
  1. Call List Report Page
  2. Navigate to Object Page
  3. Enter the edit mode
  4. Open Excel Upload dialog
  5. Upload Excel file
  6. Send draft to backend
  7. Save draft
  8. Navigate to the Sub Object Page
  9. Validate if the data was uploaded correctly

GitHub Actions

We have achieved that we can call many different scenarios individually.
Now we want to execute this automatically on every pull request.
Again, we do not want to create a separate workflow for each test.
For this purpose there is the Matrix feature in the GitHub Actions.
This makes it possible to execute a worfklow repeatedly with different parameters. A classic example would be to test something with node versions 14,16 and 18.
However, we can use this to test the different scenarios with each UI5 version.
It helps us that we can call them specifically with pnpm.
In the workflow file for the wdi5 Tests, the matrix attributes are now defined as follows:
matrix:
  scenario: ["ordersv2fe", "ordersv4fe", "ordersv2fenondraft", "ordersv2freestylenondraftopenui5", "ordersv2freestylenondraft"]
  ui5version: [108, 96, 84, 71]
  exclude:
    - scenario: ordersv4fe
      ui5version: 71

Only the Fiori Elements V4 with Version 71 is currently excluded from the test.
All five scenarios are executed now with all four UI5 versions.
This then results in 5 scenarios x 4 UI5 version – 1 included = 19 tests

In the rest of the file, the entire enviroment is built up to the test:

  1. pnpm installed
  2. pnpm install executed
  3. Copied the test apps
  4. Built the UI5 Excel upload component
  5. Started the CAP server

Now we can start the app with the attributes from the matrix:

pnpm --filter ${{ matrix.scenario }}${{ matrix.ui5version }} start:silent&

which will result in

pnpm --filter ordersv4fe108 start

After that we can start the test like:

pnpm --filter ui5-cc-excelupload-sample test -- -- --headless ${{ matrix.scenario }} ${{ matrix.ui5version }}

which result in

pnpm --filter ui5-cc-excelupload-sample test -- -- ordersv4fe 108
successful test with all 19 tests can then look like this:
And as you can see here from the runtime, all tests in the matrix run in parallel, so a complete test runs under 5 minutes.

More Information

You can find more informationen in the documentation or directly in the source code in the GitHub Repo.

Further Links:

Sara Sampaio

Sara Sampaio

Author Since: March 10, 2022

0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x