From 4d34e87677d8b883461787969b0130ae67887356 Mon Sep 17 00:00:00 2001 From: Phil Davis Date: Thu, 15 Oct 2020 13:45:28 +0545 Subject: [PATCH] Adjust dev testing doc for mono-repo changes --- docs/ocis/development/testing.md | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/docs/ocis/development/testing.md b/docs/ocis/development/testing.md index 89e9d6b210..554ac8fdda 100644 --- a/docs/ocis/development/testing.md +++ b/docs/ocis/development/testing.md @@ -59,13 +59,14 @@ To run a single test add `BEHAT_FEATURE=` ### use existing tests for BDD As a lot of scenarios are written for oC10, we can use those tests for Behaviour driven development in ocis. -Every scenario that does not work in OCIS with OC storage, is listed in `tests/acceptance/expected-failures-on-OC-storage.txt` with a link to the related issue. +Every scenario that does not work in OCIS with "owncloud" storage, is listed in `ocis/tests/acceptance/expected-failures-on-OWNCLOUD-storage.txt` with a link to the related issue. +Every scenario that does not work in OCIS with "ocis" storage, is listed in `ocis/tests/acceptance/expected-failures-on-OCIS-storage.txt` with a link to the related issue. Those scenarios are run in the ordinary acceptance test pipeline in CI. The scenarios that fail are checked against the expected failures. If there are any differences then the CI pipeline fails. -Similarly, scenarios that do not work in OCIS with EOS storage are listed in `tests/acceptance/expected-failures-on-EOS-storage.txt`. +Similarly, scenarios that do not work in OCIS with EOS storage are listed in `ocis/tests/acceptance/expected-failures-on-EOS-storage.txt`. Additionally, some issues have scenarios that demonstrate the current buggy behaviour in ocis(reva). -Those scenarios are in this ocis repository in `tests/acceptance/features/apiOcisSpecific`. +Those scenarios are in this ocis repository in `ocis/tests/acceptance/features/apiOcisSpecific`. Have a look into the [documentation](https://doc.owncloud.com/server/developer_manual/testing/acceptance-tests.html#writing-scenarios-for-bugs) to understand why we are writing those tests. If you want to work on a specific issue @@ -77,7 +78,7 @@ If you want to work on a specific issue 'apiTests': { 'coreBranch': 'master', 'coreCommit': 'a06b1bd5ba8e5244bfaf7fa04f441961e6fb0daa', - 'numberOfParts': 2 + 'numberOfParts': 6 } } @@ -100,8 +101,6 @@ If you want to work on a specific issue 8. delete each of the local tests that were demonstrating the **buggy** behavior. 9. make a PR that has the fixed code, relevant lines removed from the expected failures file and bug demonstration tests deleted. - If the changes also affect the `ocis-reva` repository make sure the changes get ported over there. - ### Notes - in a normal case the test-code cleans up users after the test-run, but if a test-run is interrupted (e.g. by CTRL+C) users might have been left on the LDAP server. In that case rerunning the tests requires wiping the users in the ldap server, otherwise the tests will fail when trying to populate the users. - the tests usually create users in the OU `TestUsers` with usernames specified in the feature file. If not defined in the feature file, most users have the password `123456`, defined by `regularUserPassword` in `behat.yml`, but other passwords are also used, see [`\FeatureContext::getPasswordForUser()`](https://github.com/owncloud/core/blob/master/tests/acceptance/features/bootstrap/FeatureContext.php#L386) for mapping and [`\FeatureContext::__construct`](https://github.com/owncloud/core/blob/master/tests/acceptance/features/bootstrap/FeatureContext.php#L1668) for the password definitions.