Compare commits

..

1 Commits

Author SHA1 Message Date
Viktor Scharf
53d0bb467b fix issue-2146 2026-01-15 16:57:26 +01:00
88 changed files with 2103 additions and 22639 deletions

View File

@@ -3,83 +3,79 @@
### Prerequisites
* [ ] DEV/QA: bump web version
* [ ] DEV/QA: bump reva version
* [ ] DEV/QA: DEV: Create rc tag `vx.y.z-rc.x`
* [ ] DEV: update introductionVersion
* [ ] DEV: add new production version
* [ ] DEV/QA: Kickoff meeting [Kickoff meeting] (https://???)
* [ ] DEV/QA: Define client versions and provide list of breaking changes for desktop/mobile team
* [ ] DEV/QA: Check new strings and align with clients
* [ ] DEV/DOCS: Create list of pending docs tasks
* [ ] DEV: Create branch `release-x.x.x-rc.x` -> CODEFREEZE
* [ ] DEV: bump opencloud version in necessary files
* [ ] DEV: `changelog/CHANGELOG.tmpl`
* [ ] DEV: `pkg/version/version.go`
* [ ] DEV: `sonar-project.properties`
* [ ] DEV: prepare changelog folder in `changelog/x.x.x_????_??_??`
* [ ] DEV: Check successful CI run on release branch
* [ ] DEV: Create signed tag `vx.y.z-rc.x`
* [ ] DEV: Check successful CI run on `vx.y.z-rc.x` tag / BLOCKING for all further activity
* [ ] DEV: Merge back release branch
* [ ] DEV: bump released deployments to `vx.y.z-rc.x`
* [ ] DEV: https://cloud.opencloud.eu/
* [ ] DEV: needs snapshot and migration
### QA Phase
* [ ] QA: Compatibility test with posix fs
* [ ] QA: Compatibility test with decomposed fs
* [ ] DEV/QA: Performance test
* [ ] STORAGE_USERS_DRIVER=posix
* [ ] 75vu's, 60m
* [ ] 75vu's, 60m
* [ ] STORAGE_USERS_DRIVER=decomposed
* [ ] 75vu's, 60m
* [ ] 75vu's, 60m
* [ ] QA: documentation test
* [ ] QA: Review documentation
* [ ] QA: Verify all new features documented
* [ ] QA: Create upgrade documentation
* [ ] QA: Check installation guides
* [ ] QA: Confirmatory testing (if needed)
* [ ] QA: [Compatibility test](???)
* [ ] QA: [Performance test](https://github.com/opencloud-eu/cdperf/tree/main/packages/k6-tests/src)
* [ ] QA: Documentation test:
* [ ] QA: Single binary - setup
* [ ] QA: Docker - setup
* [ ] QA: Docker-compose - setup
* [ ] QA: helm/k8s - setup
* [ ] QA: e2e with different deployment:
* [ ] QA: [wopi](???.works)
* [ ] QA: [traefik](???.works)
* [ ] QA: [ldap](???.works)
* [ ] QA: e2e with different storage:
* [ ] QA: decomposed
* [ ] QA: decomposeds3
* [ ] QA: posix
* [ ] QA: posix with enabled watch_fs
* [ ] QA: e2e with different deployments deployments:
* [ ] QA: e2e tests agains opencloud-charts
* [ ] QA: binary
* [ ] QA: multitanacy
* [ ] QA: docker using [docker-compose_test_plan](https://github.com/opencloud-eu/qa/blob/main/.github/ISSUE_TEMPLATE/docker-compose_test_plan_template.md)
* [ ] QA: local
* [ ] QA: nfs
* [ ] QA: s3
* [ ] QA: Different clients:
* [ ] QA: desktop (define version) https://github.com/opencloud-eu/client/releases
* [ ] QA: against mac - exploratory testing
* [ ] QA: against windows - exploratory testing
* [ ] QA: against mac - smoke test
* [ ] QA: against windows - smoke test
* [ ] QA: against linux (use auto tests)
* [ ] QA: android (define version) https://github.com/opencloud-eu/android/releases
* [ ] QA: ios (define version)
* [ ] QA: check docs german translation
* [ ] QA: german translations desktop at 100%
* [ ] QA: exploratory testing
* [ ] QA: [Smoke test](???) on Web Office (Collabora, Onlyoffice, Microsoft office)
* [ ] QA: Smoke test Hello extension
* [ ] QA: [Smoke test](???) ldap
* [ ] QA: Collecting errors found
### Collected bugs
* [ ] Please place all bugs found here
### After QA Phase
### After QA Phase (IT related)
* [ ] Brief company-wide heads up via mail @tbsbdr
* [ ] Create list of changed ENV vars and send to release-coordination@opencloud.eu
* [ ] Variable Name
* [ ] Introduced in version
* [ ] Default Value
* [ ] Description
* [ ] dependencies with user other components
* [ ] DEV: Create branch `release-x.x.x`
* [ ] DEV: bump OpenCloud version in necessary files
* [ ] DEV: `pkg/version/version.go`
* [ ] DEV: `sonar-project.properties`
* [ ] DEV: released deployment versions
* [ ] DEV: prepare changelog folder in `changelog/x.x.x_???`
* [ ] Release Notes + Breaking Changes @tbsbdr
* [ ] Migration + Breaking Changes Admin Doc @???
* [ ] DEV: Create final signed tag
* [ ] DEV: Check successful CI run on `vx.y.z` tag / BLOCKING for all further activity
* [ ] Merge release notes
* [ ] QA:bump version in pkg/version.go
* [ ] QA: Run CI
* [ ] DEV/QA: create final tag
* [ ] QA: observe CI Run on tag
* [ ] DEV/QA: Create a new `stable-*` branch
* [ ] (opencloud)[https://github.com/opencloud-eu/opencloud/branches]
* [ ] (web)[https://github.com/opencloud-eu/web/branches]
* [ ] (reva)[https://github.com/opencloud-eu/reva/branches]
* [ ] (opencloud-compose)[https://github.com/opencloud-eu/opencloud-compose/branches]
* [ ] DEV/QA:: publish release notes to the docs
* [ ] DEV/QA:: update (demo.opencloud.eu)[https://demo.opencloud.eu/]
### After QA Phase ( Marketing / Product / Sales related )
* [ ] notify marketing that the release is ready @tbsbdr
* [ ] announce in the public channel (matrix channel)[https://matrix.to/#/#opencloud:matrix.org]
* [ ] press information @AnneGo137
* [ ] press information @AnneGo137
* [ ] Blogentry @AnneGo137
* [ ] Internal meeting (Groupe Pre-Webinar) @db-ot
* [ ] Partner briefing (Partner should be informed about features, new) @matthias
* [ ] Webinar DE & EN @AnneGo137
* [ ] Präsentation DE @tbsbdr / @db-ot
* [ ] Präsentation EN @tbsbdr / @db-ot
* [ ] Website ergänzen @AnneGo137
* [ ] Features @AnneGo137
* [ ] Service & Support - New Enterprise Features @tbsbdr
* [ ] OpenCloud_Benefits.pdf updates @AnneGo137
* [ ] Welcome Files: Features as media @tbsbdr
* [ ] Flyer update @AnneGo137
* [ ] Sales presentation @matthias
### Post-release communication
* [ ] DEV: Create a `docs-stable-x.y` branch based on the docs folder in the OpenCloud repo @micbar
* [ ] DEV/QA: Ping documentation in RC about the new release tag (for opencloud/helm chart version bump in docs)
* [ ] DEV/QA: Ping marketing to update all download links (download mirrors are updated at the full hour, wait with ping until download is actually available)
* [ ] DEV/QA: Ping @??? once the demo instances are running this release
* [ ] DEV: Merge back release branch
* [ ] DEV: Create stable-x.y branch in the OpenCloud repo from final tag

View File

@@ -201,6 +201,13 @@ config = {
],
"skip": False,
},
"accountsHashDifficulty": {
"skip": False,
"suites": [
"apiAccountsHashDifficulty",
],
"accounts_hash_difficulty": "default",
},
"notification": {
"suites": [
"apiNotification",
@@ -227,6 +234,7 @@ config = {
],
"skip": False,
"antivirusNeeded": True,
"generateVirusFiles": True,
"extraServerEnvironment": {
"ANTIVIRUS_SCANNER_TYPE": "clamav",
"ANTIVIRUS_CLAMAV_SOCKET": "tcp://clamav:3310",
@@ -293,6 +301,7 @@ config = {
"skip": False,
"withRemotePhp": [True],
"antivirusNeeded": True,
"generateVirusFiles": True,
"extraServerEnvironment": {
"ANTIVIRUS_SCANNER_TYPE": "clamav",
"ANTIVIRUS_CLAMAV_SOCKET": "tcp://clamav:3310",
@@ -658,10 +667,10 @@ def testPipelines(ctx):
storage = "decomposed"
if "skip" not in config["cs3ApiTests"] or not config["cs3ApiTests"]["skip"]:
pipelines += cs3ApiTests(ctx, storage)
pipelines += cs3ApiTests(ctx, storage, "default")
if "skip" not in config["wopiValidatorTests"] or not config["wopiValidatorTests"]["skip"]:
pipelines += wopiValidatorTests(ctx, storage, "builtin")
pipelines += wopiValidatorTests(ctx, storage, "cs3")
pipelines += wopiValidatorTests(ctx, storage, "builtin", "default")
pipelines += wopiValidatorTests(ctx, storage, "cs3", "default")
pipelines += localApiTestPipeline(ctx)
pipelines += coreApiTestPipeline(ctx)
@@ -1050,12 +1059,12 @@ def codestyle(ctx):
return pipelines
def cs3ApiTests(ctx, storage):
def cs3ApiTests(ctx, storage, accounts_hash_difficulty = 4):
pipeline = {
"name": "test-cs3-API-%s" % storage,
"steps": evaluateWorkflowStep() +
restoreBuildArtifactCache(ctx, dirs["opencloudBinArtifact"], dirs["opencloudBinPath"]) +
opencloudServer(storage, deploy_type = "cs3api_validator") +
opencloudServer(storage, accounts_hash_difficulty, deploy_type = "cs3api_validator") +
[
{
"name": "cs3ApiTests",
@@ -1086,7 +1095,7 @@ def cs3ApiTests(ctx, storage):
])
return [pipeline]
def wopiValidatorTests(ctx, storage, wopiServerType):
def wopiValidatorTests(ctx, storage, wopiServerType, accounts_hash_difficulty = 4):
testgroups = [
"BaseWopiViewing",
"CheckFileInfoSchema",
@@ -1164,7 +1173,7 @@ def wopiValidatorTests(ctx, storage, wopiServerType):
"steps": evaluateWorkflowStep() +
restoreBuildArtifactCache(ctx, dirs["opencloudBinArtifact"], dirs["opencloudBinPath"]) +
waitForServices("fake-office", ["fakeoffice:8080"]) +
opencloudServer(storage, deploy_type = "wopi_validator", extra_server_environment = extra_server_environment) +
opencloudServer(storage, accounts_hash_difficulty, deploy_type = "wopi_validator", extra_server_environment = extra_server_environment) +
wopiServer +
waitForServices("wopi-fakeoffice", ["wopi-fakeoffice:9300"]) +
[
@@ -1209,21 +1218,29 @@ def wopiValidatorTests(ctx, storage, wopiServerType):
def localApiTestPipeline(ctx):
pipelines = []
with_remote_php = [True]
enable_watch_fs = [False]
if ctx.build.event == "cron":
with_remote_php.append(False)
enable_watch_fs.append(True)
defaults = {
"suites": {},
"skip": False,
"extraTestEnvironment": {},
"extraServerEnvironment": {},
"storages": ["posix"],
"accounts_hash_difficulty": 4,
"emailNeeded": False,
"antivirusNeeded": False,
"tikaNeeded": False,
"federationServer": False,
"collaborationServiceNeeded": False,
"extraCollaborationEnvironment": {},
"withRemotePhp": [True],
"enableWatchFs": [False],
"withRemotePhp": with_remote_php,
"enableWatchFs": enable_watch_fs,
"ldapNeeded": False,
"generateVirusFiles": False,
}
if "localApiTests" in config:
@@ -1238,14 +1255,6 @@ def localApiTestPipeline(ctx):
if "[decomposed]" in ctx.build.title.lower() or name.startswith("cli"):
params["storages"] = ["decomposed"]
if ctx.build.event == "cron":
params["withRemotePhp"] = [True, False]
params["enableWatchFs"] = [True, False]
# override withRemotePhp if specified in the suite config
if "withRemotePhp" in matrix:
params["withRemotePhp"] = matrix["withRemotePhp"]
for storage in params["storages"]:
for run_with_remote_php in params["withRemotePhp"]:
for run_with_watch_fs_enabled in params["enableWatchFs"]:
@@ -1270,15 +1279,16 @@ def localApiTestPipeline(ctx):
(waitForLdapService() if params["ldapNeeded"] else []) +
opencloudServer(
storage,
params["accounts_hash_difficulty"],
extra_server_environment = params["extraServerEnvironment"],
with_wrapper = True,
tika_enabled = params["tikaNeeded"],
watch_fs_enabled = run_with_watch_fs_enabled,
) +
(opencloudServer(storage, deploy_type = "federation", extra_server_environment = params["extraServerEnvironment"], watch_fs_enabled = run_with_watch_fs_enabled) if params["federationServer"] else []) +
(opencloudServer(storage, params["accounts_hash_difficulty"], deploy_type = "federation", extra_server_environment = params["extraServerEnvironment"], watch_fs_enabled = run_with_watch_fs_enabled) if params["federationServer"] else []) +
((wopiCollaborationService("fakeoffice") + wopiCollaborationService("collabora") + wopiCollaborationService("onlyoffice")) if params["collaborationServiceNeeded"] else []) +
(openCloudHealthCheck("wopi", ["wopi-collabora:9304", "wopi-onlyoffice:9304", "wopi-fakeoffice:9304"]) if params["collaborationServiceNeeded"] else []) +
localApiTest(params["suites"], storage, params["extraTestEnvironment"], run_with_remote_php) +
localApiTest(params["suites"], storage, params["extraTestEnvironment"], run_with_remote_php, params["generateVirusFiles"]) +
logRequests(),
"services": (emailService() if params["emailNeeded"] else []) +
(clamavService() if params["antivirusNeeded"] else []) +
@@ -1302,9 +1312,9 @@ def localApiTestPipeline(ctx):
pipelines.append(pipeline)
return pipelines
def localApiTest(suites, storage = "decomposed", extra_environment = {}, with_remote_php = False):
def localApiTest(suites, storage = "decomposed", extra_environment = {}, with_remote_php = False, generate_virus_files = False):
test_dir = "%s/tests/acceptance" % dirs["base"]
expected_failures_file = "%s/expected-failures-%s-storage.md" % (test_dir, storage)
expected_failures_file = "%s/expected-failures-localAPI-on-%s-storage.md" % (test_dir, storage)
environment = {
"TEST_SERVER_URL": OC_URL,
@@ -1327,6 +1337,11 @@ def localApiTest(suites, storage = "decomposed", extra_environment = {}, with_re
commands = []
# Generate EICAR virus test files if needed
if generate_virus_files:
commands.append("chmod +x %s/tests/acceptance/scripts/generate-virus-files.sh" % dirs["base"])
commands.append("bash %s/tests/acceptance/scripts/generate-virus-files.sh" % dirs["base"])
# Merge expected failures
if not with_remote_php:
commands.append("cat %s/expected-failures-without-remotephp.md >> %s" % (test_dir, expected_failures_file))
@@ -1349,6 +1364,7 @@ def coreApiTestPipeline(ctx):
"numberOfParts": 7,
"skipExceptParts": [],
"skip": False,
"accounts_hash_difficulty": 4,
}
pipelines = []
@@ -1369,10 +1385,6 @@ def coreApiTestPipeline(ctx):
params["withRemotePhp"] = [True, False]
params["enableWatchFs"] = [True, False]
# override withRemotePhp if specified in the suite config
if "withRemotePhp" in matrix:
params["withRemotePhp"] = matrix["withRemotePhp"]
debugParts = params["skipExceptParts"]
debugPartsEnabled = (len(debugParts) != 0)
@@ -1394,6 +1406,7 @@ def coreApiTestPipeline(ctx):
restoreBuildArtifactCache(ctx, dirs["opencloudBinArtifact"], dirs["opencloudBinPath"]) +
opencloudServer(
storage,
params["accounts_hash_difficulty"],
with_wrapper = True,
watch_fs_enabled = run_with_watch_fs_enabled,
) +
@@ -1427,7 +1440,7 @@ def coreApiTestPipeline(ctx):
def coreApiTest(part_number = 1, number_of_parts = 1, with_remote_php = False, storage = "posix"):
filter_tags = "~@skipOnOpencloud-%s-Storage" % storage
test_dir = "%s/tests/acceptance" % dirs["base"]
expected_failures_file = "%s/expected-failures-%s-storage.md" % (test_dir, storage)
expected_failures_file = "%s/expected-failures-API-on-%s-storage.md" % (test_dir, storage)
return [{
"name": "api-tests",
@@ -2311,7 +2324,7 @@ def notifyMatrix(ctx):
return result
def opencloudServer(storage = "decomposed", depends_on = [], deploy_type = "", extra_server_environment = {}, with_wrapper = False, tika_enabled = False, watch_fs_enabled = False):
def opencloudServer(storage = "decomposed", accounts_hash_difficulty = 4, depends_on = [], deploy_type = "", extra_server_environment = {}, with_wrapper = False, tika_enabled = False, watch_fs_enabled = False):
user = "0:0"
container_name = OC_SERVER_NAME
environment = {
@@ -2407,6 +2420,13 @@ def opencloudServer(storage = "decomposed", depends_on = [], deploy_type = "", e
if watch_fs_enabled:
environment["STORAGE_USERS_POSIX_WATCH_FS"] = True
# Pass in "default" accounts_hash_difficulty to not set this environment variable.
# That will allow OpenCloud to use whatever its built-in default is.
# Otherwise pass in a value from 4 to about 11 or 12 (default 4, for making regular tests fast)
# The high values cause lots of CPU to be used when hashing passwords, and really slow down the tests.
if accounts_hash_difficulty != "default":
environment["ACCOUNTS_HASH_DIFFICULTY"] = accounts_hash_difficulty
for item in extra_server_environment:
environment[item] = extra_server_environment[item]
@@ -3273,7 +3293,7 @@ def wopiCollaborationService(name):
environment["COLLABORATION_APP_ADDR"] = "https://onlyoffice"
environment["COLLABORATION_APP_ICON"] = "https://onlyoffice/web-apps/apps/documenteditor/main/resources/img/favicon.ico"
elif name == "fakeoffice":
environment["COLLABORATION_SERVICE_NAME"] = "collaboration-fakeoffice"
environment["COLLABORATION_SERVICE_NAME"] = "collboration-fakeoficce"
environment["COLLABORATION_APP_NAME"] = "FakeOffice"
environment["COLLABORATION_APP_PRODUCT"] = "Microsoft"
environment["COLLABORATION_APP_ADDR"] = "http://fakeoffice:8080"

10
go.mod
View File

@@ -41,13 +41,13 @@ require (
github.com/google/uuid v1.6.0
github.com/gookit/config/v2 v2.2.7
github.com/gorilla/mux v1.8.1
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.4
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.3
github.com/invopop/validation v0.8.0
github.com/jellydator/ttlcache/v2 v2.11.1
github.com/jellydator/ttlcache/v3 v3.4.0
github.com/jinzhu/now v1.1.5
github.com/justinas/alice v1.2.0
github.com/kovidgoyal/imaging v1.8.19
github.com/kovidgoyal/imaging v1.8.18
github.com/leonelquinteros/gotext v1.7.2
github.com/libregraph/idm v0.5.0
github.com/libregraph/lico v0.66.0
@@ -61,7 +61,7 @@ require (
github.com/onsi/ginkgo v1.16.5
github.com/onsi/ginkgo/v2 v2.27.5
github.com/onsi/gomega v1.39.0
github.com/open-policy-agent/opa v1.12.3
github.com/open-policy-agent/opa v1.11.1
github.com/opencloud-eu/icap-client v0.0.0-20250930132611-28a2afe62d89
github.com/opencloud-eu/libre-graph-api-go v1.0.8-0.20250724122329-41ba6b191e76
github.com/opencloud-eu/reva/v2 v2.41.1-0.20260107152322-93760b632993
@@ -110,7 +110,7 @@ require (
golang.org/x/sync v0.19.0
golang.org/x/term v0.39.0
golang.org/x/text v0.33.0
google.golang.org/genproto/googleapis/api v0.0.0-20251222181119-0a764e51fe1b
google.golang.org/genproto/googleapis/api v0.0.0-20251202230838-ff82c1b0f217
google.golang.org/grpc v1.78.0
google.golang.org/protobuf v1.36.11
gopkg.in/yaml.v2 v2.4.0
@@ -395,7 +395,7 @@ require (
golang.org/x/time v0.14.0 // indirect
golang.org/x/tools v0.40.0 // indirect
google.golang.org/genproto v0.0.0-20250303144028-a0af3efb3deb // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251222181119-0a764e51fe1b // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20251202230838-ff82c1b0f217 // indirect
gopkg.in/cenkalti/backoff.v1 v1.1.0 // indirect
gopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7 // indirect
gopkg.in/warnings.v0 v0.1.2 // indirect

20
go.sum
View File

@@ -624,8 +624,8 @@ github.com/grpc-ecosystem/go-grpc-middleware v1.4.0/go.mod h1:g5qyo/la0ALbONm6Vb
github.com/grpc-ecosystem/go-grpc-prometheus v1.2.0/go.mod h1:8NvIoxWQoOIhqOTXgfV/d3M/q6VIi02HzZEHgUlZvzk=
github.com/grpc-ecosystem/grpc-gateway v1.8.5/go.mod h1:vNeuVxBJEsws4ogUvrchl83t/GYV9WGTSLVdBhOQFDY=
github.com/grpc-ecosystem/grpc-gateway v1.9.0/go.mod h1:vNeuVxBJEsws4ogUvrchl83t/GYV9WGTSLVdBhOQFDY=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.4 h1:kEISI/Gx67NzH3nJxAmY/dGac80kKZgZt134u7Y/k1s=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.4/go.mod h1:6Nz966r3vQYCqIzWsuEl9d7cf7mRhtDmm++sOxlnfxI=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.3 h1:NmZ1PKzSTQbuGHw9DGPFomqkkLWMC+vZCkfs+FHv1Vg=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.3/go.mod h1:zQrxl1YP88HQlA6i9c63DSVPFklWpGX4OWAc9bFuaH4=
github.com/h2non/parth v0.0.0-20190131123155-b4df798d6542/go.mod h1:Ow0tF8D4Kplbc8s8sSb3V2oUCygFHVp8gC3Dn6U4MNI=
github.com/hashicorp/consul/api v1.1.0/go.mod h1:VmuI/Lkw1nC05EYQWNKwWGbkg+FbDBtguAZLlVdkD9Q=
github.com/hashicorp/consul/sdk v0.1.1/go.mod h1:VKf9jXwCTEY1QZP2MOLRhb5i/I/ssyNV1vwHyQBF0x8=
@@ -745,8 +745,8 @@ github.com/kovidgoyal/go-parallel v1.1.1 h1:1OzpNjtrUkBPq3UaqrnvOoB2F9RttSt811ui
github.com/kovidgoyal/go-parallel v1.1.1/go.mod h1:BJNIbe6+hxyFWv7n6oEDPj3PA5qSw5OCtf0hcVxWJiw=
github.com/kovidgoyal/go-shm v1.0.0 h1:HJEel9D1F9YhULvClEHJLawoRSj/1u/EDV7MJbBPgQo=
github.com/kovidgoyal/go-shm v1.0.0/go.mod h1:Yzb80Xf9L3kaoB2RGok9hHwMIt7Oif61kT6t3+VnZds=
github.com/kovidgoyal/imaging v1.8.19 h1:zWJdQqF2tfSKjvoB7XpLRhVGbYsze++M0iaqZ4ZkhNk=
github.com/kovidgoyal/imaging v1.8.19/go.mod h1:I0q8RdoEuyc4G8GFOF9CaluTUHQSf68d6TmsqpvfRI8=
github.com/kovidgoyal/imaging v1.8.18 h1:42JCqJnQBzBo0hGllLEJVYDARWXPP9MT3HgiTno9Chc=
github.com/kovidgoyal/imaging v1.8.18/go.mod h1:bqjHpeAxSuTLvKob6HuqAr9td2wP9G54Snbgd+1QLoU=
github.com/kr/fs v0.1.0/go.mod h1:FFnZGqtBN9Gxj7eW1uZ42v5BccTP0vu6NEaFoC2HwRg=
github.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
@@ -957,8 +957,8 @@ github.com/onsi/gomega v1.7.1/go.mod h1:XdKZgCCFLUoM/7CFJVPcG8C1xQ1AJ0vpAezJrB7J
github.com/onsi/gomega v1.10.1/go.mod h1:iN09h71vgCQne3DLsj+A5owkum+a2tYe+TOCB1ybHNo=
github.com/onsi/gomega v1.39.0 h1:y2ROC3hKFmQZJNFeGAMeHZKkjBL65mIZcvrLQBF9k6Q=
github.com/onsi/gomega v1.39.0/go.mod h1:ZCU1pkQcXDO5Sl9/VVEGlDyp+zm0m1cmeG5TOzLgdh4=
github.com/open-policy-agent/opa v1.12.3 h1:qe3m/w52baKC/HJtippw+hYBUKCzuBCPjB+D5P9knfc=
github.com/open-policy-agent/opa v1.12.3/go.mod h1:RnDgm04GA1RjEXJvrsG9uNT/+FyBNmozcPvA2qz60M4=
github.com/open-policy-agent/opa v1.11.1 h1:4bMlG6DjRZTRAswRyF+KUCgxHu1Gsk0h9EbZ4W9REvM=
github.com/open-policy-agent/opa v1.11.1/go.mod h1:QimuJO4T3KYxWzrmAymqlFvsIanCjKrGjmmC8GgAdgE=
github.com/opencloud-eu/go-micro-plugins/v4/store/nats-js-kv v0.0.0-20250512152754-23325793059a h1:Sakl76blJAaM6NxylVkgSzktjo2dS504iDotEFJsh3M=
github.com/opencloud-eu/go-micro-plugins/v4/store/nats-js-kv v0.0.0-20250512152754-23325793059a/go.mod h1:pjcozWijkNPbEtX5SIQaxEW/h8VAVZYTLx+70bmB3LY=
github.com/opencloud-eu/icap-client v0.0.0-20250930132611-28a2afe62d89 h1:W1ms+lP5lUUIzjRGDg93WrQfZJZCaV1ZP3KeyXi8bzY=
@@ -1744,10 +1744,10 @@ google.golang.org/genproto v0.0.0-20200804131852-c06518451d9c/go.mod h1:FWY/as6D
google.golang.org/genproto v0.0.0-20200825200019-8632dd797987/go.mod h1:FWY/as6DDZQgahTzZj3fqbO1CbirC29ZNUFHwi0/+no=
google.golang.org/genproto v0.0.0-20250303144028-a0af3efb3deb h1:ITgPrl429bc6+2ZraNSzMDk3I95nmQln2fuPstKwFDE=
google.golang.org/genproto v0.0.0-20250303144028-a0af3efb3deb/go.mod h1:sAo5UzpjUwgFBCzupwhcLcxHVDK7vG5IqI30YnwX2eE=
google.golang.org/genproto/googleapis/api v0.0.0-20251222181119-0a764e51fe1b h1:uA40e2M6fYRBf0+8uN5mLlqUtV192iiksiICIBkYJ1E=
google.golang.org/genproto/googleapis/api v0.0.0-20251222181119-0a764e51fe1b/go.mod h1:Xa7le7qx2vmqB/SzWUBa7KdMjpdpAHlh5QCSnjessQk=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251222181119-0a764e51fe1b h1:Mv8VFug0MP9e5vUxfBcE3vUkV6CImK3cMNMIDFjmzxU=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251222181119-0a764e51fe1b/go.mod h1:j9x/tPzZkyxcgEFkiKEEGxfvyumM01BEtsW8xzOahRQ=
google.golang.org/genproto/googleapis/api v0.0.0-20251202230838-ff82c1b0f217 h1:fCvbg86sFXwdrl5LgVcTEvNC+2txB5mgROGmRL5mrls=
google.golang.org/genproto/googleapis/api v0.0.0-20251202230838-ff82c1b0f217/go.mod h1:+rXWjjaukWZun3mLfjmVnQi18E1AsFbDN9QdJ5YXLto=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251202230838-ff82c1b0f217 h1:gRkg/vSppuSQoDjxyiGfN4Upv/h/DQmIR10ZU8dh4Ww=
google.golang.org/genproto/googleapis/rpc v0.0.0-20251202230838-ff82c1b0f217/go.mod h1:7i2o+ce6H/6BluujYR+kqX3GKH+dChPTQU19wjRPiGk=
google.golang.org/grpc v1.17.0/go.mod h1:6QZJwpn2B+Zp71q/5VxRsJ6NXXVCE5NRUHRo+f3cWCs=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.19.1/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=

View File

@@ -1,11 +1,6 @@
# Webfinger
The webfinger service provides an RFC7033 WebFinger lookup of OpenCloud resources, relevant for a given user account at the /.well-known/webfinger enpoint.
1. An [OpenID Connect Discovery](#openid-connect-discovery) for the IdP, based on the OpenCloud URL.
2. An [Authenticated Instance Discovery](#authenticated-instance-discovery), based on the user account.
These two request are only needed for discovery.
The webfinger service provides an RFC7033 WebFinger lookup of OpenCloud instances relevant for a given user account via endpoints a the /.well-known/webfinger implementation.
## OpenID Connect Discovery
@@ -23,7 +18,7 @@ Clients can make an unauthenticated `GET https://drive.opencloud.test/.well-know
}
```
Here, the `resource` takes the instance domain URI, but an `acct:` URI works as well.
Here, the `resource` takes the instance domain URI, but an `acct:` URI works as well.
## Authenticated Instance Discovery
@@ -63,14 +58,14 @@ webfinger:
- claim: email
regex: alan@example\.org
href: "https://{{.preferred_username}}.cloud.opencloud.test"
title:
title:
"en": "OpenCloud Instance for Alan"
"de": "OpenCloud Instanz für Alan"
break: true
- claim: "email"
regex: mary@example\.org
href: "https://{{.preferred_username}}.cloud.opencloud.test"
title:
title:
"en": "OpenCloud Instance for Mary"
"de": "OpenCloud Instanz für Mary"
break: false

View File

@@ -4,129 +4,166 @@ To run tests in the test suite you have two options. You may go the easy way and
Both ways to run tests with the test suites are described here.
## Table of Contents
- [Running Test Suite in Docker](#running-test-suite-in-docker)
- [Running API Tests](#running-api-tests)
- [Run Tests With Required Services](#run-tests-with-required-services)
- [Run Tests Only](#run-tests-only)
- [Skip Local Image Build While Running Tests](#skip-local-image-build-while-running-tests)
- [Check Test Logs](#check-test-logs)
- [Cleanup the Setup](#cleanup-the-setup)
- [Running WOPI Validator Tests](#running-wopi-validator-tests)
- [Running Test Suite in Local Environment](#running-test-suite-in-local-environment)
- [Running Tests With And Without `remote.php`](#running-tests-with-and-without-remotephp)
- [Running ENV Config Tests (@env-Config)](#running-env-config-tests-env-config)
- [Running Test Suite With Email Service (@email)](#running-test-suite-with-email-service-email)
- [Running Test Suite With Tika Service (@tikaServiceNeeded)](#running-test-suite-with-tika-service-tikaserviceneeded)
- [Running Test Suite With Antivirus Service (@antivirus)](#running-test-suite-with-antivirus-service-antivirus)
- [Running Test Suite With Federated Sharing (@ocm)](#running-test-suite-with-federated-sharing-ocm)
- [Running Text Preview Tests Containing Unicode Characters](#running-text-preview-tests-containing-unicode-characters)
- [Running All API Tests Locally](#running-all-api-tests-locally)
## Running Test Suite in Docker
Check the available commands and environment variables with:
Let's see what is available. Invoke the following command from within the root of the OpenCloud repository.
```bash
make -C tests/acceptance/docker help
```
### Running API Tests
Basically we have two sources for feature tests and test suites:
#### Run Tests With Required Services
- [OpenCloud feature test and test suites](https://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features)
- [tests and test suites transferred from core, they have prefix coreApi](https://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features)
We can run a single feature or a single test suite with different storage drivers.
At the moment, both can be applied to OpenCloud.
1. Run a specific feature file:
As a storage backend, we support the OpenCloud native storage, also called `decomposed`. This stores files directly on disk. Along with that we also provide `decomposeds3`, `posix` storage drivers.
```bash
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature' \
make -C tests/acceptance/docker run-api-tests
```
You can invoke two types of test suite runs:
or a single scenario in a feature:
- run a full test suite, which consists of multiple feature tests
- run a single feature or single scenario in a feature
```bash
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature:24' \
make -C tests/acceptance/docker run-api-tests
```
### Run Full Test Suite
2. Run a specific test suite:
#### Local OpenCloud Tests (prefix `api`)
```bash
BEHAT_SUITE='apiGraphUserGroup' \
make -C tests/acceptance/docker run-api-tests
```
The names of the full test suite make targets have the same naming as in the CI pipeline. See the available local OpenCloud specific test suites [here](https://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features). They can be run with `decomposed` storage, `decomposeds3` storage and `posix` storage
3. Run with different storage driver (default is `posix`):
```bash
STORAGE_DRIVER='posix' \
BEHAT_SUITE='apiGraphUserGroup' \
make -C tests/acceptance/docker run-api-tests
```
4. Run the tests that require an email server (tests tagged with `@email`). Provide `START_EMAIL=true` while running the tests:
```bash
START_EMAIL=true \
BEHAT_FEATURE='tests/acceptance/features/apiNotification/emailNotification.feature' \
make -C tests/acceptance/docker run-api-tests
```
5. Run the tests that require tika service (tests tagged with `@tikaServiceNeeded`). Provide `START_TIKA=true` while running the tests:
```bash
START_TIKA=true \
BEHAT_FEATURE='tests/acceptance/features/apiSearchContent/contentSearch.feature' \
make -C tests/acceptance/docker run-api-tests
```
6. Run the tests that require an antivirus service (tests tagged with `@antivirus`). Provide `START_ANTIVIRUS=true` while running the tests:
```bash
START_ANTIVIRUS=true \
BEHAT_FEATURE='tests/acceptance/features/apiAntivirus/antivirus.feature' \
make -C tests/acceptance/docker run-api-tests
```
7. Run the wopi tests. Provide `ENABLE_WOPI=true` while running the tests:
```bash
ENABLE_WOPI=true \
BEHAT_FEATURE='tests/acceptance/features/apiCollaboration/checkFileInfo.feature' \
make -C tests/acceptance/docker run-api-tests
```
#### Run Tests Only
If you want to re-run the tests because of some failures or any other reason, you can use the following command to run only the tests without starting the services again.
Also, this command can be used to run the tests against the already hosted OpenCloud server by providing the `TEST_SERVER_URL` and `USE_BEARER_TOKEN` environment variables.
> [!NOTE]
> You can utilize the following environment variables:
>
> - `BEHAT_FEATURE`
> - `BEHAT_SUITE`
> - `USE_BEARER_TOKEN`
> - `TEST_SERVER_URL`
For example, command:
```bash
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature:24' \
make -C tests/acceptance/docker run-test-only
make -C tests/acceptance/docker localApiTests-apiGraph-decomposed
```
#### Skip Local Image Build While Running Tests
runs the same tests as the `localApiTests-apiGraph-decomposed` CI pipeline, which runs the OpenCloud test suite "apiGraph" against the OpenCloud server with `decomposed` storage.
While running the tests, opencloud docker image is built with `opencloudeu/opencloud:dev` tag. If you want to skip building the local image, you can use `OC_IMAGE_TAG` env which must contain an available docker tag of the [opencloudeu/opencloud registry on Docker Hub](https://hub.docker.com/r/opencloudeu/opencloud) (e.g. 'latest').
command:
```bash
make -C tests/acceptance/docker localApiTests-apiGraph-decomposeds3
```
runs the OpenCloud test suite `apiGraph` against the OpenCloud server with `decomposeds3` storage.
And command:
```bash
make -C tests/acceptance/docker localApiTests-apiGraph-posix
```
runs the OpenCloud test suite `apiGraph` against the OpenCloud server with `posix` storage.
Note:
While running the tests, OpenCloud server is started with [ocwrapper](https://github.com/opencloud-eu/opencloud/blob/main/tests/ocwrapper/README.md) (i.e. `WITH_WRAPPER=true`) by default. In order to run the tests without ocwrapper, provide `WITH_WRAPPER=false` when running the tests. For example:
```bash
WITH_WRAPPER=false \
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature:26' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
But some test suites that are tagged with `@env-config` require the OpenCloud server to be run with ocwrapper. So, running those tests require `WITH_WRAPPER=true` (default setting).
Note:
To run the tests that require an email server (tests tagged with `@email`), you need to provide `START_EMAIL=true` while running the tests.
```bash
START_EMAIL=true \
BEHAT_FEATURE='tests/acceptance/features/apiNotification/emailNotification.feature' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
Note:
To run the tests that require tika service (tests tagged with `@tikaServiceNeeded`), you need to provide `START_TIKA=true` while running the tests.
```bash
START_TIKA=true \
BEHAT_FEATURE='tests/acceptance/features/apiSearchContent/contentSearch.feature' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
Note:
To run the tests that require an antivirus service (tests tagged with `@antivirus`), you need to provide the following environment variables while running the tests.
```bash
START_ANTIVIRUS=true \
OC_ASYNC_UPLOADS=true \
OC_ADD_RUN_SERVICES=antivirus \
POSTPROCESSING_STEPS=virusscan \
BEHAT_FEATURE='tests/acceptance/features/apiAntivirus/antivirus.feature' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
#### Tests Transferred From Core (prefix `coreApi`)
Command `make -C tests/acceptance/docker Core-API-Tests-decomposed-storage-3` runs the same tests as the `Core-API-Tests-decomposed-storage-3` CI pipeline, which runs the third (out of ten) test suite groups transferred from core against the OpenCloud server with `decomposed` storage.
And `make -C tests/acceptance/docker Core-API-Tests-decomposeds3-storage-3` runs the third (out of ten) test suite groups transferred from core against the OpenCloud server with `decomposeds3` storage.
### Run Single Feature Test
The tests for a single feature (a feature file) can also be run against the different storage backends. To do that, multiple make targets with the schema **test-_\<test-source\>_-feature-_\<storage-backend\>_** are available. To select a single feature you have to add an additional `BEHAT_FEATURE=<path-to-feature-file>` parameter when invoking the make command.
For example;
```bash
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
Note:
`BEHAT_FEATURE` must be pointing to a valid feature file
And to run a single scenario in a feature, you can do:
Note:
A specific scenario from a feature can be run by adding `:<line-number>` at the end of the feature file path. For example, to run the scenario at line 26 of the feature file `apiGraphUserGroup/createUser.feature`, simply add the line number like this: `apiGraphUserGroup/createUser.feature:26`. Note that the line numbers mentioned in the examples might not always point to a scenario, so always check the line numbers before running the test.
```bash
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature:26' \
make -C tests/acceptance/docker test-opencloud-feature-decomposed-storage
```
Similarly, with `decomposeds3` storage;
```bash
# run a whole feature
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature' \
make -C tests/acceptance/docker test-opencloud-feature-decomposeds3-storage
# run a single scenario
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature:26' \
make -C tests/acceptance/docker test-opencloud-feature-decomposeds3-storage
```
In the same way, tests transferred from core can be run as:
```bash
# run a whole feature
BEHAT_FEATURE='tests/acceptance/features/coreApiAuth/webDavAuth.feature' \
make -C tests/acceptance/docker test-core-feature-decomposed-storage
# run a single scenario
BEHAT_FEATURE='tests/acceptance/features/coreApiAuth/webDavAuth.feature:15' \
make -C tests/acceptance/docker test-core-feature-decomposed-storage
```
Note:
The test suites transferred from core have `coreApi` prefixed
### OpenCloud Image to Be Tested (Skip Local Image Build)
By default, the tests will be run against the docker image built from your current working state of the OpenCloud repository. For some purposes it might also be handy to use an OpenCloud image from Docker Hub. Therefore, you can provide the optional flag `OC_IMAGE_TAG=...` which must contain an available docker tag of the [opencloud-eu/opencloud registry on Docker Hub](https://hub.docker.com/r/opencloud-eu/opencloud) (e.g. 'latest').
```bash
OC_IMAGE_TAG=latest \
BEHAT_FEATURE='tests/acceptance/features/apiGraphUserGroup/createUser.feature' \
make -C tests/acceptance/docker run-api-tests
make -C tests/acceptance/docker localApiTests-apiGraph-opencloud
```
#### Check Test Logs
### Test Log Output
While a test is running or when it is finished, you can attach to the logs generated by the tests.
@@ -134,49 +171,15 @@ While a test is running or when it is finished, you can attach to the logs gener
make -C tests/acceptance/docker show-test-logs
```
> [!NOTE]
> The log output is opened in `less`. You can navigate up and down with your cursors. By pressing "F" you can follow the latest line of the output.
Note:
The log output is opened in `less`. You can navigate up and down with your cursors. By pressing "F" you can follow the latest line of the output.
#### Cleanup the Setup
### Cleanup
Run the following command to clean all the resources created while running the tests:
During testing we start a redis and OpenCloud docker container. These will not be stopped automatically. You can stop them with:
```bash
make -C tests/acceptance/docker clean-all
```
### Running WOPI Validator Tests
#### Available Test Groups
```text
BaseWopiViewing
CheckFileInfoSchema
EditFlows
Locks
AccessTokens
GetLock
ExtendedLockLength
FileVersion
Features
PutRelativeFile
RenameFileIfCreateChildFileIsNotSupported
```
#### Run Test
```bash
TEST_GROUP=BaseWopiViewing docker compose -f tests/acceptance/docker/src/wopi-validator-test.yml up -d
```
#### Run Test (macOS)
Use the arm image for macOS to run the validator tests.
```bash
WOPI_VALIDATOR_IMAGE=scharfvi/wopi-validator \
TEST_GROUP=BaseWopiViewing \
docker compose -f tests/acceptance/docker/src/wopi-validator-test.yml up -d
make -C tests/acceptance/docker clean
```
## Running Test Suite in Local Environment
@@ -242,7 +245,7 @@ A specific scenario from a feature can be run by adding `:<line-number>` at the
### Use Existing Tests for BDD
As a lot of scenarios are written for core, we can use those tests for Behaviour driven development in OpenCloud.
Every scenario that does not work in OpenCloud with `decomposed` storage, is listed in `tests/acceptance/expected-failures-decomposed-storage.md` with a link to the related issue.
Every scenario that does not work in OpenCloud with `decomposed` storage, is listed in `tests/acceptance/expected-failures-API-on-decomposed-storage.md` with a link to the related issue.
Those scenarios are run in the ordinary acceptance test pipeline in CI. The scenarios that fail are checked against the
expected failures. If there are any differences then the CI pipeline fails.
@@ -266,7 +269,7 @@ If you want to work on a specific issue
5. remove those tests from the expected failures file
6. make a PR that has the fixed code, and the relevant lines removed from the expected failures file.
### Running Tests With And Without `remote.php`
## Running Tests With And Without `remote.php`
By default, the tests are run with `remote.php` enabled. If you want to run the tests without `remote.php`, you can disable it by setting the environment variable `WITH_REMOTE_PHP=false` while running the tests.
@@ -276,11 +279,11 @@ TEST_SERVER_URL="https://localhost:9200" \
make test-acceptance-api
```
### Running ENV Config Tests (@env-Config)
## Running ENV Config Tests (@env-Config)
Test suites tagged with `@env-config` are used to test the environment variables that are used to configure OpenCloud. These tests are special tests that require the OpenCloud server to be run using [ocwrapper](https://github.com/opencloud-eu/opencloud/blob/main/tests/ocwrapper/README.md).
#### Run OpenCloud With ocwrapper
### Run OpenCloud With ocwrapper
```bash
# working dir: OpenCloud repo root dir
@@ -298,7 +301,7 @@ PROXY_ENABLE_BASIC_AUTH=true \
./bin/ocwrapper serve --bin=../../opencloud/bin/opencloud
```
#### Run the Tests
### Run the Tests
```bash
OC_WRAPPER_URL=http://localhost:5200 \
@@ -307,7 +310,7 @@ BEHAT_FEATURE=tests/acceptance/features/apiAsyncUpload/delayPostprocessing.featu
make test-acceptance-api
```
#### Writing New ENV Config Tests
### Writing New ENV Config Tests
While writing tests for a new OpenCloud ENV configuration, please make sure to follow these guidelines:
@@ -315,11 +318,11 @@ While writing tests for a new OpenCloud ENV configuration, please make sure to f
2. Use `OcConfigHelper.php` for helper functions - provides functions to reconfigure the running OpenCloud instance.
3. Recommended: add the new step implementations in `OcConfigContext.php`
### Running Test Suite With Email Service (@email)
## Running Test Suite With Email Service (@email)
Test suites that are tagged with `@email` require an email service. We use inbucket as the email service in our tests.
#### Setup Inbucket
### Setup Inbucket
Run the following command to setup inbucket
@@ -327,7 +330,7 @@ Run the following command to setup inbucket
docker run -d -p9000:9000 -p2500:2500 --name inbucket inbucket/inbucket
```
#### Run OpenCloud
### Run OpenCloud
Documentation for environment variables is available [here](https://docs.opencloud.eu/services/notifications/#environment-variables)
@@ -346,7 +349,7 @@ NOTIFICATIONS_SMTP_SENDER="OpenCloud <noreply@example.com>" \
opencloud/bin/opencloud server
```
#### Run the Acceptance Test
### Run the Acceptance Test
Run the acceptance test with the following command:
@@ -358,11 +361,11 @@ BEHAT_FEATURE="tests/acceptance/features/apiNotification/emailNotification.featu
make test-acceptance-api
```
### Running Test Suite With Tika Service (@tikaServiceNeeded)
## Running Test Suite With Tika Service (@tikaServiceNeeded)
Test suites that are tagged with `@tikaServiceNeeded` require tika service.
#### Setup Tika Service
### Setup Tika Service
Run the following docker command to setup tika service
@@ -370,7 +373,7 @@ Run the following docker command to setup tika service
docker run -d -p 127.0.0.1:9998:9998 apache/tika
```
#### Run OpenCloud
### Run OpenCloud
TODO: Documentation related to the content based search and tika extractor will be added later.
@@ -388,7 +391,7 @@ SEARCH_EXTRACTOR_CS3SOURCE_INSECURE=true \
opencloud/bin/opencloud server
```
#### Run the Acceptance Test
### Run the Acceptance Test
Run the acceptance test with the following command:
@@ -398,15 +401,15 @@ BEHAT_FEATURE="tests/acceptance/features/apiSearchContent/contentSearch.feature"
make test-acceptance-api
```
### Running Test Suite With Antivirus Service (@antivirus)
## Running Test Suite With Antivirus Service (@antivirus)
Test suites that are tagged with `@antivirus` require antivirus service. TODO The available antivirus and the configuration related to them will be added latert. This documentation is only going to use `clamav` as antivirus.
#### Setup clamAV
### Setup clamAV
**Option 1. Setup Locally**
#### 1. Setup Locally
Linux OS user:
##### Linux OS user
Run the following command to set up calmAV and clamAV daemon
@@ -423,7 +426,7 @@ sudo service clamav-daemon status
Note:
The commands are ubuntu specific and may differ according to your system. You can find information related to installation of clamAV in their official documentation [here](https://docs.clamav.net/manual/Installing/Packages.html).
Mac OS user:
##### Mac OS user
Install ClamAV using [here](https://gist.github.com/mendozao/3ea393b91f23a813650baab9964425b9)
Start ClamAV daemon
@@ -432,7 +435,7 @@ Start ClamAV daemon
/your/location/to/brew/Cellar/clamav/1.1.0/sbin/clamd
```
**Option 2. Setup clamAV With Docker**
#### 2. Setup clamAV With Docker
Run `clamAV` through docker
@@ -440,7 +443,7 @@ Run `clamAV` through docker
docker run -d -p 3310:3310 opencloudeu/clamav-ci:latest
```
#### Run OpenCloud
### Run OpenCloud
As `antivirus` service is not enabled by default we need to enable the service while running OpenCloud server. We also need to enable `async upload` and as virus scan is performed in post-processing step, we need to set it as well. Documentation for environment variables related to antivirus is available [here](https://docs.opencloud.eu/services/antivirus/#environment-variables)
@@ -466,6 +469,15 @@ For antivirus running localy on Linux OS, use `ANTIVIRUS_CLAMAV_SOCKET= "/var/ru
For antivirus running localy on Mac OS, use `ANTIVIRUS_CLAMAV_SOCKET= "/tmp/clamd.sock"`.
For antivirus running with docker, use `ANTIVIRUS_CLAMAV_SOCKET= "tcp://host.docker.internal:3310"`
### Create virus files
The antivirus tests require EICAR test files which are not stored in the repository
They are generated dynamically when needed for testing.
```bash
tests/acceptance/scripts/generate-virus-files.sh
```
#### Run the Acceptance Test
Run the acceptance test with the following command:
@@ -476,11 +488,11 @@ BEHAT_FEATURE="tests/acceptance/features/apiAntivirus/antivirus.feature" \
make test-acceptance-api
```
### Running Test Suite With Federated Sharing (@ocm)
## Running Test Suite With Federated Sharing (@ocm)
Test suites that are tagged with `@ocm` require running two different OpenCloud instances. TODO More detailed information and configuration related to it will be added later.
#### Setup First OpenCloud Instance
### Setup First OpenCloud Instance
```bash
# init OpenCloud
@@ -502,15 +514,15 @@ opencloud/bin/opencloud server
The first OpenCloud instance should be available at: https://localhost:9200/
#### Setup Second OpenCloud Instance
### Setup Second OpenCloud Instance
You can run the second OpenCloud instance in two ways:
**Option 1. Using `.vscode/launch.json`**
#### Using `.vscode/launch.json`
From the `Run and Debug` panel of VSCode, select `Fed OpenCloud Server` and start the debugger.
**Option 2. Using env file**
#### Using env file
```bash
# init OpenCloud
@@ -538,7 +550,7 @@ BEHAT_FEATURE="tests/acceptance/features/apiOcm/ocm.feature" \
make test-acceptance-api
```
### Running Text Preview Tests Containing Unicode Characters
## Running Text Preview Tests Containing Unicode Characters
There are some tests that check the text preview of files containing Unicode characters. The OpenCloud server by default cannot generate the thumbnail of such files correctly but it provides an environment variable to allow the use of custom fonts that support Unicode characters. So to run such tests successfully, we have to run the OpenCloud server with this environment variable.
@@ -561,15 +573,46 @@ The sample `fontsMap.json` file is located in `tests/config/drone/fontsMap.json`
### Build dev docker
```bash
make -C opencloud dev-docker
make -C opencloud dev-docker
```
### Choose STORAGE_DRIVER
By default, the system uses `posix` storage. However, you can override this by setting the `STORAGE_DRIVER` environment variable.
### Run a script that starts the openCloud server in the docker and runs the API tests locally (for debugging purposes)
### Run a script that starts the openCloud server in the docker and runs the API tests locally (for debugging purposes)
```bash
STORAGE_DRIVER=posix ./tests/acceptance/run_api_tests.sh
STORAGE_DRIVER=posix ./tests/acceptance/run_api_tests.sh
```
## Running WOPI Validator Tests
### Available Test Groups
```text
BaseWopiViewing
CheckFileInfoSchema
EditFlows
Locks
AccessTokens
GetLock
ExtendedLockLength
FileVersion
Features
PutRelativeFile
RenameFileIfCreateChildFileIsNotSupported
```
### Run Test
```bash
TEST_GROUP=BaseWopiViewing docker compose -f tests/acceptance/docker/src/wopi-validator-test.yml up -d
```
### for macOS use arm image
```bash
WOPI_VALIDATOR_IMAGE=scharfvi/wopi-validator \
TEST_GROUP=BaseWopiViewing \
docker compose -f tests/acceptance/docker/src/wopi-validator-test.yml up -d
```

View File

@@ -62,7 +62,7 @@ class HttpRequestHelper {
/**
*
* @param string $url
* @param string|null $url
* @param string|null $xRequestId
* @param string|null $method
* @param string|null $user
@@ -80,8 +80,8 @@ class HttpRequestHelper {
* @throws GuzzleException
*/
public static function sendRequestOnce(
string $url,
?string $xRequestId = null,
?string $url,
?string $xRequestId,
?string $method = 'GET',
?string $user = null,
?string $password = null,
@@ -100,7 +100,7 @@ class HttpRequestHelper {
$parsedUrl = parse_url($url);
$baseUrl = $parsedUrl['scheme'] . '://' . $parsedUrl['host'];
$baseUrl .= isset($parsedUrl['port']) ? ':' . $parsedUrl['port'] : '';
$testUrl = $baseUrl . "/graph/v1.0/me";
$testUrl = $baseUrl . "/graph/v1.0/user/$user";
if (OcHelper::isTestingOnReva()) {
$url = $baseUrl . "/ocs/v2.php/cloud/users/$user";
}

View File

@@ -23,7 +23,7 @@ namespace TestHelpers;
use GuzzleHttp\Client;
use GuzzleHttp\Cookie\CookieJar;
use GuzzleHttp\Exception\GuzzleException;
use PHPUnit\Framework\Assert;
use Exception;
/**
* Helper for obtaining bearer tokens for users
@@ -143,15 +143,21 @@ class TokenHelper {
]
);
Assert::assertEquals(
200,
$response->getStatusCode(),
'Token refresh failed: Expected status code 200 but received ' . $response->getStatusCode()
);
if ($response->getStatusCode() !== 200) {
throw new Exception(
\sprintf(
'Token refresh failed: Expected status code 200 but received %d. Message: %s',
$response->getStatusCode(),
$response->getReasonPhrase()
)
);
}
$data = json_decode($response->getBody()->getContents(), true);
Assert::assertArrayHasKey('access_token', $data, 'Missing access_token in refresh response');
Assert::assertArrayHasKey('refresh_token', $data, 'Missing refresh_token in refresh response');
if (!isset($data['access_token']) || !isset($data['refresh_token'])) {
throw new Exception('Missing tokens in refresh response');
}
return [
'access_token' => $data['access_token'],
@@ -247,24 +253,21 @@ class TokenHelper {
): string {
$response = self::makeLoginRequest($username, $password, $baseUrl, $cookieJar);
Assert::assertEquals(
200,
$response->getStatusCode(),
'Logon failed: Expected status code 200 but received: ' . $response->getStatusCode()
);
if ($response->getStatusCode() !== 200) {
throw new Exception(
\sprintf(
'Logon failed: Expected status code 200 but received %d. Message: %s',
$response->getStatusCode(),
$response->getReasonPhrase()
)
);
}
$data = json_decode($response->getBody()->getContents(), true);
Assert::assertArrayHasKey(
'hello',
$data,
'Logon response does not contain "hello" object'
);
Assert::assertArrayHasKey(
'continue_uri',
$data['hello'],
'Missing continue_uri in logon response'
);
if (!isset($data['hello']['continue_uri'])) {
throw new Exception('Missing continue_uri in logon response');
}
return $data['hello']['continue_uri'];
}
@@ -306,17 +309,42 @@ class TokenHelper {
]
);
Assert::assertEquals(
302,
$response->getStatusCode(),
'Authorization request failed: Expected status code 302 but received: ' . $response->getStatusCode()
);
if ($response->getStatusCode() !== 302) {
// Add debugging to understand what is happening
$body = $response->getBody()->getContents();
throw new Exception(
\sprintf(
'Authorization failed: Expected status code 302 but received %d. Message: %s. Body: %s',
$response->getStatusCode(),
$response->getReasonPhrase(),
$body
)
);
}
$location = $response->getHeader('Location')[0] ?? '';
Assert::assertNotEmpty($location, 'Missing Location header in authorization response');
if (empty($location)) {
throw new Exception('Missing Location header in authorization response');
}
parse_str(parse_url($location, PHP_URL_QUERY), $queryParams);
Assert::assertArrayHasKey('code', $queryParams, 'Missing code parameter in redirect URL');
// Check for errors
if (isset($queryParams['error'])) {
throw new Exception(
\sprintf(
'Authorization error: %s - %s',
$queryParams['error'],
urldecode($queryParams['error_description'] ?? 'No description')
)
);
}
if (!isset($queryParams['code'])) {
throw new Exception('Missing auth code in redirect URL. Location: ' . $location);
}
return $queryParams['code'];
}
@@ -355,15 +383,21 @@ class TokenHelper {
]
);
Assert::assertEquals(
200,
$response->getStatusCode(),
'Token request failed: Expected status code 200 but received: ' . $response->getStatusCode()
);
if ($response->getStatusCode() !== 200) {
throw new Exception(
\sprintf(
'Token request failed: Expected status code 200 but received %d. Message: %s',
$response->getStatusCode(),
$response->getReasonPhrase()
)
);
}
$data = json_decode($response->getBody()->getContents(), true);
Assert::assertArrayHasKey('access_token', $data, 'Missing access_token in token response');
Assert::assertArrayHasKey('refresh_token', $data, 'Missing refresh_token in token response');
if (!isset($data['access_token']) || !isset($data['refresh_token'])) {
throw new Exception('Missing tokens in response');
}
return [
'access_token' => $data['access_token'],

View File

@@ -195,20 +195,13 @@ class UploadHelper extends Assert {
}
/**
* get the path of the acceptance tests directory
* get the path of a file from FilesForUpload directory
*
* @param string|null $name name of the file to upload
*
* @return string
*/
public static function getAcceptanceTestsDir(): string {
return \dirname(__FILE__) . "/../";
}
/**
* get the path of the filesForUpload directory
*
* @return string
*/
public static function getFilesForUploadDir(): string {
return \dirname(__FILE__) . "/../filesForUpload/";
public static function getUploadFilesDir(?string $name): string {
return \getenv("FILES_FOR_UPLOAD") . $name;
}
}

View File

@@ -25,7 +25,6 @@ use PHPUnit\Framework\Assert;
use Psr\Http\Message\ResponseInterface;
use TestHelpers\WebDavHelper;
use TestHelpers\BehatHelper;
use TestHelpers\UploadHelper;
require_once 'bootstrap.php';
@@ -50,7 +49,7 @@ class ChecksumContext implements Context {
string $checksum
): ResponseInterface {
$file = \file_get_contents(
UploadHelper::getAcceptanceTestsDir() . $source
$this->featureContext->acceptanceTestsDirLocation() . $source
);
return $this->featureContext->makeDavRequest(
$user,

View File

@@ -37,13 +37,11 @@ use TestHelpers\SetupHelper;
use TestHelpers\HttpRequestHelper;
use TestHelpers\HttpLogger;
use TestHelpers\OcHelper;
use TestHelpers\StorageDriver;
use TestHelpers\GraphHelper;
use TestHelpers\WebDavHelper;
use TestHelpers\SettingsHelper;
use TestHelpers\OcConfigHelper;
use TestHelpers\BehatHelper;
use TestHelpers\UploadHelper;
use Swaggest\JsonSchema\InvalidValue as JsonSchemaException;
use Swaggest\JsonSchema\Exception\ArrayException;
use Swaggest\JsonSchema\Exception\ConstException;
@@ -563,38 +561,6 @@ class FeatureContext extends BehatVariablesContext {
}
}
/**
* @BeforeScenario @antivirus
*
* @return void
* @throws Exception
*/
public function createTestVirusFiles(): void {
$uploadDir = UploadHelper::getFilesForUploadDir() . 'filesWithVirus/';
$virusFile = $uploadDir . 'eicar.com';
$virusZipFile = $uploadDir . 'eicar_com.zip';
if (file_exists($virusFile) && file_exists($virusZipFile)) {
return;
}
if (!is_dir($uploadDir)) {
mkdir($uploadDir, 0755);
}
$res1 = HttpRequestHelper::sendRequestOnce('https://secure.eicar.org/eicar.com');
if ($res1->getStatusCode() !== 200) {
throw new Exception("Could not download eicar.com test virus file");
}
file_put_contents($virusFile, $res1->getBody()->getContents());
$res2 = HttpRequestHelper::sendRequestOnce('https://secure.eicar.org/eicar_com.zip');
file_put_contents($virusZipFile, $res2->getBody()->getContents());
if ($res2->getStatusCode() !== 200) {
throw new Exception("Could not download eicar_com.zip test virus file");
}
}
/**
*
* @BeforeScenario
@@ -2629,11 +2595,18 @@ class FeatureContext extends BehatVariablesContext {
return "work_tmp";
}
/**
* @return string
*/
public function acceptanceTestsDirLocation(): string {
return \dirname(__FILE__) . "/../";
}
/**
* @return string
*/
public function workStorageDirLocation(): string {
return UploadHelper::getAcceptanceTestsDir() . $this->temporaryStorageSubfolderName() . "/";
return $this->acceptanceTestsDirLocation() . $this->temporaryStorageSubfolderName() . "/";
}
/**
@@ -2994,10 +2967,10 @@ class FeatureContext extends BehatVariablesContext {
public static function isExpectedToFail(string $scenarioLine): bool {
$expectedFailFile = \getenv('EXPECTED_FAILURES_FILE');
if (!$expectedFailFile) {
if (OcHelper::getStorageDriver() === StorageDriver::POSIX) {
$expectedFailFile = __DIR__ . '/../expected-failures-posix-storage.md';
$expectedFailFile = __DIR__ . '/../expected-failures-localAPI-on-decomposed-storage.md';
if (\strpos($scenarioLine, "coreApi") === 0) {
$expectedFailFile = __DIR__ . '/../expected-failures-API-on-decomposed-storage.md';
}
$expectedFailFile = __DIR__ . '/../expected-failures-decomposed-storage.md';
}
$reader = \fopen($expectedFailFile, 'r');

View File

@@ -26,7 +26,6 @@ use Psr\Http\Message\ResponseInterface;
use TestHelpers\HttpRequestHelper;
use TestHelpers\WebDavHelper;
use TestHelpers\BehatHelper;
use TestHelpers\UploadHelper;
require_once 'bootstrap.php';
@@ -863,7 +862,7 @@ class PublicWebDavContext implements Context {
string $destination,
): void {
$content = \file_get_contents(
UploadHelper::getAcceptanceTestsDir() . $source
$this->featureContext->acceptanceTestsDirLocation() . $source
);
$response = $this->publicUploadContent(
$destination,
@@ -889,7 +888,7 @@ class PublicWebDavContext implements Context {
string $password
): void {
$content = \file_get_contents(
UploadHelper::getAcceptanceTestsDir() . $source
$this->featureContext->acceptanceTestsDirLocation() . $source
);
$response = $this->publicUploadContent(
$destination,

View File

@@ -32,7 +32,6 @@ use Psr\Http\Message\ResponseInterface;
use TestHelpers\HttpRequestHelper;
use TestHelpers\WebDavHelper;
use TestHelpers\BehatHelper;
use TestHelpers\UploadHelper;
require_once 'bootstrap.php';
@@ -365,7 +364,7 @@ class TUSContext implements Context {
$client->setChecksumAlgorithm('sha1');
$client->setApiPath(WebDavHelper::getDavPath($davPathVersion, $suffixPath));
$client->setMetadata($uploadMetadata);
$sourceFile = UploadHelper::getAcceptanceTestsDir() . $source;
$sourceFile = $this->featureContext->acceptanceTestsDirLocation() . $source;
$client->setKey((string)rand())->file($sourceFile, $destination);
$this->featureContext->pauseUploadDelete();
@@ -519,7 +518,7 @@ class TUSContext implements Context {
*/
public function writeDataToTempFile(string $content): string {
$temporaryFileName = \tempnam(
UploadHelper::getAcceptanceTestsDir(),
$this->featureContext->acceptanceTestsDirLocation(),
"tus-upload-test-"
);
if ($temporaryFileName === false) {

View File

@@ -1325,6 +1325,20 @@ trait WebDav {
if ($statusCode === 404 || $statusCode === 405) {
return;
}
// After MOVE the source path might still be visible for a short time
// We wait 1 second and retry once to avoid flaky failures.
if ($statusCode === 207) {
sleep(1);
$response = $this->listFolder(
$user,
$path,
'0',
null,
null,
$type
);
$statusCode = $response->getStatusCode();
}
if ($statusCode === 207) {
$responseXmlObject = HttpRequestHelper::getResponseXml(
$response,
@@ -1648,7 +1662,7 @@ trait WebDav {
?bool $isGivenStep = false
): ResponseInterface {
$user = $this->getActualUsername($user);
$file = \fopen(UploadHelper::getAcceptanceTestsDir() . $source, 'r');
$file = \fopen($this->acceptanceTestsDirLocation() . $source, 'r');
$this->pauseUploadDelete();
$response = $this->makeDavRequest(
$user,
@@ -1781,7 +1795,7 @@ trait WebDav {
}
return $this->uploadFileWithHeaders(
$user,
UploadHelper::getAcceptanceTestsDir() . $source,
$this->acceptanceTestsDirLocation() . $source,
$destination,
$headers,
$noOfChunks
@@ -2222,7 +2236,7 @@ trait WebDav {
$this->getBaseUrl(),
$user,
$this->getPasswordForUser($user),
UploadHelper::getAcceptanceTestsDir() . $source,
$this->acceptanceTestsDirLocation() . $source,
$destination,
$this->getStepLineRef(),
["X-OC-Mtime" => $mtime],
@@ -2257,7 +2271,7 @@ trait WebDav {
$this->getBaseUrl(),
$user,
$this->getPasswordForUser($user),
UploadHelper::getAcceptanceTestsDir() . $source,
$this->acceptanceTestsDirLocation() . $source,
$destination,
$this->getStepLineRef(),
["X-OC-Mtime" => $mtime],

View File

@@ -18,7 +18,7 @@ log_success() {
SCRIPT_PATH=$(dirname "$0")
PATH_TO_SUITES="${SCRIPT_PATH}/features"
EXPECTED_FAILURE_FILES=("expected-failures-decomposed-storage.md" "expected-failures-posix-storage.md" "expected-failures-without-remotephp.md")
EXPECTED_FAILURE_FILES=("expected-failures-localAPI-on-decomposed-storage.md" "expected-failures-API-on-decomposed-storage.md" "expected-failures-without-remotephp.md")
# contains all the suites names inside tests/acceptance/features
AVAILABLE_SUITES=($(ls -l "$PATH_TO_SUITES" | grep '^d' | awk '{print $NF}'))

View File

@@ -3,9 +3,9 @@ default:
"": "%paths.base%/../bootstrap"
suites:
apiSpaces:
apiAccountsHashDifficulty:
paths:
- "%paths.base%/../features/apiSpaces"
- "%paths.base%/../features/apiAccountsHashDifficulty"
context: &common_ldap_suite_context
parameters:
ldapAdminPassword: admin
@@ -18,6 +18,21 @@ default:
adminPassword: admin
regularUserPassword: 123456
- SettingsContext:
- GraphContext:
- SpacesContext:
- CapabilitiesContext:
- FilesVersionsContext:
- NotificationContext:
- OCSContext:
- PublicWebDavContext:
apiSpaces:
paths:
- "%paths.base%/../features/apiSpaces"
context: *common_ldap_suite_context
contexts:
- FeatureContext: *common_feature_context_params
- SettingsContext:
- SpacesContext:
- CapabilitiesContext:
- FilesVersionsContext:
@@ -427,7 +442,7 @@ default:
- AuthAppContext:
- CliContext:
- OcConfigContext:
apiTenancy:
paths:
- "%paths.base%/../features/apiTenancy"

View File

@@ -1,4 +1,3 @@
.ONESHELL:
SHELL := bash
# define standard colors
@@ -6,43 +5,47 @@ BLACK := $(shell tput -Txterm setaf 0)
RED := $(shell tput -Txterm setaf 1)
GREEN := $(shell tput -Txterm setaf 2)
YELLOW := $(shell tput -Txterm setaf 3)
BLUE := $(shell tput -Txterm setaf 4)
LIGHTPURPLE := $(shell tput -Txterm setaf 4)
PURPLE := $(shell tput -Txterm setaf 5)
CYAN := $(shell tput -Txterm setaf 6)
BLUE := $(shell tput -Txterm setaf 6)
WHITE := $(shell tput -Txterm setaf 7)
RESET := $(shell tput -Txterm sgr0)
COMPOSE_FILE := src/opencloud-base.yml
## default values only for sub-make calls
ifeq ($(LOCAL_TEST),true)
COMPOSE_FILE ?= src/opencloud-base.yml:src/tika.yml
ifeq ($(START_EMAIL),true)
COMPOSE_FILE := $(COMPOSE_FILE):src/email.yml
endif
else
COMPOSE_FILE ?= src/redis.yml:src/opencloud-base.yml:src/acceptance.yml
endif
## user input
BEHAT_FEATURE ?=
ifdef OC_IMAGE_TAG
BUILD_DEV_IMAGE := 0
else
BUILD_DEV_IMAGE := 1
endif
OC_IMAGE_TAG ?= dev
# run tests with ocwrapper by default
WITH_WRAPPER ?= true
OC_WRAPPER := ../../ocwrapper/bin/ocwrapper
# enable tika for full text extraction
ifeq ($(START_TIKA),true)
COMPOSE_FILE := src/tika.yml:$(COMPOSE_FILE)
export SEARCH_EXTRACTOR_TYPE := tika
ifdef START_TIKA
ifeq ($(START_TIKA),true)
COMPOSE_FILE := $(COMPOSE_FILE):src/tika.yml
SEARCH_EXTRACTOR_TYPE := tika
else
SEARCH_EXTRACTOR_TYPE := basic
endif
else
export SEARCH_EXTRACTOR_TYPE := basic
endif
# enable email server
ifeq ($(START_EMAIL),true)
COMPOSE_FILE := src/email.yml:$(COMPOSE_FILE)
export OC_ADD_RUN_SERVICES := notifications
endif
# enable antivirus
ifeq ($(START_ANTIVIRUS),true)
COMPOSE_FILE := src/antivirus.yml:$(COMPOSE_FILE)
export OC_ADD_RUN_SERVICES := $(OC_ADD_RUN_SERVICES) antivirus
export POSTPROCESSING_STEPS := virusscan
endif
# enable wopi services
ifeq ($(ENABLE_WOPI),true)
COMPOSE_FILE := $(COMPOSE_FILE):src/wopi.yml
SEARCH_EXTRACTOR_TYPE := basic
endif
# default to posix
@@ -50,92 +53,223 @@ STORAGE_DRIVER ?= posix
ifeq ($(STORAGE_DRIVER),posix)
# posix requires a additional driver config
COMPOSE_FILE := $(COMPOSE_FILE):src/posix.yml
else ifeq ($(STORAGE_DRIVER),decomposeds3)
COMPOSE_FILE := src/ceph.yml:$(COMPOSE_FILE)
endif
# use latest as default tag if OC_IMAGE is provided but no tag is set
ifneq ($(strip $(OC_IMAGE)),)
ifeq ($(strip $(OC_IMAGE_TAG)),)
OC_IMAGE_TAG := latest
endif
endif
# static
DIVIDE_INTO_NUM_PARTS := 10
PARTS = 1 2 3 4 5 6 7 8 9 10
LOCAL_API_SUITES = $(shell ls ../features | grep ^api*)
COMPOSE_PROJECT_NAME := opencloud-acceptance-tests
# Export variables for sub-make calls
export COMPOSE_PROJECT_NAME
export COMPOSE_FILE
export OC_IMAGE
export OC_IMAGE_TAG
# test configurations
export STORAGE_DRIVER
export WITH_WRAPPER
export BEHAT_SUITE
export BEHAT_FEATURE
export TEST_SERVER_URL
export USE_BEARER_TOKEN
## make definition
.PHONY: help
help:
@echo -e "Test suites: ${CYAN}https://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features${RESET}"
@echo -e "Testing docs: ${CYAN}https://github.com/opencloud-eu/opencloud/tree/main/tests/README.md${RESET}"
@echo "Please use 'make <target>' where <target> is one of the following:"
@echo
@echo "Available commands (targets):"
@echo -e " ${GREEN}run-api-tests\t\t${RESET}Build dev image, start services and run the tests."
@echo -e " ${GREEN}start-services\t${RESET}Start service containers."
@echo -e " ${GREEN}run-test-only\t\t${RESET}Run the tests only."
@echo -e " ${GREEN}show-test-logs\t${RESET}Show the test logs."
@echo -e " ${GREEN}ps\t\t\t${RESET}Show the running containers."
@echo -e "${PURPLE}docs: https://docs.opencloud.eu/opencloud/development/testing/#testing-with-test-suite-in-docker${RESET}\n"
@echo
@echo -e " ${YELLOW}clean-dev-image\t${RESET}Delete the docker image built during acceptance tests."
@echo -e " ${YELLOW}clean-containers\t${RESET}Delete the docker containers and volumes."
@echo -e " ${YELLOW}clean-tests\t\t${RESET}Delete test dependencies and results."
@echo -e " ${YELLOW}clean-all\t\t${RESET}Clean all resources: images, containers, volumes, test files."
@echo -e "OpenCloud feature tests and test suites can be found here:"
@echo -e "\thttps://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features"
@echo
@echo "Available environment variables:"
@echo -e " ${PURPLE}OC_IMAGE\t\t${RESET}${CYAN}[image_repo]${RESET} OpenCloud image to use. If provided, the dev image build is skipped."
@echo -e " ${PURPLE}OC_IMAGE_TAG\t\t${RESET}${CYAN}[image_tag]${RESET} OpenCloud image tag to use. If provided, the dev image build is skipped."
@echo -e " ${PURPLE}WITH_WRAPPER\t\t${RESET}${CYAN}[true|false]${RESET} Start OpenCloud server using ocwrapper. Default: ${YELLOW}true${RESET}"
@echo -e " ${PURPLE}STORAGE_DRIVER\t${RESET}${CYAN}[posix|decomposed|decomposeds3]${RESET} Storage driver to use. Default: ${YELLOW}posix${RESET}"
@echo -e " ${PURPLE}BEHAT_FEATURE\t\t${RESET}${RESET}${CYAN}[path]${RESET} Path to a feature file. Example: ${YELLOW}tests/acceptance/features/apiGraph/changeRole.feature${RESET}"
@echo -e " ${PURPLE}BEHAT_SUITE\t\t${RESET}${RESET}${CYAN}[suite_name]${RESET} Test suite to run. Example: ${YELLOW}apiGraph${RESET}"
@echo -e " ${PURPLE}TEST_SERVER_URL\t${RESET}${CYAN}[url]${RESET} URL of the OpenCloud server to test against."
@echo -e " ${PURPLE}USE_BEARER_TOKEN\t${RESET}${CYAN}[true|false]${RESET} Use a bearer token for authentication. Default: ${YELLOW}false${RESET}"
@echo -e "test suites that test core compatibility are found here and they start with prefix coreApi-:"
@echo -e "\thttps://github.com/opencloud-eu/opencloud/tree/main/tests/acceptance/features"
@echo
@echo -e "Example usage:"
@echo -e " ${PURPLE}WITH_WRAPPER${RESET}=${YELLOW}false${RESET} \\"
@echo -e " ${PURPLE}STORAGE_DRIVER${RESET}=${YELLOW}posix${RESET} \\"
@echo -e " ${PURPLE}BEHAT_FEATURE${RESET}=${YELLOW}tests/acceptance/features/apiGraph/changeRole.feature${RESET} \\"
@echo -e " make ${GREEN}run-api-tests${RESET}"
@echo -e "The OpenCloud to be tested will be build from your current working state."
@echo -e "You also can select the OpenCloud Docker image for all tests by setting"
@echo -e "\tmake ... ${YELLOW}OC_IMAGE_TAG=latest${RESET}"
@echo -e "where ${YELLOW}latest${RESET} is an example for any valid Docker image tag from"
@echo -e "https://hub.docker.com/r/opencloud-eu/opencloud."
@echo
@echo -e "${GREEN}Run full OpenCloud test suites with decomposed storage:${RESET}\n"
@echo -e "\tmake localApiTests-apiAccountsHashDifficulty-decomposed\t\t${BLUE}run apiAccountsHashDifficulty test suite, where available test suite are apiAccountsHashDifficulty apiArchiver apiContract apiGraph apiSpaces apiSpacesShares apiAsyncUpload apiCors${RESET}"
@echo
@echo -e "${GREEN}Run full OpenCloud test suites with decomposeds3 storage:${RESET}\n"
@echo -e "\tmake localApiTests-apiAccountsHashDifficulty-decomposeds3\t\t${BLUE}run apiAccountsHashDifficulty test suite, where available test suite are apiAccountsHashDifficulty apiArchiver apiContract apiGraph apiSpaces apiSpacesShares apiAsyncUpload apiCors${RESET}"
@echo
@echo -e "${GREEN}Run full OpenCloud test suites with decomposed storage:${RESET}\n"
@echo -e "\tmake Core-API-Tests-decomposed-storage-${RED}X${RESET}\t\t${BLUE}run test suite number X, where ${RED}X = 1 .. 10${RESET}"
@echo
@echo -e "${GREEN}Run full OpenCloud test suites with decomposeds3 storage:${RESET}\n"
@echo -e "\tmake Core-API-Tests-decomposeds3-storage-${RED}X${RESET}\t\t${BLUE}run test suite number X, where ${RED}X = 1 .. 10${RESET}"
@echo
@echo -e "${GREEN}Run an OpenCloud feature test with decomposed storage:${RESET}\n"
@echo -e "\tmake test-opencloud-feature-decomposed-storage ${YELLOW}BEHAT_FEATURE='...'${RESET}\t${BLUE}run single feature test${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature${RESET}"
@echo
@echo -e "${GREEN}Run an OpenCloud feature test with decomposeds3 storage:${RESET}\n"
@echo -e "\tmake test-opencloud-feature-decomposeds3-storage ${YELLOW}BEHAT_FEATURE='...'${RESET}\t${BLUE}run single feature test${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature${RESET}"
@echo
@echo -e "${GREEN}Run a core test against OpenCloud with decomposed storage:${RESET}\n"
@echo -e "\tmake test-core-feature-decomposed-storage ${YELLOW}BEHAT_FEATURE='...'${RESET}\t${BLUE}run single feature test${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/coreApiAuth/webDavAuth.feature${RESET}"
@echo
@echo -e "${GREEN}Run a core test against OpenCloud with decomposeds3 storage:${RESET}\n"
@echo -e "\tmake test-core-feature-decomposeds3-storage ${YELLOW}BEHAT_FEATURE='...'${RESET}\t${BLUE}run single feature test${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/coreApiAuth/webDavAuth.feature${RESET}"
@echo
@echo -e "\twhere ${YELLOW}BEHAT_FEATURE='...'${RESET} contains a relative path to the feature definition."
@echo -e "\texample: ${RED}tests/acceptance/features/coreApiAuth/webDavAuth.feature${RESET}"
@echo
@echo
@echo -e "${GREEN}Show output of tests:${RESET}\n"
@echo -e "\tmake show-test-logs\t\t${BLUE}show output of running or finished tests${RESET}"
@echo
@echo
@echo -e "${GREEN}Clean up after testing:${RESET}\n"
@echo -e "\tmake clean\t${BLUE}clean up all${RESET}"
@echo -e "\tmake clean-docker-container\t\t${BLUE}stops and removes used docker containers${RESET}"
@echo -e "\tmake clean-docker-volumes\t\t${BLUE}removes used docker volumes (used for caching)${RESET}"
@echo
.PHONY: test-opencloud-feature-decomposed-storage
test-opencloud-feature-decomposed-storage: ## test a OpenCloud feature with decomposed storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature:10'
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=decomposed \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
$(MAKE) --no-print-directory testSuite
.PHONY: run-api-tests
run-api-tests: $(OC_WRAPPER) build-dev-image clean-containers
@echo "${BLUE}[INFO]${RESET} Compose project: ${YELLOW}$(COMPOSE_PROJECT_NAME)${RESET}"
@echo "${BLUE}[INFO]${RESET} Compose file: ${YELLOW}$(COMPOSE_FILE)${RESET}"
@echo "${BLUE}[INFO]${RESET} Using storage driver: ${YELLOW}$(STORAGE_DRIVER)${RESET}"
# force use local server when using this command
export TEST_SERVER_URL=https://opencloud-server:9200
$(MAKE) --no-print-directory start-services
$(MAKE) --no-print-directory run-test-only
.PHONY: test-opencloud-feature-decomposeds3-storage
test-opencloud-feature-decomposeds3-storage: ## test a OpenCloud feature with decomposeds3 storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature:10'
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=decomposeds3 \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
START_CEPH=1 \
$(MAKE) --no-print-directory testSuite
.PHONY: start-services
start-services: $(OC_WRAPPER) ## start services
.PHONY: test-opencloud-feature-posix-storage
test-opencloud-feature-posix-storage: ## test a OpenCloud feature with posix storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature:10'
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=posix \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
$(MAKE) --no-print-directory testSuite
.PHONY: test-core-feature-decomposed-storage
test-core-feature-decomposed-storage: ## test a core feature with decomposed storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/coreApiAuth/webDavAuth.feature'
@TEST_SOURCE=core \
STORAGE_DRIVER=decomposed \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
$(MAKE) --no-print-directory testSuite
.PHONY: test-core-feature-decomposeds3-storage
test-core-feature-decomposeds3-storage: ## test a core feature with decomposeds3 storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/coreApiAuth/webDavAuth.feature'
@TEST_SOURCE=core \
STORAGE_DRIVER=decomposeds3 \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
START_CEPH=1 \
$(MAKE) --no-print-directory testSuite
.PHONY: test-opencloud-feature-posix-storage
test-core-opencloud-feature-posix-storage: ## test a core feature with posix storage, usage: make ... BEHAT_FEATURE='tests/acceptance/features/apiAccountsHashDifficulty/addUser.feature:10'
@TEST_SOURCE=core \
STORAGE_DRIVER=posix \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
$(MAKE) --no-print-directory testSuite
localSuiteOpencloud = $(addprefix localApiTests-, $(addsuffix -decomposed,${LOCAL_API_SUITES}))
.PHONY: $(localSuiteOpencloud)
$(localSuiteOpencloud): ## run local api test suite with decomposed storage
@$(eval BEHAT_SUITE=$(shell echo "$@" | cut -d'-' -f2))
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=decomposed \
BEHAT_SUITE=$(BEHAT_SUITE) \
$(MAKE) --no-print-directory testSuite
localSuiteDecomposedS3 = $(addprefix localApiTests-, $(addsuffix -decomposeds3,${LOCAL_API_SUITES}))
.PHONY: $(localSuiteDecomposedS3)
$(localSuiteDecomposedS3): ## run local api test suite with s3 storage
@$(eval BEHAT_SUITE=$(shell echo "$@" | cut -d'-' -f2))
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=decomposeds3 \
BEHAT_SUITE=$(BEHAT_SUITE) \
$(MAKE) --no-print-directory testSuite
localSuitePosix = $(addprefix localApiTests-, $(addsuffix -posix,${LOCAL_API_SUITES}))
.PHONY: $(localSuitePosix)
$(localSuitePosix): ## run local api test suite with posix storage
@$(eval BEHAT_SUITE=$(shell echo "$@" | cut -d'-' -f2))
@TEST_SOURCE=opencloud \
STORAGE_DRIVER=posix \
BEHAT_SUITE=$(BEHAT_SUITE) \
$(MAKE) --no-print-directory testSuite
targetsOC = $(addprefix Core-API-Tests-decomposed-storage-,$(PARTS))
.PHONY: $(targetsOC)
$(targetsOC):
@$(eval RUN_PART=$(shell echo "$@" | tr -dc '0-9'))
@TEST_SOURCE=core \
STORAGE_DRIVER=decomposed \
RUN_PART=$(RUN_PART) \
$(MAKE) --no-print-directory testSuite
targetsDecomposedS3 = $(addprefix Core-API-Tests-decomposeds3-storage-,$(PARTS))
.PHONY: $(targetsDecomposedS3)
$(targets):
@$(eval RUN_PART=$(shell echo "$@" | tr -dc '0-9'))
@TEST_SOURCE=core \
STORAGE_DRIVER=decomposeds3 \
RUN_PART=$(RUN_PART) \
$(MAKE) --no-print-directory testSuite
.PHONY: testSuite
testSuite: $(OC_WRAPPER) build-dev-image clean-docker-container
@if [ -n "${START_CEPH}" ]; then \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=src/ceph.yml \
docker compose run start_ceph; \
fi; \
if [ "${START_EMAIL}" == "true" ]; then \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=src/email.yml \
docker compose run start_email; \
fi; \
if [ "${START_ANTIVIRUS}" == "true" ]; then \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=src/antivirus.yml \
docker compose run start_antivirus; \
fi; \
if [ "${START_TIKA}" == "true" ]; then \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=src/tika.yml \
docker compose run tika-service; \
fi; \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=$(COMPOSE_FILE) \
STORAGE_DRIVER=$(STORAGE_DRIVER) \
TEST_SOURCE=$(TEST_SOURCE) \
WITH_WRAPPER=$(WITH_WRAPPER) \
OC_ASYNC_UPLOADS=$(OC_ASYNC_UPLOADS) \
OC_ADD_RUN_SERVICES=$(OC_ADD_RUN_SERVICES) \
POSTPROCESSING_STEPS=$(POSTPROCESSING_STEPS) \
SEARCH_EXTRACTOR_TYPE=$(SEARCH_EXTRACTOR_TYPE) \
OC_IMAGE_TAG=$(OC_IMAGE_TAG) \
BEHAT_SUITE=$(BEHAT_SUITE) \
BEHAT_FEATURE=$(BEHAT_FEATURE) \
DIVIDE_INTO_NUM_PARTS=$(DIVIDE_INTO_NUM_PARTS) \
RUN_PART=$(RUN_PART) \
docker compose up -d --build --force-recreate
.PHONY: run-test-only
run-test-only:
docker compose -f src/acceptance.yml up
.PHONY: show-test-logs
show-test-logs: ## show test logs
show-test-logs: ## show logs of test
@COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=$(COMPOSE_FILE) \
docker compose logs --no-log-prefix -f acceptance-tests | less
.PHONY: ps
ps: ## show running containers
ps: ## show docker status
@COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=$(COMPOSE_FILE) \
docker compose ps
$(OC_WRAPPER):
@@ -145,21 +279,55 @@ $(OC_WRAPPER):
.PHONY: build-dev-image
build-dev-image:
@if [ -z "$(OC_IMAGE)" ] && [ -z "$(OC_IMAGE_TAG)" ]; then \
@if [ $(BUILD_DEV_IMAGE) -eq 1 ]; then \
$(MAKE) --no-print-directory -C ../../../opencloud dev-docker \
; fi;
.PHONY: clean-dev-image
clean-dev-image: ## clean docker image built during acceptance tests
@docker image rm opencloudeu/opencloud:dev || true
.PHONY: clean-dev-docker-image
clean-dev-docker-image: ## clean docker image built during acceptance tests
@docker image rm opencloud-eu/opencloud:dev || true
.PHONY: clean-containers
clean-containers: ## clean docker containers created during acceptance tests
.PHONY: clean-docker-container
clean-docker-container: ## clean docker containers created during acceptance tests
@COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=$(COMPOSE_FILE) \
BEHAT_SUITE="" \
DIVIDE_INTO_NUM_PARTS="" \
OC_IMAGE_TAG="" \
RUN_PART="" \
STORAGE_DRIVER="" \
TEST_SOURCE="" \
docker compose down --remove-orphans
.PHONY: clean-docker-volumes
clean-docker-volumes: ## clean docker volumes created during acceptance tests
@COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
COMPOSE_FILE=$(COMPOSE_FILE) \
BEHAT_SUITE="" \
DIVIDE_INTO_NUM_PARTS="" \
OC_IMAGE_TAG="" \
RUN_PART="" \
STORAGE_DRIVER="" \
TEST_SOURCE="" \
docker compose down --remove-orphans -v
.PHONY: clean-tests
clean-tests:
.PHONY: clean-files
clean-files:
@$(MAKE) --no-print-directory -C ../../../. clean-tests
.PHONY: clean
clean-all: clean-containers clean-dev-image clean-tests ## clean all
clean: clean-docker-container clean-docker-volumes clean-dev-docker-image clean-files ## clean all
.PHONY: start-server
start-server: $(OC_WRAPPER) ## build and start server
@echo "Build and start server..."
COMPOSE_FILE=$(COMPOSE_FILE) \
COMPOSE_PROJECT_NAME=$(COMPOSE_PROJECT_NAME) \
OC_IMAGE_TAG=dev \
WITH_WRAPPER=$(WITH_WRAPPER) \
TEST_SOURCE=opencloud \
STORAGE_DRIVER=$(STORAGE_DRIVER) \
OC_ASYNC_UPLOADS=true \
SEARCH_EXTRACTOR_TYPE=tika \
OC_ADD_RUN_SERVICES=notifications \
docker compose up -d --build --force-recreate

View File

@@ -5,12 +5,14 @@ services:
command: /bin/bash /test/run-tests.sh
environment:
OC_ROOT: /woodpecker/src/github.com/opencloud-eu/opencloud
TEST_SERVER_URL: ${TEST_SERVER_URL:-https://opencloud-server:9200}
TEST_SERVER_URL: https://opencloud-server:9200
OC_WRAPPER_URL: http://opencloud-server:5200
STORAGE_DRIVER: ${STORAGE_DRIVER:-posix}
STORAGE_DRIVER: $STORAGE_DRIVER
TEST_SOURCE: $TEST_SOURCE
BEHAT_SUITE: ${BEHAT_SUITE:-}
BEHAT_FEATURE: ${BEHAT_FEATURE:-}
USE_BEARER_TOKEN: ${USE_BEARER_TOKEN:-false}
DIVIDE_INTO_NUM_PARTS: $DIVIDE_INTO_NUM_PARTS
RUN_PART: $RUN_PART
# email
EMAIL_HOST: email
EMAIL_PORT: 9000

View File

@@ -1,10 +0,0 @@
#!/bin/bash
set -e
mkdir -p /var/www/onlyoffice/Data/certs
cd /var/www/onlyoffice/Data/certs
openssl req -x509 -newkey rsa:4096 -keyout onlyoffice.key -out onlyoffice.crt -sha256 -days 365 -batch -nodes
chmod 400 /var/www/onlyoffice/Data/certs/onlyoffice.key
/app/ds/run-document-server.sh

View File

@@ -1,12 +1,12 @@
services:
opencloud-server:
image: ${OC_IMAGE:-opencloudeu/opencloud}:${OC_IMAGE_TAG:-dev}
entrypoint: ["/bin/sh", "/usr/bin/serve-opencloud.sh"]
image: opencloudeu/opencloud:dev
entrypoint: [ "/bin/sh", "/usr/bin/serve-opencloud.sh" ]
user: root
environment:
WITH_WRAPPER: ${WITH_WRAPPER:-true}
WITH_WRAPPER: $WITH_WRAPPER
OC_URL: "https://opencloud-server:9200"
STORAGE_USERS_DRIVER: ${STORAGE_DRIVER:-posix}
STORAGE_USERS_DRIVER: $STORAGE_DRIVER
STORAGE_USERS_POSIX_WATCH_FS: "true"
STORAGE_USERS_DRIVER_LOCAL_ROOT: /srv/app/tmp/opencloud/local/root
STORAGE_USERS_DRIVER_OC_ROOT: /srv/app/tmp/opencloud/storage/users
@@ -14,12 +14,15 @@ services:
SHARING_USER_JSON_FILE: /srv/app/tmp/opencloud/shares.json
PROXY_ENABLE_BASIC_AUTH: "true"
WEB_UI_CONFIG_FILE: /woodpecker/src/github.com/opencloud-eu/opencloud/tests/config/woodpecker/opencloud-config.json
ACCOUNTS_HASH_DIFFICULTY: 4
OC_INSECURE: "true"
IDM_CREATE_DEMO_USERS: "true"
IDM_ADMIN_PASSWORD: "admin"
FRONTEND_SEARCH_MIN_LENGTH: "2"
OC_ADD_RUN_SERVICES: ${OC_ADD_RUN_SERVICES:-}
OC_ASYNC_UPLOADS: $OC_ASYNC_UPLOADS
OC_ADD_RUN_SERVICES: $OC_ADD_RUN_SERVICES
PROXY_HTTP_ADDR: "0.0.0.0:9200"
OC_JWT_SECRET: "some-random-jwt-secret"
# decomposeds3 specific settings
STORAGE_USERS_DECOMPOSEDS3_ENDPOINT: http://ceph:8080
@@ -39,19 +42,19 @@ services:
ANTIVIRUS_CLAMAV_SOCKET: tcp://clamav:3310
# postprocessing step
POSTPROCESSING_STEPS: ${POSTPROCESSING_STEPS:-}
POSTPROCESSING_STEPS: $POSTPROCESSING_STEPS
# tika
SEARCH_EXTRACTOR_TYPE: ${SEARCH_EXTRACTOR_TYPE:-basic}
SEARCH_EXTRACTOR_TYPE: $SEARCH_EXTRACTOR_TYPE
SEARCH_EXTRACTOR_TIKA_TIKA_URL: "http://tika:9998"
SEARCH_EXTRACTOR_CS3SOURCE_INSECURE: "true"
# fonts map for txt thumbnails (including unicode support)
THUMBNAILS_TXT_FONTMAP_FILE: "/woodpecker/src/github.com/opencloud-eu/opencloud/tests/config/drone/fontsMap.json"
ports:
- "9200:9200"
- "5200:5200" ## ocwrapper
- "9174:9174" ## notifications debug
- '9200:9200'
- '5200:5200' ## ocwrapper
- '9174:9174' ## notifications debug
volumes:
- ../../../config:/woodpecker/src/github.com/opencloud-eu/opencloud/tests/config
- ../../../ocwrapper/bin/ocwrapper:/usr/bin/ocwrapper

View File

@@ -2,6 +2,8 @@
services:
opencloud-server:
environment:
# activate posix storage driver for users
STORAGE_USERS_DRIVER: posix
# posix requires a shared cache store
STORAGE_USERS_ID_CACHE_STORE: "nats-js-kv"
STORAGE_USERS_POSIX_WATCH_FS: "true"

View File

@@ -0,0 +1,4 @@
services:
redis:
image: redis:6
command: ["--databases", "1"]

View File

@@ -1,42 +1,70 @@
#!/bin/bash
mkdir -p "${OC_ROOT}/vendor-bin/behat"
if [ ! -f "${OC_ROOT}/vendor-bin/behat/composer.json" ]; then
cp /tmp/vendor-bin/behat/composer.json "${OC_ROOT}/vendor-bin/behat/composer.json"
fi
#mkdir -p /drone/src/vendor-bin/behat
#cp /tmp/vendor-bin/behat/composer.json /drone/src/vendor-bin/behat/composer.json
git config --global advice.detachedHead false
## CONFIGURE TEST
BEHAT_FILTER_TAGS='~@skip'
EXPECTED_FAILURES_FILE=''
if [ "$STORAGE_DRIVER" = "posix" ]; then
BEHAT_FILTER_TAGS+='&&~@skipOnOpencloud-posix-Storage'
EXPECTED_FAILURES_FILE="${OC_ROOT}/tests/acceptance/expected-failures-posix-storage.md"
elif [ "$STORAGE_DRIVER" = "decomposed" ]; then
BEHAT_FILTER_TAGS+='&&~@skipOnOpencloud-decomposed-Storage'
EXPECTED_FAILURES_FILE="${OC_ROOT}/tests/acceptance/expected-failures-decomposed-storage.md"
if [ "$TEST_SOURCE" = "core" ]; then
export ACCEPTANCE_TEST_TYPE='core-api'
if [ "$STORAGE_DRIVER" = "decomposed" ]; then
export OC_REVA_DATA_ROOT=''
export BEHAT_FILTER_TAGS='~@skipOnOpencloud-decomposed-Storage'
export EXPECTED_FAILURES_FILE='/drone/src/tests/acceptance/expected-failures-API-on-decomposed-storage.md'
elif [ "$STORAGE_DRIVER" = "decomposeds3" ]; then
export BEHAT_FILTER_TAGS='~@skip&&~@skipOnOpencloud-decomposeds3-Storage'
export OC_REVA_DATA_ROOT=''
else
echo "non existing STORAGE selected"
exit 1
fi
unset BEHAT_SUITE
elif [ "$TEST_SOURCE" = "opencloud" ]; then
if [ "$STORAGE_DRIVER" = "decomposed" ]; then
export BEHAT_FILTER_TAGS='~@skip&&~@skipOnOpencloud-decomposed-Storage'
export OC_REVA_DATA_ROOT=''
elif [ "$STORAGE_DRIVER" = "decomposeds3" ]; then
export BEHAT_FILTER_TAGS='~@skip&&~@skipOnOpencloud-decomposeds3-Storage'
export OC_REVA_DATA_ROOT=''
elif [ "$STORAGE_DRIVER" = "posix" ]; then
export BEHAT_FILTER_TAGS='~@skip&&~@skipOnOpencloud-posix-Storage'
export OC_REVA_DATA_ROOT=''
else
echo "non existing storage selected"
exit 1
fi
unset DIVIDE_INTO_NUM_PARTS
unset RUN_PART
else
echo "non existing TEST_SOURCE selected"
exit 1
fi
export BEHAT_FILTER_TAGS
export EXPECTED_FAILURES_FILE
if [ -n "$BEHAT_FEATURE" ]; then
export BEHAT_FEATURE
echo "[INFO] Running feature: $BEHAT_FEATURE"
if [ ! -z "$BEHAT_FEATURE" ]; then
echo "feature selected: " + $BEHAT_FEATURE
# allow running without filters if its a feature
unset BEHAT_FILTER_TAGS
unset BEHAT_SUITE
unset DIVIDE_INTO_NUM_PARTS
unset RUN_PART
unset EXPECTED_FAILURES_FILE
elif [ -n "$BEHAT_SUITE" ]; then
export BEHAT_SUITE
echo "[INFO] Running suite: $BEHAT_SUITE"
else
unset BEHAT_FEATURE
fi
## RUN TEST
sleep 10
make -C "$OC_ROOT" test-acceptance-api
chmod -R 777 "${OC_ROOT}/vendor-bin/"*"/vendor" "${OC_ROOT}/vendor-bin/"*"/composer.lock" "${OC_ROOT}/tests/acceptance/output" 2>/dev/null || true
if [[ -z "$TEST_SOURCE" ]]; then
echo "non existing TEST_SOURCE selected"
exit 1
else
sleep 10
make -C $OC_ROOT test-acceptance-api
fi
chmod -R 777 vendor-bin/**/vendor vendor-bin/**/composer.lock tests/acceptance/output

View File

@@ -1,105 +0,0 @@
x-common_config: &common_config
image: opencloudeu/opencloud:dev
restart: unless-stopped
entrypoint: /bin/sh
command: ["-c", "opencloud collaboration server"]
user: root
x-common_env: &common_env
OC_CONFIG_DIR: /etc/opencloud
MICRO_REGISTRY: nats-js-kv
MICRO_REGISTRY_ADDRESS: opencloud-server:9233
COLLABORATION_LOG_LEVEL: info
COLLABORATION_GRPC_ADDR: 0.0.0.0:9301
COLLABORATION_HTTP_ADDR: 0.0.0.0:9300
COLLABORATION_DEBUG_ADDR: 0.0.0.0:9304
COLLABORATION_APP_PROOF_DISABLE: true
COLLABORATION_APP_INSECURE: true
COLLABORATION_CS3API_DATAGATEWAY_INSECURE: true
COLLABORATION_WOPI_SECRET: some-wopi-secret
x-config_volume: &config_volume
- config:/etc/opencloud
x-depends_on: &depends_on
- opencloud-server
services:
opencloud-server:
environment:
OC_CONFIG_DIR: /etc/opencloud
GATEWAY_GRPC_ADDR: 0.0.0.0:9142
NATS_NATS_HOST: 0.0.0.0
NATS_NATS_PORT: 9233
volumes: *config_volume
fakeoffice:
image: alpine:latest
entrypoint: /bin/sh
command:
[
"-c",
"while true; do echo -e \"HTTP/1.1 200 OK\n\n$(cat /fakeoffice-discovery.xml)\" | nc -l -k -p 8080; done",
]
healthcheck:
test: ["CMD", "curl", "-f", "http://fakeoffice:8080"]
volumes:
- ./../../../config/woodpecker/hosting-discovery.xml:/fakeoffice-discovery.xml
collabora:
image: collabora/code:24.04.5.1.1
environment:
DONT_GEN_SSL_CERT: set
extra_params: --o:ssl.enable=true --o:ssl.termination=true --o:welcome.enable=false --o:net.frame_ancestors=https://opencloud-server:9200
entrypoint: /bin/sh
command: ["-c", "coolconfig generate-proof-key; /start-collabora-online.sh"]
onlyoffice:
image: onlyoffice/documentserver:7.5.1
environment:
WOPI_ENABLED: true
USE_UNAUTHORIZED_STORAGE: true
entrypoint: bash /entrypoint.sh
volumes:
- ./onlyoffice-entrypoint.sh:/entrypoint.sh
collaboration-fakeoffice:
<<: *common_config
environment:
<<: *common_env
COLLABORATION_SERVICE_NAME: collaboration-fakeoffice
COLLABORATION_APP_NAME: FakeOffice
COLLABORATION_APP_PRODUCT: Microsoft
COLLABORATION_APP_ADDR: http://fakeoffice:8080
COLLABORATION_WOPI_SRC: http://collaboration-fakeoffice:9300
volumes: *config_volume
depends_on: *depends_on
collaboration-collabora:
<<: *common_config
environment:
<<: *common_env
COLLABORATION_SERVICE_NAME: collaboration-collabora
COLLABORATION_APP_NAME: Collabora
COLLABORATION_APP_PRODUCT: Collabora
COLLABORATION_APP_ADDR: https://collabora:9980
COLLABORATION_APP_ICON: https://collabora:9980/favicon.ico
COLLABORATION_WOPI_SRC: http://collaboration-collabora:9300
volumes: *config_volume
depends_on: *depends_on
collaboration-onlyoffice:
<<: *common_config
environment:
<<: *common_env
COLLABORATION_SERVICE_NAME: collaboration-onlyoffice
COLLABORATION_APP_NAME: OnlyOffice
COLLABORATION_APP_PRODUCT: OnlyOffice
COLLABORATION_APP_ADDR: https://onlyoffice
COLLABORATION_APP_ICON: https://onlyoffice/web-apps/apps/documenteditor/main/resources/img/favicon.ico
COLLABORATION_WOPI_SRC: http://collaboration-onlyoffice:9300
volumes: *config_volume
depends_on: *depends_on
volumes:
config:

View File

@@ -0,0 +1,175 @@
## Scenarios from core API tests that are expected to fail with decomposed storage while running with the Graph API
### File
Basic file management like up and download, move, copy, properties, trash, versions and chunking.
#### [Custom dav properties with namespaces are rendered incorrectly](https://github.com/owncloud/ocis/issues/2140)
_ocdav: double-check the webdav property parsing when custom namespaces are used_
- [coreApiWebdavProperties/setFileProperties.feature:128](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L128)
- [coreApiWebdavProperties/setFileProperties.feature:129](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L129)
- [coreApiWebdavProperties/setFileProperties.feature:130](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L130)
### Sync
Synchronization features like etag propagation, setting mtime and locking files
#### [Uploading an old method chunked file with checksum should fail using new DAV path](https://github.com/owncloud/ocis/issues/2323)
- [coreApiMain/checksums.feature:233](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L233)
- [coreApiMain/checksums.feature:234](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L234)
- [coreApiMain/checksums.feature:235](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L235)
### Share
#### [d:quota-available-bytes in dprop of PROPFIND give wrong response value](https://github.com/owncloud/ocis/issues/8197)
- [coreApiWebdavProperties/getQuota.feature:57](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L57)
- [coreApiWebdavProperties/getQuota.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L58)
- [coreApiWebdavProperties/getQuota.feature:59](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L59)
- [coreApiWebdavProperties/getQuota.feature:73](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L73)
- [coreApiWebdavProperties/getQuota.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L74)
- [coreApiWebdavProperties/getQuota.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L75)
#### [deleting a file inside a received shared folder is moved to the trash-bin of the sharer not the receiver](https://github.com/owncloud/ocis/issues/1124)
- [coreApiTrashbin/trashbinSharingToShares.feature:54](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L54)
- [coreApiTrashbin/trashbinSharingToShares.feature:55](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L55)
- [coreApiTrashbin/trashbinSharingToShares.feature:56](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L56)
- [coreApiTrashbin/trashbinSharingToShares.feature:83](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L83)
- [coreApiTrashbin/trashbinSharingToShares.feature:84](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L84)
- [coreApiTrashbin/trashbinSharingToShares.feature:85](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L85)
- [coreApiTrashbin/trashbinSharingToShares.feature:142](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L142)
- [coreApiTrashbin/trashbinSharingToShares.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L143)
- [coreApiTrashbin/trashbinSharingToShares.feature:144](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L144)
- [coreApiTrashbin/trashbinSharingToShares.feature:202](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L202)
- [coreApiTrashbin/trashbinSharingToShares.feature:203](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L203)
### Other
API, search, favorites, config, capabilities, not existing endpoints, CORS and others
#### [sending MKCOL requests to another or non-existing user's webDav endpoints as normal user should return 404](https://github.com/owncloud/ocis/issues/5049)
_ocdav: api compatibility, return correct status code_
- [coreApiAuth/webDavMKCOLAuth.feature:42](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L42)
- [coreApiAuth/webDavMKCOLAuth.feature:53](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L53)
#### [trying to lock file of another user gives http 500](https://github.com/owncloud/ocis/issues/2176)
- [coreApiAuth/webDavLOCKAuth.feature:46](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L46)
- [coreApiAuth/webDavLOCKAuth.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L58)
#### [Support for favorites](https://github.com/owncloud/ocis/issues/1228)
- [coreApiFavorites/favorites.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L101)
- [coreApiFavorites/favorites.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L102)
- [coreApiFavorites/favorites.feature:103](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L103)
- [coreApiFavorites/favorites.feature:124](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L124)
- [coreApiFavorites/favorites.feature:125](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L125)
- [coreApiFavorites/favorites.feature:126](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L126)
- [coreApiFavorites/favorites.feature:189](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L189)
- [coreApiFavorites/favorites.feature:190](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L190)
- [coreApiFavorites/favorites.feature:191](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L191)
- [coreApiFavorites/favorites.feature:145](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L145)
- [coreApiFavorites/favorites.feature:146](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L146)
- [coreApiFavorites/favorites.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L147)
- [coreApiFavorites/favorites.feature:174](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L174)
- [coreApiFavorites/favorites.feature:175](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L175)
- [coreApiFavorites/favorites.feature:176](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L176)
- [coreApiFavorites/favoritesSharingToShares.feature:91](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L91)
- [coreApiFavorites/favoritesSharingToShares.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L92)
- [coreApiFavorites/favoritesSharingToShares.feature:93](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L93)
#### [WWW-Authenticate header for unauthenticated requests is not clear](https://github.com/owncloud/ocis/issues/2285)
- [coreApiWebdavOperations/refuseAccess.feature:21](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L21)
- [coreApiWebdavOperations/refuseAccess.feature:22](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L22)
#### [PATCH request for TUS upload with wrong checksum gives incorrect response](https://github.com/owncloud/ocis/issues/1755)
- [coreApiWebdavUploadTUS/checksums.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L74)
- [coreApiWebdavUploadTUS/checksums.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L75)
- [coreApiWebdavUploadTUS/checksums.feature:76](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L76)
- [coreApiWebdavUploadTUS/checksums.feature:77](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L77)
- [coreApiWebdavUploadTUS/checksums.feature:79](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L79)
- [coreApiWebdavUploadTUS/checksums.feature:78](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L78)
- [coreApiWebdavUploadTUS/checksums.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L147)
- [coreApiWebdavUploadTUS/checksums.feature:148](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L148)
- [coreApiWebdavUploadTUS/checksums.feature:149](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L149)
- [coreApiWebdavUploadTUS/checksums.feature:192](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L192)
- [coreApiWebdavUploadTUS/checksums.feature:193](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L193)
- [coreApiWebdavUploadTUS/checksums.feature:194](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L194)
- [coreApiWebdavUploadTUS/checksums.feature:195](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L195)
- [coreApiWebdavUploadTUS/checksums.feature:196](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L196)
- [coreApiWebdavUploadTUS/checksums.feature:197](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L197)
- [coreApiWebdavUploadTUS/checksums.feature:240](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L240)
- [coreApiWebdavUploadTUS/checksums.feature:241](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L241)
- [coreApiWebdavUploadTUS/checksums.feature:242](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L242)
- [coreApiWebdavUploadTUS/checksums.feature:243](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L243)
- [coreApiWebdavUploadTUS/checksums.feature:244](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L244)
- [coreApiWebdavUploadTUS/checksums.feature:245](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L245)
- [coreApiWebdavUploadTUS/uploadToShare.feature:255](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L255)
- [coreApiWebdavUploadTUS/uploadToShare.feature:256](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L256)
- [coreApiWebdavUploadTUS/uploadToShare.feature:279](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L279)
- [coreApiWebdavUploadTUS/uploadToShare.feature:280](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L280)
- [coreApiWebdavUploadTUS/uploadToShare.feature:376](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L376)
- [coreApiWebdavUploadTUS/uploadToShare.feature:377](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L377)
#### [Renaming resource to banned name is allowed in spaces webdav](https://github.com/owncloud/ocis/issues/3099)
- [coreApiWebdavMove2/moveFile.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L143)
- [coreApiWebdavMove1/moveFolder.feature:36](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L36)
- [coreApiWebdavMove1/moveFolder.feature:50](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L50)
- [coreApiWebdavMove1/moveFolder.feature:64](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L64)
#### [Trying to delete other user's trashbin item returns 409 for spaces path instead of 404](https://github.com/owncloud/ocis/issues/9791)
- [coreApiTrashbin/trashbinDelete.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinDelete.feature#L92)
#### [MOVE a file into same folder with same name returns 404 instead of 403](https://github.com/owncloud/ocis/issues/1976)
- [coreApiWebdavMove2/moveFile.feature:100](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L100)
- [coreApiWebdavMove2/moveFile.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L101)
- [coreApiWebdavMove2/moveFile.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L102)
- [coreApiWebdavMove1/moveFolder.feature:217](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L217)
- [coreApiWebdavMove1/moveFolder.feature:218](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L218)
- [coreApiWebdavMove1/moveFolder.feature:219](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L219)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:334](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L334)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:337](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L337)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:340](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L340)
#### [COPY file/folder to same name is possible (but 500 code error for folder with spaces path)](https://github.com/owncloud/ocis/issues/8711)
- [coreApiSharePublicLink2/copyFromPublicLink.feature:198](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiSharePublicLink2/copyFromPublicLink.feature#L198)
- [coreApiWebdavProperties/copyFile.feature:1094](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1094)
- [coreApiWebdavProperties/copyFile.feature:1095](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1095)
- [coreApiWebdavProperties/copyFile.feature:1096](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1096)
#### [Trying to restore personal file to file of share received folder returns 403 but the share file is deleted (new dav path)](https://github.com/owncloud/ocis/issues/10356)
- [coreApiTrashbin/trashbinSharingToShares.feature:277](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L277)
#### [Preview. UTF characters do not display on prievew](https://github.com/opencloud-eu/opencloud/issues/1451)
- [coreApiWebdavPreviews/previews.feature:249](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L249)
- [coreApiWebdavPreviews/previews.feature:250](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L250)
- [coreApiWebdavPreviews/previews.feature:251](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L251)
#### [Preview of text file truncated](https://github.com/opencloud-eu/opencloud/issues/1452)
- [coreApiWebdavPreviews/previews.feature:263](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L263)
- [coreApiWebdavPreviews/previews.feature:264](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L264)
- [coreApiWebdavPreviews/previews.feature:265](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L265)
### Won't fix
Not everything needs to be implemented for opencloud.
- _Blacklisted ignored files are no longer required because opencloud can handle `.htaccess` files without security implications introduced by serving user provided files with apache._
Note: always have an empty line at the end of this file.
The bash script that processes this file requires that the last line has a newline on the end.

View File

@@ -0,0 +1,175 @@
## Scenarios from core API tests that are expected to fail with decomposed storage while running with the Graph API
### File
Basic file management like up and download, move, copy, properties, trash, versions and chunking.
#### [Custom dav properties with namespaces are rendered incorrectly](https://github.com/owncloud/ocis/issues/2140)
_ocdav: double-check the webdav property parsing when custom namespaces are used_
- [coreApiWebdavProperties/setFileProperties.feature:128](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L128)
- [coreApiWebdavProperties/setFileProperties.feature:129](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L129)
- [coreApiWebdavProperties/setFileProperties.feature:130](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L130)
### Sync
Synchronization features like etag propagation, setting mtime and locking files
#### [Uploading an old method chunked file with checksum should fail using new DAV path](https://github.com/owncloud/ocis/issues/2323)
- [coreApiMain/checksums.feature:233](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L233)
- [coreApiMain/checksums.feature:234](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L234)
- [coreApiMain/checksums.feature:235](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L235)
### Share
#### [d:quota-available-bytes in dprop of PROPFIND give wrong response value](https://github.com/owncloud/ocis/issues/8197)
- [coreApiWebdavProperties/getQuota.feature:57](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L57)
- [coreApiWebdavProperties/getQuota.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L58)
- [coreApiWebdavProperties/getQuota.feature:59](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L59)
- [coreApiWebdavProperties/getQuota.feature:73](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L73)
- [coreApiWebdavProperties/getQuota.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L74)
- [coreApiWebdavProperties/getQuota.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L75)
#### [deleting a file inside a received shared folder is moved to the trash-bin of the sharer not the receiver](https://github.com/owncloud/ocis/issues/1124)
- [coreApiTrashbin/trashbinSharingToShares.feature:54](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L54)
- [coreApiTrashbin/trashbinSharingToShares.feature:55](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L55)
- [coreApiTrashbin/trashbinSharingToShares.feature:56](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L56)
- [coreApiTrashbin/trashbinSharingToShares.feature:83](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L83)
- [coreApiTrashbin/trashbinSharingToShares.feature:84](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L84)
- [coreApiTrashbin/trashbinSharingToShares.feature:85](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L85)
- [coreApiTrashbin/trashbinSharingToShares.feature:142](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L142)
- [coreApiTrashbin/trashbinSharingToShares.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L143)
- [coreApiTrashbin/trashbinSharingToShares.feature:144](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L144)
- [coreApiTrashbin/trashbinSharingToShares.feature:202](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L202)
- [coreApiTrashbin/trashbinSharingToShares.feature:203](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L203)
### Other
API, search, favorites, config, capabilities, not existing endpoints, CORS and others
#### [sending MKCOL requests to another or non-existing user's webDav endpoints as normal user should return 404](https://github.com/owncloud/ocis/issues/5049)
_ocdav: api compatibility, return correct status code_
- [coreApiAuth/webDavMKCOLAuth.feature:42](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L42)
- [coreApiAuth/webDavMKCOLAuth.feature:53](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L53)
#### [trying to lock file of another user gives http 500](https://github.com/owncloud/ocis/issues/2176)
- [coreApiAuth/webDavLOCKAuth.feature:46](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L46)
- [coreApiAuth/webDavLOCKAuth.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L58)
#### [Support for favorites](https://github.com/owncloud/ocis/issues/1228)
- [coreApiFavorites/favorites.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L101)
- [coreApiFavorites/favorites.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L102)
- [coreApiFavorites/favorites.feature:103](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L103)
- [coreApiFavorites/favorites.feature:124](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L124)
- [coreApiFavorites/favorites.feature:125](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L125)
- [coreApiFavorites/favorites.feature:126](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L126)
- [coreApiFavorites/favorites.feature:189](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L189)
- [coreApiFavorites/favorites.feature:190](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L190)
- [coreApiFavorites/favorites.feature:191](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L191)
- [coreApiFavorites/favorites.feature:145](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L145)
- [coreApiFavorites/favorites.feature:146](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L146)
- [coreApiFavorites/favorites.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L147)
- [coreApiFavorites/favorites.feature:174](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L174)
- [coreApiFavorites/favorites.feature:175](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L175)
- [coreApiFavorites/favorites.feature:176](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L176)
- [coreApiFavorites/favoritesSharingToShares.feature:91](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L91)
- [coreApiFavorites/favoritesSharingToShares.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L92)
- [coreApiFavorites/favoritesSharingToShares.feature:93](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L93)
#### [WWW-Authenticate header for unauthenticated requests is not clear](https://github.com/owncloud/ocis/issues/2285)
- [coreApiWebdavOperations/refuseAccess.feature:21](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L21)
- [coreApiWebdavOperations/refuseAccess.feature:22](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L22)
#### [PATCH request for TUS upload with wrong checksum gives incorrect response](https://github.com/owncloud/ocis/issues/1755)
- [coreApiWebdavUploadTUS/checksums.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L74)
- [coreApiWebdavUploadTUS/checksums.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L75)
- [coreApiWebdavUploadTUS/checksums.feature:76](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L76)
- [coreApiWebdavUploadTUS/checksums.feature:77](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L77)
- [coreApiWebdavUploadTUS/checksums.feature:79](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L79)
- [coreApiWebdavUploadTUS/checksums.feature:78](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L78)
- [coreApiWebdavUploadTUS/checksums.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L147)
- [coreApiWebdavUploadTUS/checksums.feature:148](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L148)
- [coreApiWebdavUploadTUS/checksums.feature:149](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L149)
- [coreApiWebdavUploadTUS/checksums.feature:192](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L192)
- [coreApiWebdavUploadTUS/checksums.feature:193](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L193)
- [coreApiWebdavUploadTUS/checksums.feature:194](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L194)
- [coreApiWebdavUploadTUS/checksums.feature:195](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L195)
- [coreApiWebdavUploadTUS/checksums.feature:196](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L196)
- [coreApiWebdavUploadTUS/checksums.feature:197](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L197)
- [coreApiWebdavUploadTUS/checksums.feature:240](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L240)
- [coreApiWebdavUploadTUS/checksums.feature:241](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L241)
- [coreApiWebdavUploadTUS/checksums.feature:242](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L242)
- [coreApiWebdavUploadTUS/checksums.feature:243](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L243)
- [coreApiWebdavUploadTUS/checksums.feature:244](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L244)
- [coreApiWebdavUploadTUS/checksums.feature:245](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L245)
- [coreApiWebdavUploadTUS/uploadToShare.feature:255](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L255)
- [coreApiWebdavUploadTUS/uploadToShare.feature:256](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L256)
- [coreApiWebdavUploadTUS/uploadToShare.feature:279](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L279)
- [coreApiWebdavUploadTUS/uploadToShare.feature:280](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L280)
- [coreApiWebdavUploadTUS/uploadToShare.feature:376](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L376)
- [coreApiWebdavUploadTUS/uploadToShare.feature:377](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L377)
#### [Renaming resource to banned name is allowed in spaces webdav](https://github.com/owncloud/ocis/issues/3099)
- [coreApiWebdavMove2/moveFile.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L143)
- [coreApiWebdavMove1/moveFolder.feature:36](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L36)
- [coreApiWebdavMove1/moveFolder.feature:50](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L50)
- [coreApiWebdavMove1/moveFolder.feature:64](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L64)
#### [Trying to delete other user's trashbin item returns 409 for spaces path instead of 404](https://github.com/owncloud/ocis/issues/9791)
- [coreApiTrashbin/trashbinDelete.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinDelete.feature#L92)
#### [MOVE a file into same folder with same name returns 404 instead of 403](https://github.com/owncloud/ocis/issues/1976)
- [coreApiWebdavMove2/moveFile.feature:100](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L100)
- [coreApiWebdavMove2/moveFile.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L101)
- [coreApiWebdavMove2/moveFile.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L102)
- [coreApiWebdavMove1/moveFolder.feature:217](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L217)
- [coreApiWebdavMove1/moveFolder.feature:218](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L218)
- [coreApiWebdavMove1/moveFolder.feature:219](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L219)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:334](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L334)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:337](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L337)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:340](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L340)
#### [COPY file/folder to same name is possible (but 500 code error for folder with spaces path)](https://github.com/owncloud/ocis/issues/8711)
- [coreApiSharePublicLink2/copyFromPublicLink.feature:198](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiSharePublicLink2/copyFromPublicLink.feature#L198)
- [coreApiWebdavProperties/copyFile.feature:1094](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1094)
- [coreApiWebdavProperties/copyFile.feature:1095](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1095)
- [coreApiWebdavProperties/copyFile.feature:1096](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1096)
#### [Trying to restore personal file to file of share received folder returns 403 but the share file is deleted (new dav path)](https://github.com/owncloud/ocis/issues/10356)
- [coreApiTrashbin/trashbinSharingToShares.feature:277](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L277)
#### [Preview. UTF characters do not display on prievew](https://github.com/opencloud-eu/opencloud/issues/1451)
- [coreApiWebdavPreviews/previews.feature:249](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L249)
- [coreApiWebdavPreviews/previews.feature:250](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L250)
- [coreApiWebdavPreviews/previews.feature:251](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L251)
#### [Preview of text file truncated](https://github.com/opencloud-eu/opencloud/issues/1452)
- [coreApiWebdavPreviews/previews.feature:263](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L263)
- [coreApiWebdavPreviews/previews.feature:264](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L264)
- [coreApiWebdavPreviews/previews.feature:265](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L265)
### Won't fix
Not everything needs to be implemented for opencloud.
- _Blacklisted ignored files are no longer required because opencloud can handle `.htaccess` files without security implications introduced by serving user provided files with apache._
Note: always have an empty line at the end of this file.
The bash script that processes this file requires that the last line has a newline on the end.

View File

@@ -19,6 +19,8 @@
#### [Settings service user can list other peoples assignments](https://github.com/owncloud/ocis/issues/5032)
- [apiAccountsHashDifficulty/assignRole.feature:27](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAccountsHashDifficulty/assignRole.feature#L27)
- [apiAccountsHashDifficulty/assignRole.feature:28](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAccountsHashDifficulty/assignRole.feature#L28)
- [apiGraph/getAssignedRole.feature:31](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L31)
- [apiGraph/getAssignedRole.feature:32](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L32)
- [apiGraph/getAssignedRole.feature:33](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L33)
@@ -191,9 +193,9 @@
- [apiServiceAvailability/serviceAvailabilityCheck.feature:123](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiServiceAvailability/serviceAvailabilityCheck.feature#L123)
#### [Skip tests for different languages](https://github.com/opencloud-eu/opencloud/issues/183)
- [apiActivities/activities.feature:2598](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiActivities/activities.feature#L2598)
#### [Missing properties in REPORT response](https://github.com/owncloud/ocis/issues/9780), [d:getetag property has empty value in REPORT response](https://github.com/owncloud/ocis/issues/9783)
- [apiSearch1/search.feature:437](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L437)
@@ -203,160 +205,6 @@
- [apiSearch1/search.feature:466](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L466)
- [apiSearch1/search.feature:467](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L467)
## Scenarios from core API tests that are expected to fail with decomposed storage
### File
Basic file management like up and download, move, copy, properties, trash, versions and chunking.
#### [Custom dav properties with namespaces are rendered incorrectly](https://github.com/owncloud/ocis/issues/2140)
_ocdav: double-check the webdav property parsing when custom namespaces are used_
- [coreApiWebdavProperties/setFileProperties.feature:128](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L128)
- [coreApiWebdavProperties/setFileProperties.feature:129](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L129)
- [coreApiWebdavProperties/setFileProperties.feature:130](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L130)
### Sync
Synchronization features like etag propagation, setting mtime and locking files
#### [Uploading an old method chunked file with checksum should fail using new DAV path](https://github.com/owncloud/ocis/issues/2323)
- [coreApiMain/checksums.feature:233](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L233)
- [coreApiMain/checksums.feature:234](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L234)
- [coreApiMain/checksums.feature:235](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L235)
### Share
#### [d:quota-available-bytes in dprop of PROPFIND give wrong response value](https://github.com/owncloud/ocis/issues/8197)
- [coreApiWebdavProperties/getQuota.feature:57](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L57)
- [coreApiWebdavProperties/getQuota.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L58)
- [coreApiWebdavProperties/getQuota.feature:59](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L59)
- [coreApiWebdavProperties/getQuota.feature:73](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L73)
- [coreApiWebdavProperties/getQuota.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L74)
- [coreApiWebdavProperties/getQuota.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L75)
#### [deleting a file inside a received shared folder is moved to the trash-bin of the sharer not the receiver](https://github.com/owncloud/ocis/issues/1124)
- [coreApiTrashbin/trashbinSharingToShares.feature:54](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L54)
- [coreApiTrashbin/trashbinSharingToShares.feature:55](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L55)
- [coreApiTrashbin/trashbinSharingToShares.feature:56](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L56)
- [coreApiTrashbin/trashbinSharingToShares.feature:83](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L83)
- [coreApiTrashbin/trashbinSharingToShares.feature:84](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L84)
- [coreApiTrashbin/trashbinSharingToShares.feature:85](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L85)
- [coreApiTrashbin/trashbinSharingToShares.feature:142](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L142)
- [coreApiTrashbin/trashbinSharingToShares.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L143)
- [coreApiTrashbin/trashbinSharingToShares.feature:144](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L144)
- [coreApiTrashbin/trashbinSharingToShares.feature:202](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L202)
- [coreApiTrashbin/trashbinSharingToShares.feature:203](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L203)
### Other
API, search, favorites, config, capabilities, not existing endpoints, CORS and others
#### [sending MKCOL requests to another or non-existing user's webDav endpoints as normal user should return 404](https://github.com/owncloud/ocis/issues/5049)
_ocdav: api compatibility, return correct status code_
- [coreApiAuth/webDavMKCOLAuth.feature:42](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L42)
- [coreApiAuth/webDavMKCOLAuth.feature:53](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L53)
#### [trying to lock file of another user gives http 500](https://github.com/owncloud/ocis/issues/2176)
- [coreApiAuth/webDavLOCKAuth.feature:46](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L46)
- [coreApiAuth/webDavLOCKAuth.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L58)
#### [Support for favorites](https://github.com/owncloud/ocis/issues/1228)
- [coreApiFavorites/favorites.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L101)
- [coreApiFavorites/favorites.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L102)
- [coreApiFavorites/favorites.feature:103](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L103)
- [coreApiFavorites/favorites.feature:124](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L124)
- [coreApiFavorites/favorites.feature:125](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L125)
- [coreApiFavorites/favorites.feature:126](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L126)
- [coreApiFavorites/favorites.feature:189](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L189)
- [coreApiFavorites/favorites.feature:190](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L190)
- [coreApiFavorites/favorites.feature:191](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L191)
- [coreApiFavorites/favorites.feature:145](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L145)
- [coreApiFavorites/favorites.feature:146](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L146)
- [coreApiFavorites/favorites.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L147)
- [coreApiFavorites/favorites.feature:174](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L174)
- [coreApiFavorites/favorites.feature:175](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L175)
- [coreApiFavorites/favorites.feature:176](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L176)
- [coreApiFavorites/favoritesSharingToShares.feature:91](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L91)
- [coreApiFavorites/favoritesSharingToShares.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L92)
- [coreApiFavorites/favoritesSharingToShares.feature:93](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L93)
#### [WWW-Authenticate header for unauthenticated requests is not clear](https://github.com/owncloud/ocis/issues/2285)
- [coreApiWebdavOperations/refuseAccess.feature:21](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L21)
- [coreApiWebdavOperations/refuseAccess.feature:22](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L22)
#### [PATCH request for TUS upload with wrong checksum gives incorrect response](https://github.com/owncloud/ocis/issues/1755)
- [coreApiWebdavUploadTUS/checksums.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L74)
- [coreApiWebdavUploadTUS/checksums.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L75)
- [coreApiWebdavUploadTUS/checksums.feature:76](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L76)
- [coreApiWebdavUploadTUS/checksums.feature:77](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L77)
- [coreApiWebdavUploadTUS/checksums.feature:79](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L79)
- [coreApiWebdavUploadTUS/checksums.feature:78](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L78)
- [coreApiWebdavUploadTUS/checksums.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L147)
- [coreApiWebdavUploadTUS/checksums.feature:148](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L148)
- [coreApiWebdavUploadTUS/checksums.feature:149](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L149)
- [coreApiWebdavUploadTUS/checksums.feature:192](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L192)
- [coreApiWebdavUploadTUS/checksums.feature:193](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L193)
- [coreApiWebdavUploadTUS/checksums.feature:194](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L194)
- [coreApiWebdavUploadTUS/checksums.feature:195](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L195)
- [coreApiWebdavUploadTUS/checksums.feature:196](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L196)
- [coreApiWebdavUploadTUS/checksums.feature:197](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L197)
- [coreApiWebdavUploadTUS/checksums.feature:240](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L240)
- [coreApiWebdavUploadTUS/checksums.feature:241](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L241)
- [coreApiWebdavUploadTUS/checksums.feature:242](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L242)
- [coreApiWebdavUploadTUS/checksums.feature:243](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L243)
- [coreApiWebdavUploadTUS/checksums.feature:244](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L244)
- [coreApiWebdavUploadTUS/checksums.feature:245](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L245)
- [coreApiWebdavUploadTUS/uploadToShare.feature:255](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L255)
- [coreApiWebdavUploadTUS/uploadToShare.feature:256](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L256)
- [coreApiWebdavUploadTUS/uploadToShare.feature:279](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L279)
- [coreApiWebdavUploadTUS/uploadToShare.feature:280](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L280)
- [coreApiWebdavUploadTUS/uploadToShare.feature:376](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L376)
- [coreApiWebdavUploadTUS/uploadToShare.feature:377](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L377)
#### [Renaming resource to banned name is allowed in spaces webdav](https://github.com/owncloud/ocis/issues/3099)
- [coreApiWebdavMove2/moveFile.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L143)
- [coreApiWebdavMove1/moveFolder.feature:36](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L36)
- [coreApiWebdavMove1/moveFolder.feature:50](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L50)
- [coreApiWebdavMove1/moveFolder.feature:64](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L64)
#### [Trying to delete other user's trashbin item returns 409 for spaces path instead of 404](https://github.com/owncloud/ocis/issues/9791)
- [coreApiTrashbin/trashbinDelete.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinDelete.feature#L92)
#### [MOVE a file into same folder with same name returns 404 instead of 403](https://github.com/owncloud/ocis/issues/1976)
- [coreApiWebdavMove2/moveFile.feature:100](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L100)
- [coreApiWebdavMove2/moveFile.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L101)
- [coreApiWebdavMove2/moveFile.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L102)
- [coreApiWebdavMove1/moveFolder.feature:217](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L217)
- [coreApiWebdavMove1/moveFolder.feature:218](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L218)
- [coreApiWebdavMove1/moveFolder.feature:219](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L219)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:334](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L334)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:337](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L337)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:340](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L340)
#### [COPY file/folder to same name is possible (but 500 code error for folder with spaces path)](https://github.com/owncloud/ocis/issues/8711)
- [coreApiSharePublicLink2/copyFromPublicLink.feature:198](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiSharePublicLink2/copyFromPublicLink.feature#L198)
- [coreApiWebdavProperties/copyFile.feature:1094](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1094)
- [coreApiWebdavProperties/copyFile.feature:1095](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1095)
- [coreApiWebdavProperties/copyFile.feature:1096](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1096)
#### [Trying to restore personal file to file of share received folder returns 403 but the share file is deleted (new dav path)](https://github.com/owncloud/ocis/issues/10356)
- [coreApiTrashbin/trashbinSharingToShares.feature:277](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L277)
#### [Preview. UTF characters do not display on prievew](https://github.com/opencloud-eu/opencloud/issues/1451)
@@ -370,11 +218,5 @@ _ocdav: api compatibility, return correct status code_
- [coreApiWebdavPreviews/previews.feature:264](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L264)
- [coreApiWebdavPreviews/previews.feature:265](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L265)
### Won't fix
Not everything needs to be implemented for opencloud.
- _Blacklisted ignored files are no longer required because opencloud can handle `.htaccess` files without security implications introduced by serving user provided files with apache._
Note: always have an empty line at the end of this file.
The bash script that processes this file requires that the last line has a newline on the end.

View File

@@ -1,4 +1,4 @@
## Scenarios from OpenCloud API tests that are expected to fail with posix storage
## Scenarios from OpenCloud API tests that are expected to fail with decomposed storage
#### [Downloading the archive of the resource (files | folder) using resource path is not possible](https://github.com/owncloud/ocis/issues/4637)
@@ -19,6 +19,8 @@
#### [Settings service user can list other peoples assignments](https://github.com/owncloud/ocis/issues/5032)
- [apiAccountsHashDifficulty/assignRole.feature:27](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAccountsHashDifficulty/assignRole.feature#L27)
- [apiAccountsHashDifficulty/assignRole.feature:28](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAccountsHashDifficulty/assignRole.feature#L28)
- [apiGraph/getAssignedRole.feature:31](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L31)
- [apiGraph/getAssignedRole.feature:32](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L32)
- [apiGraph/getAssignedRole.feature:33](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiGraph/getAssignedRole.feature#L33)
@@ -191,9 +193,9 @@
- [apiServiceAvailability/serviceAvailabilityCheck.feature:123](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiServiceAvailability/serviceAvailabilityCheck.feature#L123)
#### [Skip tests for different languages](https://github.com/opencloud-eu/opencloud/issues/183)
- [apiActivities/activities.feature:2598](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiActivities/activities.feature#L2598)
#### [Missing properties in REPORT response](https://github.com/owncloud/ocis/issues/9780), [d:getetag property has empty value in REPORT response](https://github.com/owncloud/ocis/issues/9783)
- [apiSearch1/search.feature:437](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L437)
@@ -203,178 +205,5 @@
- [apiSearch1/search.feature:466](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L466)
- [apiSearch1/search.feature:467](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiSearch1/search.feature#L467)
## Scenarios from core API tests that are expected to fail with posix storage
### File
Basic file management like up and download, move, copy, properties, trash, versions and chunking.
#### [Custom dav properties with namespaces are rendered incorrectly](https://github.com/owncloud/ocis/issues/2140)
_ocdav: double-check the webdav property parsing when custom namespaces are used_
- [coreApiWebdavProperties/setFileProperties.feature:128](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L128)
- [coreApiWebdavProperties/setFileProperties.feature:129](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L129)
- [coreApiWebdavProperties/setFileProperties.feature:130](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/setFileProperties.feature#L130)
### Sync
Synchronization features like etag propagation, setting mtime and locking files
#### [Uploading an old method chunked file with checksum should fail using new DAV path](https://github.com/owncloud/ocis/issues/2323)
- [coreApiMain/checksums.feature:233](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L233)
- [coreApiMain/checksums.feature:234](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L234)
- [coreApiMain/checksums.feature:235](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiMain/checksums.feature#L235)
### Share
#### [d:quota-available-bytes in dprop of PROPFIND give wrong response value](https://github.com/owncloud/ocis/issues/8197)
- [coreApiWebdavProperties/getQuota.feature:57](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L57)
- [coreApiWebdavProperties/getQuota.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L58)
- [coreApiWebdavProperties/getQuota.feature:59](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L59)
- [coreApiWebdavProperties/getQuota.feature:73](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L73)
- [coreApiWebdavProperties/getQuota.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L74)
- [coreApiWebdavProperties/getQuota.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/getQuota.feature#L75)
#### [deleting a file inside a received shared folder is moved to the trash-bin of the sharer not the receiver](https://github.com/owncloud/ocis/issues/1124)
- [coreApiTrashbin/trashbinSharingToShares.feature:54](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L54)
- [coreApiTrashbin/trashbinSharingToShares.feature:55](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L55)
- [coreApiTrashbin/trashbinSharingToShares.feature:56](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L56)
- [coreApiTrashbin/trashbinSharingToShares.feature:83](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L83)
- [coreApiTrashbin/trashbinSharingToShares.feature:84](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L84)
- [coreApiTrashbin/trashbinSharingToShares.feature:85](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L85)
- [coreApiTrashbin/trashbinSharingToShares.feature:142](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L142)
- [coreApiTrashbin/trashbinSharingToShares.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L143)
- [coreApiTrashbin/trashbinSharingToShares.feature:144](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L144)
- [coreApiTrashbin/trashbinSharingToShares.feature:202](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L202)
- [coreApiTrashbin/trashbinSharingToShares.feature:203](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L203)
### Other
API, search, favorites, config, capabilities, not existing endpoints, CORS and others
#### [sending MKCOL requests to another or non-existing user's webDav endpoints as normal user should return 404](https://github.com/owncloud/ocis/issues/5049)
_ocdav: api compatibility, return correct status code_
- [coreApiAuth/webDavMKCOLAuth.feature:42](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L42)
- [coreApiAuth/webDavMKCOLAuth.feature:53](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavMKCOLAuth.feature#L53)
#### [trying to lock file of another user gives http 500](https://github.com/owncloud/ocis/issues/2176)
- [coreApiAuth/webDavLOCKAuth.feature:46](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L46)
- [coreApiAuth/webDavLOCKAuth.feature:58](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiAuth/webDavLOCKAuth.feature#L58)
#### [Support for favorites](https://github.com/owncloud/ocis/issues/1228)
- [coreApiFavorites/favorites.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L101)
- [coreApiFavorites/favorites.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L102)
- [coreApiFavorites/favorites.feature:103](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L103)
- [coreApiFavorites/favorites.feature:124](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L124)
- [coreApiFavorites/favorites.feature:125](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L125)
- [coreApiFavorites/favorites.feature:126](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L126)
- [coreApiFavorites/favorites.feature:189](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L189)
- [coreApiFavorites/favorites.feature:190](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L190)
- [coreApiFavorites/favorites.feature:191](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L191)
- [coreApiFavorites/favorites.feature:145](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L145)
- [coreApiFavorites/favorites.feature:146](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L146)
- [coreApiFavorites/favorites.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L147)
- [coreApiFavorites/favorites.feature:174](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L174)
- [coreApiFavorites/favorites.feature:175](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L175)
- [coreApiFavorites/favorites.feature:176](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favorites.feature#L176)
- [coreApiFavorites/favoritesSharingToShares.feature:91](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L91)
- [coreApiFavorites/favoritesSharingToShares.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L92)
- [coreApiFavorites/favoritesSharingToShares.feature:93](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiFavorites/favoritesSharingToShares.feature#L93)
#### [WWW-Authenticate header for unauthenticated requests is not clear](https://github.com/owncloud/ocis/issues/2285)
- [coreApiWebdavOperations/refuseAccess.feature:21](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L21)
- [coreApiWebdavOperations/refuseAccess.feature:22](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavOperations/refuseAccess.feature#L22)
#### [PATCH request for TUS upload with wrong checksum gives incorrect response](https://github.com/owncloud/ocis/issues/1755)
- [coreApiWebdavUploadTUS/checksums.feature:74](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L74)
- [coreApiWebdavUploadTUS/checksums.feature:75](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L75)
- [coreApiWebdavUploadTUS/checksums.feature:76](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L76)
- [coreApiWebdavUploadTUS/checksums.feature:77](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L77)
- [coreApiWebdavUploadTUS/checksums.feature:79](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L79)
- [coreApiWebdavUploadTUS/checksums.feature:78](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L78)
- [coreApiWebdavUploadTUS/checksums.feature:147](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L147)
- [coreApiWebdavUploadTUS/checksums.feature:148](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L148)
- [coreApiWebdavUploadTUS/checksums.feature:149](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L149)
- [coreApiWebdavUploadTUS/checksums.feature:192](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L192)
- [coreApiWebdavUploadTUS/checksums.feature:193](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L193)
- [coreApiWebdavUploadTUS/checksums.feature:194](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L194)
- [coreApiWebdavUploadTUS/checksums.feature:195](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L195)
- [coreApiWebdavUploadTUS/checksums.feature:196](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L196)
- [coreApiWebdavUploadTUS/checksums.feature:197](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L197)
- [coreApiWebdavUploadTUS/checksums.feature:240](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L240)
- [coreApiWebdavUploadTUS/checksums.feature:241](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L241)
- [coreApiWebdavUploadTUS/checksums.feature:242](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L242)
- [coreApiWebdavUploadTUS/checksums.feature:243](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L243)
- [coreApiWebdavUploadTUS/checksums.feature:244](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L244)
- [coreApiWebdavUploadTUS/checksums.feature:245](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/checksums.feature#L245)
- [coreApiWebdavUploadTUS/uploadToShare.feature:255](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L255)
- [coreApiWebdavUploadTUS/uploadToShare.feature:256](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L256)
- [coreApiWebdavUploadTUS/uploadToShare.feature:279](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L279)
- [coreApiWebdavUploadTUS/uploadToShare.feature:280](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L280)
- [coreApiWebdavUploadTUS/uploadToShare.feature:376](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L376)
- [coreApiWebdavUploadTUS/uploadToShare.feature:377](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadToShare.feature#L377)
#### [Renaming resource to banned name is allowed in spaces webdav](https://github.com/owncloud/ocis/issues/3099)
- [coreApiWebdavMove2/moveFile.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L143)
- [coreApiWebdavMove1/moveFolder.feature:36](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L36)
- [coreApiWebdavMove1/moveFolder.feature:50](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L50)
- [coreApiWebdavMove1/moveFolder.feature:64](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L64)
#### [Trying to delete other user's trashbin item returns 409 for spaces path instead of 404](https://github.com/owncloud/ocis/issues/9791)
- [coreApiTrashbin/trashbinDelete.feature:92](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinDelete.feature#L92)
#### [MOVE a file into same folder with same name returns 404 instead of 403](https://github.com/owncloud/ocis/issues/1976)
- [coreApiWebdavMove2/moveFile.feature:100](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L100)
- [coreApiWebdavMove2/moveFile.feature:101](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L101)
- [coreApiWebdavMove2/moveFile.feature:102](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveFile.feature#L102)
- [coreApiWebdavMove1/moveFolder.feature:217](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L217)
- [coreApiWebdavMove1/moveFolder.feature:218](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L218)
- [coreApiWebdavMove1/moveFolder.feature:219](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove1/moveFolder.feature#L219)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:334](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L334)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:337](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L337)
- [coreApiWebdavMove2/moveShareOnOpencloud.feature:340](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavMove2/moveShareOnOpencloud.feature#L340)
#### [COPY file/folder to same name is possible (but 500 code error for folder with spaces path)](https://github.com/owncloud/ocis/issues/8711)
- [coreApiSharePublicLink2/copyFromPublicLink.feature:198](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiSharePublicLink2/copyFromPublicLink.feature#L198)
- [coreApiWebdavProperties/copyFile.feature:1094](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1094)
- [coreApiWebdavProperties/copyFile.feature:1095](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1095)
- [coreApiWebdavProperties/copyFile.feature:1096](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavProperties/copyFile.feature#L1096)
#### [Trying to restore personal file to file of share received folder returns 403 but the share file is deleted (new dav path)](https://github.com/owncloud/ocis/issues/10356)
- [coreApiTrashbin/trashbinSharingToShares.feature:277](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiTrashbin/trashbinSharingToShares.feature#L277)
#### [Preview. UTF characters do not display on prievew](https://github.com/opencloud-eu/opencloud/issues/1451)
- [coreApiWebdavPreviews/previews.feature:249](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L249)
- [coreApiWebdavPreviews/previews.feature:250](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L250)
- [coreApiWebdavPreviews/previews.feature:251](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L251)
#### [Preview of text file truncated](https://github.com/opencloud-eu/opencloud/issues/1452)
- [coreApiWebdavPreviews/previews.feature:263](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L263)
- [coreApiWebdavPreviews/previews.feature:264](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L264)
- [coreApiWebdavPreviews/previews.feature:265](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavPreviews/previews.feature#L265)
### Won't fix
Not everything needs to be implemented for opencloud.
- _Blacklisted ignored files are no longer required because opencloud can handle `.htaccess` files without security implications introduced by serving user provided files with apache._
Note: always have an empty line at the end of this file.
The bash script that processes this file requires that the last line has a newline on the end.

View File

@@ -201,9 +201,9 @@
- [apiAntivirus/antivirus.feature:143](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L143)
- [apiAntivirus/antivirus.feature:144](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L144)
- [apiAntivirus/antivirus.feature:145](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L145)
- [apiAntivirus/antivirus.feature:356](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L356)
- [apiAntivirus/antivirus.feature:357](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L357)
- [apiAntivirus/antivirus.feature:358](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L358)
- [apiAntivirus/antivirus.feature:359](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiAntivirus/antivirus.feature#L359)
- [apiCollaboration/wopi.feature:956](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiCollaboration/wopi.feature#L956)
- [apiCollaboration/wopi.feature:957](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiCollaboration/wopi.feature#L957)
- [apiCollaboration/wopi.feature:958](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/apiCollaboration/wopi.feature#L958)
@@ -320,6 +320,7 @@
- [coreApiWebdavUploadTUS/uploadFile.feature:122](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L122)
- [coreApiWebdavUploadTUS/uploadFile.feature:133](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L133)
- [coreApiWebdavUploadTUS/uploadFile.feature:146](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L146)
- [coreApiWebdavUploadTUS/uploadFile.feature:168](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L168)
- [coreApiWebdavUploadTUS/uploadFile.feature:187](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L187)
- [coreApiWebdavUploadTUS/uploadFile.feature:199](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L199)
- [coreApiWebdavUploadTUS/uploadFile.feature:212](https://github.com/opencloud-eu/opencloud/blob/main/tests/acceptance/features/coreApiWebdavUploadTUS/uploadFile.feature#L212)

View File

@@ -0,0 +1,16 @@
@skipOnReva
Feature: add user
As an admin
I want to be able to add users and store their password with the full hash difficulty
So that I can give people controlled individual access to resources on the OpenCloud server
Scenario: admin creates a user
When the user "Admin" creates a new user with the following attributes using the Graph API:
| userName | brand-new-user |
| displayName | Brand New User |
| email | new@example.org |
| password | %alt1% |
Then the HTTP status code should be "201"
And user "brand-new-user" should exist
And user "brand-new-user" should be able to upload file "filesForUpload/lorem.txt" to "lorem.txt"

View File

@@ -0,0 +1,64 @@
Feature: assign role
As an admin,
I want to assign roles to users
So that I can provide them different authority
Scenario Outline: only admin user can see all existing roles
Given user "Alice" has been created with default attributes
And the administrator has given "Alice" the role "<user-role>" using the settings api
When user "Alice" tries to get all existing roles using the settings API
Then the HTTP status code should be "<http-status-code>"
Examples:
| user-role | http-status-code |
| Admin | 201 |
| Space Admin | 201 |
| User | 201 |
@issue-5032
Scenario Outline: only admin user can see assignments list
Given user "Alice" has been created with default attributes
And the administrator has given "Alice" the role "<user-role>" using the settings api
When user "Alice" tries to get list of assignment using the settings API
Then the HTTP status code should be "<http-status-code>"
Examples:
| user-role | http-status-code |
| Admin | 201 |
| Space Admin | 401 |
| User | 401 |
Scenario Outline: a user cannot change own role
Given user "Alice" has been created with default attributes
And the administrator has given "Alice" the role "<user-role>" using the settings api
When user "Alice" changes his own role to "<desired-role>"
Then the HTTP status code should be "400"
And user "Alice" should have the role "<user-role>"
Examples:
| user-role | desired-role |
| Admin | User |
| Admin | Space Admin |
| Space Admin | Admin |
| Space Admin | Space Admin |
| User | Admin |
| User | Space Admin |
Scenario Outline: only admin user can change the role for another user
Given these users have been created with default attributes:
| username |
| Alice |
| Brian |
And the administrator has given "Alice" the role "<user-role>" using the settings api
When user "Alice" changes the role "<desired-role>" for user "Brian"
Then the HTTP status code should be "<http-status-code>"
And user "Brian" should have the role "<expected-role>"
Examples:
| user-role | desired-role | http-status-code | expected-role |
| Admin | User | 201 | User |
| Admin | Space Admin | 201 | Space Admin |
| Admin | Admin | 201 | Admin |
| Space Admin | Admin | 400 | User |
| Space Admin | Space Admin | 400 | User |
| User | Admin | 400 | User |
| User | Space Admin | 400 | User |

View File

@@ -0,0 +1,18 @@
@skipOnReva
Feature: sharing
As a user
I want to be able to share files when passwords are stored with the full hash difficulty
So that I can give people secure controlled access to my data
Scenario Outline: creating a share of a file with a user
Given using OCS API version "<ocs-api-version>"
And user "Alice" has been created with default attributes
And user "Alice" has uploaded file with content "OpenCloud test text file 0" to "/textfile0.txt"
And user "Brian" has been created with default attributes
When user "Alice" shares file "textfile0.txt" with user "Brian" using the sharing API
And the content of file "/Shares/textfile0.txt" for user "Brian" should be "OpenCloud test text file 0"
Examples:
| ocs-api-version |
| 1 |
| 2 |

View File

@@ -0,0 +1,21 @@
@skipOnReva
Feature: upload file
As a user
I want to be able to upload files when passwords are stored with the full hash difficulty
So that I can store and share files securely between multiple client systems
Scenario Outline: upload a file and check download content
Given using OCS API version "<ocs-api-version>"
And user "Alice" has been created with default attributes
And using <dav-path-version> DAV path
When user "Alice" uploads file with content "uploaded content" to "/upload.txt" using the WebDAV API
Then the content of file "/upload.txt" for user "Alice" should be "uploaded content"
Examples:
| ocs-api-version | dav-path-version |
| 1 | old |
| 1 | new |
| 1 | spaces |
| 2 | old |
| 2 | new |
| 2 | spaces |

View File

@@ -0,0 +1,31 @@
@skipOnReva
Feature: attempt to PUT files with invalid password
As an admin
I want the system to be secure when passwords are stored with the full hash difficulty
So that unauthorised users do not have access to data
Background:
Given user "Alice" has been created with default attributes
And user "Alice" has created folder "/PARENT"
Scenario: send PUT requests to webDav endpoints as normal user with wrong password
When user "Alice" requests these endpoints with "PUT" including body "doesnotmatter" using password "invalid" about user "Alice"
| endpoint |
| /webdav/textfile0.txt |
| /dav/files/%username%/textfile0.txt |
| /webdav/PARENT |
| /dav/files/%username%/PARENT |
| /dav/files/%username%/PARENT/parent.txt |
Then the HTTP status code of responses on all endpoints should be "401"
Scenario: send PUT requests to webDav endpoints as normal user with no password
When user "Alice" requests these endpoints with "PUT" including body "doesnotmatter" using password "" about user "Alice"
| endpoint |
| /webdav/textfile0.txt |
| /dav/files/%username%/textfile0.txt |
| /webdav/PARENT |
| /dav/files/%username%/PARENT |
| /dav/files/%username%/PARENT/parent.txt |
Then the HTTP status code of responses on all endpoints should be "401"

View File

@@ -21,7 +21,7 @@ Feature: List upload sessions via CLI command
And the CLI response should not contain these entries:
| file0.txt |
@antivirus
Scenario: list all upload sessions that are currently in postprocessing
Given the following configs have been set:
| config | value |
@@ -39,7 +39,7 @@ Feature: List upload sessions via CLI command
And the CLI response should not contain these entries:
| virusFile.txt |
@antivirus
Scenario: list all upload sessions that are infected by virus
Given the following configs have been set:
| config | value |
@@ -109,7 +109,7 @@ Feature: List upload sessions via CLI command
And the CLI response should not contain these entries:
| file2.txt |
@antivirus
Scenario: clean all upload sessions that are not in post-processing
Given the following configs have been set:
| config | value |
@@ -126,7 +126,7 @@ Feature: List upload sessions via CLI command
And the CLI response should not contain these entries:
| file1.txt |
@antivirus
Scenario: clean upload sessions that are not in post-processing and is not virus infected
Given the following configs have been set:
| config | value |

View File

@@ -0,0 +1,19 @@
#!/bin/bash
# tests/acceptance/scripts/generate-virus-files.sh
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
TARGET_DIR="$SCRIPT_DIR/../filesForUpload/filesWithVirus"
echo "Generating EICAR test files..."
mkdir -p "$TARGET_DIR"
cd "$TARGET_DIR"
echo "Downloading eicar.com..."
curl -s -o eicar.com https://secure.eicar.org/eicar.com
echo "Downloading eicar_com.zip..."
curl -s -o eicar_com.zip https://secure.eicar.org/eicar_com.zip

View File

@@ -1,32 +1,4 @@
#!/usr/bin/env bash
# Terminal colors
if [ -n "${PLAIN_OUTPUT}" ]; then
# No colors
TC_GREEN=""
TC_RED=""
TC_CYAN=""
TC_RESET=""
else
# Colors
TC_GREEN="\033[32m"
TC_RED="\033[31m"
TC_CYAN="\033[36m"
TC_RESET="\033[0m"
fi
function log_failed(){
printf "${TC_RED}[FAILED] %s\n${TC_RESET}" "$1"
}
function log_info(){
printf "${TC_CYAN}[INFO] %s\n${TC_RESET}" "$1"
}
function log_error(){
printf "${TC_RED}[ERROR] %s\n${TC_RESET}" "$1"
}
[[ "${DEBUG}" == "true" ]] && set -x
# from http://stackoverflow.com/a/630387
@@ -59,8 +31,15 @@ if [ -n "${PLAIN_OUTPUT}" ]
then
# explicitly tell Behat to not do colored output
COLORS_OPTION="--no-colors"
# Use the Bash "null" command to do nothing, rather than use tput to set a color
RED_COLOR=":"
GREEN_COLOR=":"
YELLOW_COLOR=":"
else
COLORS_OPTION="--colors"
RED_COLOR="tput setaf 1"
GREEN_COLOR="tput setaf 2"
YELLOW_COLOR="tput setaf 3"
fi
# The following environment variables can be specified:
@@ -82,7 +61,7 @@ then
LINT_STATUS=$?
if [ ${LINT_STATUS} -ne 0 ]
then
log_error "Expected failures file ${EXPECTED_FAILURES_FILE} is invalid"
echo "Error: expected failures file ${EXPECTED_FAILURES_FILE} is invalid"
exit ${LINT_STATUS}
fi
fi
@@ -245,7 +224,7 @@ function run_behat_tests() {
# So exit the tests and do not lint expected failures when undefined steps exist.
if [[ ${SCENARIO_RESULTS} == *"undefined"* ]]
then
log_error "Undefined steps: There were some undefined steps found."
${RED_COLOR}; echo -e "Undefined steps: There were some undefined steps found."
exit 1
fi
# If there were no scenarios in the requested suite or feature that match
@@ -258,7 +237,7 @@ function run_behat_tests() {
MATCHING_COUNT=`grep -ca '^No scenarios$' ${TEST_LOG_FILE}`
if [ ${MATCHING_COUNT} -eq 1 ]
then
log_info "No matching scenarios were found."
echo "Information: no matching scenarios were found."
BEHAT_EXIT_STATUS=0
else
# Find the count of scenarios that passed and failed
@@ -301,9 +280,9 @@ function run_behat_tests() {
then
if [ -n "${BEHAT_SUITE_TO_RUN}" ]
then
log_info "Checking expected failures for suite: ${BEHAT_SUITE_TO_RUN}"
echo "Checking expected failures for suite ${BEHAT_SUITE_TO_RUN}"
else
log_info "Checking expected failures..."
echo "Checking expected failures"
fi
# Check that every failed scenario is in the list of expected failures
@@ -316,7 +295,7 @@ function run_behat_tests() {
grep "\[${SUITE_SCENARIO}\]" "${EXPECTED_FAILURES_FILE}" > /dev/null
if [ $? -ne 0 ]
then
log_error "Scenario ${SUITE_SCENARIO} failed but was not expected to fail."
echo "Error: Scenario ${SUITE_SCENARIO} failed but was not expected to fail."
UNEXPECTED_FAILED_SCENARIOS+=("${SUITE_SCENARIO}")
fi
done
@@ -357,7 +336,7 @@ function run_behat_tests() {
echo "${FAILED_SCENARIO_PATHS}" | grep ${SUITE_SCENARIO}$ > /dev/null
if [ $? -ne 0 ]
then
log_error "Scenario ${SUITE_SCENARIO} was expected to fail but did not fail."
echo "Info: Scenario ${SUITE_SCENARIO} was expected to fail but did not fail."
UNEXPECTED_PASSED_SCENARIOS+=("${SUITE_SCENARIO}")
fi
done < ${EXPECTED_FAILURES_FILE}
@@ -394,7 +373,7 @@ function run_behat_tests() {
:
else
echo ""
log_info "The following tests were skipped because they are tagged @skip:"
echo "The following tests were skipped because they are tagged @skip:"
cat "${DRY_RUN_FILE}" | tee -a ${TEST_LOG_FILE}
fi
rm -f "${DRY_RUN_FILE}"
@@ -547,6 +526,7 @@ fi
export IPV4_URL
export IPV6_URL
export FILES_FOR_UPLOAD="${SCRIPT_PATH}/filesForUpload/"
TEST_LOG_FILE=$(mktemp)
SCENARIOS_THAT_PASSED=0
@@ -588,6 +568,10 @@ for i in "${!BEHAT_SUITES[@]}"
done
done
TOTAL_SCENARIOS=$((SCENARIOS_THAT_PASSED + SCENARIOS_THAT_FAILED))
echo "runsh: Total ${TOTAL_SCENARIOS} scenarios (${SCENARIOS_THAT_PASSED} passed, ${SCENARIOS_THAT_FAILED} failed)"
# 3 types of things can have gone wrong:
# - some scenario failed (and it was not expected to fail)
# - some scenario passed (but it was expected to fail)
@@ -659,42 +643,37 @@ fi
if [ -n "${EXPECTED_FAILURES_FILE}" ]
then
log_info "Exit code after checking expected failures: ${FINAL_EXIT_STATUS}"
echo "runsh: Exit code after checking expected failures: ${FINAL_EXIT_STATUS}"
fi
if [ "${UNEXPECTED_FAILURE}" = true ]
then
log_failed "Total unexpected failed scenarios:"
printf "${TC_RED}- %s\n${TC_RESET}" "${UNEXPECTED_FAILED_SCENARIOS[@]}"
echo ""
${YELLOW_COLOR}; echo "runsh: Total unexpected failed scenarios throughout the test run:"
${RED_COLOR}; printf "%s\n" "${UNEXPECTED_FAILED_SCENARIOS[@]}"
else
log_info "There were no unexpected failures."
${GREEN_COLOR}; echo "runsh: There were no unexpected failures."
fi
if [ "${UNEXPECTED_SUCCESS}" = true ]
then
log_error "Total unexpected passed scenarios:"
printf "${TC_GREEN}- %s\n${TC_RESET}" "${ACTUAL_UNEXPECTED_PASS[@]}"
echo ""
${YELLOW_COLOR}; echo "runsh: Total unexpected passed scenarios throughout the test run:"
${RED_COLOR}; printf "%s\n" "${ACTUAL_UNEXPECTED_PASS[@]}"
else
log_info "There were no unexpected success."
${GREEN_COLOR}; echo "runsh: There were no unexpected success."
fi
if [ "${UNEXPECTED_BEHAT_EXIT_STATUS}" = true ]
then
log_error "The following Behat test runs exited with non-zero status:"
printf "${TC_RED}%s\n${TC_RESET}" "${UNEXPECTED_BEHAT_EXIT_STATUSES[@]}"
${YELLOW_COLOR}; echo "runsh: The following Behat test runs exited with non-zero status:"
${RED_COLOR}; printf "%s\n" "${UNEXPECTED_BEHAT_EXIT_STATUSES[@]}"
fi
TOTAL_SCENARIOS=$((SCENARIOS_THAT_PASSED + SCENARIOS_THAT_FAILED))
printf "Summary: %s scenarios (${TC_GREEN}%s passed${TC_RESET}, ${TC_RED}%s failed${TC_RESET})" "${TOTAL_SCENARIOS}" "${SCENARIOS_THAT_PASSED}" "${SCENARIOS_THAT_FAILED}"
echo ""
# # sync the file-system so all output will be flushed to storage.
# # In CI, we sometimes see that the last lines of output are missing.
# # In drone we sometimes see that the last lines of output are missing from the
# # drone log.
# sync
# # If we are running in CI, then sleep for a bit to (hopefully) let the
# # If we are running in drone CI, then sleep for a bit to (hopefully) let the
# # drone agent send all the output to the drone server.
# if [ -n "${CI_REPO}" ]
# then

View File

@@ -1,7 +1,7 @@
#!/bin/bash
# Set required environment variables
export START_TIKA=true
export LOCAL_TEST=true
export START_EMAIL=true
export WITH_WRAPPER=true
export STORAGE_DRIVER=${STORAGE_DRIVER:-posix}
@@ -10,15 +10,11 @@ export TEST_ROOT_PATH="/drone/src/tests"
# LOCAL TEST WITHOUT EXTRA ENVS
TEST_SERVER_URL="https://opencloud-server:9200"
OC_WRAPPER_URL="http://opencloud-server:5200"
if [ "$STORAGE_DRIVER" = "posix" ]; then
EXPECTED_FAILURES_FILE="tests/acceptance/expected-failures-posix-storage.md"
else
EXPECTED_FAILURES_FILE="tests/acceptance/expected-failures-decomposed-storage.md"
fi
EXPECTED_FAILURES_FILE="tests/acceptance/expected-failures-localAPI-on-decomposed-storage.md"
EXPECTED_FAILURES_FILE_FROM_CORE="tests/acceptance/expected-failures-API-on-decomposed-storage.md"
# Start server
make -C tests/acceptance/docker start-services
make -C tests/acceptance/docker start-server
# Wait until the server responds with HTTP 200
echo "Waiting for server to start..."
@@ -64,6 +60,7 @@ SUITES=(
"apiSharingNgShareInvitation"
"apiSharingNgLinkSharePermission"
"apiSharingNgLinkShareRoot"
"apiAccountsHashDifficulty"
"apiSearchContent"
"apiNotification"
)
@@ -142,7 +139,7 @@ for SUITE in "${CORE_SUITES[@]}"; do
LOG_FILE="$LOG_DIR/${SUITE}.log"
# Run suite
make test-acceptance-api TEST_SERVER_URL=$TEST_SERVER_URL EXPECTED_FAILURES_FILE=$EXPECTED_FAILURES_FILE BEHAT_SUITE=$SUITE SEND_SCENARIO_LINE_REFERENCES=true > "$LOG_FILE" 2>&1
make test-acceptance-api TEST_SERVER_URL=$TEST_SERVER_URL EXPECTED_FAILURES_FILE=$EXPECTED_FAILURES_FILE_FROM_CORE BEHAT_SUITE=$SUITE SEND_SCENARIO_LINE_REFERENCES=true > "$LOG_FILE" 2>&1
# Check if suite was successful
if [ $? -eq 0 ]; then

View File

@@ -1,13 +1,13 @@
#!/bin/bash
# Set required environment variables
export START_TIKA=true
export LOCAL_TEST=true
export WITH_WRAPPER=false
TEST_SERVER_URL="https://opencloud-server:9200"
# Start server
make -C tests/acceptance/docker start-services
make -C tests/acceptance/docker start-server
# Wait until the server responds with HTTP 200
echo "Waiting for server to start..."

View File

@@ -3476,9 +3476,6 @@ type JSONSchema_FieldConfiguration struct {
// parameter. Use this to avoid having auto generated path parameter names
// for overlapping paths.
PathParamName string `protobuf:"bytes,47,opt,name=path_param_name,json=pathParamName,proto3" json:"path_param_name,omitempty"`
// Declares this field to be deprecated. Allows for the generated OpenAPI
// parameter to be marked as deprecated without affecting the proto field.
Deprecated bool `protobuf:"varint,49,opt,name=deprecated,proto3" json:"deprecated,omitempty"`
unknownFields protoimpl.UnknownFields
sizeCache protoimpl.SizeCache
}
@@ -3515,21 +3512,10 @@ func (x *JSONSchema_FieldConfiguration) GetPathParamName() string {
return ""
}
func (x *JSONSchema_FieldConfiguration) GetDeprecated() bool {
if x != nil {
return x.Deprecated
}
return false
}
func (x *JSONSchema_FieldConfiguration) SetPathParamName(v string) {
x.PathParamName = v
}
func (x *JSONSchema_FieldConfiguration) SetDeprecated(v bool) {
x.Deprecated = v
}
type JSONSchema_FieldConfiguration_builder struct {
_ [0]func() // Prevents comparability and use of unkeyed literals for the builder.
@@ -3538,9 +3524,6 @@ type JSONSchema_FieldConfiguration_builder struct {
// parameter. Use this to avoid having auto generated path parameter names
// for overlapping paths.
PathParamName string
// Declares this field to be deprecated. Allows for the generated OpenAPI
// parameter to be marked as deprecated without affecting the proto field.
Deprecated bool
}
func (b0 JSONSchema_FieldConfiguration_builder) Build() *JSONSchema_FieldConfiguration {
@@ -3548,7 +3531,6 @@ func (b0 JSONSchema_FieldConfiguration_builder) Build() *JSONSchema_FieldConfigu
b, x := &b0, m0
_, _ = b, x
x.PathParamName = b.PathParamName
x.Deprecated = b.Deprecated
return m0
}
@@ -3922,7 +3904,7 @@ var file_protoc_gen_openapiv2_options_openapiv2_proto_rawDesc = []byte{
0x79, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x6b, 0x65, 0x79, 0x12, 0x2c, 0x0a, 0x05,
0x76, 0x61, 0x6c, 0x75, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16, 0x2e, 0x67, 0x6f,
0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x56, 0x61,
0x6c, 0x75, 0x65, 0x52, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x3a, 0x02, 0x38, 0x01, 0x22, 0xf7,
0x6c, 0x75, 0x65, 0x52, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x3a, 0x02, 0x38, 0x01, 0x22, 0xd7,
0x0a, 0x0a, 0x0a, 0x4a, 0x53, 0x4f, 0x4e, 0x53, 0x63, 0x68, 0x65, 0x6d, 0x61, 0x12, 0x10, 0x0a,
0x03, 0x72, 0x65, 0x66, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x72, 0x65, 0x66, 0x12,
0x14, 0x0a, 0x05, 0x74, 0x69, 0x74, 0x6c, 0x65, 0x18, 0x05, 0x20, 0x01, 0x28, 0x09, 0x52, 0x05,
@@ -3986,13 +3968,11 @@ var file_protoc_gen_openapiv2_options_openapiv2_proto_rawDesc = []byte{
0x67, 0x65, 0x6e, 0x5f, 0x6f, 0x70, 0x65, 0x6e, 0x61, 0x70, 0x69, 0x76, 0x32, 0x2e, 0x6f, 0x70,
0x74, 0x69, 0x6f, 0x6e, 0x73, 0x2e, 0x4a, 0x53, 0x4f, 0x4e, 0x53, 0x63, 0x68, 0x65, 0x6d, 0x61,
0x2e, 0x45, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x45, 0x6e, 0x74, 0x72, 0x79,
0x52, 0x0a, 0x65, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x1a, 0x5c, 0x0a, 0x12,
0x52, 0x0a, 0x65, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x1a, 0x3c, 0x0a, 0x12,
0x46, 0x69, 0x65, 0x6c, 0x64, 0x43, 0x6f, 0x6e, 0x66, 0x69, 0x67, 0x75, 0x72, 0x61, 0x74, 0x69,
0x6f, 0x6e, 0x12, 0x26, 0x0a, 0x0f, 0x70, 0x61, 0x74, 0x68, 0x5f, 0x70, 0x61, 0x72, 0x61, 0x6d,
0x5f, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x2f, 0x20, 0x01, 0x28, 0x09, 0x52, 0x0d, 0x70, 0x61, 0x74,
0x68, 0x50, 0x61, 0x72, 0x61, 0x6d, 0x4e, 0x61, 0x6d, 0x65, 0x12, 0x1e, 0x0a, 0x0a, 0x64, 0x65,
0x70, 0x72, 0x65, 0x63, 0x61, 0x74, 0x65, 0x64, 0x18, 0x31, 0x20, 0x01, 0x28, 0x08, 0x52, 0x0a,
0x64, 0x65, 0x70, 0x72, 0x65, 0x63, 0x61, 0x74, 0x65, 0x64, 0x1a, 0x55, 0x0a, 0x0f, 0x45, 0x78,
0x68, 0x50, 0x61, 0x72, 0x61, 0x6d, 0x4e, 0x61, 0x6d, 0x65, 0x1a, 0x55, 0x0a, 0x0f, 0x45, 0x78,
0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x45, 0x6e, 0x74, 0x72, 0x79, 0x12, 0x10, 0x0a,
0x03, 0x6b, 0x65, 0x79, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x6b, 0x65, 0x79, 0x12,
0x2c, 0x0a, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16,

View File

@@ -612,9 +612,6 @@ message JSONSchema {
// parameter. Use this to avoid having auto generated path parameter names
// for overlapping paths.
string path_param_name = 47;
// Declares this field to be deprecated. Allows for the generated OpenAPI
// parameter to be marked as deprecated without affecting the proto field.
bool deprecated = 49;
}
// Custom properties that start with "x-" such as "x-foo" used to describe
// extra functionality that is not covered by the standard OpenAPI Specification.

View File

@@ -3268,7 +3268,6 @@ func (b0 Scopes_builder) Build() *Scopes {
type JSONSchema_FieldConfiguration struct {
state protoimpl.MessageState `protogen:"opaque.v1"`
xxx_hidden_PathParamName string `protobuf:"bytes,47,opt,name=path_param_name,json=pathParamName,proto3" json:"path_param_name,omitempty"`
xxx_hidden_Deprecated bool `protobuf:"varint,49,opt,name=deprecated,proto3" json:"deprecated,omitempty"`
unknownFields protoimpl.UnknownFields
sizeCache protoimpl.SizeCache
}
@@ -3305,21 +3304,10 @@ func (x *JSONSchema_FieldConfiguration) GetPathParamName() string {
return ""
}
func (x *JSONSchema_FieldConfiguration) GetDeprecated() bool {
if x != nil {
return x.xxx_hidden_Deprecated
}
return false
}
func (x *JSONSchema_FieldConfiguration) SetPathParamName(v string) {
x.xxx_hidden_PathParamName = v
}
func (x *JSONSchema_FieldConfiguration) SetDeprecated(v bool) {
x.xxx_hidden_Deprecated = v
}
type JSONSchema_FieldConfiguration_builder struct {
_ [0]func() // Prevents comparability and use of unkeyed literals for the builder.
@@ -3328,9 +3316,6 @@ type JSONSchema_FieldConfiguration_builder struct {
// parameter. Use this to avoid having auto generated path parameter names
// for overlapping paths.
PathParamName string
// Declares this field to be deprecated. Allows for the generated OpenAPI
// parameter to be marked as deprecated without affecting the proto field.
Deprecated bool
}
func (b0 JSONSchema_FieldConfiguration_builder) Build() *JSONSchema_FieldConfiguration {
@@ -3338,7 +3323,6 @@ func (b0 JSONSchema_FieldConfiguration_builder) Build() *JSONSchema_FieldConfigu
b, x := &b0, m0
_, _ = b, x
x.xxx_hidden_PathParamName = b.PathParamName
x.xxx_hidden_Deprecated = b.Deprecated
return m0
}
@@ -3712,7 +3696,7 @@ var file_protoc_gen_openapiv2_options_openapiv2_proto_rawDesc = []byte{
0x79, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x6b, 0x65, 0x79, 0x12, 0x2c, 0x0a, 0x05,
0x76, 0x61, 0x6c, 0x75, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16, 0x2e, 0x67, 0x6f,
0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x56, 0x61,
0x6c, 0x75, 0x65, 0x52, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x3a, 0x02, 0x38, 0x01, 0x22, 0xf7,
0x6c, 0x75, 0x65, 0x52, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x3a, 0x02, 0x38, 0x01, 0x22, 0xd7,
0x0a, 0x0a, 0x0a, 0x4a, 0x53, 0x4f, 0x4e, 0x53, 0x63, 0x68, 0x65, 0x6d, 0x61, 0x12, 0x10, 0x0a,
0x03, 0x72, 0x65, 0x66, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x72, 0x65, 0x66, 0x12,
0x14, 0x0a, 0x05, 0x74, 0x69, 0x74, 0x6c, 0x65, 0x18, 0x05, 0x20, 0x01, 0x28, 0x09, 0x52, 0x05,
@@ -3776,13 +3760,11 @@ var file_protoc_gen_openapiv2_options_openapiv2_proto_rawDesc = []byte{
0x67, 0x65, 0x6e, 0x5f, 0x6f, 0x70, 0x65, 0x6e, 0x61, 0x70, 0x69, 0x76, 0x32, 0x2e, 0x6f, 0x70,
0x74, 0x69, 0x6f, 0x6e, 0x73, 0x2e, 0x4a, 0x53, 0x4f, 0x4e, 0x53, 0x63, 0x68, 0x65, 0x6d, 0x61,
0x2e, 0x45, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x45, 0x6e, 0x74, 0x72, 0x79,
0x52, 0x0a, 0x65, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x1a, 0x5c, 0x0a, 0x12,
0x52, 0x0a, 0x65, 0x78, 0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x1a, 0x3c, 0x0a, 0x12,
0x46, 0x69, 0x65, 0x6c, 0x64, 0x43, 0x6f, 0x6e, 0x66, 0x69, 0x67, 0x75, 0x72, 0x61, 0x74, 0x69,
0x6f, 0x6e, 0x12, 0x26, 0x0a, 0x0f, 0x70, 0x61, 0x74, 0x68, 0x5f, 0x70, 0x61, 0x72, 0x61, 0x6d,
0x5f, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x2f, 0x20, 0x01, 0x28, 0x09, 0x52, 0x0d, 0x70, 0x61, 0x74,
0x68, 0x50, 0x61, 0x72, 0x61, 0x6d, 0x4e, 0x61, 0x6d, 0x65, 0x12, 0x1e, 0x0a, 0x0a, 0x64, 0x65,
0x70, 0x72, 0x65, 0x63, 0x61, 0x74, 0x65, 0x64, 0x18, 0x31, 0x20, 0x01, 0x28, 0x08, 0x52, 0x0a,
0x64, 0x65, 0x70, 0x72, 0x65, 0x63, 0x61, 0x74, 0x65, 0x64, 0x1a, 0x55, 0x0a, 0x0f, 0x45, 0x78,
0x68, 0x50, 0x61, 0x72, 0x61, 0x6d, 0x4e, 0x61, 0x6d, 0x65, 0x1a, 0x55, 0x0a, 0x0f, 0x45, 0x78,
0x74, 0x65, 0x6e, 0x73, 0x69, 0x6f, 0x6e, 0x73, 0x45, 0x6e, 0x74, 0x72, 0x79, 0x12, 0x10, 0x0a,
0x03, 0x6b, 0x65, 0x79, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x6b, 0x65, 0x79, 0x12,
0x2c, 0x0a, 0x05, 0x76, 0x61, 0x6c, 0x75, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16,

View File

@@ -163,20 +163,15 @@ func (self *Image) populate_from_gif(g *gif.GIF) {
Delay: gifmeta.CalculateFrameDelay(g.Delay[i], min_gap),
}
switch prev_disposal {
case gif.DisposalNone, 0: // 1
case gif.DisposalNone, 0:
frame.ComposeOnto = frame.Number - 1
case gif.DisposalPrevious: // 3
case gif.DisposalPrevious:
frame.ComposeOnto = prev_compose_onto
case gif.DisposalBackground: // 2
if i > 0 && g.Delay[i-1] == 0 {
// this is in contravention of the GIF spec but browsers and
// gif2apng both do this, so follow them. Test images for this
// are apple.gif and disposal-background-with-delay.gif
frame.ComposeOnto = frame.Number - 1
} else {
// delay present, frame visible, so clear to background as the spec requires
frame.ComposeOnto = 0
}
case gif.DisposalBackground:
// this is in contravention of the GIF spec but browsers and
// gif2apng both do this, so follow them. Test image for this
// is apple.gif
frame.ComposeOnto = frame.Number - 1
}
prev_disposal, prev_compose_onto = g.Disposal[i], frame.ComposeOnto
self.Frames = append(self.Frames, &frame)

View File

@@ -5,7 +5,7 @@ import os
import subprocess
VERSION = "1.8.19"
VERSION = "1.8.18"
def run(*args: str):

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

@@ -719,18 +719,6 @@ opa_strings_upper,opa_abort
opa_strings_upper,opa_unicode_to_upper
opa_strings_upper,opa_realloc
opa_strings_upper,opa_unicode_encode_utf8
to_string,opa_value_type
to_string,opa_string_terminated
to_string,opa_value_dump
to_string,opa_strlen
to_string,opa_string_allocated
opa_template_string,opa_value_type
opa_template_string,opa_array_with_cap
opa_template_string,to_string
opa_template_string,opa_array_append
opa_template_string,opa_malloc
opa_template_string,memcpy
opa_template_string,opa_string_allocated
opa_types_is_number,opa_value_type
opa_types_is_number,opa_boolean
opa_types_is_string,opa_value_type
1 opa_agg_count opa_value_type
719 opa_strings_upper opa_unicode_to_upper
720 opa_strings_upper opa_realloc
721 opa_strings_upper opa_unicode_encode_utf8
to_string opa_value_type
to_string opa_string_terminated
to_string opa_value_dump
to_string opa_strlen
to_string opa_string_allocated
opa_template_string opa_value_type
opa_template_string opa_array_with_cap
opa_template_string to_string
opa_template_string opa_array_append
opa_template_string opa_malloc
opa_template_string memcpy
opa_template_string opa_string_allocated
722 opa_types_is_number opa_value_type
723 opa_types_is_number opa_boolean
724 opa_types_is_string opa_value_type

View File

Binary file not shown.

View File

@@ -162,7 +162,6 @@ var builtinsFunctions = map[string]string{
ast.TrimRight.Name: "opa_strings_trim_right",
ast.TrimSuffix.Name: "opa_strings_trim_suffix",
ast.TrimSpace.Name: "opa_strings_trim_space",
ast.InternalTemplateString.Name: "opa_template_string",
ast.NumbersRange.Name: "opa_numbers_range",
ast.ToNumber.Name: "opa_to_number",
ast.WalkBuiltin.Name: "opa_value_transitive_closure",

View File

@@ -1768,7 +1768,7 @@ func (p *Planner) planRef(ref ast.Ref, iter planiter) error {
return errors.New("illegal ref: non-var head")
}
if head.Equal(ast.DefaultRootDocument.Value) {
if head.Compare(ast.DefaultRootDocument.Value) == 0 {
virtual := p.rules.Get(ref[0].Value)
base := &baseptr{local: p.vars.GetOrEmpty(ast.DefaultRootDocument.Value.(ast.Var))}
return p.planRefData(virtual, base, ref, 1, iter)

View File

@@ -83,7 +83,7 @@ func readModule(r io.Reader) (*module.Module, error) {
var m module.Module
if err := readSections(r, &m); err != io.EOF {
if err := readSections(r, &m); err != nil && err != io.EOF {
return nil, err
}

View File

@@ -433,7 +433,18 @@ func (a *Annotations) toObject() (*Object, *Error) {
}
if len(a.Scope) > 0 {
obj.Insert(InternedTerm("scope"), InternedTerm(a.Scope))
switch a.Scope {
case annotationScopeDocument:
obj.Insert(InternedTerm("scope"), InternedTerm("document"))
case annotationScopePackage:
obj.Insert(InternedTerm("scope"), InternedTerm("package"))
case annotationScopeRule:
obj.Insert(InternedTerm("scope"), InternedTerm("rule"))
case annotationScopeSubpackages:
obj.Insert(InternedTerm("scope"), InternedTerm("subpackages"))
default:
obj.Insert(InternedTerm("scope"), StringTerm(a.Scope))
}
}
if len(a.Title) > 0 {

View File

@@ -151,7 +151,6 @@ var DefaultBuiltins = [...]*Builtin{
Sprintf,
StringReverse,
RenderTemplate,
InternalTemplateString,
// Numbers
NumbersRange,
@@ -1110,7 +1109,7 @@ var Concat = &Builtin{
types.Named("output", types.S).Description("the joined string"),
),
Categories: stringsCat,
CanSkipBctx: false,
CanSkipBctx: true,
}
var FormatInt = &Builtin{
@@ -1278,7 +1277,7 @@ var Replace = &Builtin{
types.Named("y", types.S).Description("string with replaced substrings"),
),
Categories: stringsCat,
CanSkipBctx: false,
CanSkipBctx: true,
}
var ReplaceN = &Builtin{
@@ -1298,7 +1297,7 @@ The old string comparisons are done in argument order.`,
),
types.Named("output", types.S).Description("string with replaced substrings"),
),
CanSkipBctx: false,
CanSkipBctx: true,
}
var RegexReplace = &Builtin{
@@ -3389,11 +3388,6 @@ var InternalTestCase = &Builtin{
Decl: types.NewFunction([]types.Type{types.NewArray(nil, types.A)}, nil),
}
var InternalTemplateString = &Builtin{
Name: "internal.template_string",
Decl: types.NewFunction([]types.Type{types.NewArray(nil, types.A)}, types.S),
}
/**
* Deprecated built-ins.
*/

View File

@@ -58,14 +58,12 @@ const FeatureRefHeads = "rule_head_refs"
const FeatureRegoV1 = "rego_v1"
const FeatureRegoV1Import = "rego_v1_import"
const FeatureKeywordsInRefs = "keywords_in_refs"
const FeatureTemplateStrings = "template_strings"
// Features carries the default features supported by this version of OPA.
// Use RegisterFeatures to add to them.
var Features = []string{
FeatureRegoV1,
FeatureKeywordsInRefs,
FeatureTemplateStrings,
}
// RegisterFeatures lets applications wrapping OPA register features, to be
@@ -271,12 +269,6 @@ func (c *Capabilities) ContainsFeature(feature string) bool {
return slices.Contains(c.Features, feature)
}
func (c *Capabilities) ContainsBuiltin(name string) bool {
return slices.ContainsFunc(c.Builtins, func(builtin *Builtin) bool {
return builtin.Name == name
})
}
// addBuiltinSorted inserts a built-in into c in sorted order. An existing built-in with the same name
// will be overwritten.
func (c *Capabilities) addBuiltinSorted(bi *Builtin) {

View File

@@ -7,6 +7,7 @@ package ast
import (
"fmt"
"slices"
"sort"
"strings"
"github.com/open-policy-agent/opa/v1/types"
@@ -15,6 +16,11 @@ import (
type varRewriter func(Ref) Ref
// exprChecker defines the interface for executing type checking on a single
// expression. The exprChecker must update the provided TypeEnv with inferred
// types of vars.
type exprChecker func(*TypeEnv, *Expr) *Error
// typeChecker implements type checking on queries and rules. Errors are
// accumulated on the typeChecker so that a single run can report multiple
// issues.
@@ -22,6 +28,7 @@ type typeChecker struct {
builtins map[string]*Builtin
required *Capabilities
errs Errors
exprCheckers map[string]exprChecker
varRewriter varRewriter
ss *SchemaSet
allowNet []string
@@ -32,7 +39,11 @@ type typeChecker struct {
// newTypeChecker returns a new typeChecker object that has no errors.
func newTypeChecker() *typeChecker {
return &typeChecker{}
return &typeChecker{
exprCheckers: map[string]exprChecker{
"eq": checkExprEq,
},
}
}
func (tc *typeChecker) newEnv(exist *TypeEnv) *TypeEnv {
@@ -115,39 +126,43 @@ func (tc *typeChecker) Env(builtins map[string]*Builtin) *TypeEnv {
// are found. The resulting TypeEnv wraps the provided one. The resulting
// TypeEnv will be able to resolve types of vars contained in the body.
func (tc *typeChecker) CheckBody(env *TypeEnv, body Body) (*TypeEnv, Errors) {
var errors []*Error
errors := []*Error{}
env = tc.newEnv(env)
vis := newRefChecker(env, tc.varRewriter)
gv := NewGenericVisitor(vis.Visit)
for _, bexpr := range body {
WalkExprs(bexpr, func(expr *Expr) bool {
closureErrs := tc.checkClosures(env, expr)
errors = append(errors, closureErrs...)
WalkExprs(body, func(expr *Expr) bool {
// reset errors from previous iteration
vis.errs = nil
gv.Walk(expr)
errors = append(errors, vis.errs...)
closureErrs := tc.checkClosures(env, expr)
for _, err := range closureErrs {
errors = append(errors, err)
}
if err := tc.checkExpr(env, expr); err != nil {
hasClosureErrors := len(closureErrs) > 0
hasRefErrors := len(vis.errs) > 0
// Suppress this error if a more actionable one has occurred. In
// this case, if an error occurred in a ref or closure contained in
// this expression, and the error is due to a nil type, then it's
// likely to be the result of the more specific error.
skip := (hasClosureErrors || hasRefErrors) && causedByNilType(err)
if !skip {
errors = append(errors, err)
}
hasClosureErrors := len(closureErrs) > 0
// reset errors from previous iteration
vis.errs = nil
NewGenericVisitor(vis.Visit).Walk(expr)
for _, err := range vis.errs {
errors = append(errors, err)
}
hasRefErrors := len(vis.errs) > 0
if err := tc.checkExpr(env, expr); err != nil {
// Suppress this error if a more actionable one has occurred. In
// this case, if an error occurred in a ref or closure contained in
// this expression, and the error is due to a nil type, then it's
// likely to be the result of the more specific error.
skip := (hasClosureErrors || hasRefErrors) && causedByNilType(err)
if !skip {
errors = append(errors, err)
}
return true
})
}
}
return true
})
tc.err(errors...)
tc.err(errors)
return env, errors
}
@@ -228,7 +243,7 @@ func (tc *typeChecker) checkRule(env *TypeEnv, as *AnnotationSet, rule *Rule) {
for _, schemaAnnot := range schemaAnnots {
refType, err := tc.getSchemaType(schemaAnnot, rule)
if err != nil {
tc.err(err)
tc.err([]*Error{err})
continue
}
@@ -244,7 +259,7 @@ func (tc *typeChecker) checkRule(env *TypeEnv, as *AnnotationSet, rule *Rule) {
} else {
newType, err := override(ref[len(prefixRef):], t, refType, rule)
if err != nil {
tc.err(err)
tc.err([]*Error{err})
continue
}
env.tree.Put(prefixRef, newType)
@@ -266,25 +281,23 @@ func (tc *typeChecker) checkRule(env *TypeEnv, as *AnnotationSet, rule *Rule) {
var tpe types.Type
if len(rule.Head.Args) > 0 {
for _, arg := range rule.Head.Args {
// If args are not referred to in body, infer as any.
WalkTerms(arg, func(t *Term) bool {
if _, ok := t.Value.(Var); ok {
if cpy.GetByValue(t.Value) == nil {
cpy.tree.PutOne(t.Value, types.A)
}
}
return false
})
}
// If args are not referred to in body, infer as any.
WalkVars(rule.Head.Args, func(v Var) bool {
if cpy.GetByValue(v) == nil {
cpy.tree.PutOne(v, types.A)
}
return false
})
// Construct function type.
args := make([]types.Type, len(rule.Head.Args))
for i := range rule.Head.Args {
for i := range len(rule.Head.Args) {
args[i] = cpy.GetByValue(rule.Head.Args[i].Value)
}
tpe = types.NewFunction(args, cpy.GetByValue(rule.Head.Value.Value))
f := types.NewFunction(args, cpy.Get(rule.Head.Value))
tpe = f
} else {
switch rule.Head.RuleKind() {
case SingleValue:
@@ -297,7 +310,7 @@ func (tc *typeChecker) checkRule(env *TypeEnv, as *AnnotationSet, rule *Rule) {
var err error
tpe, err = nestedObject(cpy, objPath, typeV)
if err != nil {
tc.err(NewError(TypeErr, rule.Head.Location, "%s", err.Error()))
tc.err([]*Error{NewError(TypeErr, rule.Head.Location, "%s", err.Error())})
tpe = nil
}
} else if typeV != nil {
@@ -361,8 +374,9 @@ func (tc *typeChecker) checkExpr(env *TypeEnv, expr *Expr) *Error {
}
}
if operator == "eq" {
return checkExprEq(env, expr)
checker := tc.exprCheckers[operator]
if checker != nil {
return checker(env, expr)
}
return tc.checkExprBuiltin(env, expr)
@@ -585,7 +599,7 @@ func unify1(env *TypeEnv, term *Term, tpe types.Type, union bool) bool {
return unifies
}
return false
case *set:
case Set:
switch tpe := tpe.(type) {
case *types.Set:
return unify1Set(env, v, tpe, union)
@@ -660,14 +674,14 @@ func unify1Object(env *TypeEnv, val Object, tpe *types.Object, union bool) bool
return !stop
}
func unify1Set(env *TypeEnv, val *set, tpe *types.Set, union bool) bool {
func unify1Set(env *TypeEnv, val Set, tpe *types.Set, union bool) bool {
of := types.Values(tpe)
return !val.Until(func(elem *Term) bool {
return !unify1(env, elem, of, union)
})
}
func (tc *typeChecker) err(errors ...*Error) {
func (tc *typeChecker) err(errors []*Error) {
tc.errs = append(tc.errs, errors...)
}
@@ -688,6 +702,7 @@ func newRefChecker(env *TypeEnv, f varRewriter) *refChecker {
return &refChecker{
env: env,
errs: nil,
varRewriter: f,
}
}
@@ -699,9 +714,8 @@ func (rc *refChecker) Visit(x any) bool {
case *Expr:
switch terms := x.Terms.(type) {
case []*Term:
vis := NewGenericVisitor(rc.Visit)
for i := 1; i < len(terms); i++ {
vis.Walk(terms[i])
NewGenericVisitor(rc.Visit).Walk(terms[i])
}
return true
case *Term:
@@ -791,6 +805,7 @@ func (rc *refChecker) checkRef(curr *TypeEnv, node *typeTreeNode, ref Ref, idx i
}
func (rc *refChecker) checkRefLeaf(tpe types.Type, ref Ref, idx int) *Error {
if idx == len(ref) {
return nil
}
@@ -805,16 +820,16 @@ func (rc *refChecker) checkRefLeaf(tpe types.Type, ref Ref, idx int) *Error {
switch value := head.Value.(type) {
case Var:
if exist := rc.env.GetByValue(head.Value); exist != nil {
if exist := rc.env.GetByValue(value); exist != nil {
if !unifies(exist, keys) {
return newRefErrInvalid(ref[0].Location, rc.varRewriter(ref), idx, exist, keys, getOneOfForType(tpe))
}
} else {
rc.env.tree.PutOne(head.Value, types.Keys(tpe))
rc.env.tree.PutOne(value, types.Keys(tpe))
}
case Ref:
if exist := rc.env.GetByRef(value); exist != nil {
if exist := rc.env.Get(value); exist != nil {
if !unifies(exist, keys) {
return newRefErrInvalid(ref[0].Location, rc.varRewriter(ref), idx, exist, keys, getOneOfForType(tpe))
}
@@ -1115,7 +1130,7 @@ func getOneOfForNode(node *typeTreeNode) (result []Value) {
return false
})
slices.SortFunc(result, Value.Compare)
sortValueSlice(result)
return result
}
@@ -1138,10 +1153,16 @@ func getOneOfForType(tpe types.Type) (result []Value) {
}
result = removeDuplicate(result)
slices.SortFunc(result, Value.Compare)
sortValueSlice(result)
return result
}
func sortValueSlice(sl []Value) {
sort.Slice(sl, func(i, j int) bool {
return sl[i].Compare(sl[j]) < 0
})
}
func removeDuplicate(list []Value) []Value {
seen := make(map[Value]bool)
var newResult []Value
@@ -1165,13 +1186,13 @@ func getArgTypes(env *TypeEnv, args []*Term) []types.Type {
// getPrefix returns the shortest prefix of ref that exists in env
func getPrefix(env *TypeEnv, ref Ref) (Ref, types.Type) {
if len(ref) == 1 {
t := env.GetByRef(ref)
t := env.Get(ref)
if t != nil {
return ref, t
}
}
for i := 1; i < len(ref); i++ {
t := env.GetByRef(ref[:i])
t := env.Get(ref[:i])
if t != nil {
return ref[:i], t
}
@@ -1179,14 +1200,12 @@ func getPrefix(env *TypeEnv, ref Ref) (Ref, types.Type) {
return nil, nil
}
var dynamicAnyAny = types.NewDynamicProperty(types.A, types.A)
// override takes a type t and returns a type obtained from t where the path represented by ref within it has type o (overriding the original type of that path)
func override(ref Ref, t types.Type, o types.Type, rule *Rule) (types.Type, *Error) {
var newStaticProps []*types.StaticProperty
obj, ok := t.(*types.Object)
if !ok {
newType, err := getObjectType(ref, o, rule, dynamicAnyAny)
newType, err := getObjectType(ref, o, rule, types.NewDynamicProperty(types.A, types.A))
if err != nil {
return nil, err
}

View File

@@ -96,9 +96,6 @@ func Compare(a, b any) int {
return -1
}
return 1
case *TemplateString:
b := b.(*TemplateString)
return a.Compare(b)
case Var:
return VarCompare(a, b.(Var))
case Ref:
@@ -182,28 +179,26 @@ func sortOrder(x any) int {
return 2
case String:
return 3
case *TemplateString:
return 4
case Var:
return 5
return 4
case Ref:
return 6
return 5
case *Array:
return 7
return 6
case Object:
return 8
return 7
case Set:
return 9
return 8
case *ArrayComprehension:
return 10
return 9
case *ObjectComprehension:
return 11
return 10
case *SetComprehension:
return 12
return 11
case Call:
return 13
return 12
case Args:
return 14
return 13
case *Expr:
return 100
case *SomeDecl:
@@ -327,6 +322,14 @@ func TermValueEqual(a, b *Term) bool {
}
func ValueEqual(a, b Value) bool {
// TODO(ae): why doesn't this work the same?
//
// case interface{ Equal(Value) bool }:
// return v.Equal(b)
//
// When put on top, golangci-lint even flags the other cases as unreachable..
// but TestTopdownVirtualCache will have failing test cases when we replace
// the other cases with the above one.. 🤔
switch v := a.(type) {
case Null:
return v.Equal(b)
@@ -342,8 +345,6 @@ func ValueEqual(a, b Value) bool {
return v.Equal(b)
case *Array:
return v.Equal(b)
case *TemplateString:
return v.Equal(b)
}
return a.Compare(b) == 0

View File

File diff suppressed because it is too large Load Diff

View File

@@ -54,14 +54,15 @@ func (env *TypeEnv) GetByValue(v Value) types.Type {
return types.B
case Number:
return types.N
case String, *TemplateString:
case String:
return types.S
// Composites.
case *Array:
static := make([]types.Type, x.Len())
for i := range static {
static[i] = env.GetByValue(x.Elem(i).Value)
tpe := env.GetByValue(x.Elem(i).Value)
static[i] = tpe
}
var dynamic types.Type
@@ -79,13 +80,17 @@ func (env *TypeEnv) GetByValue(v Value) types.Type {
x.Foreach(func(k, v *Term) {
if IsConstant(k.Value) {
if kjson, err := JSON(k.Value); err == nil {
static = append(static, types.NewStaticProperty(kjson, env.GetByValue(v.Value)))
kjson, err := JSON(k.Value)
if err == nil {
tpe := env.GetByValue(v.Value)
static = append(static, types.NewStaticProperty(kjson, tpe))
return
}
}
// Can't handle it as a static property, fallback to dynamic
dynamic = types.NewDynamicProperty(env.GetByValue(k.Value), env.GetByValue(v.Value))
typeK := env.GetByValue(k.Value)
typeV := env.GetByValue(v.Value)
dynamic = types.NewDynamicProperty(typeK, typeV)
})
if len(static) == 0 && dynamic == nil {
@@ -94,7 +99,7 @@ func (env *TypeEnv) GetByValue(v Value) types.Type {
return types.NewObject(static, dynamic)
case *set:
case Set:
var tpe types.Type
x.Foreach(func(elem *Term) {
tpe = types.Or(tpe, env.GetByValue(elem.Value))
@@ -157,6 +162,7 @@ func (env *TypeEnv) GetByRef(ref Ref) types.Type {
}
func (env *TypeEnv) getRefFallback(ref Ref) types.Type {
if env.next != nil {
return env.next.GetByRef(ref)
}
@@ -293,11 +299,15 @@ func (n *typeTreeNode) PutOne(key Value, tpe types.Type) {
func (n *typeTreeNode) Put(path Ref, tpe types.Type) {
curr := n
for _, term := range path {
child, ok := curr.children.Get(term.Value)
c, ok := curr.children.Get(term.Value)
var child *typeTreeNode
if !ok {
child = newTypeTree()
child.key = term.Value
curr.children.Put(child.key, child)
} else {
child = c
}
curr = child
@@ -311,18 +321,23 @@ func (n *typeTreeNode) Put(path Ref, tpe types.Type) {
func (n *typeTreeNode) Insert(path Ref, tpe types.Type, env *TypeEnv) {
curr := n
for i, term := range path {
child, ok := curr.children.Get(term.Value)
c, ok := curr.children.Get(term.Value)
var child *typeTreeNode
if !ok {
child = newTypeTree()
child.key = term.Value
curr.children.Put(child.key, child)
} else if child.value != nil && i+1 < len(path) {
// If child has an object value, merge the new value into it.
if o, ok := child.value.(*types.Object); ok {
var err error
child.value, err = insertIntoObject(o, path[i+1:], tpe, env)
if err != nil {
panic(fmt.Errorf("unreachable, insertIntoObject: %w", err))
} else {
child = c
if child.value != nil && i+1 < len(path) {
// If child has an object value, merge the new value into it.
if o, ok := child.value.(*types.Object); ok {
var err error
child.value, err = insertIntoObject(o, path[i+1:], tpe, env)
if err != nil {
panic(fmt.Errorf("unreachable, insertIntoObject: %w", err))
}
}
}
}
@@ -334,7 +349,8 @@ func (n *typeTreeNode) Insert(path Ref, tpe types.Type, env *TypeEnv) {
if _, ok := tpe.(*types.Object); ok && curr.children.Len() > 0 {
// merge all leafs into the inserted object
for p, t := range curr.Leafs() {
leafs := curr.Leafs()
for p, t := range leafs {
var err error
curr.value, err = insertIntoObject(curr.value.(*types.Object), *p, t, env)
if err != nil {
@@ -372,8 +388,7 @@ func mergeTypes(a, b types.Type) types.Type {
bDynProps := bObj.DynamicProperties()
dynProps := types.NewDynamicProperty(
types.Or(aDynProps.Key, bDynProps.Key),
mergeTypes(aDynProps.Value, bDynProps.Value),
)
mergeTypes(aDynProps.Value, bDynProps.Value))
return types.NewObject(nil, dynProps)
} else if bAny, ok := b.(types.Any); ok && len(a.StaticProperties()) == 0 {
// If a is an object type with no static components ...
@@ -402,14 +417,14 @@ func mergeTypes(a, b types.Type) types.Type {
}
func (n *typeTreeNode) String() string {
b := &strings.Builder{}
b := strings.Builder{}
key := "-"
if k := n.key; k != nil {
key = k.String()
b.WriteString(k.String())
} else {
b.WriteString("-")
}
b.WriteString(key)
if v := n.value; v != nil {
b.WriteString(": ")
b.WriteString(v.String())
@@ -417,7 +432,9 @@ func (n *typeTreeNode) String() string {
n.children.Iter(func(_ Value, child *typeTreeNode) bool {
b.WriteString("\n\t+ ")
b.WriteString(strings.ReplaceAll(child.String(), "\n", "\n\t"))
s := child.String()
s = strings.ReplaceAll(s, "\n", "\n\t")
b.WriteString(s)
return false
})
@@ -468,8 +485,7 @@ func (n *typeTreeNode) Leafs() map[*Ref]types.Type {
func collectLeafs(n *typeTreeNode, path Ref, leafs map[*Ref]types.Type) {
nPath := append(path, NewTerm(n.key))
if n.Leaf() {
npc := nPath // copy of else nPath escapes to heap even if !n.Leaf()
leafs[&npc] = n.Value()
leafs[&nPath] = n.Value()
return
}
n.children.Iter(func(_ Value, v *typeTreeNode) bool {
@@ -497,6 +513,7 @@ func selectConstant(tpe types.Type, term *Term) types.Type {
// contains vars or refs, then the returned type will be a union of the
// possible types.
func selectRef(tpe types.Type, ref Ref) types.Type {
if tpe == nil || len(ref) == 0 {
return tpe
}

View File

@@ -121,13 +121,9 @@ func (e *Error) Error() string {
// NewError returns a new Error object.
func NewError(code string, loc *Location, f string, a ...any) *Error {
return newErrorString(code, loc, fmt.Sprintf(f, a...))
}
func newErrorString(code string, loc *Location, m string) *Error {
return &Error{
Code: code,
Location: loc,
Message: m,
Message: fmt.Sprintf(f, a...),
}
}

View File

@@ -412,7 +412,7 @@ func (i *refindices) updateGlobMatch(rule *Rule, expr *Expr) {
if _, ok := match.Value.(Var); ok {
var ref Ref
for _, other := range i.rules[rule] {
if ov, ok := other.Value.(Var); ok && ov.Equal(match.Value) {
if _, ok := other.Value.(Var); ok && other.Value.Compare(match.Value) == 0 {
ref = other.Ref
}
}

View File

@@ -158,42 +158,18 @@ func (s *Scanner) WithoutKeywords(kws map[string]tokens.Token) (*Scanner, map[st
return &cpy, kw
}
type ScanOptions struct {
continueTemplateString bool
rawTemplateString bool
}
type ScanOption func(*ScanOptions)
// ContinueTemplateString will continue scanning a template string
func ContinueTemplateString(raw bool) ScanOption {
return func(opts *ScanOptions) {
opts.continueTemplateString = true
opts.rawTemplateString = raw
}
}
// Scan will increment the scanners position in the source
// code until the next token is found. The token, starting position
// of the token, string literal, and any errors encountered are
// returned. A token will always be returned, the caller must check
// for any errors before using the other values.
func (s *Scanner) Scan(opts ...ScanOption) (tokens.Token, Position, string, []Error) {
scanOpts := &ScanOptions{}
for _, opt := range opts {
opt(scanOpts)
}
func (s *Scanner) Scan() (tokens.Token, Position, string, []Error) {
pos := Position{Offset: s.offset - s.width, Row: s.row, Col: s.col, Tabs: s.tabs}
var tok tokens.Token
var lit string
if scanOpts.continueTemplateString {
if scanOpts.rawTemplateString {
lit, tok = s.scanRawTemplateString()
} else {
lit, tok = s.scanTemplateString()
}
} else if s.isWhitespace() {
if s.isWhitespace() {
// string(rune) is an unnecessary heap allocation in this case as we know all
// the possible whitespace values, and can simply translate to string ourselves
switch s.curr {
@@ -299,17 +275,6 @@ func (s *Scanner) Scan(opts ...ScanOption) (tokens.Token, Position, string, []Er
tok = tokens.Semicolon
case '.':
tok = tokens.Dot
case '$':
switch s.curr {
case '`':
s.next()
lit, tok = s.scanRawTemplateString()
case '"':
s.next()
lit, tok = s.scanTemplateString()
default:
s.error("illegal $ character")
}
}
}
@@ -430,116 +395,6 @@ func (s *Scanner) scanRawString() string {
return util.ByteSliceToString(s.bs[start : s.offset-1])
}
func (s *Scanner) scanTemplateString() (string, tokens.Token) {
tok := tokens.TemplateStringPart
start := s.literalStart()
var escapes []int
for {
ch := s.curr
if ch == '\n' || ch < 0 {
s.error("non-terminated string")
break
}
s.next()
if ch == '"' {
tok = tokens.TemplateStringEnd
break
}
if ch == '{' {
break
}
if ch == '\\' {
switch s.curr {
case '\\', '"', '/', 'b', 'f', 'n', 'r', 't':
s.next()
case '{':
escapes = append(escapes, s.offset-1)
s.next()
case 'u':
s.next()
s.next()
s.next()
s.next()
default:
s.error("illegal escape sequence")
}
}
}
// Lazily remove escapes to not unnecessarily allocate a new byte slice
if len(escapes) > 0 {
return util.ByteSliceToString(removeEscapes(s, escapes, start)), tok
}
return util.ByteSliceToString(s.bs[start : s.offset-1]), tok
}
func (s *Scanner) scanRawTemplateString() (string, tokens.Token) {
tok := tokens.RawTemplateStringPart
start := s.literalStart()
var escapes []int
for {
ch := s.curr
if ch < 0 {
s.error("non-terminated string")
break
}
s.next()
if ch == '`' {
tok = tokens.RawTemplateStringEnd
break
}
if ch == '{' {
break
}
if ch == '\\' {
switch s.curr {
case '{':
escapes = append(escapes, s.offset-1)
s.next()
}
}
}
// Lazily remove escapes to not unnecessarily allocate a new byte slice
if len(escapes) > 0 {
return util.ByteSliceToString(removeEscapes(s, escapes, start)), tok
}
return util.ByteSliceToString(s.bs[start : s.offset-1]), tok
}
func removeEscapes(s *Scanner, escapes []int, start int) []byte {
from := start
bs := make([]byte, 0, s.offset-start-len(escapes))
for _, escape := range escapes {
// Append the bytes before the escape sequence.
if escape > from {
bs = append(bs, s.bs[from:escape-1]...)
}
// Skip the escape character.
from = escape
}
// Append the remaining bytes after the last escape sequence.
if from < s.offset-1 {
bs = append(bs, s.bs[from:s.offset-1]...)
}
return bs
}
func (s *Scanner) scanComment() string {
start := s.literalStart()
for s.curr != '\n' && s.curr != -1 {

View File

@@ -39,10 +39,6 @@ const (
Number
String
TemplateStringPart
TemplateStringEnd
RawTemplateStringPart
RawTemplateStringEnd
LBrack
RBrack
@@ -71,7 +67,6 @@ const (
Lte
Dot
Semicolon
Dollar
Every
Contains
@@ -79,58 +74,53 @@ const (
)
var strings = [...]string{
Illegal: "illegal",
EOF: "eof",
Whitespace: "whitespace",
Comment: "comment",
Ident: "identifier",
Package: "package",
Import: "import",
As: "as",
Default: "default",
Else: "else",
Not: "not",
Some: "some",
With: "with",
Null: "null",
True: "true",
False: "false",
Number: "number",
String: "string",
TemplateStringPart: "template-string-part",
TemplateStringEnd: "template-string-end",
RawTemplateStringPart: "raw-template-string-part",
RawTemplateStringEnd: "raw-template-string-end",
LBrack: "[",
RBrack: "]",
LBrace: "{",
RBrace: "}",
LParen: "(",
RParen: ")",
Comma: ",",
Colon: ":",
Add: "plus",
Sub: "minus",
Mul: "mul",
Quo: "div",
Rem: "rem",
And: "and",
Or: "or",
Unify: "eq",
Equal: "equal",
Assign: "assign",
In: "in",
Neq: "neq",
Gt: "gt",
Lt: "lt",
Gte: "gte",
Lte: "lte",
Dot: ".",
Semicolon: ";",
Dollar: "dollar",
Every: "every",
Contains: "contains",
If: "if",
Illegal: "illegal",
EOF: "eof",
Whitespace: "whitespace",
Comment: "comment",
Ident: "identifier",
Package: "package",
Import: "import",
As: "as",
Default: "default",
Else: "else",
Not: "not",
Some: "some",
With: "with",
Null: "null",
True: "true",
False: "false",
Number: "number",
String: "string",
LBrack: "[",
RBrack: "]",
LBrace: "{",
RBrace: "}",
LParen: "(",
RParen: ")",
Comma: ",",
Colon: ":",
Add: "plus",
Sub: "minus",
Mul: "mul",
Quo: "div",
Rem: "rem",
And: "and",
Or: "or",
Unify: "eq",
Equal: "equal",
Assign: "assign",
In: "in",
Neq: "neq",
Gt: "gt",
Lt: "lt",
Gte: "gte",
Lte: "lte",
Dot: ".",
Semicolon: ";",
Every: "every",
Contains: "contains",
If: "if",
}
var keywords = map[string]Token{
@@ -157,7 +147,3 @@ func IsKeyword(tok Token) bool {
_, ok := keywords[strings[tok]]
return ok
}
func KeywordFor(tok Token) string {
return strings[tok]
}

View File

@@ -42,17 +42,10 @@ var (
}
internedVarValues = map[string]Value{
"input": Var("input"),
"data": Var("data"),
"args": Var("args"),
"schema": Var("schema"),
"key": Var("key"),
"value": Var("value"),
"future": Var("future"),
"rego": Var("rego"),
"set": Var("set"),
"internal": Var("internal"),
"else": Var("else"),
"input": Var("input"),
"data": Var("data"),
"key": Var("key"),
"value": Var("value"),
"i": Var("i"), "j": Var("j"), "k": Var("k"), "v": Var("v"), "x": Var("x"), "y": Var("y"), "z": Var("z"),
}
@@ -197,13 +190,6 @@ func InternedTerm[T internable](v T) *Term {
}
}
// InternedItem works just like [Item] but returns interned terms for both
// key and value where possible. This is mostly useful for making tests less
// verbose.
func InternedItem[K, V internable](key K, value V) [2]*Term {
return [2]*Term{InternedTerm(key), InternedTerm(value)}
}
// InternedIntFromString returns a term with the given integer value if the string
// maps to an interned term. If the string does not map to an interned term, nil is
// returned.

View File

@@ -1736,10 +1736,6 @@ func (p *Parser) parseTerm() *Term {
term = p.parseNumber()
case tokens.String:
term = p.parseString()
case tokens.TemplateStringPart, tokens.TemplateStringEnd:
term = p.parseTemplateString(false)
case tokens.RawTemplateStringPart, tokens.RawTemplateStringEnd:
term = p.parseTemplateString(true)
case tokens.Ident, tokens.Contains: // NOTE(sr): contains anywhere BUT in rule heads gets no special treatment
term = p.parseVar()
case tokens.LBrack:
@@ -1771,7 +1767,7 @@ func (p *Parser) parseTermFinish(head *Term, skipws bool) *Term {
return nil
}
offset := p.s.loc.Offset
p.doScan(skipws, noScanOptions...)
p.doScan(skipws)
switch p.s.tok {
case tokens.LParen, tokens.Dot, tokens.LBrack:
@@ -1792,7 +1788,7 @@ func (p *Parser) parseHeadFinish(head *Term, skipws bool) *Term {
return nil
}
offset := p.s.loc.Offset
p.scanWS()
p.doScan(false)
switch p.s.tok {
case tokens.Add, tokens.Sub, tokens.Mul, tokens.Quo, tokens.Rem,
@@ -1800,7 +1796,7 @@ func (p *Parser) parseHeadFinish(head *Term, skipws bool) *Term {
tokens.Equal, tokens.Neq, tokens.Gt, tokens.Gte, tokens.Lt, tokens.Lte:
p.illegalToken()
case tokens.Whitespace:
p.doScan(skipws, noScanOptions...)
p.doScan(skipws)
}
switch p.s.tok {
@@ -1890,11 +1886,6 @@ func (p *Parser) parseString() *Term {
return NewTerm(InternedEmptyString.Value).SetLocation(p.s.Loc())
}
inner := p.s.lit[1 : len(p.s.lit)-1]
if !strings.ContainsRune(inner, '\\') { // nothing to un-escape
return StringTerm(inner).SetLocation(p.s.Loc())
}
var s string
if err := json.Unmarshal([]byte(p.s.lit), &s); err != nil {
p.errorf(p.s.Loc(), "illegal string literal: %s", p.s.lit)
@@ -1912,120 +1903,6 @@ func (p *Parser) parseRawString() *Term {
return StringTerm(p.s.lit[1 : len(p.s.lit)-1]).SetLocation(p.s.Loc())
}
func templateStringPartToStringLiteral(tok tokens.Token, lit string) (string, error) {
switch tok {
case tokens.TemplateStringPart, tokens.TemplateStringEnd:
inner := lit[1 : len(lit)-1]
if !strings.ContainsRune(inner, '\\') { // nothing to un-escape
return inner, nil
}
buf := make([]byte, 0, len(inner)+2)
buf = append(buf, '"')
buf = append(buf, inner...)
buf = append(buf, '"')
var s string
if err := json.Unmarshal(buf, &s); err != nil {
return "", fmt.Errorf("illegal template-string part: %s", lit)
}
return s, nil
case tokens.RawTemplateStringPart, tokens.RawTemplateStringEnd:
return lit[1 : len(lit)-1], nil
default:
return "", errors.New("expected template-string part")
}
}
func (p *Parser) parseTemplateString(multiLine bool) *Term {
loc := p.s.Loc()
if !p.po.Capabilities.ContainsFeature(FeatureTemplateStrings) {
p.errorf(loc, "template strings are not supported by current capabilities")
return nil
}
var parts []Node
for {
s, err := templateStringPartToStringLiteral(p.s.tok, p.s.lit)
if err != nil {
p.error(p.s.Loc(), err.Error())
return nil
}
// Don't add empty strings
if len(s) > 0 {
parts = append(parts, StringTerm(s).SetLocation(p.s.Loc()))
}
if p.s.tok == tokens.TemplateStringEnd || p.s.tok == tokens.RawTemplateStringEnd {
break
}
numCommentsBefore := len(p.s.comments)
p.scan()
numCommentsAfter := len(p.s.comments)
expr := p.parseLiteral()
if expr == nil {
p.error(p.s.Loc(), "invalid template-string expression")
return nil
}
if expr.Negated {
p.errorf(expr.Loc(), "unexpected negation ('%s') in template-string expression", tokens.KeywordFor(tokens.Not))
return nil
}
// Note: Actually unification
if expr.IsEquality() {
p.errorf(expr.Loc(), "unexpected unification ('=') in template-string expression")
return nil
}
if expr.IsAssignment() {
p.errorf(expr.Loc(), "unexpected assignment (':=') in template-string expression")
return nil
}
if expr.IsEvery() {
p.errorf(expr.Loc(), "unexpected '%s' in template-string expression", tokens.KeywordFor(tokens.Every))
return nil
}
if expr.IsSome() {
p.errorf(expr.Loc(), "unexpected '%s' in template-string expression", tokens.KeywordFor(tokens.Some))
return nil
}
// FIXME: Can we optimize for collections and comprehensions too? To qualify, they must not contain refs or calls.
var nonOptional bool
if term, ok := expr.Terms.(*Term); ok && numCommentsAfter == numCommentsBefore {
switch term.Value.(type) {
case String, Number, Boolean, Null:
nonOptional = true
parts = append(parts, term)
}
}
if !nonOptional {
parts = append(parts, expr)
}
if p.s.tok != tokens.RBrace {
p.errorf(p.s.Loc(), "expected %s to end template string expression", tokens.RBrace)
return nil
}
p.doScan(false, scanner.ContinueTemplateString(multiLine))
}
// When there are template-expressions, the initial location will only contain the text up to the first expression
loc.Text = p.s.Text(loc.Offset, p.s.tokEnd)
return TemplateStringTerm(multiLine, parts...).SetLocation(loc)
}
func (p *Parser) parseCall(operator *Term, offset int) (term *Term) {
if !p.enter() {
return nil
@@ -2579,17 +2456,15 @@ func (p *Parser) illegalToken() {
p.illegal("")
}
var noScanOptions []scanner.ScanOption
func (p *Parser) scan() {
p.doScan(true, noScanOptions...)
p.doScan(true)
}
func (p *Parser) scanWS() {
p.doScan(false, noScanOptions...)
p.doScan(false)
}
func (p *Parser) doScan(skipws bool, scanOpts ...scanner.ScanOption) {
func (p *Parser) doScan(skipws bool) {
// NOTE(tsandall): the last position is used to compute the "text" field for
// complex AST nodes. Whitespace never affects the last position of an AST
@@ -2602,7 +2477,7 @@ func (p *Parser) doScan(skipws bool, scanOpts ...scanner.ScanOption) {
var errs []scanner.Error
for {
var pos scanner.Position
p.s.tok, pos, p.s.lit, errs = p.s.s.Scan(scanOpts...)
p.s.tok, pos, p.s.lit, errs = p.s.s.Scan()
p.s.tokEnd = pos.End
p.s.loc.Row = pos.Row
@@ -2657,10 +2532,12 @@ func (p *Parser) restore(s *state) {
}
func setLocRecursive(x any, loc *location.Location) {
WalkNodes(x, func(n Node) bool {
n.SetLoc(loc)
NewGenericVisitor(func(x any) bool {
if node, ok := x.(Node); ok {
node.SetLoc(loc)
}
return false
})
}).Walk(x)
}
func (p *Parser) setLoc(term *Term, loc *location.Location, offset, end int) *Term {

View File

@@ -11,6 +11,7 @@
package ast
import (
"bytes"
"errors"
"fmt"
"slices"
@@ -624,9 +625,10 @@ func ParseStatements(filename, input string) ([]Statement, []*Comment, error) {
// ParseStatementsWithOpts returns a slice of parsed statements. This is the
// default return value from the parser.
func ParseStatementsWithOpts(filename, input string, popts ParserOptions) ([]Statement, []*Comment, error) {
parser := NewParser().
WithFilename(filename).
WithReader(strings.NewReader(input)).
WithReader(bytes.NewBufferString(input)).
WithProcessAnnotation(popts.ProcessAnnotation).
WithFutureKeywords(popts.FutureKeywords...).
WithAllFutureKeywords(popts.AllFutureKeywords).
@@ -636,6 +638,7 @@ func ParseStatementsWithOpts(filename, input string, popts ParserOptions) ([]Sta
withUnreleasedKeywords(popts.unreleasedKeywords)
stmts, comments, errs := parser.Parse()
if len(errs) > 0 {
return nil, nil, errs
}
@@ -644,6 +647,7 @@ func ParseStatementsWithOpts(filename, input string, popts ParserOptions) ([]Sta
}
func parseModule(filename string, stmts []Statement, comments []*Comment, regoCompatibilityMode RegoVersion) (*Module, error) {
if len(stmts) == 0 {
return nil, NewError(ParseErr, &Location{File: filename}, "empty module")
}
@@ -658,21 +662,23 @@ func parseModule(filename string, stmts []Statement, comments []*Comment, regoCo
mod := &Module{
Package: pkg,
// The comments slice only holds comments that were not their own statements.
Comments: comments,
stmts: stmts,
stmts: stmts,
}
mod.regoVersion = regoCompatibilityMode
// The comments slice only holds comments that were not their own statements.
mod.Comments = append(mod.Comments, comments...)
if regoCompatibilityMode == RegoUndefined {
mod.regoVersion = DefaultRegoVersion
} else {
mod.regoVersion = regoCompatibilityMode
}
for i, stmt := range stmts[1:] {
switch stmt := stmt.(type) {
case *Import:
mod.Imports = append(mod.Imports, stmt)
if mod.regoVersion == RegoV0 && RegoV1CompatibleRef.Equal(stmt.Path.Value) {
if mod.regoVersion == RegoV0 && Compare(stmt.Path.Value, RegoV1CompatibleRef) == 0 {
mod.regoVersion = RegoV0CompatV1
}
case *Rule:

View File

@@ -621,7 +621,7 @@ func (imp *Import) SetLoc(loc *Location) {
// document. This is the alias if defined otherwise the last element in the
// path.
func (imp *Import) Name() Var {
if imp.Alias != "" {
if len(imp.Alias) != 0 {
return imp.Alias
}
switch v := imp.Path.Value.(type) {
@@ -988,7 +988,6 @@ func (head *Head) Copy() *Head {
cpy.Key = head.Key.Copy()
cpy.Value = head.Value.Copy()
cpy.keywords = nil
cpy.Assign = head.Assign
return &cpy
}

View File

@@ -27,12 +27,13 @@ func checkRootDocumentOverrides(node any) Errors {
errors := Errors{}
WalkRules(node, func(rule *Rule) bool {
name := rule.Head.Name
var name string
if len(rule.Head.Reference) > 0 {
name = rule.Head.Reference[0].Value.(Var)
name = rule.Head.Reference[0].Value.(Var).String()
} else {
name = rule.Head.Name.String()
}
if ReservedVars.Contains(name) {
if RootDocumentRefs.Contains(RefTerm(VarTerm(name))) {
errors = append(errors, NewError(CompileErr, rule.Location, "rules must not shadow %v (use a different rule name)", name))
}
@@ -51,8 +52,8 @@ func checkRootDocumentOverrides(node any) Errors {
if expr.IsAssignment() {
// assign() can be called directly, so we need to assert its given first operand exists before checking its name.
if nameOp := expr.Operand(0); nameOp != nil {
name := Var(nameOp.String())
if ReservedVars.Contains(name) {
name := nameOp.String()
if RootDocumentRefs.Contains(RefTerm(VarTerm(name))) {
errors = append(errors, NewError(CompileErr, expr.Location, "variables must not shadow %v (use a different variable name)", name))
}
}
@@ -64,24 +65,26 @@ func checkRootDocumentOverrides(node any) Errors {
}
func walkCalls(node any, f func(any) bool) {
vis := NewGenericVisitor(func(x any) bool {
switch y := x.(type) {
vis := &GenericVisitor{func(x any) bool {
switch x := x.(type) {
case Call:
return f(x)
case *Expr:
if y.IsCall() {
if x.IsCall() {
return f(x)
}
case *Head:
// GenericVisitor doesn't walk the rule head ref
walkCalls(y.Reference, f)
walkCalls(x.Reference, f)
}
return false
})
}}
vis.Walk(node)
}
func checkDeprecatedBuiltins(deprecatedBuiltinsMap map[string]struct{}, node any) (errs Errors) {
func checkDeprecatedBuiltins(deprecatedBuiltinsMap map[string]struct{}, node any) Errors {
errs := make(Errors, 0)
walkCalls(node, func(x any) bool {
var operator string
var loc *Location

View File

@@ -48,8 +48,6 @@ func ValueName(x Value) string {
return "objectcomprehension"
case *SetComprehension:
return "setcomprehension"
case *TemplateString:
return "templatestring"
}
return TypeName(x)

View File

@@ -25,13 +25,7 @@ import (
"github.com/open-policy-agent/opa/v1/util"
)
var (
NullValue Value = Null{}
errFindNotFound = errors.New("find: not found")
varRegexp = regexp.MustCompile("^[[:alpha:]_][[:alpha:][:digit:]_]*$")
)
var errFindNotFound = errors.New("find: not found")
// Location records a position in source code.
type Location = location.Location
@@ -49,7 +43,6 @@ func NewLocation(text []byte, file string, row int, col int) *Location {
// - Variables, References
// - Array, Set, and Object Comprehensions
// - Calls
// - Template Strings
type Value interface {
Compare(other Value) int // Compare returns <0, 0, or >0 if this Value is less than, equal to, or greater than other, respectively.
Find(path Ref) (Value, error) // Find returns value referred to by path or an error if path is not found.
@@ -358,8 +351,6 @@ func (term *Term) Copy() *Term {
cpy.Value = v.Copy()
case *SetComprehension:
cpy.Value = v.Copy()
case *TemplateString:
cpy.Value = v.Copy()
case Call:
cpy.Value = v.Copy()
}
@@ -465,17 +456,7 @@ func (term *Term) Vars() VarSet {
}
// IsConstant returns true if the AST value is constant.
// Note that this is only a shallow check as we currently don't have a real
// notion of constant "vars" in the AST implementation. Meaning that while we could
// derive that a reference to a constant value is also constant, we currently don't.
func IsConstant(v Value) bool {
switch v.(type) {
case Null, Boolean, Number, String:
return true
case Var, Ref, *ArrayComprehension, *ObjectComprehension, *SetComprehension, Call:
return false
}
found := false
vis := GenericVisitor{
func(x any) bool {
@@ -550,6 +531,8 @@ func IsScalar(v Value) bool {
// Null represents the null value defined by JSON.
type Null struct{}
var NullValue Value = Null{}
// NullTerm creates a new Term with a Null value.
func NullTerm() *Term {
return &Term{Value: NullValue}
@@ -835,173 +818,6 @@ func (str String) Hash() int {
return int(xxhash.Sum64String(string(str)))
}
type TemplateString struct {
Parts []Node `json:"parts"`
MultiLine bool `json:"multi_line"`
}
func (ts *TemplateString) Copy() *TemplateString {
cpy := &TemplateString{MultiLine: ts.MultiLine, Parts: make([]Node, len(ts.Parts))}
for i, p := range ts.Parts {
switch v := p.(type) {
case *Expr:
cpy.Parts[i] = v.Copy()
case *Term:
cpy.Parts[i] = v.Copy()
}
}
return cpy
}
func (ts *TemplateString) Equal(other Value) bool {
if o, ok := other.(*TemplateString); ok && ts.MultiLine == o.MultiLine && len(ts.Parts) == len(o.Parts) {
for i, p := range ts.Parts {
switch v := p.(type) {
case *Expr:
if ope, ok := o.Parts[i].(*Expr); !ok || !v.Equal(ope) {
return false
}
case *Term:
if opt, ok := o.Parts[i].(*Term); !ok || !v.Equal(opt) {
return false
}
default:
return false
}
}
return true
}
return false
}
func (ts *TemplateString) Compare(other Value) int {
if ots, ok := other.(*TemplateString); ok {
if ts.MultiLine != ots.MultiLine {
if !ts.MultiLine {
return -1
}
return 1
}
if len(ts.Parts) != len(ots.Parts) {
return len(ts.Parts) - len(ots.Parts)
}
for i := range ts.Parts {
if cmp := Compare(ts.Parts[i], ots.Parts[i]); cmp != 0 {
return cmp
}
}
return 0
}
return Compare(ts, other)
}
func (ts *TemplateString) Find(path Ref) (Value, error) {
if len(path) == 0 {
return ts, nil
}
return nil, errFindNotFound
}
func (ts *TemplateString) Hash() int {
hash := 0
for _, p := range ts.Parts {
switch x := p.(type) {
case *Expr:
hash += x.Hash()
case *Term:
hash += x.Value.Hash()
default:
panic(fmt.Sprintf("invalid template part type %T", p))
}
}
return hash
}
func (*TemplateString) IsGround() bool {
return false
}
func (ts *TemplateString) String() string {
str := strings.Builder{}
str.WriteString("$\"")
for _, p := range ts.Parts {
switch x := p.(type) {
case *Expr:
str.WriteByte('{')
str.WriteString(p.String())
str.WriteByte('}')
case *Term:
s := p.String()
if _, ok := x.Value.(String); ok {
s = strings.TrimPrefix(s, "\"")
s = strings.TrimSuffix(s, "\"")
s = EscapeTemplateStringStringPart(s)
}
str.WriteString(s)
default:
str.WriteString("<invalid>")
}
}
str.WriteByte('"')
return str.String()
}
func TemplateStringTerm(multiLine bool, parts ...Node) *Term {
return &Term{Value: &TemplateString{MultiLine: multiLine, Parts: parts}}
}
// EscapeTemplateStringStringPart escapes unescaped left curly braces in s - i.e "{" becomes "\{".
// The internal representation of string terms within a template string does **NOT**
// treat '{' as special, but expects code dealing with template strings to escape them when
// required, such as when serializing the complete template string. Code that programmatically
// constructs template strings should not pre-escape left curly braces in string term parts.
//
// // TODO(anders): a future optimization would be to combine this with the other escaping done
// // for strings (e.g. escaping quotes, backslashes, and JSON control characters) in a single operation
// // to avoid multiple passes and allocations over the same string. That's currently done by
// // strconv.Quote, so we would need to re-implement that logic in code of our own.
// // NOTE(anders): I would love to come up with a better name for this component than
// // "TemplateStringStringPart"..
func EscapeTemplateStringStringPart(s string) string {
numUnescaped := countUnescapedLeftCurly(s)
if numUnescaped == 0 {
return s
}
l := len(s)
escaped := make([]byte, 0, l+numUnescaped)
if s[0] == '{' {
escaped = append(escaped, '\\', s[0])
} else {
escaped = append(escaped, s[0])
}
for i := 1; i < l; i++ {
if s[i] == '{' && s[i-1] != '\\' {
escaped = append(escaped, '\\', s[i])
} else {
escaped = append(escaped, s[i])
}
}
return util.ByteSliceToString(escaped)
}
func countUnescapedLeftCurly(s string) (n int) {
// Note(anders): while not the functions I'd intuitively reach for to solve this,
// they are hands down the fastest option here, as they're done in assembly, which
// performs about an order of magnitude better than a manual loop in Go.
if n = strings.Count(s, "{"); n > 0 {
n -= strings.Count(s, `\{`)
}
return n
}
// Var represents a variable as defined by the language.
type Var string
@@ -1135,14 +951,14 @@ func (ref Ref) Insert(x *Term, pos int) Ref {
// Extend returns a copy of ref with the terms from other appended. The head of
// other will be converted to a string.
func (ref Ref) Extend(other Ref) Ref {
offset := len(ref)
dst := make(Ref, offset+len(other))
dst := make(Ref, len(ref)+len(other))
copy(dst, ref)
head := other[0].Copy()
head.Value = String(head.Value.(Var))
offset := len(ref)
dst[offset] = head
copy(dst[offset+1:], other[1:])
return dst
}
@@ -1254,38 +1070,42 @@ func (ref Ref) HasPrefix(other Ref) bool {
func (ref Ref) ConstantPrefix() Ref {
i := ref.Dynamic()
if i < 0 {
return ref
return ref.Copy()
}
return ref[:i]
return ref[:i].Copy()
}
// StringPrefix returns the string portion of the ref starting from the head.
func (ref Ref) StringPrefix() Ref {
for i := 1; i < len(ref); i++ {
switch ref[i].Value.(type) {
case String: // pass
default: // cut off
return ref[:i]
return ref[:i].Copy()
}
}
return ref
return ref.Copy()
}
// GroundPrefix returns the ground portion of the ref starting from the head. By
// definition, the head of the reference is always ground.
func (ref Ref) GroundPrefix() Ref {
for i := range ref {
if i > 0 && !ref[i].IsGround() {
return ref[:i]
}
if ref.IsGround() {
return ref
}
return ref
prefix := make(Ref, 0, len(ref))
for i, x := range ref {
if i > 0 && !x.IsGround() {
break
}
prefix = append(prefix, x)
}
return prefix
}
// DynamicSuffix returns the dynamic portion of the ref.
// If the ref is not dynamic, nil is returned.
func (ref Ref) DynamicSuffix() Ref {
i := ref.Dynamic()
if i < 0 {
@@ -1296,7 +1116,7 @@ func (ref Ref) DynamicSuffix() Ref {
// IsGround returns true if all of the parts of the Ref are ground.
func (ref Ref) IsGround() bool {
if len(ref) < 2 {
if len(ref) == 0 {
return true
}
return termSliceIsGround(ref[1:])
@@ -1316,30 +1136,19 @@ func (ref Ref) IsNested() bool {
// contains non-string terms this function returns an error. Path
// components are escaped.
func (ref Ref) Ptr() (string, error) {
buf := &strings.Builder{}
tail := ref[1:]
l := max(len(tail)-1, 0) // number of '/' to add
for i := range tail {
str, ok := tail[i].Value.(String)
if !ok {
parts := make([]string, 0, len(ref)-1)
for _, term := range ref[1:] {
if str, ok := term.Value.(String); ok {
parts = append(parts, url.PathEscape(string(str)))
} else {
return "", errors.New("invalid path value type")
}
l += len(str)
}
buf.Grow(l)
for i := range tail {
if i > 0 {
buf.WriteByte('/')
}
str := string(tail[i].Value.(String))
// Sadly, the url package does not expose an appender for this.
buf.WriteString(url.PathEscape(str))
}
return buf.String(), nil
return strings.Join(parts, "/"), nil
}
var varRegexp = regexp.MustCompile("^[[:alpha:]_][[:alpha:][:digit:]_]*$")
func IsVarCompatibleString(s string) bool {
return varRegexp.MatchString(s)
}
@@ -1454,12 +1263,13 @@ type Array struct {
// Copy returns a deep copy of arr.
func (arr *Array) Copy() *Array {
cpy := make([]int, len(arr.elems))
copy(cpy, arr.hashs)
return &Array{
elems: termSliceCopy(arr.elems),
hashs: slices.Clone(arr.hashs),
hashs: cpy,
hash: arr.hash,
ground: arr.ground,
}
ground: arr.IsGround()}
}
// Equal returns true if arr is equal to other.
@@ -1738,19 +1548,13 @@ type set struct {
// Copy returns a deep copy of s.
func (s *set) Copy() Set {
cpy := &set{
hash: s.hash,
ground: s.ground,
sortGuard: sync.Once{},
elems: make(map[int]*Term, len(s.elems)),
keys: make([]*Term, 0, len(s.keys)),
terms := make([]*Term, len(s.keys))
for i := range s.keys {
terms[i] = s.keys[i].Copy()
}
for hash := range s.elems {
cpy.elems[hash] = s.elems[hash].Copy()
cpy.keys = append(cpy.keys, cpy.elems[hash])
}
cpy := NewSet(terms...).(*set)
cpy.hash = s.hash
cpy.ground = s.ground
return cpy
}
@@ -2505,21 +2309,19 @@ func (obj *object) Merge(other Object) (Object, bool) {
// is called. The conflictResolver can return a merged value and a boolean
// indicating if the merge has failed and should stop.
func (obj *object) MergeWith(other Object, conflictResolver func(v1, v2 *Term) (*Term, bool)) (Object, bool) {
// Might overallocate assuming no conflicts is the common case,
// but that's typically faster than iterating over each object twice.
result := newobject(obj.Len() + other.Len())
result := NewObject()
stop := obj.Until(func(k, v *Term) bool {
v2 := other.Get(k)
// The key didn't exist in other, keep the original value
if v2 == nil {
result.insert(k, v, false)
result.Insert(k, v)
return false
}
// The key exists in both, resolve the conflict if possible
merged, stop := conflictResolver(v, v2)
if !stop {
result.insert(k, merged, false)
result.Insert(k, merged)
}
return stop
})
@@ -2531,7 +2333,7 @@ func (obj *object) MergeWith(other Object, conflictResolver func(v1, v2 *Term) (
// Copy in any values from other for keys that don't exist in obj
other.Foreach(func(k, v *Term) {
if v2 := obj.Get(k); v2 == nil {
result.insert(k, v, false)
result.Insert(k, v)
}
})
return result, true
@@ -2931,28 +2733,12 @@ func (c Call) IsGround() bool {
return termSliceIsGround(c)
}
// MakeExpr returns a new Expr from this call.
// MakeExpr returns an ew Expr from this call.
func (c Call) MakeExpr(output *Term) *Expr {
terms := []*Term(c)
return NewExpr(append(terms, output))
}
func (c Call) Operator() Ref {
if len(c) == 0 {
return nil
}
return c[0].Value.(Ref)
}
func (c Call) Operands() []*Term {
if len(c) < 1 {
return nil
}
return c[1:]
}
func (c Call) String() string {
args := make([]string, len(c)-1)
for i := 1; i < len(c); i++ {

View File

@@ -19,6 +19,7 @@ type Transformer interface {
// Transform iterates the AST and calls the Transform function on the
// Transformer t for x before recursing.
func Transform(t Transformer, x any) (any, error) {
if term, ok := x.(*Term); ok {
return Transform(t, term.Value)
}
@@ -283,19 +284,6 @@ func Transform(t Transformer, x any) (any, error) {
}
}
return y, nil
case *TemplateString:
for i := range y.Parts {
if expr, ok := y.Parts[i].(*Expr); ok {
transformed, err := Transform(t, expr)
if err != nil {
return nil, err
}
if y.Parts[i], ok = transformed.(*Expr); !ok {
return nil, fmt.Errorf("illegal transform: %T != %T", expr, transformed)
}
}
}
return y, nil
default:
return y, nil
}
@@ -303,29 +291,29 @@ func Transform(t Transformer, x any) (any, error) {
// TransformRefs calls the function f on all references under x.
func TransformRefs(x any, f func(Ref) (Value, error)) (any, error) {
t := NewGenericTransformer(func(x any) (any, error) {
t := &GenericTransformer{func(x any) (any, error) {
if r, ok := x.(Ref); ok {
return f(r)
}
return x, nil
})
}}
return Transform(t, x)
}
// TransformVars calls the function f on all vars under x.
func TransformVars(x any, f func(Var) (Value, error)) (any, error) {
t := NewGenericTransformer(func(x any) (any, error) {
t := &GenericTransformer{func(x any) (any, error) {
if v, ok := x.(Var); ok {
return f(v)
}
return x, nil
})
}}
return Transform(t, x)
}
// TransformComprehensions calls the function f on all comprehensions under x.
// TransformComprehensions calls the functio nf on all comprehensions under x.
func TransformComprehensions(x any, f func(any) (Value, error)) (any, error) {
t := NewGenericTransformer(func(x any) (any, error) {
t := &GenericTransformer{func(x any) (any, error) {
switch x := x.(type) {
case *ArrayComprehension:
return f(x)
@@ -335,7 +323,7 @@ func TransformComprehensions(x any, f func(any) (Value, error)) (any, error) {
return f(x)
}
return x, nil
})
}}
return Transform(t, x)
}
@@ -399,7 +387,11 @@ func transformTerm(t Transformer, term *Term) (*Term, error) {
if err != nil {
return nil, err
}
return &Term{Value: v, Location: term.Location}, nil
r := &Term{
Value: v,
Location: term.Location,
}
return r, nil
}
func transformValue(t Transformer, v Value) (Value, error) {
@@ -415,18 +407,13 @@ func transformValue(t Transformer, v Value) (Value, error) {
}
func transformVar(t Transformer, v Var) (Var, error) {
tv, err := t.Transform(v)
v1, err := Transform(t, v)
if err != nil {
return "", err
}
if tv == nil {
return "", nil
}
r, ok := tv.(Var)
r, ok := v1.(Var)
if !ok {
return "", fmt.Errorf("illegal transform: %T != %T", v, tv)
return "", fmt.Errorf("illegal transform: %T != %T", v, v1)
}
return r, nil
}

View File

@@ -11,11 +11,12 @@ func isRefSafe(ref Ref, safe VarSet) bool {
case Call:
return isCallSafe(head, safe)
default:
vis := varVisitorPool.Get().WithParams(SafetyCheckVisitorParams)
vis.Walk(ref[0])
isSafe := vis.Vars().DiffCount(safe) == 0
varVisitorPool.Put(vis)
return isSafe
for v := range ref[0].Vars() {
if !safe.Contains(v) {
return false
}
}
return true
}
}

View File

@@ -358,11 +358,6 @@
"Minor": 34,
"Patch": 0
},
"internal.template_string": {
"Major": 1,
"Minor": 12,
"Patch": 0
},
"internal.test_case": {
"Major": 1,
"Minor": 2,
@@ -1042,11 +1037,6 @@
"Major": 0,
"Minor": 59,
"Patch": 0
},
"template_strings": {
"Major": 1,
"Minor": 12,
"Patch": 0
}
},
"keywords": {

View File

@@ -4,108 +4,44 @@
package ast
var (
termTypeVisitor = newTypeVisitor[*Term]()
varTypeVisitor = newTypeVisitor[Var]()
exprTypeVisitor = newTypeVisitor[*Expr]()
ruleTypeVisitor = newTypeVisitor[*Rule]()
refTypeVisitor = newTypeVisitor[Ref]()
bodyTypeVisitor = newTypeVisitor[Body]()
withTypeVisitor = newTypeVisitor[*With]()
)
// Visitor defines the interface for iterating AST elements. The Visit function
// can return a Visitor w which will be used to visit the children of the AST
// element v. If the Visit function returns nil, the children will not be
// visited.
//
// Deprecated: use GenericVisitor or another visitor implementation
type Visitor interface {
Visit(v any) (w Visitor)
}
type (
// GenericVisitor provides a utility to walk over AST nodes using a
// closure. If the closure returns true, the visitor will not walk
// over AST nodes under x.
GenericVisitor struct {
f func(x any) bool
}
// BeforeAndAfterVisitor wraps Visitor to provide hooks for being called before
// and after the AST has been visited.
//
// Deprecated: use GenericVisitor or another visitor implementation
type BeforeAndAfterVisitor interface {
Visitor
Before(x any)
After(x any)
}
// BeforeAfterVisitor provides a utility to walk over AST nodes using
// closures. If the before closure returns true, the visitor will not
// walk over AST nodes under x. The after closure is invoked always
// after visiting a node.
BeforeAfterVisitor struct {
before func(x any) bool
after func(x any)
}
// VarVisitor walks AST nodes under a given node and collects all encountered
// variables. The collected variables can be controlled by specifying
// VarVisitorParams when creating the visitor.
VarVisitor struct {
params VarVisitorParams
vars VarSet
}
// VarVisitorParams contains settings for a VarVisitor.
VarVisitorParams struct {
SkipRefHead bool
SkipRefCallHead bool
SkipObjectKeys bool
SkipClosures bool
SkipWithTarget bool
SkipSets bool
}
// Visitor defines the interface for iterating AST elements. The Visit function
// can return a Visitor w which will be used to visit the children of the AST
// element v. If the Visit function returns nil, the children will not be
// visited.
//
// Deprecated: use [GenericVisitor] or another visitor implementation
Visitor interface {
Visit(v any) (w Visitor)
}
// BeforeAndAfterVisitor wraps Visitor to provide hooks for being called before
// and after the AST has been visited.
//
// Deprecated: use [GenericVisitor] or another visitor implementation
BeforeAndAfterVisitor interface {
Visitor
Before(x any)
After(x any)
}
// typeVisitor is a generic visitor for a specific type T (the "generic" name was
// however taken). Contrary to the [GenericVisitor], the typeVisitor only invokes
// the visit function for nodes of type T, saving both CPU cycles and type assertions.
// typeVisitor implementations carry no state, and can be shared freely across
// goroutines. Access is private for the time being, as there is already inflation
// in visitor types exposed in the AST package. The various WalkXXX functions however
// now leverage typeVisitor under the hood.
//
// While a typeVisitor is generally a more performant option over a GenericVisitor,
// it is not as flexible: a type visitor can only visit nodes of a single type T,
// whereas a GenericVisitor visits all nodes. Adding to that, a typeVisitor can only
// be instantiated for **concrete types** — not interfaces (e.g., [*Expr], not [Node]),
// as reflection would be required to determine the concrete type at runtime, thus
// nullifying the performance benefits of the typeVisitor in the first place.
typeVisitor[T any] struct {
typ any
}
)
// Walk iterates the AST by calling the Visit function on the [Visitor]
// Walk iterates the AST by calling the Visit function on the Visitor
// v for x before recursing.
//
// Deprecated: use [GenericVisitor.Walk]
// Deprecated: use GenericVisitor.Walk
func Walk(v Visitor, x any) {
if bav, ok := v.(BeforeAndAfterVisitor); !ok {
walk(v, x)
} else {
bav.Before(x)
defer bav.After(x)
walk(bav, x)
bav.After(x)
}
}
// WalkBeforeAndAfter iterates the AST by calling the Visit function on the
// Visitor v for x before recursing.
//
// Deprecated: use [GenericVisitor.Walk]
// Deprecated: use GenericVisitor.Walk
func WalkBeforeAndAfter(v BeforeAndAfterVisitor, x any) {
Walk(v, x)
}
@@ -217,258 +153,132 @@ func walk(v Visitor, x any) {
for i := range x.Symbols {
Walk(w, x.Symbols[i])
}
case *TemplateString:
for i := range x.Parts {
Walk(w, x.Parts[i])
}
}
}
// WalkVars calls the function f on all vars under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkVars(x any, f func(Var) bool) {
varTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if v, ok := x.(Var); ok {
return f(v)
}
return false
}}
vis.Walk(x)
}
// WalkClosures calls the function f on all closures under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkClosures(x any, f func(any) bool) {
vis := NewGenericVisitor(func(x any) bool {
vis := &GenericVisitor{func(x any) bool {
switch x := x.(type) {
case *ArrayComprehension, *ObjectComprehension, *SetComprehension, *Every:
return f(x)
}
return false
})
}}
vis.Walk(x)
}
// WalkRefs calls the function f on all references under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkRefs(x any, f func(Ref) bool) {
refTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if r, ok := x.(Ref); ok {
return f(r)
}
return false
}}
vis.Walk(x)
}
// WalkTerms calls the function f on all terms under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkTerms(x any, f func(*Term) bool) {
termTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if term, ok := x.(*Term); ok {
return f(term)
}
return false
}}
vis.Walk(x)
}
// WalkWiths calls the function f on all with modifiers under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkWiths(x any, f func(*With) bool) {
withTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if w, ok := x.(*With); ok {
return f(w)
}
return false
}}
vis.Walk(x)
}
// WalkExprs calls the function f on all expressions under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkExprs(x any, f func(*Expr) bool) {
exprTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if r, ok := x.(*Expr); ok {
return f(r)
}
return false
}}
vis.Walk(x)
}
// WalkBodies calls the function f on all bodies under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkBodies(x any, f func(Body) bool) {
bodyTypeVisitor.walk(x, f)
vis := &GenericVisitor{func(x any) bool {
if b, ok := x.(Body); ok {
return f(b)
}
return false
}}
vis.Walk(x)
}
// WalkRules calls the function f on all rules under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkRules(x any, f func(*Rule) bool) {
switch x := x.(type) {
case *Module:
for i := range x.Rules {
if !f(x.Rules[i]) && x.Rules[i].Else != nil {
WalkRules(x.Rules[i].Else, f)
vis := &GenericVisitor{func(x any) bool {
if r, ok := x.(*Rule); ok {
stop := f(r)
// NOTE(tsandall): since rules cannot be embedded inside of queries
// we can stop early if there is no else block.
if stop || r.Else == nil {
return true
}
}
case *Rule:
if !f(x) && x.Else != nil {
WalkRules(x.Else, f)
}
default:
ruleTypeVisitor.walk(x, f)
}
return false
}}
vis.Walk(x)
}
// WalkNodes calls the function f on all nodes under x. If the function f
// returns true, AST nodes under the last node will not be visited.
func WalkNodes(x any, f func(Node) bool) {
vis := NewGenericVisitor(func(x any) bool {
vis := &GenericVisitor{func(x any) bool {
if n, ok := x.(Node); ok {
return f(n)
}
return false
})
}}
vis.Walk(x)
}
func newTypeVisitor[T any]() *typeVisitor[T] {
var t T
return &typeVisitor[T]{typ: any(t)}
}
func (tv *typeVisitor[T]) walkArgs(args Args, visit func(x T) bool) {
// If T is not Args, avoid allocation by inlining the walk.
if _, ok := tv.typ.(Args); !ok {
for i := range args {
tv.walk(args[i], visit)
}
} else {
tv.walk(args, visit) // allocates
}
}
func (tv *typeVisitor[T]) walkBody(body Body, visit func(x T) bool) {
if _, ok := tv.typ.(Body); !ok {
for i := range body {
tv.walk(body[i], visit)
}
} else {
tv.walk(body, visit) // allocates
}
}
func (tv *typeVisitor[T]) walkRef(ref Ref, visit func(x T) bool) {
if _, ok := tv.typ.(Ref); !ok {
for i := range ref {
tv.walk(ref[i], visit)
}
} else {
tv.walk(ref, visit) // allocates
}
}
func (tv *typeVisitor[T]) walk(x any, visit func(x T) bool) {
if v, ok := x.(T); ok && visit(v) {
return
}
switch x := x.(type) {
case *Module:
tv.walk(x.Package, visit)
for i := range x.Imports {
tv.walk(x.Imports[i], visit)
}
for i := range x.Rules {
tv.walk(x.Rules[i], visit)
}
for i := range x.Annotations {
tv.walk(x.Annotations[i], visit)
}
for i := range x.Comments {
tv.walk(x.Comments[i], visit)
}
case *Package:
tv.walkRef(x.Path, visit)
case *Import:
tv.walk(x.Path, visit)
if _, ok := tv.typ.(Var); ok {
tv.walk(x.Alias, visit)
}
case *Rule:
tv.walk(x.Head, visit)
tv.walkBody(x.Body, visit)
if x.Else != nil {
tv.walk(x.Else, visit)
}
case *Head:
if _, ok := tv.typ.(Var); ok {
tv.walk(x.Name, visit)
}
tv.walkArgs(x.Args, visit)
if x.Key != nil {
tv.walk(x.Key, visit)
}
if x.Value != nil {
tv.walk(x.Value, visit)
}
case Body:
for i := range x {
tv.walk(x[i], visit)
}
case Args:
for i := range x {
tv.walk(x[i], visit)
}
case *Expr:
switch ts := x.Terms.(type) {
case *Term, *SomeDecl, *Every:
tv.walk(ts, visit)
case []*Term:
for i := range ts {
tv.walk(ts[i], visit)
}
}
for i := range x.With {
tv.walk(x.With[i], visit)
}
case *With:
tv.walk(x.Target, visit)
tv.walk(x.Value, visit)
case *Term:
tv.walk(x.Value, visit)
case Ref:
for i := range x {
tv.walk(x[i], visit)
}
case *object:
x.Foreach(func(k, v *Term) {
tv.walk(k, visit)
tv.walk(v, visit)
})
case Object:
for _, k := range x.Keys() {
tv.walk(k, visit)
tv.walk(x.Get(k), visit)
}
case *Array:
for i := range x.Len() {
tv.walk(x.Elem(i), visit)
}
case Set:
xSlice := x.Slice()
for i := range xSlice {
tv.walk(xSlice[i], visit)
}
case *ArrayComprehension:
tv.walk(x.Term, visit)
tv.walkBody(x.Body, visit)
case *ObjectComprehension:
tv.walk(x.Key, visit)
tv.walk(x.Value, visit)
tv.walkBody(x.Body, visit)
case *SetComprehension:
tv.walk(x.Term, visit)
tv.walkBody(x.Body, visit)
case Call:
for i := range x {
tv.walk(x[i], visit)
}
case *Every:
if x.Key != nil {
tv.walk(x.Key, visit)
}
tv.walk(x.Value, visit)
tv.walk(x.Domain, visit)
tv.walkBody(x.Body, visit)
case *SomeDecl:
for i := range x.Symbols {
tv.walk(x.Symbols[i], visit)
}
case *TemplateString:
for i := range x.Parts {
tv.walk(x.Parts[i], visit)
}
}
// GenericVisitor provides a utility to walk over AST nodes using a
// closure. If the closure returns true, the visitor will not walk
// over AST nodes under x.
type GenericVisitor struct {
f func(x any) bool
}
// NewGenericVisitor returns a new GenericVisitor that will invoke the function
// f on AST nodes. Note that while it returns a pointer, the creating a GenericVisitor
// doesn't commonly allocate it on the heap, as long as it doesn't escape the function
// in which it is created and used (as it's trivially inlined).
// f on AST nodes.
func NewGenericVisitor(f func(x any) bool) *GenericVisitor {
return &GenericVisitor{f}
}
@@ -500,9 +310,7 @@ func (vis *GenericVisitor) Walk(x any) {
vis.Walk(x.Path)
case *Import:
vis.Walk(x.Path)
if x.Alias != "" {
vis.f(x.Alias)
}
vis.Walk(x.Alias)
case *Rule:
vis.Walk(x.Head)
vis.Walk(x.Body)
@@ -510,12 +318,8 @@ func (vis *GenericVisitor) Walk(x any) {
vis.Walk(x.Else)
}
case *Head:
if x.Name != "" {
vis.f(x.Name)
}
if x.Args != nil {
vis.Walk(x.Args)
}
vis.Walk(x.Name)
vis.Walk(x.Args)
if x.Key != nil {
vis.Walk(x.Key)
}
@@ -595,13 +399,18 @@ func (vis *GenericVisitor) Walk(x any) {
for i := range x.Symbols {
vis.Walk(x.Symbols[i])
}
case *TemplateString:
for i := range x.Parts {
vis.Walk(x.Parts[i])
}
}
}
// BeforeAfterVisitor provides a utility to walk over AST nodes using
// closures. If the before closure returns true, the visitor will not
// walk over AST nodes under x. The after closure is invoked always
// after visiting a node.
type BeforeAfterVisitor struct {
before func(x any) bool
after func(x any)
}
// NewBeforeAfterVisitor returns a new BeforeAndAfterVisitor that
// will invoke the functions before and after AST nodes.
func NewBeforeAfterVisitor(before func(x any) bool, after func(x any)) *BeforeAfterVisitor {
@@ -733,29 +542,31 @@ func (vis *BeforeAfterVisitor) Walk(x any) {
}
}
// NewVarVisitor returns a new [VarVisitor] object.
// VarVisitor walks AST nodes under a given node and collects all encountered
// variables. The collected variables can be controlled by specifying
// VarVisitorParams when creating the visitor.
type VarVisitor struct {
params VarVisitorParams
vars VarSet
}
// VarVisitorParams contains settings for a VarVisitor.
type VarVisitorParams struct {
SkipRefHead bool
SkipRefCallHead bool
SkipObjectKeys bool
SkipClosures bool
SkipWithTarget bool
SkipSets bool
}
// NewVarVisitor returns a new VarVisitor object.
func NewVarVisitor() *VarVisitor {
return &VarVisitor{
vars: NewVarSet(),
}
}
// ClearOrNewVarVisitor clears a non-nil [VarVisitor] or returns a new one.
func ClearOrNewVarVisitor(vis *VarVisitor) *VarVisitor {
if vis == nil {
return NewVarVisitor()
}
return vis.Clear()
}
// ClearOrNew resets the visitor to its initial state, or returns a new one if nil.
//
// Deprecated: use [ClearOrNewVarVisitor] instead.
func (vis *VarVisitor) ClearOrNew() *VarVisitor {
return ClearOrNewVarVisitor(vis)
}
// Clear resets the visitor to its initial state, and returns it for chaining.
func (vis *VarVisitor) Clear() *VarVisitor {
vis.params = VarVisitorParams{}
@@ -764,6 +575,14 @@ func (vis *VarVisitor) Clear() *VarVisitor {
return vis
}
// ClearOrNew returns a new VarVisitor if vis is nil, or else a cleared VarVisitor.
func (vis *VarVisitor) ClearOrNew() *VarVisitor {
if vis == nil {
return NewVarVisitor()
}
return vis.Clear()
}
// WithParams sets the parameters in params on vis.
func (vis *VarVisitor) WithParams(params VarVisitorParams) *VarVisitor {
vis.params = params
@@ -779,7 +598,7 @@ func (vis *VarVisitor) Add(v Var) {
}
}
// Vars returns a [VarSet] that contains collected vars.
// Vars returns a VarSet that contains collected vars.
func (vis *VarVisitor) Vars() VarSet {
return vis.vars
}
@@ -806,7 +625,7 @@ func (vis *VarVisitor) visit(v any) bool {
}
if vis.params.SkipClosures {
switch v := v.(type) {
case *ArrayComprehension, *ObjectComprehension, *SetComprehension, *TemplateString:
case *ArrayComprehension, *ObjectComprehension, *SetComprehension:
return true
case *Expr:
if ev, ok := v.Terms.(*Every); ok {
@@ -876,8 +695,9 @@ func (vis *VarVisitor) visit(v any) bool {
return false
}
// Walk iterates the AST by calling the function f on the [VarVisitor] before recursing.
// Contrary to the deprecated [Walk] function, this does not require allocating the visitor from heap.
// Walk iterates the AST by calling the function f on the
// GenericVisitor before recursing. Contrary to the generic Walk, this
// does not require allocating the visitor from heap.
func (vis *VarVisitor) Walk(x any) {
if vis.visit(x) {
return
@@ -885,9 +705,16 @@ func (vis *VarVisitor) Walk(x any) {
switch x := x.(type) {
case *Module:
vis.Walk(x.Package)
for i := range x.Imports {
vis.Walk(x.Imports[i])
}
for i := range x.Rules {
vis.Walk(x.Rules[i])
}
for i := range x.Comments {
vis.Walk(x.Comments[i])
}
case *Package:
vis.WalkRef(x.Path)
case *Import:
@@ -940,9 +767,9 @@ func (vis *VarVisitor) Walk(x any) {
vis.Walk(x[i].Value)
}
case *object:
x.Foreach(func(k, v *Term) {
x.Foreach(func(k, _ *Term) {
vis.Walk(k)
vis.Walk(v)
vis.Walk(x.Get(k))
})
case *Array:
x.Foreach(func(t *Term) {
@@ -978,10 +805,6 @@ func (vis *VarVisitor) Walk(x any) {
for i := range x.Symbols {
vis.Walk(x.Symbols[i])
}
case *TemplateString:
for i := range x.Parts {
vis.Walk(x.Parts[i])
}
}
}

View File

@@ -970,7 +970,7 @@ func compileModules(compiler *ast.Compiler, m metrics.Metrics, bundles map[strin
m.Timer(metrics.RegoModuleCompile).Start()
defer m.Timer(metrics.RegoModuleCompile).Stop()
modules := make(map[string]*ast.Module, len(compiler.Modules)+len(extraModules)+len(bundles))
modules := map[string]*ast.Module{}
// preserve any modules already on the compiler
maps.Copy(modules, compiler.Modules)

View File

@@ -27,6 +27,8 @@ import (
const defaultLocationFile = "__format_default__"
var (
elseVar ast.Value = ast.Var("else")
expandedConst = ast.NewBody(ast.NewExpr(ast.InternedTerm(true)))
commentsSlicePool = util.NewSlicePool[*ast.Comment](50)
varRegexp = regexp.MustCompile("^[[:alpha:]_][[:alpha:][:digit:]_]*$")
@@ -730,7 +732,7 @@ func (w *writer) writeElse(rule *ast.Rule, comments []*ast.Comment) ([]*ast.Comm
rule.Else.Head.Name = "else" // NOTE(sr): whaaat
elseHeadReference := ast.VarTerm("else") // construct a reference for the term
elseHeadReference := ast.NewTerm(elseVar) // construct a reference for the term
elseHeadReference.Location = rule.Else.Head.Location // and set the location to match the rule location
rule.Else.Head.Reference = ast.Ref{elseHeadReference}
@@ -1282,11 +1284,6 @@ func (w *writer) writeTermParens(parens bool, term *ast.Term, comments []*ast.Co
}
}
case *ast.TemplateString:
comments, err = w.writeTemplateString(x, comments)
if err != nil {
return nil, err
}
case ast.Var:
w.write(w.formatVar(x))
case ast.Call:
@@ -1304,91 +1301,6 @@ func (w *writer) writeTermParens(parens bool, term *ast.Term, comments []*ast.Co
return comments, nil
}
func (w *writer) writeTemplateString(ts *ast.TemplateString, comments []*ast.Comment) ([]*ast.Comment, error) {
w.write("$")
if ts.MultiLine {
w.write("`")
} else {
w.write(`"`)
}
for i, p := range ts.Parts {
switch x := p.(type) {
case *ast.Expr:
w.write("{")
w.up()
if w.beforeEnd != nil {
// We have a comment on the same line as the opening template-expression brace '{'
w.endLine()
w.startLine()
} else {
// We might have comments to write; the first of which should be on the same line as the opening template-expression brace '{'
before, _, _ := partitionComments(comments, x.Location)
if len(before) > 0 {
w.write(" ")
w.inline = true
if err := w.writeComments(before); err != nil {
return nil, err
}
comments = comments[len(before):]
}
}
var err error
comments, err = w.writeExpr(x, comments)
if err != nil {
return comments, err
}
// write trailing comments
if i+1 < len(ts.Parts) {
before, _, _ := partitionComments(comments, ts.Parts[i+1].Loc())
if len(before) > 0 {
w.endLine()
if err := w.writeComments(before); err != nil {
return nil, err
}
comments = comments[len(before):]
w.startLine()
}
}
w.write("}")
if err := w.down(); err != nil {
return nil, err
}
case *ast.Term:
if s, ok := x.Value.(ast.String); ok {
if ts.MultiLine {
w.write(ast.EscapeTemplateStringStringPart(string(s)))
} else {
str := ast.EscapeTemplateStringStringPart(s.String())
w.write(str[1 : len(str)-1])
}
} else {
s := x.String()
s = strings.TrimPrefix(s, "\"")
s = strings.TrimSuffix(s, "\"")
w.write(s)
}
default:
w.write("<invalid>")
}
}
if ts.MultiLine {
w.write("`")
} else {
w.write(`"`)
}
return comments, nil
}
func (w *writer) writeRef(x ast.Ref, comments []*ast.Comment) ([]*ast.Comment, error) {
if len(x) > 0 {
parens := false
@@ -2019,7 +1931,7 @@ func partitionComments(comments []*ast.Comment, l *ast.Location) ([]*ast.Comment
var at *ast.Comment
before := make([]*ast.Comment, 0, numBefore)
after := make([]*ast.Comment, 0, numAfter)
after := comments[0 : 0 : len(comments)-numBefore]
for _, c := range comments {
switch cmp := c.Location.Row - l.Row; {

View File

@@ -2212,7 +2212,7 @@ func (r *Rego) compileQuery(query ast.Body, imports []*ast.Import, _ metrics.Met
if r.pkg != "" {
var err error
pkg, err = ast.ParsePackage("package " + r.pkg)
pkg, err = ast.ParsePackage(fmt.Sprintf("package %v", r.pkg))
if err != nil {
return nil, nil, err
}

View File

@@ -216,25 +216,16 @@ func (vis namespacingVisitor) Visit(x any) bool {
switch x := x.(type) {
case *ast.ArrayComprehension:
x.Term = vis.namespaceTerm(x.Term)
vis := ast.NewGenericVisitor(vis.Visit)
for _, expr := range x.Body {
vis.Walk(expr)
}
ast.NewGenericVisitor(vis.Visit).Walk(x.Body)
return true
case *ast.SetComprehension:
x.Term = vis.namespaceTerm(x.Term)
vis := ast.NewGenericVisitor(vis.Visit)
for _, expr := range x.Body {
vis.Walk(expr)
}
ast.NewGenericVisitor(vis.Visit).Walk(x.Body)
return true
case *ast.ObjectComprehension:
x.Key = vis.namespaceTerm(x.Key)
x.Value = vis.namespaceTerm(x.Value)
vis := ast.NewGenericVisitor(vis.Visit)
for _, expr := range x.Body {
vis.Walk(expr)
}
ast.NewGenericVisitor(vis.Visit).Walk(x.Body)
return true
case *ast.Expr:
switch terms := x.Terms.(type) {

View File

@@ -344,7 +344,7 @@ func (p *CopyPropagator) livevarRef(a *ast.Term) bool {
}
for _, v := range p.sorted {
if v.Equal(ref[0].Value) {
if ref[0].Value.Compare(v) == 0 {
return true
}
}
@@ -403,7 +403,7 @@ func containedIn(value ast.Value, x any) bool {
if v, ok := value.(ast.Ref); ok {
match = x.HasPrefix(v)
} else {
match = x.Equal(value)
match = x.Compare(value) == 0
}
if stop || match {
stop = true

View File

@@ -28,6 +28,7 @@ func (h printHook) Print(_ print.Context, msg string) error {
}
func builtinPrint(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
if bctx.PrintHook == nil {
return iter(nil)
}
@@ -39,7 +40,7 @@ func builtinPrint(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term
buf := make([]string, arr.Len())
err = builtinPrintCrossProductOperands(bctx.Location, buf, arr, 0, func(buf []string) error {
err = builtinPrintCrossProductOperands(bctx, buf, arr, 0, func(buf []string) error {
pctx := print.Context{
Context: bctx.Context,
Location: bctx.Location,
@@ -53,32 +54,20 @@ func builtinPrint(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term
return iter(nil)
}
func builtinPrintCrossProductOperands(loc *ast.Location, buf []string, operands *ast.Array, i int, f func([]string) error) error {
func builtinPrintCrossProductOperands(bctx BuiltinContext, buf []string, operands *ast.Array, i int, f func([]string) error) error {
if i >= operands.Len() {
return f(buf)
}
operand := operands.Elem(i)
// We allow primitives ...
switch x := operand.Value.(type) {
case ast.String:
buf[i] = string(x)
return builtinPrintCrossProductOperands(loc, buf, operands, i+1, f)
case ast.Number, ast.Boolean, ast.Null:
buf[i] = x.String()
return builtinPrintCrossProductOperands(loc, buf, operands, i+1, f)
}
// ... but all other operand types must be sets.
xs, ok := operand.Value.(ast.Set)
xs, ok := operands.Elem(i).Value.(ast.Set)
if !ok {
return Halt{Err: internalErr(loc, "illegal argument type: "+ast.ValueName(operand.Value))}
return Halt{Err: internalErr(bctx.Location, fmt.Sprintf("illegal argument type: %v", ast.ValueName(operands.Elem(i).Value)))}
}
if xs.Len() == 0 {
buf[i] = "<undefined>"
return builtinPrintCrossProductOperands(loc, buf, operands, i+1, f)
return builtinPrintCrossProductOperands(bctx, buf, operands, i+1, f)
}
return xs.Iter(func(x *ast.Term) error {
@@ -88,7 +77,7 @@ func builtinPrintCrossProductOperands(loc *ast.Location, buf []string, operands
default:
buf[i] = v.String()
}
return builtinPrintCrossProductOperands(loc, buf, operands, i+1, f)
return builtinPrintCrossProductOperands(bctx, buf, operands, i+1, f)
})
}

View File

@@ -134,7 +134,7 @@ func (q *Query) WithTracer(tracer Tracer) *Query {
// WithQueryTracer adds a query tracer to use during evaluation. This is optional.
// Disabled QueryTracers will be ignored.
func (q *Query) WithQueryTracer(tracer QueryTracer) *Query {
if tracer == nil || !tracer.Enabled() {
if !tracer.Enabled() {
return q
}

View File

@@ -1,73 +0,0 @@
package topdown
import (
"bytes"
"io"
)
var _ io.Writer = (*sinkW)(nil)
type sinkWriter interface {
io.Writer
String() string
Grow(int)
WriteByte(byte) error
WriteString(string) (int, error)
}
type sinkW struct {
buf *bytes.Buffer
cancel Cancel
err error
}
func newSink(name string, hint int, c Cancel) sinkWriter {
b := &bytes.Buffer{}
if hint > 0 {
b.Grow(hint)
}
if c == nil {
return b
}
return &sinkW{
cancel: c,
buf: b,
err: Halt{
Err: &Error{
Code: CancelErr,
Message: name + ": timed out before finishing",
},
},
}
}
func (sw *sinkW) Grow(n int) {
sw.buf.Grow(n)
}
func (sw *sinkW) Write(bs []byte) (int, error) {
if sw.cancel.Cancelled() {
return 0, sw.err
}
return sw.buf.Write(bs)
}
func (sw *sinkW) WriteByte(b byte) error {
if sw.cancel.Cancelled() {
return sw.err
}
return sw.buf.WriteByte(b)
}
func (sw *sinkW) WriteString(s string) (int, error) {
if sw.cancel.Cancelled() {
return 0, sw.err
}
return sw.buf.WriteString(s)
}
func (sw *sinkW) String() string {
return sw.buf.String()
}

View File

@@ -152,7 +152,7 @@ func builtinFormatInt(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Ter
return iter(ast.InternedTerm(fmt.Sprintf(format, i)))
}
func builtinConcat(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
func builtinConcat(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
join, err := builtins.StringOperand(operands[0].Value, 1)
if err != nil {
return err
@@ -163,13 +163,11 @@ func builtinConcat(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Ter
return iter(term)
}
sb := newSink(ast.Concat.Name, 0, bctx.Cancel)
// NOTE(anderseknert):
// More or less Go's strings.Join implementation, but where we avoid
// creating an intermediate []string slice to pass to that function,
// as that's expensive (3.5x more space allocated). Instead we build
// the string directly using the sink to concatenate the string
// the string directly using a strings.Builder to concatenate the string
// values from the array/set with the separator.
n := 0
switch b := operands[1].Value.(type) {
@@ -184,36 +182,25 @@ func builtinConcat(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Ter
}
sep := string(join)
n += len(sep) * (l - 1)
var sb strings.Builder
sb.Grow(n)
if _, err := sb.WriteString(string(b.Elem(0).Value.(ast.String))); err != nil {
return err
}
sb.WriteString(string(b.Elem(0).Value.(ast.String)))
if sep == "" {
for i := 1; i < l; i++ {
if _, err := sb.WriteString(string(b.Elem(i).Value.(ast.String))); err != nil {
return err
}
sb.WriteString(string(b.Elem(i).Value.(ast.String)))
}
} else if len(sep) == 1 {
// when the separator is a single byte, sb.WriteByte is substantially faster
bsep := sep[0]
for i := 1; i < l; i++ {
if err := sb.WriteByte(bsep); err != nil {
return err
}
if _, err := sb.WriteString(string(b.Elem(i).Value.(ast.String))); err != nil {
return err
}
sb.WriteByte(bsep)
sb.WriteString(string(b.Elem(i).Value.(ast.String)))
}
} else {
// for longer separators, there is no such difference between WriteString and Write
for i := 1; i < l; i++ {
if _, err := sb.WriteString(sep); err != nil {
return err
}
if _, err := sb.WriteString(string(b.Elem(i).Value.(ast.String))); err != nil {
return err
}
sb.WriteString(sep)
sb.WriteString(string(b.Elem(i).Value.(ast.String)))
}
}
return iter(ast.InternedTerm(sb.String()))
@@ -228,15 +215,12 @@ func builtinConcat(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Ter
sep := string(join)
l := b.Len()
n += len(sep) * (l - 1)
var sb strings.Builder
sb.Grow(n)
for i, v := range b.Slice() {
if _, err := sb.WriteString(string(v.Value.(ast.String))); err != nil {
return err
}
sb.WriteString(string(v.Value.(ast.String)))
if i < l-1 {
if _, err := sb.WriteString(sep); err != nil {
return err
}
sb.WriteString(sep)
}
}
return iter(ast.InternedTerm(sb.String()))
@@ -539,7 +523,7 @@ func builtinSplit(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Term) e
return iter(ast.ArrayTerm(util.SplitMap(text, delim, ast.InternedTerm)...))
}
func builtinReplace(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
func builtinReplace(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
s, err := builtins.StringOperand(operands[0].Value, 1)
if err != nil {
return err
@@ -555,12 +539,7 @@ func builtinReplace(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Te
return err
}
sink := newSink(ast.Replace.Name, len(s), bctx.Cancel)
replacer := strings.NewReplacer(string(old), string(n))
if _, err := replacer.WriteString(sink, string(s)); err != nil {
return err
}
replaced := sink.String()
replaced := strings.ReplaceAll(string(s), string(old), string(n))
if replaced == string(s) {
return iter(operands[0])
}
@@ -568,7 +547,7 @@ func builtinReplace(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Te
return iter(ast.InternedTerm(replaced))
}
func builtinReplaceN(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
func builtinReplaceN(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
patterns, err := builtins.ObjectOperand(operands[0].Value, 1)
if err != nil {
return err
@@ -595,12 +574,7 @@ func builtinReplaceN(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.T
oldnewArr = append(oldnewArr, string(keyVal), string(strVal))
}
sink := newSink(ast.ReplaceN.Name, len(s), bctx.Cancel)
replacer := strings.NewReplacer(oldnewArr...)
if _, err := replacer.WriteString(sink, string(s)); err != nil {
return err
}
return iter(ast.InternedTerm(sink.String()))
return iter(ast.InternedTerm(strings.NewReplacer(oldnewArr...).Replace(string(s))))
}
func builtinTrim(_ BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {

View File

@@ -1,45 +0,0 @@
// Copyright 2025 The OPA Authors. All rights reserved.
// Use of this source code is governed by an Apache2
// license that can be found in the LICENSE file.
package topdown
import (
"strings"
"github.com/open-policy-agent/opa/v1/ast"
"github.com/open-policy-agent/opa/v1/topdown/builtins"
)
func builtinTemplateString(bctx BuiltinContext, operands []*ast.Term, iter func(*ast.Term) error) error {
arr, err := builtins.ArrayOperand(operands[0].Value, 1)
if err != nil {
return err
}
buf := make([]string, arr.Len())
var count int
err = builtinPrintCrossProductOperands(bctx.Location, buf, arr, 0, func(buf []string) error {
count += 1
// Precautionary run-time assertion that template-strings can't produce multiple outputs; e.g. for custom relation type built-ins not known at compile-time.
if count > 1 {
return Halt{Err: &Error{
Code: ConflictErr,
Location: bctx.Location,
Message: "template-strings must not produce multiple outputs",
}}
}
return nil
})
if err != nil {
return err
}
return iter(ast.StringTerm(strings.Join(buf, "")))
}
func init() {
RegisterBuiltinFunc(ast.InternalTemplateString.Name, builtinTemplateString)
}

View File

@@ -21,7 +21,6 @@ import (
"fmt"
"hash"
"math/big"
"strconv"
"strings"
"github.com/lestrrat-go/jwx/v3/jwk"
@@ -1132,8 +1131,8 @@ func builtinJWTDecodeVerify(bctx BuiltinContext, operands []*ast.Term, iter func
switch v := nbf.Value.(type) {
case ast.Number:
// constraints.time is in nanoseconds but nbf Value is in seconds
compareTime := ast.Number(strconv.FormatFloat(constraints.time/1000000000, 'g', -1, 64))
if compareTime.Compare(v) == -1 {
compareTime := ast.FloatNumberTerm(constraints.time / 1000000000)
if ast.Compare(compareTime, v) == -1 {
return iter(unverified)
}
default:

View File

@@ -10,7 +10,7 @@ import (
"runtime/debug"
)
var Version = "1.12.3"
var Version = "1.11.1"
// GoVersion is the version of Go this was built with
var GoVersion = runtime.Version()

10
vendor/modules.txt vendored
View File

@@ -805,7 +805,7 @@ github.com/gorilla/schema
## explicit; go 1.14
github.com/grpc-ecosystem/go-grpc-middleware
github.com/grpc-ecosystem/go-grpc-middleware/recovery
# github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.4
# github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.3
## explicit; go 1.24.0
github.com/grpc-ecosystem/grpc-gateway/v2/internal/httprule
github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2/options
@@ -894,7 +894,7 @@ github.com/kovidgoyal/go-parallel
# github.com/kovidgoyal/go-shm v1.0.0
## explicit; go 1.24.0
github.com/kovidgoyal/go-shm
# github.com/kovidgoyal/imaging v1.8.19
# github.com/kovidgoyal/imaging v1.8.18
## explicit; go 1.24.0
github.com/kovidgoyal/imaging
github.com/kovidgoyal/imaging/apng
@@ -1277,7 +1277,7 @@ github.com/onsi/gomega/matchers/support/goraph/edge
github.com/onsi/gomega/matchers/support/goraph/node
github.com/onsi/gomega/matchers/support/goraph/util
github.com/onsi/gomega/types
# github.com/open-policy-agent/opa v1.12.3
# github.com/open-policy-agent/opa v1.11.1
## explicit; go 1.24.6
github.com/open-policy-agent/opa/ast
github.com/open-policy-agent/opa/ast/json
@@ -2587,12 +2587,12 @@ golang.org/x/tools/internal/versions
# google.golang.org/genproto v0.0.0-20250303144028-a0af3efb3deb
## explicit; go 1.23.0
google.golang.org/genproto/protobuf/field_mask
# google.golang.org/genproto/googleapis/api v0.0.0-20251222181119-0a764e51fe1b
# google.golang.org/genproto/googleapis/api v0.0.0-20251202230838-ff82c1b0f217
## explicit; go 1.24.0
google.golang.org/genproto/googleapis/api
google.golang.org/genproto/googleapis/api/annotations
google.golang.org/genproto/googleapis/api/httpbody
# google.golang.org/genproto/googleapis/rpc v0.0.0-20251222181119-0a764e51fe1b
# google.golang.org/genproto/googleapis/rpc v0.0.0-20251202230838-ff82c1b0f217
## explicit; go 1.24.0
google.golang.org/genproto/googleapis/rpc/errdetails
google.golang.org/genproto/googleapis/rpc/status