Patch: Certificate validation setting + Misc fixes (#642)

- Add certificate validation setting
- Fix some OIDC providers not linking emails to local users
- Reintroduce sort by peers option for prowlarr results
- Fix "All languages" search query reverting to default language
- Fix download/request dismissal with multiple admin users
- Fix download / request behavior on details modal
This commit is contained in:
Alex
2026-02-22 23:07:55 +00:00
committed by GitHub
parent 014fc38b48
commit 0d271f1f69
52 changed files with 1292 additions and 892 deletions

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.1 MiB

After

Width:  |  Height:  |  Size: 2.0 MiB

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 151 KiB

After

Width:  |  Height:  |  Size: 148 KiB

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 854 KiB

After

Width:  |  Height:  |  Size: 848 KiB

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.3 MiB

After

Width:  |  Height:  |  Size: 2.1 MiB

View File

@@ -1 +0,0 @@
../baseline-browser-mapping/dist/cli.js

17
node_modules/.package-lock.json generated vendored
View File

@@ -1,17 +0,0 @@
{
"name": "shelfmark",
"lockfileVersion": 3,
"requires": true,
"packages": {
"node_modules/baseline-browser-mapping": {
"version": "2.9.19",
"resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.9.19.tgz",
"integrity": "sha512-ipDqC8FrAl/76p2SSWKSI+H9tFwm7vYqXQrItCuiVPt26Km0jS+NzSsBWAaBusvSbQcfJG+JitdMm+wZAgTYqg==",
"dev": true,
"license": "Apache-2.0",
"bin": {
"baseline-browser-mapping": "dist/cli.js"
}
}
}
}

View File

@@ -1,201 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -1,463 +0,0 @@
# [`baseline-browser-mapping`](https://github.com/web-platform-dx/web-features/packages/baseline-browser-mapping)
By the [W3C WebDX Community Group](https://www.w3.org/community/webdx/) and contributors.
`baseline-browser-mapping` provides:
- An `Array` of browsers compatible with Baseline Widely available and Baseline year feature sets via the [`getCompatibleVersions()` function](#get-baseline-widely-available-browser-versions-or-baseline-year-browser-versions).
- An `Array`, `Object` or `CSV` as a string describing the Baseline feature set support of all browser versions included in the module's data set via the [`getAllVersions()` function](#get-data-for-all-browser-versions).
You can use `baseline-browser-mapping` to help you determine minimum browser version support for your chosen Baseline feature set; or to analyse the level of support for different Baseline feature sets in your site's traffic by joining the data with your analytics data.
## Install for local development
To install the package, run:
`npm install --save-dev baseline-browser-mapping`
`baseline-browser-mapping` depends on `web-features` and `@mdn/browser-compat-data` for core browser version selection, but the data is pre-packaged and minified. This package checks for updates to those modules and the supported [downstream browsers](#downstream-browsers) on a daily basis and is updated frequently. Consider adding a script to your `package.json` to update `baseline-browser-mapping` and using it as part of your build process to ensure your data is as up to date as possible:
```javascript
"scripts": [
"refresh-baseline-browser-mapping": "npm i --save-dev baseline-browser-mapping@latest"
]
```
The minimum supported NodeJS version for `baseline-browser-mapping` is v8 in alignment with `browserslist`. For NodeJS versions earlier than v13.2, the [`require('baseline-browser-mapping')`](https://nodejs.org/api/modules.html#requireid) syntax should be used to import the module.
## Keeping `baseline-browser-mapping` up to date
If you are only using this module to generate minimum browser versions for Baseline Widely available or Baseline year feature sets, you don't need to update this module frequently, as the backward looking data is reasonably stable.
However, if you are targeting Newly available, using the [`getAllVersions()`](#get-data-for-all-browser-versions) function or heavily relying on the data for downstream browsers, you should update this module more frequently. If you target a feature cut off date within the last two months and your installed version of `baseline-browser-mapping` has data that is more than 2 months old, you will receive a console warning advising you to update to the latest version when you call `getCompatibleVersions()` or `getAllVersions()`.
If you want to suppress these warnings you can use the `suppressWarnings: true` option in the configuration object passed to `getCompatibleVersions()` or `getAllVersions()`. Alternatively, you can use the `BASELINE_BROWSER_MAPPING_IGNORE_OLD_DATA=true` environment variable when running your build process. This module also respects the `BROWSERSLIST_IGNORE_OLD_DATA=true` environment variable. Environment variables can also be provided in a `.env` file from Node 20 onwards; however, this module does not load .env files automatically to avoid conflicts with other libraries with different requirements. You will need to use `process.loadEnvFile()` or a library like `dotenv` to load .env files before `baseline-browser-mapping` is called.
If you want to ensure [reproducible builds](https://www.wikiwand.com/en/articles/Reproducible_builds), we strongly recommend using the `widelyAvailableOnDate` option to fix the Widely available date on a per build basis to ensure dependent tools provide the same output and you do not produce data staleness warnings. If you are using [`browserslist`](https://github.com/browserslist/browserslist) to target Baseline Widely available, consider automatically updating your `browserslist` configuration in `package.json` or `.browserslistrc` to `baseline widely available on {YYYY-MM-DD}` as part of your build process to ensure the same or sufficiently similar list of minimum browsers is reproduced for historical builds.
## Importing `baseline-browser-mapping`
This module exposes two functions: `getCompatibleVersions()` and `getAllVersions()`, both which can be imported directly from `baseline-browser-mapping`:
```javascript
import {
getCompatibleVersions,
getAllVersions,
} from "baseline-browser-mapping";
```
If you want to load the script and data directly in a web page without hosting it yourself, consider using a CDN:
```html
<script type="module">
import {
getCompatibleVersions,
getAllVersions,
} from "https://cdn.jsdelivr.net/npm/baseline-browser-mapping";
</script>
```
## Get Baseline Widely available browser versions or Baseline year browser versions
To get the current list of minimum browser versions compatible with Baseline Widely available features from the core browser set, call the `getCompatibleVersions()` function:
```javascript
getCompatibleVersions();
```
Executed on 7th March 2025, the above code returns the following browser versions:
```javascript
[
{ browser: "chrome", version: "105", release_date: "2022-09-02" },
{
browser: "chrome_android",
version: "105",
release_date: "2022-09-02",
},
{ browser: "edge", version: "105", release_date: "2022-09-02" },
{ browser: "firefox", version: "104", release_date: "2022-08-23" },
{
browser: "firefox_android",
version: "104",
release_date: "2022-08-23",
},
{ browser: "safari", version: "15.6", release_date: "2022-09-02" },
{
browser: "safari_ios",
version: "15.6",
release_date: "2022-09-02",
},
];
```
> [!NOTE]
> The minimum versions of each browser are not necessarily the final release before the Widely available cutoff date of `TODAY - 30 MONTHS`. Some earlier versions will have supported the full Widely available feature set.
### `getCompatibleVersions()` configuration options
`getCompatibleVersions()` accepts an `Object` as an argument with configuration options. The defaults are as follows:
```javascript
{
targetYear: undefined,
widelyAvailableOnDate: undefined,
includeDownstreamBrowsers: false,
listAllCompatibleVersions: false,
suppressWarnings: false
}
```
#### `targetYear`
The `targetYear` option returns the minimum browser versions compatible with all **Baseline Newly available** features at the end of the specified calendar year. For example, calling:
```javascript
getCompatibleVersions({
targetYear: 2020,
});
```
Returns the following versions:
```javascript
[
{ browser: "chrome", version: "87", release_date: "2020-11-19" },
{
browser: "chrome_android",
version: "87",
release_date: "2020-11-19",
},
{ browser: "edge", version: "87", release_date: "2020-11-19" },
{ browser: "firefox", version: "83", release_date: "2020-11-17" },
{
browser: "firefox_android",
version: "83",
release_date: "2020-11-17",
},
{ browser: "safari", version: "14", release_date: "2020-09-16" },
{ browser: "safari_ios", version: "14", release_date: "2020-09-16" },
];
```
> [!NOTE]
> The minimum version of each browser is not necessarily the final version released in that calendar year. In the above example, Firefox 84 was the final version released in 2020; however Firefox 83 supported all of the features that were interoperable at the end of 2020.
> [!WARNING]
> You cannot use `targetYear` and `widelyAavailableDate` together. Please only use one of these options at a time.
#### `widelyAvailableOnDate`
The `widelyAvailableOnDate` option returns the minimum versions compatible with Baseline Widely available on a specified date in the format `YYYY-MM-DD`:
```javascript
getCompatibleVersions({
widelyAvailableOnDate: `2023-04-05`,
});
```
> [!TIP]
> This option is useful if you provide a versioned library that targets Baseline Widely available on each version's release date and you need to provide a statement on minimum supported browser versions in your documentation.
#### `includeDownstreamBrowsers`
Setting `includeDownstreamBrowsers` to `true` will include browsers outside of the Baseline core browser set where it is possible to map those browsers to an upstream Chromium or Gecko version:
```javascript
getCompatibleVersions({
includeDownstreamBrowsers: true,
});
```
For more information on downstream browsers, see [the section on downstream browsers](#downstream-browsers) below.
#### `includeKaiOS`
KaiOS is an operating system and app framework based on the Gecko engine from Firefox. KaiOS is based on the Gecko engine and feature support can be derived from the upstream Gecko version that each KaiOS version implements. However KaiOS requires other considerations beyond feature compatibility to ensure a good user experience as it runs on device types that do not have either mouse and keyboard or touch screen input in the way that all the other browsers supported by this module do.
```javascript
getCompatibleVersions({
includeDownstreamBrowsers: true,
includeKaiOS: true,
});
```
> [!NOTE]
> Including KaiOS requires you to include all downstream browsers using the `includeDownstreamBrowsers` option.
#### `listAllCompatibleVersions`
Setting `listAllCompatibleVersions` to true will include the minimum versions of each compatible browser, and all the subsequent versions:
```javascript
getCompatibleVersions({
listAllCompatibleVersions: true,
});
```
#### `suppressWarnings`
Setting `suppressWarnings` to `true` will suppress the console warning about old data:
```javascript
getCompatibleVersions({
suppressWarnings: true,
});
```
## Get data for all browser versions
You may want to obtain data on all the browser versions available in this module for use in an analytics solution or dashboard. To get details of each browser version's level of Baseline support, call the `getAllVersions()` function:
```javascript
import { getAllVersions } from "baseline-browser-mapping";
getAllVersions();
```
By default, this function returns an `Array` of `Objects` and excludes downstream browsers:
```javascript
[
...
{
browser: "firefox_android", // Browser name
version: "125", // Browser version
release_date: "2024-04-16", // Release date
year: 2023, // Baseline year feature set the version supports
wa_compatible: true // Whether the browser version supports Widely available
},
...
]
```
For browser versions in `@mdn/browser-compat-data` that were released before Baseline can be defined, i.e. Baseline 2015, the `year` property is always the string: `"pre_baseline"`.
### Understanding which browsers support Newly available features
You may want to understand which recent browser versions support all Newly available features. You can replace the `wa_compatible` property with a `supports` property using the `useSupport` option:
```javascript
getAllVersions({
useSupports: true,
});
```
The `supports` property is optional and has two possible values:
- `widely` for browser versions that support all Widely available features.
- `newly` for browser versions that support all Newly available features.
Browser versions that do not support Widely or Newly available will not include the `support` property in the `array` or `object` outputs, and in the CSV output, the `support` column will contain an empty string. Browser versions that support all Newly available features also support all Widely available features.
### `getAllVersions()` Configuration options
`getAllVersions()` accepts an `Object` as an argument with configuration options. The defaults are as follows:
```javascript
{
includeDownstreamBrowsers: false,
outputFormat: "array",
suppressWarnings: false
}
```
#### `includeDownstreamBrowsers` (in `getAllVersions()` output)
As with `getCompatibleVersions()`, you can set `includeDownstreamBrowsers` to `true` to include the Chromium and Gecko downstream browsers [listed below](#list-of-downstream-browsers).
```javascript
getAllVersions({
includeDownstreamBrowsers: true,
});
```
Downstream browsers include the same properties as core browsers, as well as the `engine`they use and `engine_version`, for example:
```javascript
[
...
{
browser: "samsunginternet_android",
version: "27.0",
release_date: "2024-11-06",
engine: "Blink",
engine_version: "125",
year: 2023,
supports: "widely"
},
...
]
```
#### `includeKaiOS` (in `getAllVersions()` output)
As with `getCompatibleVersions()` you can include KaiOS in your output. The same requirement to have `includeDownstreamBrowsers: true` applies.
```javascript
getAllVersions({
includeDownstreamBrowsers: true,
includeKaiOS: true,
});
```
#### `suppressWarnings` (in `getAllVersions()` output)
As with `getCompatibleVersions()`, you can set `suppressWarnings` to `true` to suppress the console warning about old data:
```javascript
getAllVersions({
suppressWarnings: true,
});
```
#### `outputFormat`
By default, this function returns an `Array` of `Objects` which can be manipulated in Javascript or output to JSON.
To return an `Object` that nests keys , set `outputFormat` to `object`:
```javascript
getAllVersions({
outputFormat: "object",
});
```
In thise case, `getAllVersions()` returns a nested object with the browser [IDs listed below](#list-of-downstream-browsers) as keys, and versions as keys within them:
```javascript
{
"chrome": {
"53": {
"year": 2016,
"release_date": "2016-09-07"
},
...
}
```
Downstream browsers will include extra fields for `engine` and `engine_versions`
```javascript
{
...
"webview_android": {
"53": {
"year": 2016,
"release_date": "2016-09-07",
"engine": "Blink",
"engine_version": "53"
},
...
}
```
To return a `String` in CSV format, set `outputFormat` to `csv`:
```javascript
getAllVersions({
outputFormat: "csv",
});
```
`getAllVersions` returns a `String` with a header row and comma-separated values for each browser version that you can write to a file or pass to another service. Core browsers will have "NULL" as the value for their `engine` and `engine_version`:
```csv
"browser","version","year","supports","release_date","engine","engine_version"
...
"chrome","24","pre_baseline","","2013-01-10","NULL","NULL"
...
"chrome","53","2016","","2016-09-07","NULL","NULL"
...
"firefox","135","2024","widely","2025-02-04","NULL","NULL"
"firefox","136","2024","newly","2025-03-04","NULL","NULL"
...
"ya_android","20.12","2020","year_only","2020-12-20","Blink","87"
...
```
> [!NOTE]
> The above example uses `"includeDownstreamBrowsers": true`
### Static resources
The outputs of `getAllVersions()` are available as JSON or CSV files generated on a daily basis and hosted on GitHub pages:
- Core browsers only
- [Array](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions_array.json)
- [Object](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions_object.json)
- [CSV](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions.csv)
- Core browsers only, with `supports` property
- [Array](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions_array_with_supports.json)
- [Object](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions_object_with_supports.json)
- [CSV](https://web-platform-dx.github.io/baseline-browser-mapping/all_versions_with_supports.csv)
- Including downstream browsers
- [Array](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions_array.json)
- [Object](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions_object.json)
- [CSV](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions.csv)
- Including downstream browsers with `supports` property
- [Array](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions_array_with_supports.json)
- [Object](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions_object_with_supports.json)
- [CSV](https://web-platform-dx.github.io/baseline-browser-mapping/with_downstream/all_versions_with_supports.csv)
These files are updated on a daily basis.
## CLI
`baseline-browser-mapping` includes a command line interface that exposes the same data and options as the `getCompatibleVersions()` function. To learn more about using the CLI, run:
```sh
npx baseline-browser-mapping --help
```
## Downstream browsers
### Limitations
The browser versions in this module come from two different sources:
- MDN's `browser-compat-data` module.
- Parsed user agent strings provided by [useragents.io](https://useragents.io/)
MDN `browser-compat-data` is an authoritative source of information for the browsers it contains. The release dates for the Baseline core browser set and the mapping of downstream browsers to Chromium versions should be considered accurate.
Browser mappings from useragents.io are provided on a best effort basis. They assume that browser vendors are accurately stating the Chromium version they have implemented. The initial set of version mappings was derived from a bulk export in November 2024. This version was iterated over with a Regex match looking for a major Chrome version and a corresponding version of the browser in question, e.g.:
`Mozilla/5.0 (Linux; U; Android 10; en-US; STK-L21 Build/HUAWEISTK-L21) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/100.0.4896.58 UCBrowser/13.8.2.1324 Mobile Safari/537.36`
Shows UC Browser Mobile 13.8 implementing Chromium 100, and:
`Mozilla/5.0 (Linux; arm_64; Android 11; Redmi Note 8 Pro) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.6613.123 YaBrowser/24.10.2.123.00 SA/3 Mobile Safari/537.36`
Shows Yandex Browser Mobile 24.10 implementing Chromium 128. The Chromium version from this string is mapped to the corresponding Chrome version from MDN `browser-compat-data`.
> [!NOTE]
> Where possible, approximate release dates have been included based on useragents.io "first seen" data. useragents.io does not have "first seen" dates prior to June 2020. However, these browsers' Baseline compatibility is determined by their Chromium or Gecko version, so their release dates are more informative than critical.
This data is updated on a daily basis using a [script](https://github.com/web-platform-dx/web-features/tree/main/scripts/refresh-downstream.ts) triggered by a GitHub [action](https://github.com/web-platform-dx/web-features/tree/main/.github/workflows/refresh_downstream.yml). Useragents.io provides a private API for this module which exposes the last 7 days of newly seen user agents for the currently tracked browsers. If a new major version of one of the tracked browsers is encountered with a Chromium version that meets or exceeds the previous latest version of that browser, it is added to the [src/data/downstream-browsers.json](src/data/downstream-browsers.json) file with the date it was first seen by useragents.io as its release date.
KaiOS is an exception - its upstream version mappings are handled separately from the other browsers because they happen very infrequently.
### List of downstream browsers
| Browser | ID | Core | Source |
| --------------------- | ------------------------- | ------- | ------------------------- |
| Chrome | `chrome` | `true` | MDN `browser-compat-data` |
| Chrome for Android | `chrome_android` | `true` | MDN `browser-compat-data` |
| Edge | `edge` | `true` | MDN `browser-compat-data` |
| Firefox | `firefox` | `true` | MDN `browser-compat-data` |
| Firefox for Android | `firefox_android` | `true` | MDN `browser-compat-data` |
| Safari | `safari` | `true` | MDN `browser-compat-data` |
| Safari on iOS | `safari_ios` | `true` | MDN `browser-compat-data` |
| Opera | `opera` | `false` | MDN `browser-compat-data` |
| Opera Android | `opera_android` | `false` | MDN `browser-compat-data` |
| Samsung Internet | `samsunginternet_android` | `false` | MDN `browser-compat-data` |
| WebView Android | `webview_android` | `false` | MDN `browser-compat-data` |
| QQ Browser Mobile | `qq_android` | `false` | useragents.io |
| UC Browser Mobile | `uc_android` | `false` | useragents.io |
| Yandex Browser Mobile | `ya_android` | `false` | useragents.io |
| KaiOS | `kai_os` | `false` | Manual |
| Facebook for Android | `facebook_android` | `false` | useragents.io |
| Instagram for Android | `instagram_android` | `false` | useragents.io |
> [!NOTE]
> All the non-core browsers currently included implement Chromium or Gecko. Their inclusion in any of the above methods is based on the Baseline feature set supported by the Chromium or Gecko version they implement, not their release date.

View File

@@ -1,64 +0,0 @@
{
"name": "baseline-browser-mapping",
"main": "./dist/index.cjs",
"version": "2.9.19",
"description": "A library for obtaining browser versions with their maximum supported Baseline feature set and Widely Available status.",
"exports": {
".": {
"require": "./dist/index.cjs",
"types": "./dist/index.d.ts",
"default": "./dist/index.js"
},
"./legacy": {
"require": "./dist/index.cjs",
"types": "./dist/index.d.ts"
}
},
"jsdelivr": "./dist/index.js",
"files": [
"dist/*",
"!dist/scripts/*",
"LICENSE.txt",
"README.md"
],
"types": "./dist/index.d.ts",
"type": "module",
"bin": {
"baseline-browser-mapping": "dist/cli.js"
},
"scripts": {
"fix-cli-permissions": "output=$(npx baseline-browser-mapping 2>&1); path=$(printf '%s\n' \"$output\" | sed -n 's/^.*: \\(.*\\): Permission denied$/\\1/p; t; s/^\\(.*\\): Permission denied$/\\1/p'); if [ -n \"$path\" ]; then echo \"Permission denied for: $path\"; echo \"Removing $path ...\"; rm -rf \"$path\"; else echo \"$output\"; fi",
"test:format": "npx prettier --check .",
"test:lint": "npx eslint .",
"test:jasmine": "npx jasmine",
"test:jasmine-browser": "npx jasmine-browser-runner runSpecs --config ./spec/support/jasmine-browser.js",
"test": "npm run build && npm run fix-cli-permissions && npm run test:format && npm run test:lint && npm run test:jasmine && npm run test:jasmine-browser",
"build": "rm -rf dist; npx prettier . --write; rollup -c; rm -rf ./dist/scripts/expose-data.d.ts ./dist/cli.d.ts",
"refresh-downstream": "npx tsx scripts/refresh-downstream.ts",
"refresh-static": "npx tsx scripts/refresh-static.ts",
"update-data-file": "npx tsx scripts/update-data-file.ts; npx prettier ./src/data/data.js --write",
"update-data-dependencies": "npm i @mdn/browser-compat-data@latest web-features@latest -D",
"check-data-changes": "git diff --name-only | grep -q '^src/data/data.js$' && echo 'changes-available=TRUE' || echo 'changes-available=FALSE'"
},
"license": "Apache-2.0",
"devDependencies": {
"@mdn/browser-compat-data": "^7.2.5",
"@rollup/plugin-terser": "^0.4.4",
"@rollup/plugin-typescript": "^12.1.3",
"@types/node": "^22.15.17",
"eslint-plugin-new-with-error": "^5.0.0",
"jasmine": "^5.8.0",
"jasmine-browser-runner": "^3.0.0",
"jasmine-spec-reporter": "^7.0.0",
"prettier": "^3.5.3",
"rollup": "^4.44.0",
"tslib": "^2.8.1",
"typescript": "^5.7.2",
"typescript-eslint": "^8.35.0",
"web-features": "^3.14.0"
},
"repository": {
"type": "git",
"url": "git+https://github.com/web-platform-dx/baseline-browser-mapping.git"
}
}

View File

@@ -11,6 +11,7 @@ from shelfmark.bypass import BypassCancelledException
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
if TYPE_CHECKING:
from shelfmark.download import network
@@ -46,7 +47,8 @@ def _fetch_via_bypasser(target_url: str) -> Optional[str]:
f"{bypasser_url}{bypasser_path}",
headers={"Content-Type": "application/json"},
json={"cmd": "request.get", "url": target_url, "maxTimeout": bypasser_timeout},
timeout=(CONNECT_TIMEOUT, read_timeout)
timeout=(CONNECT_TIMEOUT, read_timeout),
verify=get_ssl_verify(bypasser_url),
)
response.raise_for_status()
result = response.json()

View File

@@ -23,7 +23,7 @@ from shelfmark.config.settings import RECORDING_DIR
from shelfmark.core.config import config as app_config
from shelfmark.core.logger import setup_logger
from shelfmark.download import network
from shelfmark.download.network import get_proxies
from shelfmark.download.network import get_proxies, get_ssl_verify
logger = setup_logger(__name__)
@@ -931,7 +931,7 @@ def _try_with_cached_cookies(url: str, hostname: str) -> Optional[str]:
headers['User-Agent'] = stored_ua
logger.debug(f"Trying request with cached cookies: {url}")
response = requests.get(url, cookies=cookies, headers=headers, proxies=get_proxies(url), timeout=(5, 10))
response = requests.get(url, cookies=cookies, headers=headers, proxies=get_proxies(url), timeout=(5, 10), verify=get_ssl_verify(url))
if response.status_code == 200:
logger.debug("Cached cookies worked, skipped Chrome bypass")
return response.text

View File

@@ -5,6 +5,7 @@ from typing import Any, Callable
from shelfmark.core.utils import normalize_http_url
from shelfmark.core.user_db import UserDB
from shelfmark.download.network import get_ssl_verify
_OIDC_LOCKOUT_MESSAGE = "A local admin account with a password is required before enabling OIDC. Use the 'Go to Users' button above to create one. This ensures you can still sign in if your identity provider is unavailable."
@@ -59,7 +60,7 @@ def test_oidc_connection(
if not discovery_url:
return {"success": False, "message": "Discovery URL is not configured."}
response = requests.get(discovery_url, timeout=10)
response = requests.get(discovery_url, timeout=10, verify=get_ssl_verify(discovery_url))
response.raise_for_status()
document = response.json()

View File

@@ -473,6 +473,17 @@ def network_settings():
tor_overrides_network = tor_enabled # Only override when Tor is actually active
return [
SelectField(
key="CERTIFICATE_VALIDATION",
label="Certificate Validation",
description="Controls SSL/TLS certificate verification for outbound connections. Disable for self-signed certificates on internal services (e.g. OIDC providers, Prowlarr).",
options=[
{"value": "enabled", "label": "Enabled (Recommended)"},
{"value": "disabled_local", "label": "Disabled for Local Addresses"},
{"value": "disabled", "label": "Disabled"},
],
default="enabled",
),
SelectField(
key="CUSTOM_DNS",
label="DNS Provider",

View File

@@ -63,6 +63,30 @@ def _emit_activity_event(ws_manager: Any | None, *, room: str, payload: dict[str
logger.warning("Failed to emit activity_update event: %s", exc)
def _list_admin_user_ids(user_db: UserDB) -> list[int]:
admin_ids: set[int] = set()
try:
users = user_db.list_users()
except Exception as exc:
logger.warning("Failed to list users while resolving admin dismissal scope: %s", exc)
return []
for user in users:
if not isinstance(user, dict):
continue
role = str(user.get("role") or "").strip().lower()
if role != "admin":
continue
try:
user_id = int(user.get("id"))
except (TypeError, ValueError):
continue
if user_id > 0:
admin_ids.add(user_id)
return sorted(admin_ids)
def _list_visible_requests(user_db: UserDB, *, is_admin: bool, db_user_id: int | None) -> list[dict[str, Any]]:
if is_admin:
request_rows = user_db.list_requests()
@@ -302,9 +326,16 @@ def register_activity_routes(
emit_request_updates(updated_requests)
request_rows = _list_visible_requests(user_db, is_admin=is_admin, db_user_id=db_user_id)
if not is_admin and db_user_id is not None:
if viewer_db_user_id is not None:
owner_user_scope = None if is_admin else db_user_id
if not is_admin and owner_user_scope is None:
owner_user_scope = viewer_db_user_id
try:
terminal_rows = activity_service.get_undismissed_terminal_downloads(db_user_id, limit=200)
terminal_rows = activity_service.get_undismissed_terminal_downloads(
viewer_db_user_id,
owner_user_id=owner_user_scope,
limit=200,
)
_merge_terminal_snapshot_backfill(status=status, terminal_rows=terminal_rows)
except Exception as exc:
logger.warning("Failed to merge terminal snapshot backfill rows: %s", exc)
@@ -361,26 +392,42 @@ def register_activity_routes(
logger.warning("Failed to resolve activity snapshot id for dismiss payload: %s", exc)
activity_log_id = None
item_type = str(data.get("item_type") or "").strip().lower()
target_user_ids = [db_user_id]
if bool(session.get("is_admin")) and item_type == "request":
admin_ids = _list_admin_user_ids(user_db)
if db_user_id not in admin_ids:
admin_ids.append(db_user_id)
target_user_ids = sorted(set(admin_ids))
dismissal = None
try:
dismissal = activity_service.dismiss_item(
user_id=db_user_id,
item_type=data.get("item_type"),
item_key=data.get("item_key"),
activity_log_id=activity_log_id,
)
for target_user_id in target_user_ids:
target_dismissal = activity_service.dismiss_item(
user_id=target_user_id,
item_type=data.get("item_type"),
item_key=data.get("item_key"),
activity_log_id=activity_log_id,
)
if target_user_id == db_user_id:
dismissal = target_dismissal
except ValueError as exc:
return jsonify({"error": str(exc)}), 400
_emit_activity_event(
ws_manager,
room=f"user_{db_user_id}",
payload={
"kind": "dismiss",
"user_id": db_user_id,
"item_type": dismissal["item_type"],
"item_key": dismissal["item_key"],
},
)
if dismissal is None:
return jsonify({"error": "Failed to persist dismissal"}), 500
for target_user_id in target_user_ids:
_emit_activity_event(
ws_manager,
room=f"user_{target_user_id}",
payload={
"kind": "dismiss",
"user_id": target_user_id,
"item_type": dismissal["item_type"],
"item_key": dismissal["item_key"],
},
)
return jsonify({"status": "dismissed", "item": dismissal})
@@ -427,20 +474,40 @@ def register_activity_routes(
normalized_payload["activity_log_id"] = activity_log_id
normalized_items.append(normalized_payload)
request_items = [
item
for item in normalized_items
if str(item.get("item_type") or "").strip().lower() == "request"
]
actor_is_admin = bool(session.get("is_admin"))
target_user_ids = [db_user_id]
if actor_is_admin and request_items:
admin_ids = _list_admin_user_ids(user_db)
if db_user_id not in admin_ids:
admin_ids.append(db_user_id)
target_user_ids = sorted(set(admin_ids))
try:
dismissed_count = activity_service.dismiss_many(user_id=db_user_id, items=normalized_items)
if actor_is_admin and request_items:
for target_user_id in target_user_ids:
if target_user_id == db_user_id:
continue
activity_service.dismiss_many(user_id=target_user_id, items=request_items)
except ValueError as exc:
return jsonify({"error": str(exc)}), 400
_emit_activity_event(
ws_manager,
room=f"user_{db_user_id}",
payload={
"kind": "dismiss_many",
"user_id": db_user_id,
"count": dismissed_count,
},
)
for target_user_id in target_user_ids:
target_count = dismissed_count if target_user_id == db_user_id else len(request_items)
_emit_activity_event(
ws_manager,
room=f"user_{target_user_id}",
payload={
"kind": "dismiss_many",
"user_id": target_user_id,
"count": target_count,
},
)
return jsonify({"status": "dismissed", "count": dismissed_count})

View File

@@ -543,9 +543,25 @@ class ActivityService:
finally:
conn.close()
def get_undismissed_terminal_downloads(self, user_id: int, *, limit: int = 200) -> list[dict[str, Any]]:
"""Return latest undismissed terminal download snapshots for one user."""
normalized_user_id = self._coerce_positive_int(user_id, "user_id")
def get_undismissed_terminal_downloads(
self,
viewer_user_id: int,
*,
owner_user_id: int | None,
limit: int = 200,
) -> list[dict[str, Any]]:
"""Return latest undismissed terminal download snapshots for a viewer.
`viewer_user_id` controls which dismissals are applied.
`owner_user_id` scopes activity rows to one owner when provided; when
omitted, rows across all owners are considered.
"""
normalized_viewer_user_id = self._coerce_positive_int(viewer_user_id, "viewer_user_id")
normalized_owner_user_id = (
self._coerce_positive_int(owner_user_id, "owner_user_id")
if owner_user_id is not None
else None
)
normalized_limit = max(1, min(int(limit), 500))
conn = self._connect()
@@ -568,14 +584,19 @@ class ActivityService:
ON d.user_id = ?
AND d.item_type = l.item_type
AND d.item_key = l.item_key
WHERE l.user_id = ?
WHERE (? IS NULL OR l.user_id = ?)
AND l.item_type = 'download'
AND l.final_status IN ('complete', 'error', 'cancelled')
AND d.id IS NULL
ORDER BY l.terminal_at DESC, l.id DESC
LIMIT ?
""",
(normalized_user_id, normalized_user_id, normalized_limit * 2),
(
normalized_viewer_user_id,
normalized_owner_user_id,
normalized_owner_user_id,
normalized_limit * 2,
),
).fetchall()
payload: list[dict[str, Any]] = []

View File

@@ -11,6 +11,7 @@ from typing import Any, Dict, Optional, Tuple
import requests
from shelfmark.core.logger import setup_logger
from shelfmark.download.network import get_ssl_verify
logger = setup_logger(__name__)
@@ -482,6 +483,7 @@ class ImageCacheService:
timeout=(5, 10),
headers=FETCH_HEADERS,
stream=True,
verify=get_ssl_verify(url),
)
response.raise_for_status()

View File

@@ -19,6 +19,7 @@ from shelfmark.core.oidc_auth import (
)
from shelfmark.core.settings_registry import load_config_file
from shelfmark.core.user_db import UserDB
from shelfmark.download.network import get_ssl_verify
logger = setup_logger(__name__)
oauth = OAuth()
@@ -38,16 +39,6 @@ def _normalize_claims(raw_claims: Any) -> dict[str, Any]:
return {}
def _is_email_verified(claims: dict[str, Any]) -> bool:
"""Normalize provider-specific email_verified values into a strict boolean."""
value = claims.get("email_verified", False)
if isinstance(value, bool):
return value
if isinstance(value, str):
return value.strip().lower() == "true"
return False
def _has_username_or_email(claims: dict[str, Any]) -> bool:
"""Return True when claims include a usable username or email."""
for key in ("preferred_username", "email"):
@@ -90,6 +81,11 @@ def _get_oidc_client() -> tuple[Any, dict[str, Any]]:
if admin_group and use_admin_group and group_claim and group_claim not in scopes:
scopes.append(group_claim)
def _ssl_compliance_fix(session, **kwargs):
"""Set session.verify based on the Certificate Validation setting."""
session.verify = get_ssl_verify(discovery_url)
return session
oauth._clients.pop("shelfmark_idp", None)
oauth.register(
name="shelfmark_idp",
@@ -100,6 +96,7 @@ def _get_oidc_client() -> tuple[Any, dict[str, Any]]:
"scope": " ".join(scopes),
"code_challenge_method": "S256",
},
compliance_fix=_ssl_compliance_fix,
overwrite=True,
)
@@ -196,7 +193,7 @@ def register_oidc_routes(app: Flask, user_db: UserDB) -> None:
if admin_group and use_admin_group:
is_admin = admin_group in groups
allow_email_link = bool(user_info.get("email")) and _is_email_verified(claims)
allow_email_link = bool(user_info.get("email"))
user = provision_oidc_user(
user_db,
user_info,

View File

@@ -1092,6 +1092,18 @@ def update_settings(tab_name: str, values: Dict[str, Any]) -> Dict[str, Any]:
):
_apply_dns_settings(config_obj)
# Apply certificate validation changes live (network tab)
if (
config_obj is not None
and tab_name == "network"
and "CERTIFICATE_VALIDATION" in values_to_save
):
try:
from shelfmark.download.network import _apply_ssl_warning_suppression
_apply_ssl_warning_suppression()
except Exception as e:
logger.warning(f"Failed to apply certificate validation setting: {e}")
# Apply AA mirror settings changes live (mirrors tab)
aa_keys = {"AA_BASE_URL", "AA_MIRROR_URLS", "AA_ADDITIONAL_URLS"}
if (

View File

@@ -18,6 +18,7 @@ from urllib.parse import urlparse
import requests
from shelfmark.core.config import config
from shelfmark.download.network import get_ssl_verify
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.clients import (
@@ -110,7 +111,7 @@ class DelugeClient(DownloadClient):
"params": list(params),
}
response = self._session.post(self._rpc_url, json=payload, timeout=timeout)
response = self._session.post(self._rpc_url, json=payload, timeout=timeout, verify=get_ssl_verify(self._rpc_url))
response.raise_for_status()
data = response.json()

View File

@@ -12,6 +12,7 @@ import requests
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.download.clients import (
DownloadClient,
DownloadStatus,
@@ -79,6 +80,7 @@ class NZBGetClient(DownloadClient):
headers={"Content-Type": "application/json"},
auth=(self.username, self.password),
timeout=30,
verify=get_ssl_verify(rpc_url),
)
response.raise_for_status()
@@ -135,7 +137,7 @@ class NZBGetClient(DownloadClient):
try:
# Fetch NZB content from the URL (handles Prowlarr proxy redirects)
logger.debug(f"Fetching NZB from: {url}")
response = requests.get(url, timeout=30)
response = requests.get(url, timeout=30, verify=get_ssl_verify(url))
response.raise_for_status()
nzb_content = base64.b64encode(response.content).decode('ascii')

View File

@@ -7,6 +7,7 @@ from typing import Optional, Tuple
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.download.clients import (
DownloadClient,
DownloadStatus,
@@ -136,6 +137,7 @@ class QBittorrentClient(DownloadClient):
host=self._base_url,
username=config.get("QBITTORRENT_USERNAME", ""),
password=config.get("QBITTORRENT_PASSWORD", ""),
VERIFY_WEBUI_CERTIFICATE=get_ssl_verify(self._base_url),
)
self._category = config.get("QBITTORRENT_CATEGORY", "books")
self._download_dir = config.get("QBITTORRENT_DOWNLOAD_DIR", "")

View File

@@ -4,12 +4,14 @@ rTorrent download client for Prowlarr integration.
Uses xmlrpc to communicate with rTorrent's RPC interface.
"""
from typing import Optional, Tuple
import ssl
from typing import Any, Optional, Tuple
from urllib.parse import urlparse
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.download.clients import (
DownloadClient,
DownloadStatus,
@@ -22,6 +24,21 @@ from shelfmark.download.clients.torrent_utils import (
logger = setup_logger(__name__)
def _create_rtorrent_server_proxy(url: str) -> Any:
"""Create an XML-RPC ServerProxy honoring certificate validation mode."""
from xmlrpc.client import SafeTransport, ServerProxy
verify = get_ssl_verify(url)
if url.startswith("https://") and not verify:
ssl_context = ssl.create_default_context()
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
transport = SafeTransport(context=ssl_context)
return ServerProxy(url, transport=transport)
return ServerProxy(url)
@register_client("torrent")
class RTorrentClient(DownloadClient):
"""rTorrent download client using xmlrpc."""
@@ -31,8 +48,6 @@ class RTorrentClient(DownloadClient):
def __init__(self):
"""Initialize rTorrent client with settings from config."""
from xmlrpc.client import ServerProxy
raw_url = config.get("RTORRENT_URL", "")
if not raw_url:
raise ValueError("RTORRENT_URL is required")
@@ -50,7 +65,7 @@ class RTorrentClient(DownloadClient):
f"{parsed.scheme}://{username}:{password}@{parsed.netloc}{parsed.path}"
)
self._rpc = ServerProxy(self._base_url)
self._rpc = _create_rtorrent_server_proxy(self._base_url)
self._download_dir = config.get("RTORRENT_DOWNLOAD_DIR", "")
self._label = config.get("RTORRENT_LABEL", "")

View File

@@ -12,6 +12,7 @@ import requests
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.download.clients import (
DownloadClient,
DownloadStatus,
@@ -148,7 +149,7 @@ class SABnzbdClient(DownloadClient):
if params:
request_params.update(params)
response = requests.get(api_url, params=request_params, timeout=30)
response = requests.get(api_url, params=request_params, timeout=30, verify=get_ssl_verify(api_url))
response.raise_for_status()
result = response.json()
@@ -177,7 +178,7 @@ class SABnzbdClient(DownloadClient):
}
files = {"name": (filename, nzb_content, "application/x-nzb")}
response = requests.post(api_url, params=request_params, files=files, timeout=30)
response = requests.post(api_url, params=request_params, files=files, timeout=30, verify=get_ssl_verify(api_url))
response.raise_for_status()
result = response.json()
@@ -190,7 +191,7 @@ class SABnzbdClient(DownloadClient):
def _fetch_nzb_content(self, url: str) -> bytes:
"""Fetch NZB content, including Prowlarr auth headers when appropriate."""
headers = self._get_prowlarr_headers(url)
response = requests.get(url, timeout=30, headers=headers)
response = requests.get(url, timeout=30, headers=headers, verify=get_ssl_verify(url))
response.raise_for_status()
return response.content

View File

@@ -1,5 +1,6 @@
"""Shared download client settings registration."""
from contextlib import contextmanager
from typing import Any, Dict, Optional
from shelfmark.core.settings_registry import (
@@ -12,10 +13,39 @@ from shelfmark.core.settings_registry import (
TagListField,
)
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
# ==================== Test Connection Callbacks ====================
@contextmanager
def _transmission_session_verify_override(url: str):
"""Ensure transmission-rpc constructor uses the configured TLS verify mode."""
verify = get_ssl_verify(url)
if verify:
yield
return
try:
import transmission_rpc.client as transmission_rpc_client
except Exception:
yield
return
original_session_factory = transmission_rpc_client.requests.Session
def _session_factory(*args: Any, **kwargs: Any) -> Any:
session = original_session_factory(*args, **kwargs)
session.verify = False
return session
transmission_rpc_client.requests.Session = _session_factory
try:
yield
finally:
transmission_rpc_client.requests.Session = original_session_factory
def _test_qbittorrent_connection(current_values: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
"""Test the qBittorrent connection using current form values."""
from shelfmark.core.config import config
@@ -36,7 +66,7 @@ def _test_qbittorrent_connection(current_values: Optional[Dict[str, Any]] = None
if not url:
return {"success": False, "message": "qBittorrent URL is invalid"}
client = Client(host=url, username=username, password=password)
client = Client(host=url, username=username, password=password, VERIFY_WEBUI_CERTIFICATE=get_ssl_verify(url))
client.auth_log_in()
api_version = client.app.web_api_version
return {"success": True, "message": f"Connected to qBittorrent (API v{api_version})"}
@@ -81,17 +111,25 @@ def _test_transmission_connection(current_values: Optional[Dict[str, Any]] = Non
"protocol": protocol,
}
try:
client = Client(**client_kwargs)
with _transmission_session_verify_override(url):
client = Client(**client_kwargs)
except TypeError as e:
if "protocol" not in str(e):
raise
client_kwargs.pop("protocol", None)
client = Client(**client_kwargs)
with _transmission_session_verify_override(url):
client = Client(**client_kwargs)
if protocol == "https" and hasattr(client, "protocol"):
try:
setattr(client, "protocol", protocol)
except Exception:
pass
# Keep session verify aligned for subsequent calls beyond constructor bootstrap.
http_session = getattr(client, "_http_session", None)
if http_session is not None:
http_session.verify = get_ssl_verify(url)
session = client.get_session()
version = session.version
return {"success": True, "message": f"Connected to Transmission {version}"}
@@ -151,7 +189,7 @@ def _test_deluge_connection(current_values: Optional[Dict[str, Any]] = None) ->
def rpc_call(session: requests.Session, rpc_id: int, method: str, *params: Any) -> Any:
payload = {"id": rpc_id, "method": method, "params": list(params)}
resp = session.post(rpc_url, json=payload, timeout=15)
resp = session.post(rpc_url, json=payload, timeout=15, verify=get_ssl_verify(rpc_url))
resp.raise_for_status()
data = resp.json()
if data.get("error"):
@@ -214,8 +252,9 @@ def _test_deluge_connection(current_values: Optional[Dict[str, Any]] = None) ->
def _test_rtorrent_connection(current_values: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
"""Test the rTorrent connection using current form values."""
from shelfmark.core.config import config
import ssl
from urllib.parse import urlparse
from xmlrpc.client import ServerProxy
from xmlrpc.client import SafeTransport, ServerProxy
current_values = current_values or {}
@@ -236,7 +275,16 @@ def _test_rtorrent_connection(current_values: Optional[Dict[str, Any]] = None) -
parsed = urlparse(url)
url = f"{parsed.scheme}://{username}:{password}@{parsed.netloc}{parsed.path}"
rpc = ServerProxy(url.rstrip("/"))
rpc_url = url.rstrip("/")
verify = get_ssl_verify(rpc_url)
if rpc_url.startswith("https://") and not verify:
ssl_context = ssl.create_default_context()
ssl_context.check_hostname = False
ssl_context.verify_mode = ssl.CERT_NONE
rpc = ServerProxy(rpc_url, transport=SafeTransport(context=ssl_context))
else:
rpc = ServerProxy(rpc_url)
version = rpc.system.client_version()
return {"success": True, "message": f"Connected to rTorrent {version}"}
except Exception as e:
@@ -264,7 +312,7 @@ def _test_nzbget_connection(current_values: Optional[Dict[str, Any]] = None) ->
try:
rpc_url = f"{url.rstrip('/')}/jsonrpc"
payload = {"jsonrpc": "2.0", "method": "status", "params": [], "id": 1}
response = requests.post(rpc_url, json=payload, auth=(username, password), timeout=30)
response = requests.post(rpc_url, json=payload, auth=(username, password), timeout=30, verify=get_ssl_verify(rpc_url))
response.raise_for_status()
result = response.json()
if "error" in result and result["error"]:
@@ -301,7 +349,7 @@ def _test_sabnzbd_connection(current_values: Optional[Dict[str, Any]] = None) ->
try:
api_url = f"{url.rstrip('/')}/api"
params = {"apikey": api_key, "mode": "version", "output": "json"}
response = requests.get(api_url, params=params, timeout=30)
response = requests.get(api_url, params=params, timeout=30, verify=get_ssl_verify(api_url))
response.raise_for_status()
result = response.json()
version = result.get("version", "unknown")

View File

@@ -11,6 +11,7 @@ import requests
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.download.network import get_ssl_verify
logger = setup_logger(__name__)
@@ -88,7 +89,7 @@ def extract_torrent_info(
# Use allow_redirects=False to handle magnet link redirects manually
# Some indexers redirect download URLs to magnet links
resp = requests.get(url, timeout=30, allow_redirects=False, headers=headers)
resp = requests.get(url, timeout=30, allow_redirects=False, headers=headers, verify=get_ssl_verify(url))
# Check if this is a redirect to a magnet link
if resp.status_code in (301, 302, 303, 307, 308):
@@ -103,7 +104,7 @@ def extract_torrent_info(
)
# Not a magnet redirect, follow it manually
logger.debug(f"Following redirect to: {redirect_url[:80]}...")
resp = requests.get(redirect_url, timeout=30, headers=headers)
resp = requests.get(redirect_url, timeout=30, headers=headers, verify=get_ssl_verify(redirect_url))
resp.raise_for_status()
torrent_data = resp.content

View File

@@ -4,12 +4,14 @@ Transmission download client for Prowlarr integration.
Uses the transmission-rpc library to communicate with Transmission's RPC API.
"""
from typing import Optional, Tuple
from contextlib import contextmanager
from typing import Any, Iterator, Optional, Tuple
from shelfmark.core.config import config
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.download.clients import (
DownloadClient,
DownloadStatus,
@@ -23,6 +25,50 @@ from shelfmark.download.clients.torrent_utils import (
logger = setup_logger(__name__)
@contextmanager
def _transmission_session_verify_override(url: str) -> Iterator[None]:
"""Temporarily override transmission-rpc's session factory when verify is disabled.
transmission-rpc performs an RPC call inside Client.__init__, so verify must be
set before the client is constructed.
"""
verify = get_ssl_verify(url)
if verify:
yield
return
try:
import transmission_rpc.client as transmission_rpc_client
except Exception:
# If internals differ, gracefully fall back to default behavior.
yield
return
original_session_factory = transmission_rpc_client.requests.Session
def _session_factory(*args: Any, **kwargs: Any) -> Any:
session = original_session_factory(*args, **kwargs)
session.verify = False
return session
transmission_rpc_client.requests.Session = _session_factory
try:
yield
finally:
transmission_rpc_client.requests.Session = original_session_factory
def _apply_transmission_ssl_verify(client: Any, url: str) -> None:
"""Apply global certificate validation policy to transmission-rpc client."""
session = getattr(client, "_http_session", None)
if session is None:
return
try:
session.verify = get_ssl_verify(url)
except Exception as e:
logger.debug("Unable to apply Transmission TLS verify setting: %s", e)
@register_client("torrent")
class TransmissionClient(DownloadClient):
"""Transmission download client using transmission-rpc library."""
@@ -57,19 +103,22 @@ class TransmissionClient(DownloadClient):
"protocol": protocol,
}
try:
self._client = Client(**client_kwargs)
with _transmission_session_verify_override(url):
self._client = Client(**client_kwargs)
except TypeError as e:
# Older transmission-rpc versions may not accept protocol as a kwarg.
if "protocol" not in str(e):
raise
client_kwargs.pop("protocol", None)
self._client = Client(**client_kwargs)
with _transmission_session_verify_override(url):
self._client = Client(**client_kwargs)
# Some versions expose protocol as an attribute rather than kwarg.
if protocol == "https" and hasattr(self._client, "protocol"):
try:
setattr(self._client, "protocol", protocol)
except Exception:
pass
_apply_transmission_ssl_verify(self._client, url)
self._category = config.get("TRANSMISSION_CATEGORY", "books")
self._download_dir = config.get("TRANSMISSION_DOWNLOAD_DIR", "")

View File

@@ -11,7 +11,7 @@ import requests
from tqdm import tqdm
from shelfmark.download import network
from shelfmark.download.network import get_proxies
from shelfmark.download.network import get_proxies, get_ssl_verify
from shelfmark.core.config import config as app_config
from shelfmark.core.logger import setup_logger
@@ -261,6 +261,7 @@ def html_get_page(
cookies=cookies,
headers=headers,
allow_redirects=allow_redirects,
verify=get_ssl_verify(current_url),
)
if is_aa_url and response.is_redirect:
@@ -403,7 +404,7 @@ def download_url(
logger.info(f"Downloading: {current_url} (attempt {attempt + 1}/{MAX_DOWNLOAD_RETRIES})")
# Try with CF cookies/UA if available
cookies = _apply_cf_bypass(current_url, headers)
response = requests.get(current_url, stream=True, proxies=get_proxies(current_url), timeout=REQUEST_TIMEOUT, cookies=cookies, headers=headers)
response = requests.get(current_url, stream=True, proxies=get_proxies(current_url), timeout=REQUEST_TIMEOUT, cookies=cookies, headers=headers, verify=get_ssl_verify(current_url))
response.raise_for_status()
if status_callback:
@@ -514,7 +515,7 @@ def _try_resume(
cookies = _apply_cf_bypass(url, resume_headers)
response = requests.get(
url, stream=True, proxies=get_proxies(url), timeout=REQUEST_TIMEOUT,
headers=resume_headers, cookies=cookies
headers=resume_headers, cookies=cookies, verify=get_ssl_verify(url)
)
# Check resume support

View File

@@ -90,6 +90,59 @@ def get_proxies(url: str = "") -> dict:
return {}
def get_ssl_verify(url: str = "") -> bool:
"""Return the ``verify`` value for outbound requests based on the
CERTIFICATE_VALIDATION setting.
- ``enabled`` → always ``True``
- ``disabled_local`` → ``False`` for local/private addresses, ``True`` otherwise
- ``disabled`` → always ``False``
"""
mode = app_config.get("CERTIFICATE_VALIDATION", "enabled")
if mode == "disabled":
return False
if mode == "disabled_local" and url:
try:
parsed = urllib.parse.urlparse(url)
hostname = parsed.hostname or ""
if hostname and _is_local_address(hostname):
return False
except Exception:
pass
return True
_ssl_warnings_suppressed = False
def _apply_ssl_warning_suppression() -> None:
"""Suppress or restore urllib3 InsecureRequestWarning based on the
CERTIFICATE_VALIDATION setting.
Called once at init and again whenever the setting changes via the UI.
Only modifies warning filters when the mode is not 'enabled', so the
default case is a complete no-op (zero behavioural change for users who
never touch the setting).
"""
global _ssl_warnings_suppressed # noqa: PLW0603
import urllib3
mode = app_config.get("CERTIFICATE_VALIDATION", "enabled")
if mode in ("disabled", "disabled_local"):
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
_ssl_warnings_suppressed = True
logger.debug("SSL warnings suppressed (certificate validation: %s)", mode)
elif _ssl_warnings_suppressed:
import warnings
warnings.simplefilter("default", urllib3.exceptions.InsecureRequestWarning)
_ssl_warnings_suppressed = False
logger.debug("SSL warnings restored (certificate validation: enabled)")
# DNS state - authoritative values managed by this module
# Other modules should use get_dns_config() to read these
CUSTOM_DNS: List[str] = []
@@ -418,7 +471,8 @@ class DoHResolver:
self.base_url,
params=params,
proxies=get_proxies(self.base_url),
timeout=10 # Increased from 5s to handle slow network conditions
timeout=10, # Increased from 5s to handle slow network conditions
verify=get_ssl_verify(self.base_url),
)
response.raise_for_status()
@@ -940,7 +994,7 @@ def _initialize_aa_state() -> None:
logger.debug(f"AA_BASE_URL: auto, checking available urls {_aa_urls}")
for i, url in enumerate(_aa_urls):
try:
response = requests.get(url, proxies=get_proxies(url), timeout=3)
response = requests.get(url, proxies=get_proxies(url), timeout=3, verify=get_ssl_verify(url))
if response.status_code == 200:
_current_aa_url_index = i
_aa_base_url = url
@@ -1036,6 +1090,7 @@ def init(force: bool = False) -> None:
try:
init_dns(force=force)
init_aa(force=force)
_apply_ssl_warning_suppression()
# Only set flag AFTER work completes successfully
_initialized = True
except Exception:

View File

@@ -20,6 +20,7 @@ from shelfmark.core.settings_registry import (
HeadingField,
)
from shelfmark.core.config import config as app_config
from shelfmark.download.network import get_ssl_verify
from shelfmark.metadata_providers import (
BookMetadata,
DisplayField,
@@ -233,7 +234,7 @@ class GoogleBooksProvider(MetadataProvider):
url = f"{GOOGLE_BOOKS_BASE_URL}{endpoint}"
try:
response = self.session.get(url, params=params, timeout=15)
response = self.session.get(url, params=params, timeout=15, verify=get_ssl_verify(url))
response.raise_for_status()
return response.json()

View File

@@ -16,6 +16,7 @@ from shelfmark.core.settings_registry import (
HeadingField,
)
from shelfmark.core.config import config as app_config
from shelfmark.download.network import get_ssl_verify
from shelfmark.metadata_providers import (
BookMetadata,
DisplayField,
@@ -635,7 +636,8 @@ class HardcoverProvider(MetadataProvider):
response = self.session.post(
HARDCOVER_API_URL,
json={"query": query, "variables": variables},
timeout=15
timeout=15,
verify=get_ssl_verify(HARDCOVER_API_URL),
)
response.raise_for_status()

View File

@@ -10,6 +10,7 @@ import requests
from shelfmark.core.cache import cacheable
from shelfmark.core.logger import setup_logger
from shelfmark.download.network import get_ssl_verify
from shelfmark.core.settings_registry import (
register_settings,
CheckboxField,
@@ -188,7 +189,8 @@ class OpenLibraryProvider(MetadataProvider):
response = self.session.get(
f"{OPENLIBRARY_BASE_URL}/search.json",
params=params,
timeout=15
timeout=15,
verify=get_ssl_verify(OPENLIBRARY_BASE_URL),
)
response.raise_for_status()
data = response.json()
@@ -229,7 +231,8 @@ class OpenLibraryProvider(MetadataProvider):
try:
response = self.session.get(
f"{OPENLIBRARY_BASE_URL}/works/{book_id}.json",
timeout=15
timeout=15,
verify=get_ssl_verify(OPENLIBRARY_BASE_URL),
)
response.raise_for_status()
work = response.json()
@@ -261,7 +264,8 @@ class OpenLibraryProvider(MetadataProvider):
# First try the ISBN API which returns edition data
response = self.session.get(
f"{OPENLIBRARY_BASE_URL}/isbn/{clean_isbn}.json",
timeout=15
timeout=15,
verify=get_ssl_verify(OPENLIBRARY_BASE_URL),
)
response.raise_for_status()
edition = response.json()
@@ -485,7 +489,8 @@ class OpenLibraryProvider(MetadataProvider):
try:
response = self.session.get(
f"{OPENLIBRARY_BASE_URL}{author_key}.json",
timeout=10
timeout=10,
verify=get_ssl_verify(OPENLIBRARY_BASE_URL),
)
response.raise_for_status()
author = response.json()
@@ -504,7 +509,8 @@ def _test_openlibrary_connection() -> Dict[str, Any]:
response = provider.session.get(
f"{OPENLIBRARY_BASE_URL}/search.json",
params={"q": "test", "limit": 1},
timeout=10
timeout=10,
verify=get_ssl_verify(OPENLIBRARY_BASE_URL),
)
response.raise_for_status()
data = response.json()

View File

@@ -113,6 +113,13 @@ class LeadingCellConfig:
uppercase: bool = False # Force uppercase for badge text
@dataclass
class SortOption:
"""A sort option that appears in the sort dropdown without being tied to a column."""
label: str # Display label in the sort dropdown
sort_key: str # Field to sort by on the Release object
@dataclass
class SourceActionButton:
"""Action button configuration for a release source."""
@@ -131,6 +138,7 @@ class ReleaseColumnConfig:
default_indexers: Optional[List[str]] = None # For Prowlarr: indexers selected in settings (pre-selected in filter)
cache_ttl_seconds: Optional[int] = None # How long to cache results (default: 5 min)
supported_filters: Optional[List[str]] = None # Which filters this source supports: ["format", "language", "indexer"]
extra_sort_options: Optional[List[SortOption]] = None # Additional sort options not tied to a column
action_button: Optional[SourceActionButton] = None # Custom action button (replaces default expand search)
@@ -191,6 +199,13 @@ def serialize_column_config(config: ReleaseColumnConfig) -> Dict[str, Any]:
if config.supported_filters is not None:
result["supported_filters"] = config.supported_filters
# Include extra sort options (sort entries not tied to a column)
if config.extra_sort_options:
result["extra_sort_options"] = [
{"label": opt.label, "sort_key": opt.sort_key}
for opt in config.extra_sort_options
]
# Include action button if specified (replaces default expand search)
if config.action_button is not None:
result["action_button"] = {

View File

@@ -164,7 +164,7 @@ def search_books(query: str, filters: SearchFilters) -> List[BookInfo]:
filters_query = ""
for value in filters.lang or config.BOOK_LANGUAGE:
for value in filters.lang if filters.lang is not None else config.BOOK_LANGUAGE:
if value != "all":
filters_query += f"&lang={quote(value)}"
@@ -704,6 +704,7 @@ def _extract_libgen_download_url(link: str, cancel_flag: Optional[Event] = None)
timeout=(5, 10),
allow_redirects=True,
proxies=network.get_proxies(link),
verify=network.get_ssl_verify(link),
)
if response.status_code != 200:
@@ -1172,8 +1173,7 @@ class DirectDownloadSource(ReleaseSource):
if isbn:
logger.debug(f"Searching direct_download: isbn='{isbn}', langs={lang_filter}")
filters = SearchFilters(isbn=[isbn])
if lang_filter:
filters.lang = lang_filter
filters.lang = lang_filter if lang_filter is not None else []
try:
results = search_books(isbn, filters)
if results:
@@ -1200,7 +1200,7 @@ class DirectDownloadSource(ReleaseSource):
continue
logger.debug(f"Searching direct_download: title_author='{query}', langs={langs}")
filters = SearchFilters(lang=langs) if langs else SearchFilters()
filters = SearchFilters(lang=langs if langs is not None else [])
try:
for bi in search_books(query, filters):
if bi.id not in seen_ids:

View File

@@ -6,6 +6,7 @@ import requests
from shelfmark.core.logger import setup_logger
from shelfmark.core.utils import normalize_http_url
from shelfmark.download.network import get_ssl_verify
from shelfmark.release_sources.prowlarr.torznab import parse_torznab_xml
logger = setup_logger(__name__)
@@ -42,6 +43,7 @@ class ProwlarrClient:
params=params,
json=json_data,
timeout=self.timeout,
verify=get_ssl_verify(url),
)
if not response.ok:
@@ -193,6 +195,7 @@ class ProwlarrClient:
# Override the session default JSON accept header.
"Accept": "application/rss+xml, application/xml;q=0.9, */*;q=0.8"
},
verify=get_ssl_verify(url),
)
if not response.ok:
try:

View File

@@ -22,6 +22,7 @@ from shelfmark.release_sources import (
ColumnColorHint,
LeadingCellConfig,
LeadingCellType,
SortOption,
)
from shelfmark.release_sources.prowlarr.api import ProwlarrClient
from shelfmark.core.utils import normalize_http_url
@@ -467,6 +468,9 @@ class ProwlarrSource(ReleaseSource):
sort_key="size_bytes",
),
],
extra_sort_options=[
SortOption(label="Peers", sort_key="seeders"),
],
grid_template="minmax(0,2fr) minmax(140px,1fr) 50px 50px 90px 80px",
leading_cell=LeadingCellConfig(type=LeadingCellType.NONE), # No leading cell for Prowlarr
available_indexers=available_indexers,

View File

@@ -10,6 +10,7 @@ import {
ButtonStateInfo,
RequestPolicyMode,
CreateRequestPayload,
isMetadataBook,
} from './types';
import {
getBookInfo,
@@ -673,12 +674,6 @@ function App() {
}
};
// Handle "Find Downloads" from DetailsModal
const handleFindDownloads = (book: Book) => {
setSelectedBook(null);
setReleaseBook(book);
};
const submitRequest = useCallback(
async (payload: CreateRequestPayload, successMessage: string): Promise<boolean> => {
try {
@@ -1327,9 +1322,16 @@ function App() {
book={selectedBook}
onClose={() => setSelectedBook(null)}
onDownload={handleDownload}
onFindDownloads={handleFindDownloads}
onFindDownloads={(book) => {
setSelectedBook(null);
void handleGetReleases(book);
}}
onSearchSeries={handleSearchSeries}
buttonState={getDirectActionButtonState(selectedBook.id)}
buttonState={
isMetadataBook(selectedBook)
? getUniversalActionButtonState(selectedBook.id)
: getDirectActionButtonState(selectedBook.id)
}
/>
)}

View File

@@ -69,6 +69,11 @@ export const BookDownloadButton = ({
const displayText = isQueuing ? 'Queuing...' : buttonState.text;
const showCircularProgress = buttonState.state === 'downloading' && buttonState.progress !== undefined;
const showSpinner = (isInProgress && !showCircularProgress) || isQueuing;
const isRequestAction = buttonState.state === 'download' && buttonState.text === 'Request';
const iconVariantActionIconPath = isRequestAction
? 'M12 4.5v15m7.5-7.5h-15'
: 'M3 16.5v2.25A2.25 2.25 0 0 0 5.25 21h13.5A2.25 2.25 0 0 0 21 18.75V16.5M16.5 12 12 16.5m0 0L7.5 12m4.5 4.5V3';
const primaryActionIconPath = isRequestAction ? 'M12 4.5v15m7.5-7.5h-15' : 'M12 4v12m0 0l-4-4m4 4 4-4M6 20h12';
const primaryStateClasses =
isCompleted
@@ -198,10 +203,10 @@ export const BookDownloadButton = ({
return (
<>
<svg className={`${iconSizes.mobile} sm:hidden`} fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d="M3 16.5v2.25A2.25 2.25 0 0 0 5.25 21h13.5A2.25 2.25 0 0 0 21 18.75V16.5M16.5 12 12 16.5m0 0L7.5 12m4.5 4.5V3" />
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d={iconVariantActionIconPath} />
</svg>
<svg className={`${iconSizes.desktop} hidden sm:block`} fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d="M3 16.5v2.25A2.25 2.25 0 0 0 5.25 21h13.5A2.25 2.25 0 0 0 21 18.75V16.5M16.5 12 12 16.5m0 0L7.5 12m4.5 4.5V3" />
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={1.5} d={iconVariantActionIconPath} />
</svg>
</>
);
@@ -221,7 +226,7 @@ export const BookDownloadButton = ({
>
{variant === 'primary' && showIcon && !isCompleted && !hasError && !showCircularProgress && !showSpinner && (
<svg className={primaryIconSizes[size]} fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M12 4v12m0 0l-4-4m4 4 4-4M6 20h12" />
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d={primaryActionIconPath} />
</svg>
)}

View File

@@ -78,7 +78,10 @@ export const DetailsModal = ({ book, onClose, onDownload, onFindDownloads, onSea
// Determine if this is a metadata book (Universal mode) vs a release (Direct Download)
const isMetadata = isMetadataBook(book);
const metadataActionText =
isMetadata && buttonState.state === 'download' && buttonState.text === 'Get'
? 'Find Downloads'
: buttonState.text;
const publisherInfo = { label: 'Publisher', value: book.publisher || '-' };
// Build metadata grid based on mode
@@ -336,19 +339,21 @@ export const DetailsModal = ({ book, onClose, onDownload, onFindDownloads, onSea
</svg>
</a>
)}
{/* Action button - Find Downloads (Universal) or Download (Direct) */}
{/* Action button - mirrors search result action state/flow */}
<button
onClick={isMetadata ? () => onFindDownloads?.(book) : handleDownload}
disabled={!isMetadata && buttonState.state !== 'download'}
disabled={isMetadata ? buttonState.state === 'blocked' : buttonState.state !== 'download'}
className={`ml-auto rounded-full px-6 py-2.5 text-sm font-medium text-white transition-colors focus:outline-none focus:ring-2 focus:ring-offset-2 disabled:opacity-50 disabled:cursor-not-allowed ${
isMetadata
? 'bg-emerald-600 hover:bg-emerald-700 focus:ring-emerald-500'
? buttonState.state === 'blocked'
? 'bg-gray-500 focus:ring-gray-400'
: 'bg-emerald-600 hover:bg-emerald-700 focus:ring-emerald-500'
: buttonState.state === 'blocked'
? 'bg-gray-500 focus:ring-gray-400'
: 'bg-sky-700 hover:bg-sky-800 focus:ring-sky-500'
}`}
>
{isMetadata ? 'Find Downloads' : buttonState.text}
{isMetadata ? metadataActionText : buttonState.text}
</button>
</div>
</footer>

View File

@@ -24,7 +24,13 @@ import { ReleaseCell } from './ReleaseCell';
import { getColorStyleFromHint } from '../utils/colorMaps';
import { getNestedValue } from '../utils/objectHelpers';
import { LanguageMultiSelect } from './LanguageMultiSelect';
import { LANGUAGE_OPTION_ALL, LANGUAGE_OPTION_DEFAULT, getLanguageFilterValues, releaseLanguageMatchesFilter, buildLanguageNormalizer } from '../utils/languageFilters';
import {
LANGUAGE_OPTION_DEFAULT,
getLanguageFilterValues,
getReleaseSearchLanguageParams,
releaseLanguageMatchesFilter,
buildLanguageNormalizer,
} from '../utils/languageFilters';
// Module-level cache for release search results
// Key format: `${provider}:${provider_id}:${source}:${contentType}`
@@ -1125,10 +1131,7 @@ export const ReleaseModal = ({
try {
// Resolve language codes for the API call (same logic as Apply button)
const langCodes = getLanguageFilterValues(languageFilter, bookLanguages, defaultLanguages);
const languagesParam = (langCodes === null || langCodes?.includes(LANGUAGE_OPTION_ALL))
? undefined
: langCodes;
const languagesParam = getReleaseSearchLanguageParams(languageFilter, bookLanguages, defaultLanguages);
// Pass indexer filter only if the source supports it (empty array = search all)
const supportsIndexerFilter = releasesBySource[activeTab]?.column_config?.supported_filters?.includes('indexer');
@@ -1289,6 +1292,21 @@ export const ReleaseModal = ({
return columnConfig.columns.filter(col => col.sortable) || [];
}, [columnConfig]);
// Build unified list of all sort options (from sortable columns + extra_sort_options)
const allSortOptions = useMemo(() => {
const fromColumns = sortableColumns.map(col => ({
label: col.label,
sortKey: col.sort_key || col.key,
defaultDirection: inferDefaultDirection(col.render_type) as 'asc' | 'desc',
}));
const fromExtra = (columnConfig.extra_sort_options || []).map(opt => ({
label: opt.label,
sortKey: opt.sort_key,
defaultDirection: 'desc' as const, // Extra sort options are typically numeric (e.g., peers)
}));
return [...fromColumns, ...fromExtra];
}, [sortableColumns, columnConfig.extra_sort_options]);
// Get current sort state for active tab (from state, localStorage, or default to null = best match)
const currentSort = useMemo((): SortState | null => {
// Check state first - explicit null means "Default" was selected
@@ -1299,17 +1317,17 @@ export const ReleaseModal = ({
const saved = getSavedSort(activeTab);
if (saved) {
// Verify the saved sort is still valid for this source
const isValid = sortableColumns.some(col => (col.sort_key || col.key) === saved.key);
const isValid = allSortOptions.some(opt => opt.sortKey === saved.key);
if (isValid) {
return saved;
}
}
// Default to null (best-match sorting)
return null;
}, [activeTab, sortBySource, sortableColumns]);
}, [activeTab, sortBySource, allSortOptions]);
// Handle sort change - null means "Default" (best title match), otherwise toggle direction or set new column
const handleSortChange = useCallback((sortKey: string | null, column: ColumnSchema | null) => {
const handleSortChange = useCallback((sortKey: string | null, defaultDirection: 'asc' | 'desc') => {
if (sortKey === null) {
// "Default" selected - use best-match sorting
setSortBySource(prev => {
@@ -1330,16 +1348,16 @@ export const ReleaseModal = ({
let newState: SortState;
if (currentState && currentState.key === sortKey) {
// Same column - toggle direction
// Same key - toggle direction
newState = {
key: sortKey,
direction: currentState.direction === 'asc' ? 'desc' : 'asc',
};
} else {
// New column - use default direction for this column type
// New key - use provided default direction
newState = {
key: sortKey,
direction: inferDefaultDirection(column!.render_type),
direction: defaultDirection,
};
}
@@ -1385,7 +1403,7 @@ export const ReleaseModal = ({
});
// Then, sort by explicit column, or default to book-title relevance with exact author boost
if (currentSort && sortableColumns.length > 0) {
if (currentSort && allSortOptions.length > 0) {
filtered = sortReleases(filtered, currentSort.key, currentSort.direction);
} else {
const responseBook = releasesBySource[activeTab]?.book;
@@ -1395,7 +1413,7 @@ export const ReleaseModal = ({
}
return filtered;
}, [releasesBySource, activeTab, formatFilter, resolvedLanguageCodes, effectiveFormats, defaultLanguages, languageNormalizer, indexerFilter, currentSort, sortableColumns, columnConfig, book]);
}, [releasesBySource, activeTab, formatFilter, resolvedLanguageCodes, effectiveFormats, defaultLanguages, languageNormalizer, indexerFilter, currentSort, allSortOptions, columnConfig, book]);
// Pre-compute display field lookups to avoid repeated .find() calls in JSX
const displayFields = useMemo(() => {
@@ -1771,8 +1789,8 @@ export const ReleaseModal = ({
</svg>
</button>
{/* Sort dropdown - only show if source has sortable columns */}
{sortableColumns.length > 0 && (
{/* Sort dropdown - only show if source has sort options */}
{allSortOptions.length > 0 && (
<Dropdown
align="right"
widthClassName="w-auto flex-shrink-0"
@@ -1800,7 +1818,7 @@ export const ReleaseModal = ({
<button
type="button"
onClick={() => {
handleSortChange(null, null);
handleSortChange(null, 'asc');
close();
}}
className={`w-full px-3 py-2 text-left text-sm flex items-center justify-between hover-surface rounded ${!currentSort
@@ -1815,16 +1833,15 @@ export const ReleaseModal = ({
</svg>
)}
</button>
{sortableColumns.map((col) => {
const sortKey = col.sort_key || col.key;
const isSelected = currentSort?.key === sortKey;
{allSortOptions.map((opt) => {
const isSelected = currentSort?.key === opt.sortKey;
const direction = isSelected ? currentSort?.direction : null;
return (
<button
key={sortKey}
key={opt.sortKey}
type="button"
onClick={() => {
handleSortChange(sortKey, col);
handleSortChange(opt.sortKey, opt.defaultDirection);
// Don't close - allow toggling direction
if (!isSelected) close();
}}
@@ -1833,7 +1850,7 @@ export const ReleaseModal = ({
: 'text-gray-700 dark:text-gray-300'
}`}
>
<span>{col.label}</span>
<span>{opt.label}</span>
{isSelected && direction && (
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24" strokeWidth={2}>
{direction === 'asc' ? (
@@ -1955,11 +1972,7 @@ export const ReleaseModal = ({
setLoadingBySource((prev) => ({ ...prev, [activeTab]: true }));
try {
// Resolve language codes for the API call
const langCodes = getLanguageFilterValues(languageFilter, bookLanguages, defaultLanguages);
// Don't pass languages if "All" is selected or null
const languagesParam = (langCodes === null || langCodes?.includes(LANGUAGE_OPTION_ALL))
? undefined
: langCodes;
const languagesParam = getReleaseSearchLanguageParams(languageFilter, bookLanguages, defaultLanguages);
// Pass indexer filter only if the source supports it (empty array = search all)
const supportsIndexerFilter = columnConfig.supported_filters?.includes('indexer');

View File

@@ -0,0 +1,46 @@
import * as assert from 'node:assert/strict';
import { describe, it } from 'node:test';
import type { Language } from '../types/index.js';
import {
LANGUAGE_OPTION_ALL,
LANGUAGE_OPTION_DEFAULT,
getReleaseSearchLanguageParams,
} from '../utils/languageFilters.js';
const supportedLanguages: Language[] = [
{ code: 'en', language: 'English' },
{ code: 'de', language: 'German' },
{ code: 'hu', language: 'Hungarian' },
];
describe('languageFilters release search params', () => {
it('omits languages when only default selection is active', () => {
const result = getReleaseSearchLanguageParams(
[LANGUAGE_OPTION_DEFAULT],
supportedLanguages,
['en'],
);
assert.equal(result, undefined);
});
it('preserves explicit all-languages selection', () => {
const result = getReleaseSearchLanguageParams(
[LANGUAGE_OPTION_ALL],
supportedLanguages,
['en'],
);
assert.deepEqual(result, [LANGUAGE_OPTION_ALL]);
});
it('resolves explicit language selections to codes', () => {
const result = getReleaseSearchLanguageParams(
['de', 'hu'],
supportedLanguages,
['en'],
);
assert.deepEqual(result, ['de', 'hu']);
});
});

View File

@@ -315,6 +315,11 @@ export interface LeadingCellConfig {
uppercase?: boolean; // Force uppercase for badge text
}
export interface ExtraSortOption {
label: string; // Display label in the sort dropdown
sort_key: string; // Field to sort by on the Release object
}
export interface SourceActionButton {
label: string; // Button text (e.g., "Refresh search")
action: string; // Action type: "expand" triggers expand_search
@@ -329,6 +334,7 @@ export interface ReleaseColumnConfig {
default_indexers?: string[]; // For Prowlarr: indexers selected in settings (pre-selected in filter)
cache_ttl_seconds?: number; // How long to cache results (default: 300 = 5 min)
supported_filters?: string[]; // Which filters this source supports: ["format", "language", "indexer"]
extra_sort_options?: ExtraSortOption[]; // Additional sort options not tied to a column
action_button?: SourceActionButton; // Custom action button (replaces default expand search)
}

View File

@@ -66,6 +66,21 @@ export const getLanguageFilterValues = (
return resolved.size ? Array.from(resolved) : null;
};
/**
* Resolve language selection for /api/releases requests.
* - undefined: use backend defaults
* - ["all"]: disable language filtering
* - ["en", ...]: explicit filter list
*/
export const getReleaseSearchLanguageParams = (
selection: string[],
supportedLanguages: Language[],
defaultLanguageCodes: string[] = [],
): string[] | undefined => {
const resolved = getLanguageFilterValues(selection, supportedLanguages, defaultLanguageCodes);
return resolved === null ? undefined : resolved;
};
export const formatDefaultLanguageLabel = (
languageCodes: string[],
supportedLanguages: Language[],
@@ -136,4 +151,3 @@ export const releaseLanguageMatchesFilter = (
const selectedSet = new Set(selectedCodes.map(c => c.toLowerCase()));
return releaseCodes.every(code => selectedSet.has(code));
};

View File

@@ -309,7 +309,7 @@ class TestSecuritySettings:
action = next((f for f in fields if f.key == "open_users_tab"), None)
assert action is not None
assert action.label == "Go to Users"
assert action.show_when == {"field": "AUTH_METHOD", "value": "builtin"}
assert action.show_when == {"field": "AUTH_METHOD", "value": ["builtin", "oidc"]}
class TestSecurityOnSave:

View File

@@ -279,6 +279,40 @@ class TestActivityRoutes:
assert "expired-task-1" in response.json["status"]["complete"]
assert response.json["status"]["complete"]["expired-task-1"]["id"] == "expired-task-1"
def test_admin_snapshot_backfills_terminal_downloads_across_users(self, main_module, client):
admin = _create_user(main_module, prefix="admin", role="admin")
request_owner = _create_user(main_module, prefix="reader")
_set_session(client, user_id=admin["username"], db_user_id=admin["id"], is_admin=True)
main_module.activity_service.record_terminal_snapshot(
user_id=request_owner["id"],
item_type="download",
item_key="download:cross-user-expired-task",
origin="requested",
final_status="complete",
source_id="cross-user-expired-task",
snapshot={
"kind": "download",
"download": {
"id": "cross-user-expired-task",
"title": "Cross User Task",
"author": "Another User",
"added_time": 123,
"status_message": "Finished",
"source": "direct_download",
"user_id": request_owner["id"],
},
},
)
with patch.object(main_module, "get_auth_mode", return_value="builtin"):
with patch.object(main_module.backend, "queue_status", return_value=_sample_status_payload()):
response = client.get("/api/activity/snapshot")
assert response.status_code == 200
assert "cross-user-expired-task" in response.json["status"]["complete"]
assert response.json["status"]["complete"]["cross-user-expired-task"]["id"] == "cross-user-expired-task"
def test_snapshot_clears_stale_download_dismissal_when_same_task_is_active(self, main_module, client):
user = _create_user(main_module, prefix="reader")
_set_session(client, user_id=user["username"], db_user_id=user["id"], is_admin=False)
@@ -332,6 +366,29 @@ class TestActivityRoutes:
assert snapshot_two.status_code == 200
assert {"item_type": "download", "item_key": "download:shared-task"} not in snapshot_two.json["dismissed"]
def test_admin_request_dismissal_is_shared_across_admin_users(self, main_module, client):
admin_one = _create_user(main_module, prefix="admin-one", role="admin")
admin_two = _create_user(main_module, prefix="admin-two", role="admin")
with patch.object(main_module, "get_auth_mode", return_value="builtin"):
_set_session(client, user_id=admin_one["username"], db_user_id=admin_one["id"], is_admin=True)
dismiss_response = client.post(
"/api/activity/dismiss",
json={"item_type": "request", "item_key": "request:999999"},
)
assert dismiss_response.status_code == 200
_set_session(client, user_id=admin_two["username"], db_user_id=admin_two["id"], is_admin=True)
with patch.object(main_module.backend, "queue_status", return_value=_sample_status_payload()):
snapshot_response = client.get("/api/activity/snapshot")
history_response = client.get("/api/activity/history?limit=50&offset=0")
assert snapshot_response.status_code == 200
assert {"item_type": "request", "item_key": "request:999999"} in snapshot_response.json["dismissed"]
assert history_response.status_code == 200
assert any(row["item_key"] == "request:999999" for row in history_response.json)
def test_history_paging_is_stable_and_non_overlapping(self, main_module, client):
user = _create_user(main_module, prefix="history-user")
_set_session(client, user_id=user["username"], db_user_id=user["id"], is_admin=False)

View File

@@ -235,7 +235,10 @@ class TestActivityService:
item_key="download:task-2",
)
rows = activity_service.get_undismissed_terminal_downloads(user["id"])
rows = activity_service.get_undismissed_terminal_downloads(
user["id"],
owner_user_id=user["id"],
)
assert len(rows) == 1
assert rows[0]["item_key"] == "download:task-1"
assert rows[0]["final_status"] == "complete"
@@ -243,3 +246,50 @@ class TestActivityService:
"kind": "download",
"download": {"id": "task-1", "status_message": "done"},
}
def test_get_undismissed_terminal_downloads_can_span_owners_for_admin_viewer(
self,
user_db,
activity_service,
):
viewer = user_db.create_user(username="admin-viewer", role="admin")
owner_one = user_db.create_user(username="owner-one")
owner_two = user_db.create_user(username="owner-two")
activity_service.record_terminal_snapshot(
user_id=owner_one["id"],
item_type="download",
item_key="download:owner-one-task",
origin="direct",
final_status="complete",
source_id="owner-one-task",
terminal_at="2026-01-01T10:00:00+00:00",
snapshot={"kind": "download", "download": {"id": "owner-one-task"}},
)
activity_service.record_terminal_snapshot(
user_id=owner_two["id"],
item_type="download",
item_key="download:owner-two-task",
origin="direct",
final_status="complete",
source_id="owner-two-task",
terminal_at="2026-01-01T11:00:00+00:00",
snapshot={"kind": "download", "download": {"id": "owner-two-task"}},
)
activity_service.dismiss_item(
user_id=viewer["id"],
item_type="download",
item_key="download:owner-two-task",
)
all_owner_rows = activity_service.get_undismissed_terminal_downloads(
viewer["id"],
owner_user_id=None,
)
assert [row["item_key"] for row in all_owner_rows] == ["download:owner-one-task"]
owner_one_rows = activity_service.get_undismissed_terminal_downloads(
viewer["id"],
owner_user_id=owner_one["id"],
)
assert [row["item_key"] for row in owner_one_rows] == ["download:owner-one-task"]

View File

@@ -259,3 +259,51 @@ class TestProvisionOIDCUser:
assert user["username"] != "john" # Should have a suffix
assert user["oidc_subject"] == "sub-456"
assert user["auth_source"] == "oidc"
def test_provision_links_to_existing_user_by_email(self, user_db):
"""When allow_email_link=True and emails match, link to existing local user."""
from shelfmark.core.oidc_auth import provision_oidc_user
user_db.create_user(
username="localuser",
email="shared@example.com",
password_hash="hash",
)
user_info = {
"oidc_subject": "oidc-sub-789",
"username": "oidcuser",
"email": "shared@example.com",
"display_name": "OIDC User",
}
user = provision_oidc_user(
user_db, user_info, is_admin=False, allow_email_link=True,
)
assert user["username"] == "localuser"
assert user["oidc_subject"] == "oidc-sub-789"
assert user["auth_source"] == "oidc"
assert user["email"] == "shared@example.com"
def test_provision_does_not_link_by_email_when_disabled(self, user_db):
"""When allow_email_link=False (default), don't link by email."""
from shelfmark.core.oidc_auth import provision_oidc_user
user_db.create_user(
username="localuser",
email="shared@example.com",
password_hash="hash",
)
user_info = {
"oidc_subject": "oidc-sub-no-link",
"username": "oidcuser",
"email": "shared@example.com",
"display_name": "OIDC User",
}
user = provision_oidc_user(
user_db, user_info, is_admin=False, allow_email_link=False,
)
# Should create a new user, not link to existing
assert user["username"] == "oidcuser"
assert user["oidc_subject"] == "oidc-sub-no-link"
original = user_db.get_user(username="localuser")
assert original["oidc_subject"] is None

View File

@@ -253,13 +253,14 @@ class TestOIDCCallbackEndpoint:
assert "issuer validation failed" in error
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_redirects_when_auto_provision_disabled(self, mock_get_client, client):
def test_callback_redirects_when_auto_provision_disabled_and_no_email_match(
self, mock_get_client, client
):
config = {**MOCK_OIDC_CONFIG, "OIDC_AUTO_PROVISION": False}
fake_client = Mock()
fake_client.authorize_access_token.return_value = {
"userinfo": {
"sub": "unknown-user",
"email": "unknown@example.com",
"preferred_username": "unknown",
"groups": [],
}
@@ -272,7 +273,7 @@ class TestOIDCCallbackEndpoint:
assert "Account not found" in error
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_allows_pre_created_user_by_verified_email_when_no_provision(
def test_callback_links_pre_created_user_by_email_when_no_provision(
self, mock_get_client, client, user_db
):
config = {**MOCK_OIDC_CONFIG, "OIDC_AUTO_PROVISION": False}
@@ -283,7 +284,6 @@ class TestOIDCCallbackEndpoint:
"userinfo": {
"sub": "oidc-alice-sub",
"email": "alice@example.com",
"email_verified": True,
"preferred_username": "alice_oidc",
"groups": [],
}
@@ -298,18 +298,16 @@ class TestOIDCCallbackEndpoint:
assert sess.get("db_user_id") is not None
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_does_not_link_unverified_email_when_no_provision(
def test_callback_does_not_link_when_no_email_and_no_provision(
self, mock_get_client, client, user_db
):
config = {**MOCK_OIDC_CONFIG, "OIDC_AUTO_PROVISION": False}
user = user_db.create_user(username="bob", email="bob@example.com", password_hash="hash")
user_db.create_user(username="bob", email="bob@example.com", password_hash="hash")
fake_client = Mock()
fake_client.authorize_access_token.return_value = {
"userinfo": {
"sub": "oidc-bob-sub",
"email": "bob@example.com",
"email_verified": False,
"preferred_username": "bob_oidc",
"groups": [],
}
@@ -321,7 +319,7 @@ class TestOIDCCallbackEndpoint:
assert error is not None
assert "Account not found" in error
updated_user = user_db.get_user(user_id=user["id"])
updated_user = user_db.get_user(username="bob")
assert updated_user["oidc_subject"] is None
@patch("shelfmark.core.oidc_routes._get_oidc_client")
@@ -359,3 +357,84 @@ class TestOIDCCallbackEndpoint:
error = _get_oidc_error(resp)
assert error is not None
assert "Authentication failed" in error
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_links_to_existing_user_by_email(
self, mock_get_client, client, user_db
):
"""OIDC login with matching email should link to existing local user."""
user_db.create_user(username="localuser", email="shared@example.com", password_hash="hash")
fake_client = Mock()
fake_client.authorize_access_token.return_value = {
"userinfo": {
"sub": "oidc-new-sub",
"email": "shared@example.com",
"preferred_username": "oidcuser",
"groups": [],
}
}
mock_get_client.return_value = (fake_client, MOCK_OIDC_CONFIG)
resp = client.get("/api/auth/oidc/callback?code=abc123&state=test-state")
assert resp.status_code == 302
with client.session_transaction() as sess:
assert sess["user_id"] == "localuser"
linked = user_db.get_user(username="localuser")
assert linked["oidc_subject"] == "oidc-new-sub"
assert linked["auth_source"] == "oidc"
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_creates_new_user_when_no_email_match(
self, mock_get_client, client, user_db
):
"""OIDC login without matching email creates a new user."""
user_db.create_user(username="existing", email="other@example.com", password_hash="hash")
fake_client = Mock()
fake_client.authorize_access_token.return_value = {
"userinfo": {
"sub": "oidc-nomatch",
"email": "different@example.com",
"preferred_username": "newuser",
"groups": [],
}
}
mock_get_client.return_value = (fake_client, MOCK_OIDC_CONFIG)
resp = client.get("/api/auth/oidc/callback?code=abc123&state=test-state")
assert resp.status_code == 302
with client.session_transaction() as sess:
assert sess["user_id"] == "newuser"
original = user_db.get_user(username="existing")
assert original["oidc_subject"] is None
@patch("shelfmark.core.oidc_routes._get_oidc_client")
def test_callback_no_email_link_when_oidc_has_no_email(
self, mock_get_client, client, user_db
):
"""OIDC login without email in claims should not attempt email linking."""
user_db.create_user(username="existing", email="existing@example.com", password_hash="hash")
fake_client = Mock()
fake_client.authorize_access_token.return_value = {
"userinfo": {
"sub": "oidc-noemail",
"preferred_username": "noemailuser",
"groups": [],
}
}
mock_get_client.return_value = (fake_client, MOCK_OIDC_CONFIG)
resp = client.get("/api/auth/oidc/callback?code=abc123&state=test-state")
assert resp.status_code == 302
with client.session_transaction() as sess:
assert sess["user_id"] == "noemailuser"
original = user_db.get_user(username="existing")
assert original["oidc_subject"] is None

View File

@@ -2,10 +2,11 @@ import requests
class _FakeResponse:
def __init__(self, status_code: int, *, headers: dict | None = None, text: str = "") -> None:
def __init__(self, status_code: int, *, headers: dict | None = None, text: str = "", url: str = "") -> None:
self.status_code = status_code
self.headers = headers or {}
self.text = text
self.url = url
@property
def is_redirect(self) -> bool: # requests.Response compatibility
@@ -53,9 +54,9 @@ def test_html_get_page_aa_cross_host_redirect_rotates_mirror(monkeypatch):
def fake_get(url: str, **kwargs):
calls.append({"url": url, "allow_redirects": kwargs.get("allow_redirects")})
if url.startswith("https://annas-archive.li/"):
return _FakeResponse(302, headers={"Location": "https://annas-archive.pm/search?q=test"})
return _FakeResponse(302, headers={"Location": "https://annas-archive.pm/search?q=test"}, url=url)
if url.startswith("https://annas-archive.gl/"):
return _FakeResponse(200, text="OK")
return _FakeResponse(200, text="OK", url=url)
raise AssertionError(f"Unexpected URL: {url}")
monkeypatch.setattr(http.requests, "get", fake_get)
@@ -88,9 +89,9 @@ def test_html_get_page_aa_same_host_redirect_is_followed(monkeypatch):
def fake_get(url: str, **kwargs):
calls.append({"url": url, "allow_redirects": kwargs.get("allow_redirects")})
if url == "https://annas-archive.li/search?q=test":
return _FakeResponse(302, headers={"Location": "/search?q=test&page=1"})
return _FakeResponse(302, headers={"Location": "/search?q=test&page=1"}, url=url)
if url == "https://annas-archive.li/search?q=test&page=1":
return _FakeResponse(200, text="OK2")
return _FakeResponse(200, text="OK2", url=url)
raise AssertionError(f"Unexpected URL: {url}")
monkeypatch.setattr(http.requests, "get", fake_get)
@@ -125,7 +126,7 @@ def test_html_get_page_locked_aa_does_not_fail_over_on_cross_host_redirect(monke
def fake_get(url: str, **kwargs):
calls.append(url)
if url.startswith("https://annas-archive.li/"):
return _FakeResponse(302, headers={"Location": "https://annas-archive.pm/search?q=test"})
return _FakeResponse(302, headers={"Location": "https://annas-archive.pm/search?q=test"}, url=url)
raise AssertionError(f"Unexpected URL: {url}")
monkeypatch.setattr(http.requests, "get", fake_get)

View File

@@ -0,0 +1,274 @@
"""Tests for certificate validation / SSL verify utilities."""
import warnings
import pytest
# ---------------------------------------------------------------------------
# get_ssl_verify()
# ---------------------------------------------------------------------------
class TestGetSslVerify:
"""Tests for get_ssl_verify() return values across all modes."""
def test_enabled_returns_true(self, monkeypatch):
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "enabled" if k == "CERTIFICATE_VALIDATION" else d)
assert network.get_ssl_verify("https://example.com") is True
def test_enabled_returns_true_for_local_url(self, monkeypatch):
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "enabled" if k == "CERTIFICATE_VALIDATION" else d)
assert network.get_ssl_verify("https://localhost:8080") is True
def test_disabled_returns_false_for_public_url(self, monkeypatch):
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled" if k == "CERTIFICATE_VALIDATION" else d)
assert network.get_ssl_verify("https://example.com") is False
def test_disabled_returns_false_for_local_url(self, monkeypatch):
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled" if k == "CERTIFICATE_VALIDATION" else d)
assert network.get_ssl_verify("https://192.168.1.1:9091") is False
def test_disabled_returns_false_with_no_url(self, monkeypatch):
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled" if k == "CERTIFICATE_VALIDATION" else d)
assert network.get_ssl_verify() is False
def test_default_when_unset_returns_true(self, monkeypatch):
"""When CERTIFICATE_VALIDATION is not in config, default is 'enabled'."""
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": d)
assert network.get_ssl_verify("https://example.com") is True
class TestGetSslVerifyDisabledLocal:
"""Tests for 'disabled_local' mode with various address types."""
@pytest.fixture(autouse=True)
def _set_mode(self, monkeypatch):
import shelfmark.download.network as network
self.network = network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled_local" if k == "CERTIFICATE_VALIDATION" else d)
# --- Should return False (local addresses) ---
def test_localhost(self):
assert self.network.get_ssl_verify("https://localhost:8080/path") is False
def test_127_0_0_1(self):
assert self.network.get_ssl_verify("http://127.0.0.1:9091") is False
def test_ipv6_loopback(self):
assert self.network.get_ssl_verify("http://[::1]:8080") is False
def test_private_10_x(self):
assert self.network.get_ssl_verify("https://10.0.0.5:443") is False
def test_private_172_16_x(self):
assert self.network.get_ssl_verify("https://172.16.0.1:8080") is False
def test_private_172_31_x(self):
assert self.network.get_ssl_verify("https://172.31.255.255:443") is False
def test_private_192_168_x(self):
assert self.network.get_ssl_verify("https://192.168.1.100:9696") is False
def test_dot_local_domain(self):
assert self.network.get_ssl_verify("https://authelia.local:9091") is False
def test_dot_internal_domain(self):
assert self.network.get_ssl_verify("https://prowlarr.internal:9696") is False
def test_dot_lan_domain(self):
assert self.network.get_ssl_verify("https://server.lan:443") is False
def test_dot_home_domain(self):
assert self.network.get_ssl_verify("https://nas.home:5000") is False
def test_dot_docker_domain(self):
assert self.network.get_ssl_verify("https://app.docker:8080") is False
def test_simple_hostname_no_dot(self):
"""Docker-style service names like 'prowlarr', 'deluge'."""
assert self.network.get_ssl_verify("http://prowlarr:9696") is False
def test_link_local_169_254(self):
assert self.network.get_ssl_verify("http://169.254.1.1:8080") is False
# --- Should return True (public addresses) ---
def test_public_domain(self):
assert self.network.get_ssl_verify("https://example.com") is True
def test_public_ip(self):
assert self.network.get_ssl_verify("https://8.8.8.8:443") is True
def test_public_subdomain(self):
assert self.network.get_ssl_verify("https://api.hardcover.app/v1/graphql") is True
def test_172_32_is_public(self):
"""172.32.x.x is NOT in the private range (only 172.16-31.x.x)."""
assert self.network.get_ssl_verify("https://172.32.0.1:443") is True
def test_empty_url_returns_true(self):
"""No URL means we can't determine locality — default to verify."""
assert self.network.get_ssl_verify("") is True
def test_no_url_returns_true(self):
assert self.network.get_ssl_verify() is True
# ---------------------------------------------------------------------------
# _apply_ssl_warning_suppression()
# ---------------------------------------------------------------------------
class TestApplySslWarningSuppression:
"""Tests for urllib3 InsecureRequestWarning suppression toggling."""
@pytest.fixture(autouse=True)
def _reset_suppression_flag(self):
"""Ensure the module-level flag is clean before each test."""
import shelfmark.download.network as network
original = network._ssl_warnings_suppressed
yield
network._ssl_warnings_suppressed = original
def test_enabled_at_init_is_noop(self, monkeypatch):
"""When mode is 'enabled' and warnings were never suppressed, nothing changes."""
import shelfmark.download.network as network
network._ssl_warnings_suppressed = False
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "enabled" if k == "CERTIFICATE_VALIDATION" else d)
filters_before = list(warnings.filters)
network._apply_ssl_warning_suppression()
filters_after = list(warnings.filters)
assert filters_before == filters_after
def test_disabled_mode_suppresses_warnings(self, monkeypatch):
import urllib3
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled" if k == "CERTIFICATE_VALIDATION" else d)
network._apply_ssl_warning_suppression()
with warnings.catch_warnings(record=True) as w:
warnings.simplefilter("always")
warnings.warn("test", urllib3.exceptions.InsecureRequestWarning)
# urllib3.disable_warnings adds a filter that suppresses — so recorded warnings
# should be empty after suppression is applied. However, our catch_warnings
# with "always" takes precedence within the context manager. Instead, check
# that the filter was installed.
filters = [f for f in warnings.filters if len(f) >= 3 and f[2] is urllib3.exceptions.InsecureRequestWarning]
assert len(filters) > 0
def test_disabled_local_mode_suppresses_warnings(self, monkeypatch):
import urllib3
import shelfmark.download.network as network
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled_local" if k == "CERTIFICATE_VALIDATION" else d)
network._apply_ssl_warning_suppression()
filters = [f for f in warnings.filters if len(f) >= 3 and f[2] is urllib3.exceptions.InsecureRequestWarning]
assert len(filters) > 0
def test_enabled_mode_restores_warnings(self, monkeypatch):
import urllib3
import shelfmark.download.network as network
# First suppress
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "disabled" if k == "CERTIFICATE_VALIDATION" else d)
network._apply_ssl_warning_suppression()
# Then restore
monkeypatch.setattr(network.app_config, "get", lambda k, d="": "enabled" if k == "CERTIFICATE_VALIDATION" else d)
network._apply_ssl_warning_suppression()
# "default" filter should be present for InsecureRequestWarning
default_filters = [
f for f in warnings.filters
if len(f) >= 3 and f[0] == "default" and f[2] is urllib3.exceptions.InsecureRequestWarning
]
assert len(default_filters) > 0
# ---------------------------------------------------------------------------
# Settings registration
# ---------------------------------------------------------------------------
class TestCertificateValidationSetting:
"""Tests for the CERTIFICATE_VALIDATION settings field registration."""
def _get_network_fields(self):
import shelfmark.config.settings # noqa: F401 — ensure settings tabs are registered
from shelfmark.core.settings_registry import get_settings_tab
tab = get_settings_tab("network")
assert tab is not None
return {field.key: field for field in tab.fields if hasattr(field, "key")}
def test_field_registered(self):
fields = self._get_network_fields()
assert "CERTIFICATE_VALIDATION" in fields
def test_field_is_select(self):
from shelfmark.core.settings_registry import SelectField
fields = self._get_network_fields()
assert isinstance(fields["CERTIFICATE_VALIDATION"], SelectField)
def test_field_default_is_enabled(self):
fields = self._get_network_fields()
assert fields["CERTIFICATE_VALIDATION"].default == "enabled"
def test_field_has_three_options(self):
fields = self._get_network_fields()
options = fields["CERTIFICATE_VALIDATION"].options
assert len(options) == 3
def test_field_option_values(self):
fields = self._get_network_fields()
values = [opt["value"] for opt in fields["CERTIFICATE_VALIDATION"].options]
assert values == ["enabled", "disabled_local", "disabled"]
# ---------------------------------------------------------------------------
# Live-apply on settings save
# ---------------------------------------------------------------------------
def test_update_settings_certificate_validation_triggers_suppression(monkeypatch):
"""Changing CERTIFICATE_VALIDATION via update_settings calls _apply_ssl_warning_suppression."""
import shelfmark.config.settings # noqa: F401 — ensure settings tabs are registered
from shelfmark.core.config import config as config_obj
from shelfmark.core.settings_registry import update_settings
monkeypatch.setattr("shelfmark.core.settings_registry.save_config_file", lambda _tab, _values: True)
monkeypatch.setattr(config_obj, "refresh", lambda: None)
called = {"count": 0}
import shelfmark.download.network as network
def fake_apply():
called["count"] += 1
monkeypatch.setattr(network, "_apply_ssl_warning_suppression", fake_apply)
result = update_settings("network", {"CERTIFICATE_VALIDATION": "disabled"})
assert result["success"] is True
assert called["count"] == 1

View File

@@ -0,0 +1,115 @@
"""SSL verification behavior for download client settings test callbacks."""
import types
from types import SimpleNamespace
from unittest.mock import MagicMock, patch
def make_config_getter(values):
"""Create a config.get function that returns values from a dict."""
def getter(key, default=""):
return values.get(key, default)
return getter
def test_transmission_settings_test_connection_applies_ssl_verify(monkeypatch):
"""Transmission settings callback should apply verify mode to transmission-rpc session."""
from shelfmark.core.config import config as config_obj
from shelfmark.download.clients import settings as settings_module
current_values = {
"TRANSMISSION_URL": "https://localhost:9091",
"TRANSMISSION_USERNAME": "admin",
"TRANSMISSION_PASSWORD": "password",
}
monkeypatch.setattr(config_obj, "get", make_config_getter(current_values))
monkeypatch.setattr(settings_module, "get_ssl_verify", lambda _url: False)
mock_http_session = SimpleNamespace(verify=True)
mock_client = MagicMock()
mock_client._http_session = mock_http_session
mock_client.get_session.return_value = SimpleNamespace(version="4.0.0")
mock_transmission_rpc = MagicMock()
mock_transmission_rpc.Client = MagicMock(return_value=mock_client)
with patch.dict("sys.modules", {"transmission_rpc": mock_transmission_rpc}):
result = settings_module._test_transmission_connection(current_values=current_values)
assert result["success"] is True
assert mock_http_session.verify is False
def test_transmission_settings_test_connection_disables_verify_during_constructor(monkeypatch):
"""Settings callback should disable verify before transmission-rpc constructor bootstraps."""
from shelfmark.core.config import config as config_obj
from shelfmark.download.clients import settings as settings_module
current_values = {
"TRANSMISSION_URL": "https://localhost:9091",
"TRANSMISSION_USERNAME": "admin",
"TRANSMISSION_PASSWORD": "password",
}
monkeypatch.setattr(config_obj, "get", make_config_getter(current_values))
monkeypatch.setattr(settings_module, "get_ssl_verify", lambda _url: False)
transmission_pkg = types.ModuleType("transmission_rpc")
transmission_pkg.__path__ = []
transmission_client_mod = types.ModuleType("transmission_rpc.client")
def _base_session_factory():
return types.SimpleNamespace(verify=True)
transmission_client_mod.requests = types.SimpleNamespace(Session=_base_session_factory)
def _fake_client_ctor(**_kwargs):
bootstrap_session = transmission_client_mod.requests.Session()
if bootstrap_session.verify is not False:
raise RuntimeError("verify not disabled during constructor bootstrap")
client = MagicMock()
client._http_session = bootstrap_session
client.get_session.return_value = types.SimpleNamespace(version="4.0.0")
return client
transmission_pkg.Client = _fake_client_ctor
transmission_pkg.client = transmission_client_mod
with patch.dict(
"sys.modules",
{
"transmission_rpc": transmission_pkg,
"transmission_rpc.client": transmission_client_mod,
},
):
result = settings_module._test_transmission_connection(current_values=current_values)
assert result["success"] is True
def test_rtorrent_settings_test_connection_uses_unverified_transport_when_disabled(monkeypatch):
"""rTorrent settings callback should pass SafeTransport for HTTPS when verify is disabled."""
from shelfmark.core.config import config as config_obj
from shelfmark.download.clients import settings as settings_module
current_values = {
"RTORRENT_URL": "https://localhost:8080/RPC2",
"RTORRENT_USERNAME": "",
"RTORRENT_PASSWORD": "",
}
monkeypatch.setattr(config_obj, "get", make_config_getter(current_values))
monkeypatch.setattr(settings_module, "get_ssl_verify", lambda _url: False)
mock_rpc = MagicMock()
mock_rpc.system.client_version.return_value = "0.9.8"
mock_xmlrpc = MagicMock()
mock_xmlrpc.ServerProxy = MagicMock(return_value=mock_rpc)
with patch.dict("sys.modules", {"xmlrpc.client": mock_xmlrpc}):
result = settings_module._test_rtorrent_connection(current_values=current_values)
assert result["success"] is True
assert mock_xmlrpc.SafeTransport.called is True
assert "transport" in mock_xmlrpc.ServerProxy.call_args.kwargs

View File

@@ -80,6 +80,36 @@ class TestRTorrentClientIsConfigured:
class TestRTorrentClientTestConnection:
"""Tests for RTorrentClient.test_connection()."""
def test_init_https_disabled_verification_uses_unverified_transport(self, monkeypatch):
"""HTTPS rTorrent with verify disabled should use a SafeTransport with custom SSL context."""
config_values = {
"RTORRENT_URL": "https://localhost:8080/RPC2",
"RTORRENT_USERNAME": "",
"RTORRENT_PASSWORD": "",
"RTORRENT_DOWNLOAD_DIR": "/downloads",
"RTORRENT_LABEL": "cwabd",
}
monkeypatch.setattr(
"shelfmark.download.clients.rtorrent.config.get",
make_config_getter(config_values),
)
mock_rpc = MagicMock()
mock_xmlrpc = create_mock_xmlrpc_module()
mock_xmlrpc.ServerProxy.return_value = mock_rpc
with patch.dict("sys.modules", {"xmlrpc.client": mock_xmlrpc}):
if "shelfmark.download.clients.rtorrent" in sys.modules:
del sys.modules["shelfmark.download.clients.rtorrent"]
from shelfmark.download.clients import rtorrent as rtorrent_module
monkeypatch.setattr(rtorrent_module, "get_ssl_verify", lambda _url: False)
rtorrent_module.RTorrentClient()
assert mock_xmlrpc.SafeTransport.called is True
assert "transport" in mock_xmlrpc.ServerProxy.call_args.kwargs
def test_test_connection_success(self, monkeypatch):
"""Test successful connection."""
config_values = {

View File

@@ -9,6 +9,7 @@ from unittest.mock import MagicMock, patch
from datetime import timedelta
import pytest
import sys
import types
from shelfmark.download.clients import DownloadStatus
@@ -154,6 +155,87 @@ class TestTransmissionClientTestConnection:
TransmissionClient()
assert mock_transmission_rpc.Client.call_args.kwargs.get("protocol") == "https"
def test_init_applies_certificate_validation_to_session(self, monkeypatch):
"""Test Transmission client applies verify mode onto transmission-rpc session."""
config_values = {
"TRANSMISSION_URL": "https://localhost:9091",
"TRANSMISSION_USERNAME": "admin",
"TRANSMISSION_PASSWORD": "password",
"TRANSMISSION_CATEGORY": "test",
}
monkeypatch.setattr(
"shelfmark.download.clients.transmission.config.get",
make_config_getter(config_values),
)
mock_http_session = MagicMock()
mock_client_instance = MagicMock()
mock_client_instance._http_session = mock_http_session
mock_transmission_rpc = create_mock_transmission_rpc_module()
mock_transmission_rpc.Client.return_value = mock_client_instance
with patch.dict("sys.modules", {"transmission_rpc": mock_transmission_rpc}):
if "shelfmark.download.clients.transmission" in sys.modules:
del sys.modules["shelfmark.download.clients.transmission"]
from shelfmark.download.clients import transmission as transmission_module
monkeypatch.setattr(transmission_module, "get_ssl_verify", lambda _url: False)
transmission_module.TransmissionClient()
assert mock_http_session.verify is False
def test_init_disables_verify_before_constructor_bootstrap(self, monkeypatch):
"""verify=False must be in place before transmission-rpc constructor bootstraps RPC session."""
config_values = {
"TRANSMISSION_URL": "https://localhost:9091",
"TRANSMISSION_USERNAME": "admin",
"TRANSMISSION_PASSWORD": "password",
"TRANSMISSION_CATEGORY": "test",
}
monkeypatch.setattr(
"shelfmark.download.clients.transmission.config.get",
make_config_getter(config_values),
)
transmission_pkg = types.ModuleType("transmission_rpc")
transmission_pkg.__path__ = [] # Mark as package for submodule imports.
transmission_client_mod = types.ModuleType("transmission_rpc.client")
def _base_session_factory():
return types.SimpleNamespace(verify=True)
transmission_client_mod.requests = types.SimpleNamespace(Session=_base_session_factory)
def _fake_client_ctor(**_kwargs):
bootstrap_session = transmission_client_mod.requests.Session()
if bootstrap_session.verify is not False:
raise RuntimeError("verify not disabled during constructor bootstrap")
client = MagicMock()
client._http_session = bootstrap_session
client.get_session.return_value = MockSession(version="4.0.5")
return client
transmission_pkg.Client = _fake_client_ctor
transmission_pkg.client = transmission_client_mod
with patch.dict(
"sys.modules",
{
"transmission_rpc": transmission_pkg,
"transmission_rpc.client": transmission_client_mod,
},
):
if "shelfmark.download.clients.transmission" in sys.modules:
del sys.modules["shelfmark.download.clients.transmission"]
from shelfmark.download.clients import transmission as transmission_module
monkeypatch.setattr(transmission_module, "get_ssl_verify", lambda _url: False)
client = transmission_module.TransmissionClient()
assert client._client._http_session.verify is False
def test_test_connection_success(self, monkeypatch):
"""Test successful connection."""
config_values = {