Compare commits

..

127 Commits

Author SHA1 Message Date
Safihre
e1ea4f1e7e Update text files for 5.0.0Beta1 2026-02-02 15:26:00 +01:00
mnightingale
e40098d0e7 Remove sleep when cache is full (#3300)
* Remove cache full sleep

* Remove override_trigger
2026-02-02 10:27:48 +01:00
SABnzbd Automation
5025f9ec5d Update translatable texts
[skip ci]
2026-02-02 08:15:36 +00:00
mnightingale
26a485374c Only wake servers while requests are available (#3299) 2026-02-02 09:14:54 +01:00
mnightingale
5b3a8fcd3f Fix nzb deadlocks (#3298)
* Fix nzb deadlocks

* Keep the lock behaviour unchanged but ensure correct order
2026-01-30 17:23:57 +01:00
mnightingale
44447ab416 Add more logging to stop_idle_jobs (#3294)
* Add more logging to stop_idle_jobs

* Would log too much

* Reduce logging a little

* Tweak message

* Spelling

* Log first articles
2026-01-30 15:24:52 +01:00
mnightingale
040573c75c Fix deadlock in hard_reset/remove_socket (#3297) 2026-01-29 18:14:22 +01:00
mnightingale
16a6936053 Bind socket throughout test but don't listen and configure a timeout (#3296) 2026-01-29 12:07:27 +01:00
mnightingale
e2921e7b9c Add guards to process_nw_read (#3295) 2026-01-29 08:10:12 +01:00
mnightingale
e1cd1eed83 Remove unused logging arguments (#3293) 2026-01-27 20:35:23 +01:00
SABnzbd Automation
a4de704967 Update translatable texts
[skip ci]
2026-01-26 18:07:10 +00:00
mnightingale
d9f9aa5bea Fix adding sockets mid-connect (#3291)
* Do not add sockets that are not already connected

* Don't preemptively mark thread busy

* Clear nntp instance on failed connect

* Just use reset_nw like everywhere else

* Track when the socket is connected and idle connections can handle requested when connected (completed auth) or socket_connected

* Add tests for connection state handling

* Windows is really slow at this

* Rename connected to ready and socket_connected to connected
2026-01-26 19:06:22 +01:00
renovate[bot]
f4b73cf9ec Update all dependencies (#3292)
* Update all dependencies

* pycparser dropped support for Python 3.9

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Safihre <safihre@sabnzbd.org>
2026-01-26 11:09:39 +01:00
SABnzbd Automation
ddc84542eb Update translatable texts
[skip ci]
2026-01-26 09:50:06 +00:00
Safihre
9624a285f1 Add Apprise documentation URL to Notifications page 2026-01-26 10:45:11 +01:00
Safihre
43a9678f07 Do not show tracebacks externally
Releastes to #3286
2026-01-23 13:36:46 +01:00
mnightingale
4ee41e331c Handle SSLWantWriteError exceptions and buffer writes (#3289)
* Handle SSLWantWriteError

* Add buffer for non-blocking writes
2026-01-23 09:21:32 +01:00
mnightingale
062dc9fa11 Fix assembler waiting for failed article (#3290) 2026-01-23 07:12:16 +01:00
SABnzbd Automation
d215d4b0d7 Update translatable texts
[skip ci]
2026-01-21 20:44:06 +00:00
Safihre
04711886d9 Bump next release from 4.6 to 5.0 due to major changes
No release yet, just text bumps
2026-01-21 21:42:59 +01:00
mnightingale
a19b3750e3 Handle non-fatal errors during read (#3280)
* Handle non-fatal errors during read

* sabctools 9.3.1
2026-01-21 21:18:58 +01:00
mnightingale
eff5f663ab Add database indexes (#3283)
* Add database indexes

* Remove completed bytes index

* Allow duplicate query to short circuit

* Remove duplicate indexes

* Remove most of the query changes
2026-01-20 11:52:49 +01:00
mnightingale
46c98acff3 Fix inconsistent NzbFile sorting (#3276)
* Fix inconsistent NzbFile sorting

* Add more groups and tiers

* Black formatting
2026-01-20 11:06:02 +01:00
renovate[bot]
df5fad29bc Update all dependencies (#3285)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-19 13:04:31 +01:00
SABnzbd Automation
27d222943c Update translatable texts
[skip ci]
2026-01-19 11:43:49 +00:00
Safihre
3384beed24 Make black 26.1.0 happy again - almost 2026-01-19 12:42:51 +01:00
mnightingale
bf41237135 Trigger assemble when next is available and it has been 5 seconds (#3281) 2026-01-17 09:53:06 +01:00
mnightingale
3d4fabfbdf Log socket exception type on write error (#3279) 2026-01-16 22:08:16 +01:00
mnightingale
cf14e24036 Revert pipelining stat/head check (#3278) 2026-01-15 21:12:20 +01:00
Safihre
d0c2b74181 Add current version/environment metadata to _api_showlog
Closes #3277
2026-01-15 10:02:34 +01:00
mnightingale
d21a111993 Fix pipelining connection read/write logic errors (#3272)
* Fix commands which fail to be sent are lost

* Force macOS to use the select implementation

* Do not recreate lock when reinitialised

* Suppress errors when closing socket to ensure socket is closed

* Make connection errors on read or write both only wait 5 seconds before reconnecting

* Fix selector selection

* Only check generation under lock

* Use PollSelector
2026-01-12 15:19:07 +01:00
mnightingale
3e7dcce365 Fix queue cannot be loaded (#3271)
* Fix queue cannot be restored

* Also change init

* Add a test

Fixes #3269
2026-01-12 09:54:59 +01:00
renovate[bot]
5594d4d6eb Update dependency urllib3 to v2.6.3 (#3274)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-12 01:34:09 +00:00
SABnzbd Automation
605a1b30be Update translatable texts
[skip ci]
2026-01-11 21:42:10 +00:00
mnightingale
a2cb861640 Fix bug when bandwidth limit is removed (#3273) 2026-01-11 22:41:27 +01:00
mnightingale
df1c0915d0 Recreate frènch_german_demö test data (#3268) 2026-01-09 10:59:51 +01:00
mnightingale
4d73c3e9c0 Adjust monitored socket events as required to prevent hot looping (#3267)
* Adjust monitored socket events as required to prevent hot looping

* Prevent write hot looping

* Guard against pending recursive call

* Already have EVENT_READ and don't need to handle it in two places
2026-01-08 15:14:59 +01:00
Safihre
17dcff49b2 Generalize locking strategy (#3264)
* Use per-nzo/nzf lock in wrapper

* Replace global TryList lock with per-nzo one

* Offset does require file_lock
2026-01-06 17:00:27 +01:00
mnightingale
220186299b Do not throttle flushing cache during shutdown (#3265)
* Add a deadline for flushing cache contents on shutdown and don't throttle

* Revert "Add a deadline for flushing cache contents on shutdown and don't throttle"

This reverts commit e405b4c4f4.

* Always flush the whole cache but don't sleep when shutting down
2026-01-06 16:58:39 +01:00
Safihre
ae30be382b Merge platform specific memory functions into single 2026-01-06 13:38:06 +01:00
SABnzbd Automation
13b10fd9bb Update translatable texts
[skip ci]
2026-01-06 11:46:06 +00:00
mnightingale
d9bb544caf Always enqueue when file_done (#3263) 2026-01-06 12:45:25 +01:00
SABnzbd Automation
bf2080068c Update translatable texts
[skip ci]
2026-01-06 09:03:40 +00:00
mnightingale
b4e8c80bc9 Implement Direct Write (#3236)
* Implement direct write

* Support direct_write changes at runtime

* Check sparse support when download_dir changes

* Fixes to reverting to append mode and add tests

* Single write path, remove truncate, improve tests, add test for append mode with out of order direct writes

* assert expected nzf.assembler_next_index

* bytes_written_sequentially assertions

* Slim tests and mock load_article as a dictionary

* More robust bytes_written_sequentially

* Worked but guard Python -1 semantics

* os.path.getsize silly

* Add test with force followed by append to gaps

* Split flush_cache into its own function so the loop does not need to clear the article variable

* Fewer private functions

* Extract article cache limit for waiting constant

* Move option back to specials

* Use Status.DELETED for clarity

* Use nzo.lock in articlecache

* Document why assembler_next_index increments

* Remove duplicated code from write

* load_data formatting

* Create files with the same permissions as with open(...)

* Options are callable

* Fix crash if direct writing from cache but has been deleted

* Fix crash in next_index check via article cache

* Fix assembler waiting for register_article and cache waiting for assembler to write

* Simplify flush_cache loop and only log once per second

* Document why we would leave the assembler when forced at the first not tried article

* When skippedwe can't increment the next_index

* Rename bytes_written_sequentially to sequential_offset improve comments and logic

* Don't need to check when the config changes, due to the runtime changes any failure during assembly will disable it

* Remove unused constant

* Improve append triggering based on contiguous bytes ready to write to file and add a trigger to direct write

* Throttle downloader threads when direct writing out of order

* Clear ready_bytes when removed from queue

* Rework check_assembler_levels sleeping to have a deadline, be based on if the assembler actual pending bytes, and if delaying could have any impact

* Always write first articles if filenames are checked

* Rename force to allow_non_contiguous so it is clearer what it means

* Article is required

* Tweak delay triggers

* Fix for possible dictionary changed size during iteration

* postproc only gets the nzo

* Rename constants and remove redundant calculation

* For safety just key by nzf_id

* Not redundant because capped at 500M

* Tweak a little more

* Only delay if assembler is busy

* Remove unused constant and rename the remaining one

* Calculate if direct write is allowed when cache limit changes

* Allow direct writes to bypass trigger

* Avoid race to requeue

* Breakup the queuing logic so its understandable
2026-01-06 10:03:00 +01:00
Safihre
33aa4f1199 Revert "OptionBool should return bool"
This reverts commit ecb36442d3.

It messes up the sabnzbd.ini reading/writing!
2026-01-05 15:58:54 +01:00
Safihre
ecb36442d3 OptionBool should return bool
Hmm why wasn't this the case?
2026-01-05 14:50:35 +01:00
Safihre
0bbe34242e Stop updating uvicorn branch 2026-01-05 09:51:55 +01:00
renovate[bot]
7c6abd9528 Update all dependencies (#3258)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-01-05 01:53:32 +00:00
Safihre
448c034f79 Simplify get_windows_memory 2026-01-03 22:03:26 +01:00
mnightingale
9d5cf9fc5b Fix binding outgoing ip test when ipv6 is available (#3257) 2026-01-02 15:53:48 +01:00
mnightingale
4f9d0fb7d4 Fix connection test failing when bad credentials are used (#3256) 2026-01-02 15:03:04 +01:00
mnightingale
240d5b4ff7 Fix failing downloader tests and make tests less fragile (#3254)
* Fix failing downloader tests and make tests less fragile

* Implement feedback

* Spelling and only sleep if necessary

* Grammar is hard
2026-01-02 15:02:34 +01:00
SABnzbd Automation
a2161ba89b Update translatable texts
[skip ci]
2026-01-02 11:50:29 +00:00
Safihre
68e193bf56 Use blocking writes instead of buffering (#3248) 2026-01-02 12:49:42 +01:00
SABnzbd Automation
b5dda7c52d Update translatable texts
[skip ci]
2025-12-30 08:24:04 +00:00
mnightingale
b6691003db Refactor RSS flow (#3247) 2025-12-30 09:23:20 +01:00
Safihre
ed655553c8 Pausing the queue with Force'd downloads doesn't let them download
Closes #3246
2025-12-29 12:37:44 +01:00
Safihre
316b96c653 Add context folder with documents for AI-tools 2025-12-29 12:12:58 +01:00
mnightingale
62401cba27 Simplify RSS rule evaluation (#3243) 2025-12-29 10:50:55 +01:00
renovate[bot]
3cabf44ce3 Update dependency pyinstaller-hooks-contrib to v2025.11 (#3244)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-29 08:55:14 +01:00
mnightingale
a637d218c4 Refactor preparing the rss config (#3242)
* Refactor preparing the rss config

* Defaults, bool, and iteration
2025-12-24 21:47:15 +01:00
Safihre
63c03b42a9 Update text files for 4.6.0Beta2 2025-12-22 22:00:02 +01:00
SABnzbd Automation
4539837fad Update translatable texts
[skip ci]
2025-12-22 20:53:43 +00:00
Safihre
a0cd48e3f5 Notify user if they run AMD64 version on ARM64 Windows machine
Closes #3235
2025-12-22 21:52:58 +01:00
Safihre
ceeb7cb162 Add Windows ARM64 binary 2025-12-22 21:25:17 +01:00
SABnzbd Automation
f9f4e1b028 Update translatable texts
[skip ci]
2025-12-22 15:40:48 +00:00
Safihre
6487944c6c Move Pipelining setting to Server-level 2025-12-22 16:38:46 +01:00
renovate[bot]
239fddf39c Update all dependencies (develop) (#3238)
* Update all dependencies

* Compare fakefs result after sorting

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Safihre <safihre@sabnzbd.org>
2025-12-22 12:45:00 +00:00
SABnzbd Automation
8ada8b2fd9 Update translatable texts
[skip ci]
2025-12-19 11:41:18 +00:00
Safihre
b19bd65495 Show error in case of failed NZB upload
Closes #3233
2025-12-19 12:40:34 +01:00
Safihre
e3ea5fdd64 Update appdata file with Flathub suggestions
@jcfp
2025-12-19 11:54:53 +01:00
Safihre
4fdb89701a Add release URL's to appdata 2025-12-19 11:42:36 +01:00
SABnzbd Automation
9165c4f304 Update translatable texts
[skip ci]
2025-12-18 20:10:14 +00:00
mnightingale
4152f0ba6a Increase max pipelining (#3234) 2025-12-18 20:09:24 +00:00
SABnzbd Automation
3eaab17739 Update translatable texts
[skip ci]
2025-12-16 09:04:29 +00:00
Safihre
578bfd083d Update text files 4.6.0 Beta 1 2025-12-16 10:03:41 +01:00
renovate[bot]
dd464456e4 Update all dependencies (#3231)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-15 02:08:17 +00:00
mnightingale
e7a0255359 Make behaviour after reset more robust (#3229)
* Make behaviour after reset more robust

* Remove use of hasattr and rename to generation

* I had a feeling this would be a circular reference

* Reset and increment generation under lock
2025-12-14 22:39:48 +01:00
mnightingale
2e1281d9e8 Fix nzb types (#3230) 2025-12-14 15:46:14 +01:00
SABnzbd Automation
efecefdd3b Update translatable texts
[skip ci]
2025-12-09 20:22:55 +00:00
Safihre
a91e718ef5 Split nzbstuff into separate files for Article, NzbFile and NzbObject (#3221) 2025-12-09 21:21:51 +01:00
mnightingale
b420975267 Fix read/write actions after reset_nw (#3223) 2025-12-09 19:39:49 +01:00
SABnzbd Automation
c4211df8dc Update translatable texts
[skip ci]
2025-12-08 21:37:36 +00:00
Safihre
e182707d3a Update text files for 4.6.0Alpha2 2025-12-08 22:36:51 +01:00
Safihre
05cbd9d7c4 Correct process_nzb_only_download and add tests 2025-12-08 11:42:22 +01:00
Safihre
6e8683349f Keep NZB name prefix when processing multiple NZBs
Closes #3217
2025-12-08 10:27:16 +01:00
Safihre
adb4816552 Update to Python 3.14.2 2025-12-08 10:27:16 +01:00
renovate[bot]
3914290c11 Update all dependencies (#3219)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-08 01:00:54 +00:00
Safihre
f76bf55b4a Handle aborted Direct Unpack better
Closes #3212
2025-12-05 16:18:17 +01:00
SABnzbd Automation
1cde764336 Update translatable texts
[skip ci]
2025-12-05 12:34:16 +00:00
mnightingale
44d94226ec Pipelining and performance optimisations (#3199)
* Pipelining and performance optimisations

* Refactor to remove handle_remainder and add on_response callback to allow inspecting of nntp messages

* Logic fix if there are sockets but nothing to read/write

* Fix logic errors for failed article requests

* Fix logic for reconfiguring servers

* Add guard_restart callback to pipelining_requests

* Fix article download stats

* Fix current article request shown via api

* Removal of DecodingStatus

* Fix circular reference

* Cleanup imports

* Handle reset_nw and hard_reset for inflight requests

* Improve __request_article behaviour using discard helper

* Article should be None here (before auth) but just in case

* Remove command_queue_condition unnecessary with the pull rather than push queue system

* During reset discard any data received prior to sending quit request

* Circular references again

* Revert to using bytearray

* Revert "During reset discard any data received prior to sending quit request"

This reverts commit ed522e3e80.

* Simpler interaction with sabctools

* Temporarily use the sabctools streaming decoder branch

* Fix most uu tests

* Reduce maximum pipelining requests

* Fix the squiggly line

* Remove some LOG_ALL debug code

* Make get_articles return consistent (None) - it now populates the server deque

* Reduce NNTP_BUFFER_SIZE

* Rename PIPELINING_REQUESTS to DEF_PIPELINING_REQUESTS

* A little refactoring

* Reduce default pipelining until it is dynamic

* Use BoundedSemaphore and fix the unacquired release

* Use crc from sabctools for uu and make filename logic consistent wit yenc

* Use sabctools 9.0.0

* Fix Check Before Download

* Move lock to NzbFile

* Use sabctools 9.1.0

* Minor change

* Fix 430 on check before download

* Update sabnews to work reliably with pipelining

* Minor tidy up

* Why does only Linux complain about this

* Leave this as it was

* Remove unused import

* Compare enum by identity

* Remove command_queue and just prepare a single request
Check if it should be sent and discard when paused

* Kick-start idle connections

* Modify events sockets are monitored for
2025-12-05 13:33:35 +01:00
Safihre
e8e8fff5bf Prevent filepath creation before first article is processed (#3215) 2025-12-05 13:18:27 +01:00
SABnzbd Automation
1b04e07d40 Update translatable texts
[skip ci]
2025-12-04 14:14:01 +00:00
Safihre
54db889f05 Update sfv help text
Closes #3214 and #3213
2025-12-04 15:13:07 +01:00
Safihre
777d279267 Only clear work-flag for post processing when needed 2025-12-01 16:40:59 +01:00
Safihre
75be6b5850 Use Event's to handle Post Processing queue
See #3209
2025-12-01 15:28:05 +01:00
Safihre
a4657e2bd3 Correct rar-version logging line 2025-12-01 11:36:10 +01:00
renovate[bot]
095b48ca47 Update all dependencies (#3210)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 01:32:06 +00:00
renovate[bot]
d459f69113 Update all dependencies (#3204)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-11-24 01:49:05 +00:00
mnightingale
2ecdd0b940 Optimise bpsmeter to avoid repeat lookups and try/except (#3203) 2025-11-22 14:46:21 +01:00
Safihre
73a4ad50e5 Allow longer build timeout for Snap amr64 build 2025-11-21 12:02:41 +01:00
Safihre
9b59e24961 Lower macOS build version to support older clients 2025-11-21 11:50:45 +01:00
Safihre
27e164763e Remove unused imports and shorten build timeouts 2025-11-21 11:29:53 +01:00
Safihre
eb544d85c7 Update text files for 4.6.0Alpha1 2025-11-21 11:02:24 +01:00
Safihre
ad85a241df Enable verify_xff_header by default 2025-11-21 10:12:47 +01:00
Safihre
e4d8642b4f Correct mobile layout if Full Width is enabled 2025-11-21 10:12:19 +01:00
Safihre
77b35e7904 Re-enable all Python versions for CI tests 2025-11-21 10:05:01 +01:00
Safihre
f8a0b3db52 Remove hostname resolution in get_webhost
#3131
2025-11-21 10:00:12 +01:00
Safihre
9c8b26ab4e Use new removeprefix and removesuffix 2025-11-21 10:00:11 +01:00
Safihre
67a5a552fd Add missing typing hints to several files 2025-11-21 10:00:10 +01:00
Safihre
80f57a2b9a Drop support for Python 3.8 2025-11-21 10:00:09 +01:00
Safihre
baaf7edc89 Windows tray icon disappears after Explorer restart
Closes #3200
2025-11-20 16:05:52 +01:00
Safihre
2d9f480af1 Only measure real write time during disk speed test 2025-11-20 15:34:34 +01:00
L-Cie
2266ac33aa Address low throughput reporting in diskspeed.py (#3197)
* increased buffer, mesaurement time, changed file management and calcucation of result

* Write smaller chunks first, abort if time exceeds

* Move urandom dump to diskspeedmeasure, reduced buffer size to 16MB and recycled buffer for more efficient resource usage during writes

* fixed formatting issues

* fixed formatting issues

* fixed formatting issues

---------

Co-authored-by: L-Cie <lcie@sturmklinge.ch>
2025-11-20 06:49:18 +01:00
renovate[bot]
1ba479398c Update all dependencies (develop) (#3195)
* Update all dependencies

* Pin tavern due to failure in newer versions

* User SABnzbd User-agent in wiki test

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Safihre <safihre@sabnzbd.org>
2025-11-18 13:30:07 +01:00
Safihre
f71a81f7a8 Stop endless loop in edge-case of no STAT or HEAD support
Closes #3191
2025-11-13 13:33:11 +01:00
Safihre
1916c01bd9 Use general failure flag for pre-check result check
Closes #3190
2025-11-11 16:48:05 +01:00
Safihre
699d97965c Make Assembler queue configurable and auto increase on high bw-limit 2025-11-10 15:54:26 +01:00
renovate[bot]
399935ad21 Update all dependencies (develop) (#3186)
* Update all dependencies

* Allow older markdown for Python 3.9 and below

---------

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Safihre <safihre@sabnzbd.org>
2025-11-10 14:19:10 +00:00
Sander
0824fdc7c7 STAT: tell on which server an article is present (#3185)
* STAT: tell on which server an article is present

* Update logging format for article presence: old format
2025-11-10 11:08:31 +01:00
SABnzbd Automation
a3f8e89af8 Update translatable texts
[skip ci]
2025-11-05 21:37:39 +00:00
Safihre
f9f17731c8 Certificate validation should also be Strict in Wizard
Closes ##3183
2025-11-05 22:36:49 +01:00
SABnzbd Automation
b052325ea7 Update translatable texts
[skip ci]
2025-11-03 13:29:35 +00:00
Safihre
daca14f97e Update Apprise texts 2025-11-03 14:28:47 +01:00
renovate[bot]
daa26bc1a6 Update dependency cheroot to v11.1.0 (#3180)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-11-03 06:43:18 +00:00
renovate[bot]
70d5134d28 Update all dependencies (#3174)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-27 01:46:12 +00:00
Safihre
a32458d9a9 Resolve another PyGithub deprecation 2025-10-24 13:22:07 +02:00
155 changed files with 7227 additions and 5436 deletions

View File

@@ -7,7 +7,7 @@
"schedule": [
"before 8am on Monday"
],
"baseBranches": ["develop", "feature/uvicorn"],
"baseBranches": ["develop"],
"pip_requirements": {
"fileMatch": [
"requirements.txt",
@@ -23,7 +23,8 @@
"jaraco.collections",
"sabctools",
"paho-mqtt",
"werkzeug"
"werkzeug",
"tavern"
],
"packageRules": [
{

View File

@@ -8,16 +8,26 @@ env:
jobs:
build_windows:
name: Build Windows binary
runs-on: windows-2022
timeout-minutes: 30
name: Build Windows binary (${{ matrix.architecture }})
strategy:
fail-fast: false
matrix:
include:
- architecture: x64
runs-on: windows-2022
- architecture: arm64
runs-on: windows-11-arm
runs-on: ${{ matrix.runs-on }}
timeout-minutes: 15
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.14"
architecture: "x64"
architecture: ${{ matrix.architecture }}
cache: pip
cache-dependency-path: "**/requirements.txt"
- name: Install Python dependencies
@@ -31,13 +41,13 @@ jobs:
id: windows_binary
run: python builder/package.py binary
- name: Upload Windows standalone binary (unsigned)
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v6
id: upload-unsigned-binary
with:
path: "*-win64-bin.zip"
name: Windows standalone binary
path: "*-win*-bin.zip"
name: Windows standalone binary (${{ matrix.architecture }})
- name: Sign Windows standalone binary
uses: signpath/github-action-submit-signing-request@v1
uses: signpath/github-action-submit-signing-request@v2
if: contains(github.ref, 'refs/tags/')
with:
api-token: ${{ secrets.SIGNPATH_API_TOKEN }}
@@ -49,22 +59,24 @@ jobs:
wait-for-completion: true
output-artifact-directory: "signed"
- name: Upload Windows standalone binary (signed)
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v6
if: contains(github.ref, 'refs/tags/')
with:
name: Windows standalone binary (signed)
name: Windows standalone binary (${{ matrix.architecture }}, signed)
path: "signed"
- name: Build Windows installer
if: matrix.architecture == 'x64'
run: python builder/package.py installer
- name: Upload Windows installer
uses: actions/upload-artifact@v4
if: matrix.architecture == 'x64'
uses: actions/upload-artifact@v6
id: upload-unsigned-installer
with:
path: "*-win-setup.exe"
name: Windows installer
name: Windows installer (${{ matrix.architecture }})
- name: Sign Windows installer
uses: signpath/github-action-submit-signing-request@v1
if: contains(github.ref, 'refs/tags/')
if: matrix.architecture == 'x64' && contains(github.ref, 'refs/tags/')
uses: signpath/github-action-submit-signing-request@v2
with:
api-token: ${{ secrets.SIGNPATH_API_TOKEN }}
organization-id: ${{ secrets.SIGNPATH_ORG_ID }}
@@ -75,27 +87,27 @@ jobs:
wait-for-completion: true
output-artifact-directory: "signed"
- name: Upload Windows installer (signed)
if: contains(github.ref, 'refs/tags/')
uses: actions/upload-artifact@v4
if: matrix.architecture == 'x64' && contains(github.ref, 'refs/tags/')
uses: actions/upload-artifact@v6
with:
name: Windows installer (signed)
name: Windows installer (${{ matrix.architecture }}, signed)
path: "signed/*-win-setup.exe"
build_macos:
name: Build macOS binary
runs-on: macos-14
timeout-minutes: 30
timeout-minutes: 15
env:
# We need the official Python, because the GA ones only support newer macOS versions
# The deployment target is picked up by the Python build tools automatically
# If updated, make sure to also set LSMinimumSystemVersion in SABnzbd.spec
PYTHON_VERSION: "3.14.0"
PYTHON_VERSION: "3.14.2"
MACOSX_DEPLOYMENT_TARGET: "10.15"
# We need to force compile for universal2 support
CFLAGS: -arch x86_64 -arch arm64
ARCHFLAGS: -arch x86_64 -arch arm64
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
# Only use this for the caching of pip packages!
uses: actions/setup-python@v6
@@ -105,7 +117,7 @@ jobs:
cache-dependency-path: "**/requirements.txt"
- name: Cache Python download
id: cache-python-download
uses: actions/cache@v4
uses: actions/cache@v5
with:
path: ~/python.pkg
key: cache-macOS-Python-${{ env.PYTHON_VERSION }}
@@ -140,7 +152,7 @@ jobs:
# Run this on macOS so the line endings are correct by default
run: python builder/package.py source
- name: Upload source distribution
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v6
with:
path: "*-src.tar.gz"
name: Source distribution
@@ -153,7 +165,7 @@ jobs:
python3 builder/package.py app
python3 builder/make_dmg.py
- name: Upload macOS binary
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v6
with:
path: "*-macos.dmg"
name: macOS binary
@@ -167,14 +179,14 @@ jobs:
matrix:
include:
- os: ubuntu-latest
linux_arch: amd64
linux_arch: x64
- os: ubuntu-24.04-arm
linux_arch: arm64
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Cache par2cmdline-turbo tarball
uses: actions/cache@v4
uses: actions/cache@v5
id: cache-par2cmdline
# Clearing the cache in case of new version requires manual clearing in GitHub!
with:
@@ -196,7 +208,7 @@ jobs:
timeout 10s snap run sabnzbd --help || true
sudo snap remove sabnzbd
- name: Upload snap
uses: actions/upload-artifact@v4
uses: actions/upload-artifact@v6
with:
name: Snap package (${{ matrix.linux_arch }})
path: ${{ steps.snapcraft.outputs.snap }}
@@ -215,7 +227,7 @@ jobs:
runs-on: ubuntu-latest
needs: [build_windows, build_macos]
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
@@ -223,15 +235,15 @@ jobs:
cache: pip
cache-dependency-path: "builder/release-requirements.txt"
- name: Download Source distribution artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@v7
with:
name: Source distribution
- name: Download macOS artifact
uses: actions/download-artifact@v5
uses: actions/download-artifact@v7
with:
name: macOS binary
- name: Download Windows artifacts
uses: actions/download-artifact@v5
uses: actions/download-artifact@v7
with:
pattern: ${{ (contains(github.ref, 'refs/tags/')) && '*signed*' || '*Windows*' }}
merge-multiple: true

View File

@@ -7,20 +7,20 @@ jobs:
name: Black Code Formatter
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Black Code Formatter
uses: lgeiger/black-action@master
with:
# Tools folder excluded for now due to https://github.com/psf/black/issues/4963
args: >
SABnzbd.py
sabnzbd
scripts
tools
builder
builder/SABnzbd.spec
tests
--line-length=120
--target-version=py38
--target-version=py39
--check
--diff
@@ -31,19 +31,19 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.9", "3.10", "3.11", "3.12", "3.13", "3.14"]
python-version: [ "3.9", "3.10", "3.11", "3.12", "3.13", "3.14" ]
name: ["Linux"]
os: [ubuntu-latest]
include:
- name: macOS
os: macos-13
os: macos-latest
python-version: "3.14"
- name: Windows
os: windows-2022
python-version: "3.14"
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v6
with:

View File

@@ -26,7 +26,7 @@ jobs:
if: github.repository_owner == 'sabnzbd'
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@v5
- uses: dessant/lock-threads@v6
with:
log-output: true
issue-inactive-days: 60

View File

@@ -12,7 +12,7 @@ jobs:
env:
TX_TOKEN: ${{ secrets.TX_TOKEN }}
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
with:
token: ${{ secrets.AUTOMATION_GITHUB_TOKEN }}
- name: Generate translatable texts
@@ -30,7 +30,7 @@ jobs:
run: |
python3 tools/make_mo.py
- name: Push translatable and translated texts back to repo
uses: stefanzweifel/git-auto-commit-action@v7.0.0
uses: stefanzweifel/git-auto-commit-action@v7.1.0
if: env.TX_TOKEN
with:
commit_message: |

View File

@@ -52,7 +52,7 @@ Specific guides to install from source are available for Windows and macOS:
https://sabnzbd.org/wiki/installation/install-macos
https://sabnzbd.org/wiki/installation/install-from-source-windows
Only Python 3.8 and above is supported.
Only Python 3.9 and above is supported.
On Linux systems you need to install:
par2 unrar python3-setuptools python3-pip

View File

@@ -16,7 +16,7 @@ If you want to know more you can head over to our website: https://sabnzbd.org.
SABnzbd has a few dependencies you'll need before you can get running. If you've previously run SABnzbd from one of the various Linux packages, then you likely already have all the needed dependencies. If not, here's what you're looking for:
- `python` (Python 3.8 and above, often called `python3`)
- `python` (Python 3.9 and above, often called `python3`)
- Python modules listed in `requirements.txt`. Install with `python3 -m pip install -r requirements.txt -U`
- `par2` (Multi-threaded par2 installation guide can be found [here](https://sabnzbd.org/wiki/installation/multicore-par2))
- `unrar` (make sure you get the "official" non-free version of unrar)

View File

@@ -1,95 +1,45 @@
Release Notes - SABnzbd 4.5.5
Release Notes - SABnzbd 5.0.0 Beta 1
=========================================================
## Bug fixes and changes in 4.5.5
This is the first beta release of version 5.0.
* macOS: Failed to start on versions of macOS older than 11.
Python 3.14 dropped support for macOS 10.13 and 10.14.
Because of that macOS 10.15 is required to run 4.5.5.
Due to several fundamental changes we decided to
not just call this 4.6 but promote it to 5.0!
## Bug fixes and changes in 4.5.4
## New features in 5.0.0
### New Features
* History details now includes option to mark job as `Completed`.
* `Quota` notifications available for all notification services.
- Sends alerts at 75%, 90%, and 100% quota usage.
* Multi-Operations now supports Move to Top/Bottom.
* New `outgoing_nntp_ip` option to bind outgoing NNTP connections to specific IP address.
* Added support for NNTP Pipelining which eliminates idle waiting between
requests, significantly improving speeds on high-latency connections.
Read more here: https://sabnzbd.org/wiki/advanced/nntp-pipelining
* Implemented Direct Write to optimize assembly of downloaded files.
Read more here: https://sabnzbd.org/wiki/advanced/direct-write
* Complete redesign of article cache.
* Improved disk speed measurement in Status window.
* Enable `verify_xff_header` by default.
* Reduce delays between jobs during post-processing.
* If a download only has `.nzb` files inside, the new downloads
will include the name of the original download.
* No longer show tracebacks in the browser, only in the logs.
* Dropped support for Python 3.8.
* Windows: Added Windows ARM (portable) release.
### Improvements
* Setup wizard now requires successful Server Test before proceeding.
* Anime episode notation `S04 - 10` now supported for Sorting and Duplicate Detection.
* Multi-Operations: Play/Resume button unselects on second click for better usability.
* Unrar now handles renaming of invalid characters on Windows filesystem.
* Switched from vendored `sabnzbd.rarfile` module to `rarfile>=4.2`.
* Warning displayed when removing all Orphaned jobs (clears Temporary Download folder).
## Bug fixes since 4.5.0
### Bug Fixes
* Active connections counter in Status window now updates correctly.
* Job setting changes during URL-grabbing no longer ignored.
* Incomplete `.par2` file parsing no longer leaves files behind.
* `Local IPv4 address` now detectable when using Socks5 proxy.
* Server configuration changes no longer show `Failure` message during page reload.
* `Check before download` could get stuck or fail to reject.
* No error was shown in case NZB upload failed.
* Correct mobile layout if `Full Width` is enabled.
* Aborted Direct Unpack could result in no files being unpacked.
* Sorting of files inside jobs was inconsistent.
* Windows: Tray icon disappears after Explorer restart.
* macOS: Slow to start on some network setups.
### Platform-Specific
* Linux: `Make Windows compatible` automatically enabled when needed.
* Windows: Executables are now signed using SignPath Foundation certificate.
* Windows: Can now start SABnzbd directly from installer.
* Windows and macOS: Binaries now use Python 3.14.
## Bug fixes and changes in 4.5.3
* Remember if `Permanently delete` was previously checked.
* All available IP-addresses will be included when selecting the fastest.
* Pre-queue script rejected NZBs were sometimes reported as `URL Fetching failed`.
* RSS `Next scan` time was not adjusted after manual `Read All Feeds Now`.
* Prevent renaming of `.cbr` files during verification.
* If `--disable-file-log` was enabled, `Show Logging` would crash.
* API: Added `time_added`, timestamp of when the job was added to the queue.
* API: History output could contain duplicate items.
* Snap: Updated packages and changed build process for reliability.
* macOS: Repair would fail on macOS 10.13 High Sierra.
* Windows: Unable to start on Windows 8.
* Windows: Updated Unrar to 7.13, which resolves CVE-2025-8088.
## Bug fixes and changes in 4.5.2
* Added Tab and Shift+Tab navigation to move between rename fields in queue.
* Invalid cookies of other services could result in errors.
* Internet Bandwidth test could be stuck in infinite loop.
* RSS readout did not ignore torrent alternatives.
* Prowl and Pushover settings did not load correctly.
* Renamed `osx` to `macos` internally.
* API: Removed `B` post-fix from `quota` and `left_quota` fields in `queue`.
* Windows: Support more languages in the installer.
* Windows and macOS: Updated par2cmdline-turbo to 1.3.0 and Unrar to 7.12.
## Bug fixes and changes in 4.5.1
* Correct platform detection on Linux.
* The `From SxxEyy` RSS filters did not always work.
* Windows and macOS: Update Unrar to 7.11.
## New features in 4.5.0
* Improved failure detection by downloading additional par2 files right away.
* Added more diagnostic information about the system.
* Use XFF headers for login validation if `verify_xff_header` is enabled.
* Added Turkish translation (by @cardpuncher).
* Added `unrar_parameters` option to supply custom Unrar parameters.
* Windows: Removed MultiPar support.
* Windows and macOS: Updated Python to 3.13.2, 7zip to 24.09,
Unrar to 7.10 and par2cmdline-turbo to 1.2.0.
## Bug fixes since 4.4.0
* Handle filenames that exceed maximum filesystem lengths.
* Directly decompress gzip responses when retrieving NZB's.
## Upgrade notices
* Direct upgrade supported from version 3.0.0 and newer.
* Older versions require performing a `Queue repair` after upgrading.
* You can directly upgrade from version 3.0.0 and newer.
* Upgrading from older versions will require performing a `Queue repair`.
* Downgrading from version 4.2.0 or newer to 3.7.2 or older will require
performing a `Queue repair` due to changes in the internal data format.
## Known problems and solutions

View File

@@ -19,8 +19,8 @@ import sys
# Trick to show a better message on older Python
# releases that don't support walrus operator
if Python_38_is_required_to_run_SABnzbd := sys.hexversion < 0x03080000:
print("Sorry, requires Python 3.8 or above")
if Python_39_is_required_to_run_SABnzbd := sys.hexversion < 0x03090000:
print("Sorry, requires Python 3.9 or above")
print("You can read more at: https://sabnzbd.org/wiki/installation/install-off-modules")
sys.exit(1)
@@ -40,7 +40,7 @@ import re
import gc
import threading
import http.cookies
from typing import List, Dict, Any
from typing import Any
try:
import sabctools
@@ -142,7 +142,7 @@ class GUIHandler(logging.Handler):
"""Initializes the handler"""
logging.Handler.__init__(self)
self._size: int = size
self.store: List[Dict[str, Any]] = []
self.store: list[dict[str, Any]] = []
def emit(self, record: logging.LogRecord):
"""Emit a record by adding it to our private queue"""
@@ -236,9 +236,7 @@ def print_help():
def print_version():
print(
(
"""
print(("""
%s-%s
(C) Copyright 2007-2025 by The SABnzbd-Team (sabnzbd.org)
@@ -247,10 +245,7 @@ This is free software, and you are welcome to redistribute it
under certain conditions. It is licensed under the
GNU GENERAL PUBLIC LICENSE Version 2 or (at your option) any later version.
"""
% (sabnzbd.MY_NAME, sabnzbd.__version__)
)
)
""" % (sabnzbd.MY_NAME, sabnzbd.__version__)))
def daemonize():
@@ -540,21 +535,19 @@ def get_webhost(web_host, web_port, https_port):
# If only APIPA's or IPV6 are found, fall back to localhost
ipv4 = ipv6 = False
localhost = hostip = "localhost"
try:
info = socket.getaddrinfo(socket.gethostname(), None)
# Valid user defined name?
info = socket.getaddrinfo(web_host, None)
except socket.error:
# Hostname does not resolve
if not is_localhost(web_host):
web_host = "0.0.0.0"
try:
# Valid user defined name?
info = socket.getaddrinfo(web_host, None)
info = socket.getaddrinfo(localhost, None)
except socket.error:
if not is_localhost(web_host):
web_host = "0.0.0.0"
try:
info = socket.getaddrinfo(localhost, None)
except socket.error:
info = socket.getaddrinfo("127.0.0.1", None)
localhost = "127.0.0.1"
info = socket.getaddrinfo("127.0.0.1", None)
localhost = "127.0.0.1"
for item in info:
ip = str(item[4][0])
if ip.startswith("169.254."):
@@ -872,7 +865,7 @@ def main():
elif opt in ("-t", "--templates"):
web_dir = arg
elif opt in ("-s", "--server"):
(web_host, web_port) = split_host(arg)
web_host, web_port = split_host(arg)
elif opt in ("-n", "--nobrowser"):
autobrowser = False
elif opt in ("-b", "--browser"):
@@ -1282,7 +1275,6 @@ def main():
"tools.encode.on": True,
"tools.gzip.on": True,
"tools.gzip.mime_types": mime_gzip,
"request.show_tracebacks": True,
"error_page.401": sabnzbd.panic.error_page_401,
"error_page.404": sabnzbd.panic.error_page_404,
}

View File

@@ -16,6 +16,7 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import os
import platform
import re
# Constants
@@ -43,11 +44,17 @@ RELEASE_VERSION_BASE = f"{RELEASE_VERSION_TUPLE[0]}.{RELEASE_VERSION_TUPLE[1]}.{
RELEASE_NAME = "SABnzbd-%s" % RELEASE_VERSION
RELEASE_TITLE = "SABnzbd %s" % RELEASE_VERSION
RELEASE_SRC = RELEASE_NAME + "-src.tar.gz"
RELEASE_BINARY = RELEASE_NAME + "-win64-bin.zip"
RELEASE_INSTALLER = RELEASE_NAME + "-win-setup.exe"
RELEASE_WIN_BIN_X64 = RELEASE_NAME + "-win64-bin.zip"
RELEASE_WIN_BIN_ARM64 = RELEASE_NAME + "-win-arm64-bin.zip"
RELEASE_WIN_INSTALLER = RELEASE_NAME + "-win-setup.exe"
RELEASE_MACOS = RELEASE_NAME + "-macos.dmg"
RELEASE_README = "README.mkd"
# Detect architecture
RELEASE_WIN_BIN = RELEASE_WIN_BIN_X64
if platform.machine() == "ARM64":
RELEASE_WIN_BIN = RELEASE_WIN_BIN_ARM64
# Used in package.py and SABnzbd.spec
EXTRA_FILES = [
RELEASE_README,

View File

@@ -18,7 +18,6 @@
import os
from constants import RELEASE_VERSION
# We need to call dmgbuild from command-line, so here we can setup how
if __name__ == "__main__":
# Check for DMGBuild

View File

@@ -28,7 +28,6 @@ import urllib.request
import urllib.error
import configobj
import packaging.version
from typing import List
from constants import (
RELEASE_VERSION,
@@ -36,8 +35,8 @@ from constants import (
VERSION_FILE,
RELEASE_README,
RELEASE_NAME,
RELEASE_BINARY,
RELEASE_INSTALLER,
RELEASE_WIN_BIN,
RELEASE_WIN_INSTALLER,
ON_GITHUB_ACTIONS,
RELEASE_THIS,
RELEASE_SRC,
@@ -70,7 +69,7 @@ def delete_files_glob(glob_pattern: str, allow_no_matches: bool = False):
raise FileNotFoundError(f"No files found that match '{glob_pattern}'")
def run_external_command(command: List[str], print_output: bool = True, **kwargs):
def run_external_command(command: list[str], print_output: bool = True, **kwargs):
"""Wrapper to ease the use of calling external programs"""
process = subprocess.Popen(command, text=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kwargs)
output, _ = process.communicate()
@@ -258,7 +257,7 @@ if __name__ == "__main__":
# Remove any leftovers
safe_remove(RELEASE_NAME)
safe_remove(RELEASE_BINARY)
safe_remove(RELEASE_WIN_BIN)
# Run PyInstaller and check output
shutil.copyfile("builder/SABnzbd.spec", "SABnzbd.spec")
@@ -276,8 +275,8 @@ if __name__ == "__main__":
test_sab_binary("dist/SABnzbd/SABnzbd.exe")
# Create the archive
run_external_command(["win/7zip/7za.exe", "a", RELEASE_BINARY, "SABnzbd"], cwd="dist")
shutil.move(f"dist/{RELEASE_BINARY}", RELEASE_BINARY)
run_external_command(["win/7zip/7za.exe", "a", RELEASE_WIN_BIN, "SABnzbd"], cwd="dist")
shutil.move(f"dist/{RELEASE_WIN_BIN}", RELEASE_WIN_BIN)
if "installer" in sys.argv:
# Check if we have the dist folder
@@ -285,10 +284,10 @@ if __name__ == "__main__":
raise FileNotFoundError("SABnzbd executable not found, run binary creation first")
# Check if we have a signed version
if os.path.exists(f"signed/{RELEASE_BINARY}"):
if os.path.exists(f"signed/{RELEASE_WIN_BIN}"):
print("Using signed version of SABnzbd binaries")
safe_remove("dist/SABnzbd")
run_external_command(["win/7zip/7za.exe", "x", "-odist", f"signed/{RELEASE_BINARY}"])
run_external_command(["win/7zip/7za.exe", "x", "-odist", f"signed/{RELEASE_WIN_BIN}"])
# Make sure it exists
if not os.path.exists("dist/SABnzbd/SABnzbd.exe"):
@@ -311,7 +310,7 @@ if __name__ == "__main__":
"/V3",
"/DSAB_VERSION=%s" % RELEASE_VERSION,
"/DSAB_VERSIONKEY=%s" % ".".join(map(str, RELEASE_VERSION_TUPLE)),
"/DSAB_FILE=%s" % RELEASE_INSTALLER,
"/DSAB_FILE=%s" % RELEASE_WIN_INSTALLER,
"NSIS_Installer.nsi.tmp",
]
)

View File

@@ -29,8 +29,9 @@ from constants import (
RELEASE_VERSION_BASE,
PRERELEASE,
RELEASE_SRC,
RELEASE_BINARY,
RELEASE_INSTALLER,
RELEASE_WIN_BIN_X64,
RELEASE_WIN_BIN_ARM64,
RELEASE_WIN_INSTALLER,
RELEASE_MACOS,
RELEASE_README,
RELEASE_THIS,
@@ -42,8 +43,9 @@ from constants import (
# Verify we have all assets
files_to_check = (
RELEASE_SRC,
RELEASE_BINARY,
RELEASE_INSTALLER,
RELEASE_WIN_BIN_X64,
RELEASE_WIN_BIN_ARM64,
RELEASE_WIN_INSTALLER,
RELEASE_MACOS,
RELEASE_README,
)
@@ -112,7 +114,7 @@ if RELEASE_THIS and gh_token:
print("Removing existing asset %s " % gh_asset.name)
gh_asset.delete_asset()
# Upload the new one
print("Uploading %s to release %s" % (file_to_check, gh_release.title))
print("Uploading %s to release %s" % (file_to_check, gh_release.name))
gh_release.upload_asset(file_to_check)
# Check if we now have all files

View File

@@ -1,19 +1,19 @@
# Basic build requirements
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
pyinstaller==6.16.0
packaging==25.0
pyinstaller-hooks-contrib==2025.9
altgraph==0.17.4
wrapt==2.0.0
setuptools==80.9.0
pyinstaller==6.18.0
packaging==26.0
pyinstaller-hooks-contrib==2026.0
altgraph==0.17.5
wrapt==2.0.1
setuptools==80.10.2
# For the Windows build
pefile==2024.8.26; sys_platform == 'win32'
pywin32-ctypes==0.2.3; sys_platform == 'win32'
# For the macOS build
dmgbuild==1.6.5; sys_platform == 'darwin'
mac-alias==2.2.2; sys_platform == 'darwin'
macholib==1.16.3; sys_platform == 'darwin'
ds-store==1.3.1; sys_platform == 'darwin'
PyNaCl==1.6.0; sys_platform == 'darwin'
dmgbuild==1.6.7; sys_platform == 'darwin'
mac-alias==2.2.3; sys_platform == 'darwin'
macholib==1.16.4; sys_platform == 'darwin'
ds-store==1.3.2; sys_platform == 'darwin'
PyNaCl==1.6.2; sys_platform == 'darwin'

39
context/Download-flow.md Normal file
View File

@@ -0,0 +1,39 @@
## Download flow (Downloader + NewsWrapper)
1. **Job ingestion**
- NZBs arrive via UI/API/URL; `urlgrabber.py` fetches remote NZBs, `nzbparser.py` turns them into `NzbObject`s, and `nzbqueue.NzbQueue` stores ordered jobs with priorities and categories.
2. **Queue to articles**
- When servers need work, `NzbQueue.get_articles` (called from `Server.get_article` in `downloader.py`) hands out batches of `Article`s per server, respecting retention, priority, and forced/paused items.
3. **Downloader setup**
- `Downloader` thread loads server configs (`config.get_servers`), instantiates `Server` objects (per host/port/SSL/threads), and spawns `NewsWrapper` instances per configured connection.
- A `selectors.DefaultSelector` watches all sockets; `BPSMeter` tracks throughput and speed limits; timers manage server penalties/restarts.
4. **Connection establishment (NewsWrapper.init_connect → NNTP.connect)**
- `Server.request_addrinfo` resolves fastest address; `NewsWrapper` builds an `NNTP` socket, wraps SSL if needed, sets non-blocking, and registers with the selector.
- First server greeting (200/201) is queued; `finish_connect` drives the login handshake (`AUTHINFO USER/PASS`) and handles temporary (480) or permanent (400/502) errors.
5. **Request scheduling & pipelining**
- `write()` chooses the next article command (`STAT/HEAD` for precheck, `BODY` or `ARTICLE` otherwise).
- Concurrency is limited by `server.pipelining_requests`; commands are queued and sent with `sock.sendall`, so there is no local send buffer.
- Sockets stay registered for `EVENT_WRITE`: without write readiness events, a temporarily full kernel send buffer could stall queued commands when there is nothing to read, so WRITE interest is needed to resume sending promptly.
6. **Receiving data**
- Selector events route to `process_nw_read`; `NewsWrapper.read` pulls bytes (SSL optimized via sabctools), parses NNTP responses, and calls `on_response`.
- Successful BODY/ARTICLE (220/222) updates per-server stats; missing/500 variants toggle capability flags (BODY/STAT support).
7. **Decoding and caching**
- `Downloader.decode` hands responses to `decoder.decode`, which yEnc/UU decodes, CRC-checks, and stores payloads in `ArticleCache` (memory or disk spill).
- Articles with DMCA/bad data trigger retry on other servers until `max_art_tries` is exceeded.
8. **Assembly to files**
- `Assembler` worker consumes decoded pieces, writes to the target file, updates CRC, and cleans admin markers. It guards disk space (`diskspace_check`) and schedules direct unpack or PAR2 handling when files finish.
9. **Queue bookkeeping**
- `NzbQueue.register_article` records success/failure; completed files advance NZF/NZO state. If all files done, the job moves to post-processing (`PostProcessor.process`), which runs `newsunpack`, scripts, sorting, etc.
10. **Control & resilience**
- Pausing/resuming (`Downloader.pause/resume`), bandwidth limiting, and sleep tuning happen in the main loop.
- Errors/timeouts lead to `reset_nw` (close socket, return article, maybe penalize server). Optional servers can be temporarily disabled; required ones schedule resumes.
- Forced disconnect/shutdown drains sockets, refreshes DNS, and exits cleanly.

32
context/Repo-layout.md Normal file
View File

@@ -0,0 +1,32 @@
## Repo layout
- Entry points & metadata
- `SABnzbd.py`: starts the app.
- `README.md` / `README.mkd`: release notes and overview.
- `requirements.txt`: runtime deps.
- Core application package `sabnzbd/`
- Download engine: `downloader.py` (main loop), `newswrapper.py` (NNTP connections), `urlgrabber.py`, `nzbqueue.py` (queue), `nzbparser.py` (parse NZB), `assembler.py` (writes decoded parts), `decoder.py` (yEnc/UU decode), `articlecache.py` (in-memory/on-disk cache).
- Post-processing: `newsunpack.py`, `postproc.py`, `directunpacker.py`, `sorting.py`, `deobfuscate_filenames.py`.
- Config/constants/utilities: `cfg.py`, `config.py`, `constants.py`, `misc.py`, `filesystem.py`, `encoding.py`, `lang.py`, `scheduler.py`, `notifier.py`, `emailer.py`, `rss.py`.
- UI plumbing: `interface.py`, `skintext.py`, `version.py`, platform helpers (`macosmenu.py`, `sabtray*.py`).
- Subpackages: `sabnzbd/nzb/` (NZB model objects), `sabnzbd/utils/` (helpers).
- Web interfaces & assets
- `interfaces/Glitter`, `interfaces/Config`, `interfaces/wizard`: HTML/JS/CSS skins.
- `icons/`: tray/web icons.
- `locale/`, `po/`, `tools/`: translation sources and helper scripts (`make_mo.py`, etc.).
- Testing & samples
- `tests/`: pytest suite plus `data/` fixtures and `test_utils/`.
- `scripts/`: sample post-processing hooks (`Sample-PostProc.*`).
- Packaging/build
- `builder/`: platform build scripts (DMG/EXE specs, `package.py`, `release.py`).
- Platform folders `win/`, `macos/`, `linux/`, `snap/`: installer or platform-specific assets.
- `admin/`, `builder/constants.py`, `licenses/`: release and licensing support files.
- Documentation
- Documentation website source is stored in the `sabnzbd.github.io` repo.
- This repo is most likely located 1 level up from the root folder of this repo.
- Documentation is split per SABnzbd version, in the `wiki` folder.

View File

@@ -187,7 +187,8 @@
<td><label for="apprise_enable"> $T('opt-apprise_enable')</label></td>
</tr>
</table>
<em>$T('explain-apprise_enable')</em><br>
<p>$T('explain-apprise_enable')</p>
<p><a href="https://appriseit.com/" target="_blank">Apprise documentation</a></p>
<p>$T('version'): ${apprise.__version__}</p>
$show_cat_box('apprise')
@@ -197,7 +198,7 @@
<div class="field-pair">
<label class="config" for="apprise_urls">$T('opt-apprise_urls')</label>
<input type="text" name="apprise_urls" id="apprise_urls" value="$apprise_urls" />
<span class="desc">$T('explain-apprise_urls'). <br>$T('readwiki')</span>
<span class="desc">$T('explain-apprise_urls')</span>
</div>
<div class="field-pair">
<span class="desc">$T('explain-apprise_extra_urls')</span>

View File

@@ -117,6 +117,12 @@
<input type="checkbox" name="optional" id="optional" value="1" />
<span class="desc">$T('explain-optional')</span>
</div>
<div class="field-pair advanced-settings">
<label class="config" for="pipelining_requests">$T('srv-pipelining_requests')</label>
<input type="number" name="pipelining_requests" id="pipelining_requests" min="1" max="20" value="1" />
<span class="desc">$T('explain-pipelining_requests')<br>$T('readwiki')
<a href="https://sabnzbd.org/wiki/advanced/nntp-pipelining" target="_blank">https://sabnzbd.org/wiki/advanced/nntp-pipelining</a></span>
</div>
<div class="field-pair advanced-settings">
<label class="config" for="expire_date">$T('srv-expire_date')</label>
<input type="date" name="expire_date" id="expire_date" />
@@ -248,6 +254,12 @@
<input type="checkbox" name="optional" id="optional$cur" value="1" <!--#if int($server['optional']) != 0 then 'checked="checked"' else ""#--> />
<span class="desc">$T('explain-optional')</span>
</div>
<div class="field-pair advanced-settings">
<label class="config" for="pipelining_requests$cur">$T('srv-pipelining_requests')</label>
<input type="number" name="pipelining_requests" id="pipelining_requests$cur" value="$server['pipelining_requests']" min="1" max="20" required />
<span class="desc">$T('explain-pipelining_requests')<br>$T('readwiki')
<a href="https://sabnzbd.org/wiki/advanced/nntp-pipelining" target="_blank">https://sabnzbd.org/wiki/advanced/nntp-pipelining</a></span>
</div>
<div class="field-pair advanced-settings">
<label class="config" for="expire_date$cur">$T('srv-expire_date')</label>
<input type="date" name="expire_date" id="expire_date$cur" value="$server['expire_date']" />

View File

@@ -6,8 +6,12 @@
<span class="glyphicon glyphicon-open"></span> $T('Glitter-notification-uploading') <span class="main-notification-box-file-count"></span>
</div>
<div class="main-notification-box-uploading-failed">
<span class="glyphicon glyphicon-exclamation-sign"></span> $T('Glitter-notification-upload-failed').replace('%s', '') <span class="main-notification-box-file-count"></span>
</div>
<div class="main-notification-box-queue-repair">
<span class="glyphicon glyphicon glyphicon-wrench"></span> $T('Glitter-repairQueue')
<span class="glyphicon glyphicon-wrench"></span> $T('Glitter-repairQueue')
</div>
<div class="main-notification-box-disconnect">

View File

@@ -726,6 +726,9 @@ function ViewModel() {
$('#nzbname').val('')
$('.btn-file em').html(glitterTranslate.chooseFile + '&hellip;')
}
}).fail(function(xhr, status, error) {
// Update the uploading notification text to show error
showNotification('.main-notification-box-uploading-failed', 0, error)
});
}

View File

@@ -69,6 +69,10 @@ legend,
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.2);
}
.main-notification-box-uploading-failed {
color: #F95151;
}
.container,
.modal-body,
.modal-footer {

View File

@@ -7,6 +7,10 @@
padding-right: 8px;
}
.container-full-width .container {
width: 100%;
}
.main-navbar {
margin-top: 0;
padding: 0;

View File

@@ -5,6 +5,10 @@
<metadata_license>MIT</metadata_license>
<name>SABnzbd</name>
<summary>Free and easy binary newsreader</summary>
<branding>
<color type="primary" scheme_preference="light">#e7e7e7</color>
<color type="primary" scheme_preference="dark">#444444</color>
</branding>
<description>
<p>
SABnzbd is a free and Open Source web-based binary newsreader,
@@ -17,6 +21,13 @@
and services that help automate the download process.
</p>
</description>
<keywords>
<keyword>usenet</keyword>
<keyword>nzb</keyword>
<keyword>download</keyword>
<keyword>newsreader</keyword>
<keyword>binary</keyword>
</keywords>
<categories>
<category>Network</category>
<category>FileTransfer</category>
@@ -24,33 +35,49 @@
<url type="homepage">https://sabnzbd.org</url>
<url type="bugtracker">https://github.com/sabnzbd/sabnzbd/issues</url>
<url type="vcs-browser">https://github.com/sabnzbd/sabnzbd</url>
<url type="contribute">https://github.com/sabnzbd/sabnzbd</url>
<url type="translate">https://sabnzbd.org/wiki/translate</url>
<url type="donation">https://sabnzbd.org/donate</url>
<url type="help">https://sabnzbd.org/wiki/</url>
<url type="faq">https://sabnzbd.org/wiki/faq</url>
<url type="contact">https://sabnzbd.org/live-chat.html</url>
<releases>
<release version="4.5.5" date="2025-10-24" type="stable"/>
<release version="4.5.4" date="2025-10-22" type="stable"/>
<release version="4.5.3" date="2025-08-25" type="stable"/>
<release version="4.5.2" date="2025-07-09" type="stable"/>
<release version="4.5.1" date="2025-04-11" type="stable"/>
<release version="4.5.0" date="2025-04-01" type="stable"/>
<release version="4.4.1" date="2024-12-23" type="stable"/>
<release version="4.4.0" date="2024-12-09" type="stable"/>
<release version="4.3.3" date="2024-08-01" type="stable"/>
<release version="4.3.2" date="2024-05-30" type="stable"/>
<release version="4.3.1" date="2024-05-03" type="stable"/>
<release version="4.3.0" date="2024-05-01" type="stable"/>
<release version="4.2.2" date="2024-02-01" type="stable"/>
<release version="4.2.1" date="2024-01-05" type="stable"/>
<release version="4.2.0" date="2024-01-03" type="stable"/>
<release version="4.1.0" date="2023-09-26" type="stable"/>
<release version="4.0.3" date="2023-06-16" type="stable"/>
<release version="4.0.2" date="2023-06-09" type="stable"/>
<release version="4.0.1" date="2023-05-01" type="stable"/>
<release version="4.0.0" date="2023-04-28" type="stable"/>
<release version="3.7.2" date="2023-02-05" type="stable"/>
<release version="5.0.0" date="2026-03-01" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/5.0.0</url>
</release>
<release version="4.5.5" date="2025-10-24" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.5</url>
</release>
<release version="4.5.4" date="2025-10-22" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.4</url>
</release>
<release version="4.5.3" date="2025-08-25" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.3</url>
</release>
<release version="4.5.2" date="2025-07-09" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.2</url>
</release>
<release version="4.5.1" date="2025-04-11" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.1</url>
</release>
<release version="4.5.0" date="2025-04-01" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.5.0</url>
</release>
<release version="4.4.1" date="2024-12-23" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.4.1</url>
</release>
<release version="4.4.0" date="2024-12-09" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.4.0</url>
</release>
<release version="4.3.3" date="2024-08-01" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.3.3</url>
</release>
<release version="4.3.2" date="2024-05-30" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.3.2</url>
</release>
<release version="4.3.1" date="2024-05-03" type="stable">
<url type="details">https://github.com/sabnzbd/sabnzbd/releases/tag/4.3.1</url>
</release>
</releases>
<launchable type="desktop-id">sabnzbd.desktop</launchable>
<provides>
@@ -73,11 +100,59 @@
<screenshots>
<screenshot type="default">

<caption>Web interface</caption>
<caption>Intuitive interface</caption>
</screenshot>
<screenshot>

<caption>Night mode</caption>
<caption>Also comes in Night-mode</caption>
</screenshot>
<screenshot>

<caption>Add NZB's or use drag-and-drop!</caption>
</screenshot>
<screenshot>

<caption>Scales to any screen size</caption>
</screenshot>
<screenshot>

<caption>Easy overview of all history details</caption>
</screenshot>
<screenshot>

<caption>Every option, on every screen size</caption>
</screenshot>
<screenshot>

<caption>Manage a job's individual files</caption>
</screenshot>
<screenshot>

<caption>Easy speed limiting</caption>
</screenshot>
<screenshot>

<caption>Quickly change settings</caption>
</screenshot>
<screenshot>

<caption>Easy system check</caption>
</screenshot>
<screenshot>

<caption>See active connections</caption>
</screenshot>
<screenshot>

<caption>Customize the interface</caption>
</screenshot>
<screenshot>

<caption>Tabbed-mode</caption>
</screenshot>
<screenshot>

<caption>Specify any pause duration</caption>
</screenshot>
<screenshot>


View File

@@ -4,7 +4,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: team@sabnzbd.org\n"
"Language-Team: SABnzbd <team@sabnzbd.org>\n"

View File

@@ -4,7 +4,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: team@sabnzbd.org\n"
"Language-Team: SABnzbd <team@sabnzbd.org>\n"
@@ -125,6 +125,11 @@ msgstr ""
msgid "Current umask (%o) might deny SABnzbd access to the files and folders it creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Completed Download Folder %s is on FAT file system, limiting maximum file size to 4GB"
@@ -279,7 +284,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr ""
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr ""
@@ -508,11 +513,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr ""
@@ -530,11 +530,6 @@ msgstr ""
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr ""
@@ -680,6 +675,11 @@ msgstr ""
msgid "%s is not writable with special character filenames. This can cause problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -894,7 +894,7 @@ msgid "Update Available!"
msgstr ""
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1127,6 +1127,16 @@ msgstr ""
msgid "left"
msgstr ""
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr ""
@@ -1300,103 +1310,18 @@ msgstr ""
msgid "NZB added to queue"
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr ""
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr ""
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr ""
@@ -1638,6 +1563,14 @@ msgstr ""
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr ""
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr ""
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1664,14 +1597,6 @@ msgstr ""
msgid "RSS Feed %s was empty"
msgstr ""
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr ""
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr ""
@@ -3127,7 +3052,7 @@ msgid "Enable SFV-based checks"
msgstr ""
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgid "If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
@@ -3482,6 +3407,14 @@ msgstr ""
msgid "Enable"
msgstr ""
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid "Request multiple articles per connection without waiting for each response first.<br />This can improve download speeds, especially on connections with higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -3885,17 +3818,16 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid "Send notifications directly to any notification service you use.<br>For example: Slack, Discord, Telegram, or any service from over 100 supported services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid "Apprise defines service connection information using URLs.<br>Read the Apprise wiki how to define the URL for each service.<br>Use a comma and/or space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4190,6 +4122,11 @@ msgstr ""
msgid "Filename"
msgstr ""
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr ""
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr ""
@@ -4599,6 +4536,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr ""
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Czech (https://app.transifex.com/sabnzbd/teams/111101/cs/)\n"
@@ -144,6 +144,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -316,7 +321,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Neočekávaná přípona v rar souboru %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Přerušeno, nalezena neočekávaná připona"
@@ -560,11 +565,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Příliš mnoho spojení k serveru %s [%s]"
@@ -584,11 +584,6 @@ msgstr "Přihlášení k serveru %s se nezdařilo [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Nejspíše chyba downloaderu"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Vypínání"
@@ -736,6 +731,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Odmítnuto spojení z:"
@@ -965,7 +965,7 @@ msgid "Update Available!"
msgstr "Dostupná aktualizace!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Nezdařilo se nahrát soubor: %s"
@@ -1207,6 +1207,16 @@ msgstr "Zkouším SFV ověření"
msgid "left"
msgstr ""
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Nejspíše chyba downloaderu"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Tento server nepovoluje SSL na tomto portu"
@@ -1388,103 +1398,18 @@ msgstr "Nelze nahrát %s, detekován porušený soubor"
msgid "NZB added to queue"
msgstr "NZB přidáno do fronty"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignoruji duplikátní NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Nezdařilo se duplikovat NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Duplikátní NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Prázdný NZB soubor %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Nechtěná přípona v souboru %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Zrušeno, nelze dokončit"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Chyba při importu %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLIKÁT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ŠIFROVANÉ"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "PŘÍLIŠ VELKÝ"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "NEKOMPLETNÍ"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NECHTĚNÝ"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "ČEKÁNÍ %s s"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "PROPAGUJI %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Staženo do %s s průměrnou rychlostí %s B/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Stáří"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Pozastavuji duplikátní NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problém s"
@@ -1729,6 +1654,14 @@ msgstr ""
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Prázdný RSS záznam nalezen (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Nekompatibilní kanál"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1755,14 +1688,6 @@ msgstr ""
msgid "RSS Feed %s was empty"
msgstr "RSS kanál %s byl prázdný"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Nekompatibilní kanál"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Prázdný RSS záznam nalezen (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Zobrazit rozhraní"
@@ -3287,7 +3212,8 @@ msgid "Enable SFV-based checks"
msgstr ""
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
@@ -3671,6 +3597,17 @@ msgstr ""
msgid "Enable"
msgstr ""
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4080,17 +4017,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4407,6 +4349,11 @@ msgstr ""
msgid "Filename"
msgstr ""
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Stáří"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr ""
@@ -4826,6 +4773,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Prázdný NZB soubor %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Danish (https://app.transifex.com/sabnzbd/teams/111101/da/)\n"
@@ -147,6 +147,11 @@ msgid ""
msgstr ""
"Aktuel umask (%o) kan nægte SABnzbd adgang til filer og mapper den opretter."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -331,7 +336,7 @@ msgstr "I \"%s\" uønsket extension i RAR fil. Uønsket fil er \"%s\" "
msgid "Unwanted extension is in rar file %s"
msgstr "Uønsket extension i rar fil %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Afbrudt, uønsket extension fundet"
@@ -360,11 +365,11 @@ msgstr "Kvota"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "Advarsel om kvotegrænse (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Download genoptaget efter nulstilling af kvote"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -585,11 +590,6 @@ msgstr "Det lykkedes ikke at initialisere %s@%s med begrundelse %s"
msgid "Fatal error in Downloader"
msgstr "Alvorlig fejl i Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Modtog ukendt statuskode %s for artikel %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Alt for mange forbindelser til serveren %s [%s]"
@@ -611,11 +611,6 @@ msgstr "Det lykkedes ikke at logge på serveren %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Forbindelse %s@%s mislykkedes, besked %s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Suspect fejl i downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Påbegynder lukning af SABnzbd"
@@ -774,6 +769,11 @@ msgid ""
msgstr ""
"%s er ikke skrivbar med filnavne med specialtegn. Dette kan give problemer."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Afviste forbindelse fra:"
@@ -1006,7 +1006,7 @@ msgid "Update Available!"
msgstr "Opdatering tilgængelig!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Kunne ikke uploade fil: %s"
@@ -1248,6 +1248,16 @@ msgstr "Forsøger SFV verifikation"
msgid "left"
msgstr "tilbage"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Modtog ukendt statuskode %s for artikel %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Suspect fejl i downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Denne server tillader ikke SSL på denne port"
@@ -1432,103 +1442,18 @@ msgstr "Downloadnings fejl %s, ødelagt fil fundet"
msgid "NZB added to queue"
msgstr "NZB tilføjet i køen"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorerer identiske NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Fejler dublet NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Dublet NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Ødelagt NZB fil %s, springer over (årsag=%s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tom NZB fil %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Før-kø script job markeret som mislykkedet"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Afbrudt, kan ikke afsluttes"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Det lykkedes ikke at importere %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLIKERE"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "KRYPTEREDE"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "FOR STOR"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "UFULDSTÆNDIG"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "UØNSKET"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "VENT %s sekunder"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "PROPAGATING %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Hentede i %s med et gennemsnit på %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Alder"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artikler misdannede"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artikler manglede"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artikler havde ikke-matchende dubletter"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Pause duplikeret NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problem med"
@@ -1717,7 +1642,7 @@ msgstr "Efterbehandling mislykkedes for %s (%s)"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
msgstr "Efterbehandling blev afbrudt"
#: sabnzbd/postproc.py
msgid "Download Failed"
@@ -1771,12 +1696,12 @@ msgstr "RAR filer kunne ikke bekræfte"
#: sabnzbd/postproc.py
msgid "Trying RAR renamer"
msgstr ""
msgstr "Forsøger RAR-omdøbning"
#. Warning message
#: sabnzbd/postproc.py
msgid "No matching earlier rar file for %s"
msgstr ""
msgstr "Ingen matchende tidligere rar-fil for %s"
#. Error message
#: sabnzbd/postproc.py
@@ -1801,7 +1726,15 @@ msgstr "Fejl ved lukning af system"
#. Error message
#: sabnzbd/powersup.py
msgid "Received a DBus exception %s"
msgstr ""
msgstr "Modtog en DBus-undtagelse %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post blev fundet (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibel feed"
#. Error message
#: sabnzbd/rss.py
@@ -1829,14 +1762,6 @@ msgstr "Server %s bruger et upålideligt HTTPS-certifikat"
msgid "RSS Feed %s was empty"
msgstr "RSS Feed %s er tom"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibel feed"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post blev fundet (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Vis grænseflade"
@@ -2177,7 +2102,7 @@ msgstr "Denne måned"
#: sabnzbd/skintext.py
msgid "Selected date range"
msgstr ""
msgstr "Valgt datointerval"
#: sabnzbd/skintext.py
msgid "Today"
@@ -2272,7 +2197,7 @@ msgstr "Forum"
#. Main menu item
#: sabnzbd/skintext.py
msgid "Live Chat"
msgstr ""
msgstr "Live chat"
#. Main menu item
#: sabnzbd/skintext.py
@@ -2421,7 +2346,7 @@ msgstr "Forsøg igen"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "Markér som fuldført og fjern midlertidige filer"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -2436,7 +2361,7 @@ msgstr "Fjern alt fra køen?"
#. Delete confirmation popup
#: sabnzbd/skintext.py
msgid "Are you sure you want to remove these jobs?"
msgstr ""
msgstr "Er du sikker på, at du vil fjerne disse jobs?"
#. Queue page button
#: sabnzbd/skintext.py
@@ -2461,7 +2386,7 @@ msgstr "Fjern NZB & slet filer"
#. Checkbox if job should be added to Archive
#: sabnzbd/skintext.py
msgid "Permanently delete (skip archive)"
msgstr ""
msgstr "Slet permanent (spring arkiv over)"
#. Caption for missing articles in Queue
#: sabnzbd/skintext.py
@@ -2484,7 +2409,7 @@ msgstr "Nulstil kvota nu"
#: sabnzbd/skintext.py
msgid "Archive"
msgstr ""
msgstr "Arkiv"
#. Button/link hiding History job details
#: sabnzbd/skintext.py
@@ -2509,7 +2434,7 @@ msgstr "Vis Alt"
#. Button showing all archived jobs
#: sabnzbd/skintext.py
msgid "Show Archive"
msgstr ""
msgstr "Vis arkiv"
#. History table header - Size of the download quota
#: sabnzbd/skintext.py
@@ -2560,6 +2485,8 @@ msgid ""
"Disconnect all active connections to usenet servers. Connections will be "
"reopened after a few seconds if there are items in the queue."
msgstr ""
"Afbryd alle aktive forbindelser til usenet-servere. Forbindelser genåbnes "
"efter få sekunder, hvis der er elementer i køen."
#: sabnzbd/skintext.py
msgid "This will send a test email to your account."
@@ -2750,6 +2677,8 @@ msgid ""
"Speed up repairs by installing par2cmdline-turbo, it is available for many "
"platforms."
msgstr ""
"Sæt fart på reparationer ved at installere par2cmdline-turbo, det er "
"tilgængeligt for mange platforme."
#: sabnzbd/skintext.py
msgid "Version"
@@ -2825,6 +2754,8 @@ msgid ""
"If the SABnzbd Host or Port is exposed to the internet, your current "
"settings allow full external access to the SABnzbd interface."
msgstr ""
"Hvis SABnzbd-værten eller porten er eksponeret på internettet, tillader dine"
" nuværende indstillinger fuld ekstern adgang til SABnzbd-grænsefladen."
#: sabnzbd/skintext.py
msgid "Security"
@@ -2935,6 +2866,10 @@ msgid ""
"the Completed Download Folder.<br>Recurring backups can be configured on the"
" Scheduling page."
msgstr ""
"Opret en sikkerhedskopi af konfigurationsfilen og databaser i "
"sikkerhedskopimappen.<br>Hvis sikkerhedskopimappen ikke er indstillet, "
"oprettes sikkerhedskopien i den fuldførte downloadmappe.<br>Tilbagevendende "
"sikkerhedskopier kan konfigureres på planlægningssiden."
#: sabnzbd/skintext.py
msgid "Cleanup List"
@@ -3049,6 +2984,8 @@ msgstr "Eksterne internetadgang"
#: sabnzbd/skintext.py
msgid "You can set access rights for systems outside your local network."
msgstr ""
"Du kan indstille adgangsrettigheder for systemer uden for dit lokale "
"netværk."
#: sabnzbd/skintext.py
msgid "No access"
@@ -3152,6 +3089,9 @@ msgid ""
" again.<br />Applies to both the Temporary and Complete Download Folder.<br "
"/>Checked every few minutes."
msgstr ""
"Download genoptages automatisk, hvis den minimale ledige plads er "
"tilgængelig igen.<br />Gælder for både den midlertidige og den fuldførte "
"downloadmappe.<br />Kontrolleres hvert par minutter."
#: sabnzbd/skintext.py
msgid "Permissions for completed downloads"
@@ -3237,6 +3177,9 @@ msgid ""
"stored.<br />If left empty, the backup will be created in the Completed "
"Download Folder."
msgstr ""
"Placering, hvor sikkerhedskopier af konfigurationsfilen og databaser "
"gemmes.<br />Hvis den efterlades tom, oprettes sikkerhedskopien i den "
"fuldførte downloadmappe."
#: sabnzbd/skintext.py
msgid "<i>Data will <b>not</b> be moved. Requires SABnzbd restart!</i>"
@@ -3254,7 +3197,7 @@ msgstr ""
#: sabnzbd/skintext.py
msgid "Purge Logs"
msgstr ""
msgstr "Ryd logfiler"
#: sabnzbd/skintext.py
msgid ".nzb Backup Folder"
@@ -3318,6 +3261,8 @@ msgid ""
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Udpak kun og kør scripts på jobs, der bestod verifikationsstadiet. Hvis "
"slået fra, markeres alle jobs som fuldført, selvom de er ufuldstændige."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -3330,19 +3275,19 @@ msgstr ""
#: sabnzbd/skintext.py
msgid "Identical download detection"
msgstr ""
msgstr "Identisk downloaddetektering"
#: sabnzbd/skintext.py
msgid "Detect identical downloads based on name or NZB contents."
msgstr ""
msgstr "Detektér identiske downloads baseret på navn eller NZB-indhold."
#: sabnzbd/skintext.py
msgid "Smart duplicate detection"
msgstr ""
msgstr "Smart dubletdetektering"
#: sabnzbd/skintext.py
msgid "Detect duplicates based on analysis of the filename."
msgstr ""
msgstr "Detektér dubletter baseret på analyse af filnavnet."
#: sabnzbd/skintext.py
msgid "Allow proper releases"
@@ -3353,6 +3298,8 @@ msgid ""
"Bypass smart duplicate detection if PROPER, REAL or REPACK is detected in "
"the download name."
msgstr ""
"Spring smart dubletdetektering over, hvis PROPER, REAL eller REPACK "
"registreres i downloadnavnet."
#. Four way switch for duplicates
#: sabnzbd/skintext.py
@@ -3371,7 +3318,7 @@ msgstr "Mislykkes job (flyt til historik)"
#: sabnzbd/skintext.py
msgid "Abort post-processing"
msgstr ""
msgstr "Afbryd efterbehandling"
#: sabnzbd/skintext.py
msgid "Action when unwanted extension detected"
@@ -3379,7 +3326,7 @@ msgstr "Aktion når uønsket extension er fundet"
#: sabnzbd/skintext.py
msgid "Action when an unwanted extension is detected"
msgstr ""
msgstr "Handling når en uønsket filtype registreres"
#: sabnzbd/skintext.py
msgid "Unwanted extensions"
@@ -3387,25 +3334,28 @@ msgstr "Uønsket extension"
#: sabnzbd/skintext.py
msgid "Blacklist"
msgstr ""
msgstr "Sortliste"
#: sabnzbd/skintext.py
msgid "Whitelist"
msgstr ""
msgstr "Hvidliste"
#: sabnzbd/skintext.py
msgid ""
"Select a mode and list all (un)wanted extensions. For example: <b>exe</b> or"
" <b>exe, com</b>"
msgstr ""
"Vælg en tilstand og angiv alle (u)ønskede filtypeendelser. For eksempel: "
"<b>exe</b> eller <b>exe, com</b>"
#: sabnzbd/skintext.py
msgid "Enable SFV-based checks"
msgstr "Aktiver SFV-baseret kontrol"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Udfør en ekstra kontrol baseret på SFV-filer."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3473,15 +3423,15 @@ msgstr "Afbryd fra usenet-serverne når køen er tom eller sat på pause."
#: sabnzbd/skintext.py
msgid "Automatically sort queue"
msgstr ""
msgstr "Sortér kø automatisk"
#: sabnzbd/skintext.py
msgid "Automatically sort jobs in the queue when a new job is added."
msgstr ""
msgstr "Sortér automatisk jobs i køen, når et nyt job tilføjes."
#: sabnzbd/skintext.py
msgid "The queue will resort every 30 seconds if % downloaded is selected."
msgstr ""
msgstr "Køen vil sortere hver 30. sekund, hvis % downloadet er valgt."
#: sabnzbd/skintext.py
msgid "Propagation delay"
@@ -3514,11 +3464,11 @@ msgstr "Erstat mellemrum med understreg i mappenavn."
#: sabnzbd/skintext.py
msgid "Replace underscores in folder name"
msgstr ""
msgstr "Erstat understreger i mappenavn"
#: sabnzbd/skintext.py
msgid "Replace underscores with dots in folder names."
msgstr ""
msgstr "Erstat understreger med punktummer i mappenavne."
#: sabnzbd/skintext.py
msgid "Replace dots in Foldername"
@@ -3570,19 +3520,23 @@ msgstr "Fjern efter download"
#: sabnzbd/skintext.py
msgid "Deobfuscate final filenames"
msgstr ""
msgstr "Afslør endelige filnavne"
#: sabnzbd/skintext.py
msgid ""
"If filenames of (large) files in the final folder look obfuscated or "
"meaningless they will be renamed to the job name."
msgstr ""
"Hvis filnavne på (store) filer i den endelige mappe ser slørede eller "
"meningsløse ud, omdøbes de til jobnavnet."
#: sabnzbd/skintext.py
msgid ""
"Additionally, attempts to set the correct file extension based on the file "
"signature if the extension is not present or meaningless."
msgstr ""
"Forsøger derudover at indstille den korrekte filendelse baseret på "
"filsignaturen, hvis endelsen ikke er til stede eller meningsløs."
#: sabnzbd/skintext.py
msgid "HTTPS certificate verification"
@@ -3597,11 +3551,11 @@ msgstr ""
#: sabnzbd/skintext.py
msgid "SOCKS5 Proxy"
msgstr ""
msgstr "SOCKS5-proxy"
#: sabnzbd/skintext.py
msgid "Use the specified SOCKS5 proxy for all outgoing connections."
msgstr ""
msgstr "Brug den angivne SOCKS5-proxy til alle udgående forbindelser."
#: sabnzbd/skintext.py
msgid "Server"
@@ -3714,11 +3668,11 @@ msgstr "Tidsudløb"
#: sabnzbd/skintext.py
msgid "Account expiration date"
msgstr ""
msgstr "Kontoudløbsdato"
#: sabnzbd/skintext.py
msgid "Warn 5 days in advance of account expiration date."
msgstr ""
msgstr "Advar 5 dage før kontoudløbsdato."
#: sabnzbd/skintext.py
msgid ""
@@ -3726,6 +3680,9 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"Kvote for denne server, talt fra det tidspunkt, den indstilles. I bytes, "
"efterfulgt eventuelt af K,M,G.<br />Kontrolleres hvert par minutter. Besked "
"sendes, når kvoten er brugt."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3756,6 +3713,13 @@ msgid ""
"used. - Disabled: no certification verification. This is not secure at all, "
"anyone could intercept your connection. "
msgstr ""
"Når SSL er aktiveret: - Streng: gennemtving fuld certifikatverifikation. "
"Dette er den mest sikre indstilling. - Medium: verificér at certifikatet er "
"gyldigt og matcher serveradressen, men tillad lokalt injicerede certifikater"
" (f.eks. af firewall eller virusscanner). - Minimal: verificér at "
"certifikatet er gyldigt. Dette er ikke sikkert, ethvert gyldigt certifikat "
"kan bruges. - Deaktiveret: ingen certifikatverifikation. Dette er slet ikke "
"sikkert, enhver kan opfange din forbindelse."
#: sabnzbd/skintext.py
msgid "Disabled"
@@ -3767,7 +3731,7 @@ msgstr "Minimal"
#: sabnzbd/skintext.py
msgid "Medium"
msgstr ""
msgstr "Medium"
#: sabnzbd/skintext.py
msgid "Strict"
@@ -3781,13 +3745,15 @@ msgstr "0 er højeste prioritet, 100 er den laveste prioritet"
#. Server required tickbox
#: sabnzbd/skintext.py
msgid "Required"
msgstr ""
msgstr "Påkrævet"
#: sabnzbd/skintext.py
msgid ""
"In case of connection failures, the download queue will be paused for a few "
"minutes instead of skipping this server"
msgstr ""
"I tilfælde af forbindelsesfejl vil downloadkøen blive sat på pause i et par "
"minutter i stedet for at springe denne server over"
#. Server optional tickbox
#: sabnzbd/skintext.py
@@ -3804,6 +3770,17 @@ msgstr ""
msgid "Enable"
msgstr "Aktivere"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -3833,11 +3810,11 @@ msgstr "Personlige notater"
#: sabnzbd/skintext.py
msgid "Article availability"
msgstr ""
msgstr "Artikeltilgængelighed"
#: sabnzbd/skintext.py
msgid "%f% available of %d requested articles"
msgstr ""
msgstr "%f% tilgængelige af %d anmodede artikler"
#. Config->Scheduling
#: sabnzbd/skintext.py
@@ -3898,12 +3875,12 @@ msgstr "Anvend filtre"
#. Config->RSS edit button
#: sabnzbd/skintext.py
msgid "Edit"
msgstr ""
msgstr "Redigér"
#. Config->RSS when will be the next RSS scan
#: sabnzbd/skintext.py
msgid "Next scan at"
msgstr ""
msgstr "Næste scanning kl."
#. Config->RSS table column header
#: sabnzbd/skintext.py
@@ -3985,6 +3962,8 @@ msgid ""
"If only the <em>Default</em> category is selected, notifications are enabled"
" for jobs in all categories."
msgstr ""
"Hvis kun kategorien <em>Standard</em> er valgt, er beskeder aktiveret for "
"jobs i alle kategorier."
#: sabnzbd/skintext.py
msgid "Email Notification On Job Completion"
@@ -4161,20 +4140,20 @@ msgstr "Enhed(er) som meddelelse skal sendes til"
#. Pushover settings
#: sabnzbd/skintext.py
msgid "Emergency retry"
msgstr ""
msgstr "Nødforsøg"
#: sabnzbd/skintext.py
msgid "How often (in seconds) the same notification will be sent"
msgstr ""
msgstr "Hvor ofte (i sekunder) samme besked vil blive sendt"
#. Pushover settings
#: sabnzbd/skintext.py
msgid "Emergency expire"
msgstr ""
msgstr "Nødudløb"
#: sabnzbd/skintext.py
msgid "How many seconds your notification will continue to be retried"
msgstr ""
msgstr "Hvor mange sekunder din besked fortsætter med at blive forsøgt"
#. Header for Pushbullet notification section
#: sabnzbd/skintext.py
@@ -4217,19 +4196,30 @@ msgid "Enable Apprise notifications"
msgstr "Aktiver Apprise-notifikationer"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Send notifikationer via Apprise til næsten enhver notifikationstjeneste"
"Send beskeder direkte til enhver beskedtjeneste, du bruger.<br>For eksempel:"
" Slack, Discord, Telegram eller enhver tjeneste fra over 100 understøttede "
"tjenester!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "Standard Apprise-URL'er"
msgid "Use default Apprise URLs"
msgstr "Brug standard Apprise-URL'er"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "Brug komma og/eller mellemrum for at angive flere URL'er."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise definerer tjenesteforbindelsesoplysninger ved hjælp af "
"URL'er.<br>Læs Apprise-wikien om, hvordan man definerer URL'en for hver "
"tjeneste.<br>Brug komma og/eller mellemrum til at identificere mere end én "
"URL."
#: sabnzbd/skintext.py
msgid ""
@@ -4408,15 +4398,15 @@ msgstr "Sorteringsstreng"
#: sabnzbd/skintext.py
msgid "Multi-part Label"
msgstr ""
msgstr "Fler-dels-etiket"
#: sabnzbd/skintext.py
msgid "Show folder"
msgstr ""
msgstr "Vis mappe"
#: sabnzbd/skintext.py
msgid "Season folder"
msgstr ""
msgstr "Sæsonmappe"
#: sabnzbd/skintext.py
msgid "In folders"
@@ -4432,7 +4422,7 @@ msgstr "Job Navn som Filnavn"
#: sabnzbd/skintext.py
msgid "Series"
msgstr ""
msgstr "Serier"
#. Note for title expression in Sorting that does case adjustment
#: sabnzbd/skintext.py
@@ -4445,31 +4435,31 @@ msgstr "Forarbejdede resultat"
#: sabnzbd/skintext.py
msgid "Any property"
msgstr ""
msgstr "Enhver egenskab"
#: sabnzbd/skintext.py
msgid "property"
msgstr ""
msgstr "egenskab"
#: sabnzbd/skintext.py
msgid "GuessIt Property"
msgstr ""
msgstr "GuessIt-egenskab"
#: sabnzbd/skintext.py
msgid "GuessIt.Property"
msgstr ""
msgstr "GuessIt.Egenskab"
#: sabnzbd/skintext.py
msgid "GuessIt_Property"
msgstr ""
msgstr "GuessIt_Egenskab"
#: sabnzbd/skintext.py
msgid "Minimum Filesize"
msgstr ""
msgstr "Minimum filstørrelse"
#: sabnzbd/skintext.py
msgid "Affected Job Types"
msgstr ""
msgstr "Berørte jobtyper"
#: sabnzbd/skintext.py
msgid "All"
@@ -4477,15 +4467,15 @@ msgstr "Alle"
#: sabnzbd/skintext.py
msgid "Series with air dates"
msgstr ""
msgstr "Serier med sendetidspunkter"
#: sabnzbd/skintext.py
msgid "Movies"
msgstr ""
msgstr "Film"
#: sabnzbd/skintext.py
msgid "Other / Unknown"
msgstr ""
msgstr "Andet / Ukendt"
#: sabnzbd/skintext.py
msgid ""
@@ -4497,34 +4487,43 @@ msgid ""
"applied.</p><p>More options are available when Advanced Settings is "
"checked.<br/>Detailed information can be found on the Wiki.</p>"
msgstr ""
"<p>Brug sorteringsværktøjer til automatisk at organisere dine fuldførte "
"downloads. For eksempel, placer alle episoder fra en serie i en "
"sæsonspecifik mappe. Eller placer film i en mappe opkaldt efter "
"filmen.</p><p>Sorteringsværktøjer afprøves i den rækkefølge, de vises, og "
"kan omarrangeres ved at trække og slippe.<br/>Den første aktive sortering, "
"der matcher både den berørte kategori og jobtype, anvendes.</p><p>Flere "
"muligheder er tilgængelige, når Avancerede indstillinger er "
"markeret.<br/>Detaljeret information kan findes på Wiki'en.</p>"
#: sabnzbd/skintext.py
msgid "Add Sorter"
msgstr ""
msgstr "Tilføj sortering"
#: sabnzbd/skintext.py
msgid "Remove Sorter"
msgstr ""
msgstr "Fjern sortering"
#: sabnzbd/skintext.py
msgid "Test Data"
msgstr ""
msgstr "Testdata"
#: sabnzbd/skintext.py
msgid "Quick start"
msgstr ""
msgstr "Hurtig start"
#: sabnzbd/skintext.py
msgid ""
"Move and rename all episodes in the \"tv\" category to a show-specific "
"folder"
msgstr ""
"Flyt og omdøb alle episoder i kategorien \"tv\" til en programspecifik mappe"
#: sabnzbd/skintext.py
msgid ""
"Move and rename all movies in the \"movies\" category to a movie-specific "
"folder"
msgstr ""
msgstr "Flyt og omdøb alle film i kategorien \"movies\" til en filmspecifik mappe"
#: sabnzbd/skintext.py
msgid ""
@@ -4557,6 +4556,11 @@ msgstr "Slet"
msgid "Filename"
msgstr "Filnavn"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Alder"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Ledig diskplads"
@@ -4635,11 +4639,11 @@ msgstr "Datoformat"
#: sabnzbd/skintext.py
msgid "Extra queue columns"
msgstr ""
msgstr "Ekstra køkolonner"
#: sabnzbd/skintext.py
msgid "Extra history columns"
msgstr ""
msgstr "Ekstra historikkolonner"
#: sabnzbd/skintext.py
msgid "page"
@@ -4690,6 +4694,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"Er du sikker på, at du vil slette alle mapper i din midlertidige "
"downloadmappe? Dette kan ikke fortrydes!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -4728,6 +4734,8 @@ msgid ""
"When you Retry a job, 'Duplicate Detection' and 'Abort jobs that cannot be "
"completed' are disabled."
msgstr ""
"Når du genforsøger et job, er 'Dubletdetektering' og 'Afbryd jobs, der ikke "
"kan fuldføres' deaktiveret."
#: sabnzbd/skintext.py
msgid "View Script Log"
@@ -4735,7 +4743,7 @@ msgstr "Vis scriptlog"
#: sabnzbd/skintext.py
msgid "Renaming the job will abort Direct Unpack."
msgstr ""
msgstr "Omdøbning af jobbet vil afbryde direkte udpakning."
#: sabnzbd/skintext.py
msgid ""
@@ -4759,7 +4767,7 @@ msgstr "Kompakt layout"
#: sabnzbd/skintext.py
msgid "Always use full screen width"
msgstr ""
msgstr "Brug altid fuld skærmbredde"
#: sabnzbd/skintext.py
msgid "Tabbed layout <br/>(separate queue and history)"
@@ -4779,11 +4787,11 @@ msgstr "Bekræft Historik-fjernelse"
#: sabnzbd/skintext.py
msgid "Keyboard shortcuts"
msgstr ""
msgstr "Tastaturgenveje"
#: sabnzbd/skintext.py
msgid "Shift+Arrow key: Browse Queue and History pages"
msgstr ""
msgstr "Shift+piletast: Gennemse Kø- og Historiksider"
#: sabnzbd/skintext.py
msgid "How long or untill when do you want to pause? (in English!)"
@@ -4806,10 +4814,12 @@ msgid ""
"All usernames, passwords and API-keys are automatically removed from the log"
" and the included copy of your settings."
msgstr ""
"Alle brugernavne, adgangskoder og API-nøgler fjernes automatisk fra loggen "
"og den inkluderede kopi af dine indstillinger."
#: sabnzbd/skintext.py
msgid "Sort by % downloaded <small>Most&rarr;Least</small>"
msgstr ""
msgstr "Sortér efter % downloadet <small>Mest&rarr;Mindst</small>"
#: sabnzbd/skintext.py
msgid "Sort by Age <small>Oldest&rarr;Newest</small>"
@@ -4944,11 +4954,11 @@ msgstr "Start guide"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "Klik på Test server før du fortsætter"
#: sabnzbd/skintext.py
msgid "Restore backup"
msgstr ""
msgstr "Gendan sikkerhedskopi"
#: sabnzbd/skintext.py
msgid ""
@@ -4965,7 +4975,7 @@ msgstr ""
#. Error message
#: sabnzbd/sorting.py
msgid "Failed to rename %s to %s"
msgstr ""
msgstr "Kunne ikke omdøbe %s til %s"
#. Error message
#: sabnzbd/sorting.py
@@ -4984,6 +4994,10 @@ msgstr "Fil ikke på server"
msgid "Server could not complete request"
msgstr "Serveren kunne ikke fuldføre anmodningen"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tom NZB fil %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -15,14 +15,14 @@
# Stefan Rodriguez Galeano, 2024
# M Z, 2024
# Gjelbrim Haskaj, 2024
# Safihre <safihre@sabnzbd.org>, 2024
# Media Cat, 2025
# Safihre <safihre@sabnzbd.org>, 2025
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Media Cat, 2025\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: German (https://app.transifex.com/sabnzbd/teams/111101/de/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -58,6 +58,8 @@ msgid ""
"Unable to link to OpenSSL, optimized SSL connection functions will not be "
"used."
msgstr ""
"OpenSSL kann nicht verknüpft werden, optimierte SSL-Verbindungsfunktionen "
"werden nicht verwendet."
#. Error message
#: SABnzbd.py
@@ -165,6 +167,11 @@ msgstr ""
"Die aktuellen Zugriffseinstellungen (%o) könnte SABnzbd den Zugriff auf die "
"erstellten Dateien und Ordner von SABnzbd verweigern."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -355,7 +362,7 @@ msgstr "Unerwünschter Typ \"%s\" in RAR Datei. Unerwünschte Datei ist %s "
msgid "Unwanted extension is in rar file %s"
msgstr "Unerwünschter Dateityp im RAR-Archiv %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Abgebrochen, unerwünschte Dateieindung gefunden"
@@ -384,11 +391,11 @@ msgstr "Kontingent"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "Warnung zur Kontingentgrenze (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Download nach Kontingentzurücksetzung fortgesetzt"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -615,11 +622,6 @@ msgstr "Fehler %s@%s zu initialisieren, aus folgendem Grund: %s"
msgid "Fatal error in Downloader"
msgstr "Schwerer Fehler im Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s:Unbekannter Statuscode%s für Artikel erhalten %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Zu viele Verbindungen zu Server %s [%s]"
@@ -641,11 +643,6 @@ msgstr "Anmelden beim Server fehlgeschlagen. %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Fehler beim Verbinden mit %s@%s, Meldung = %s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Vermute Fehler im Downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Wird beendet …"
@@ -811,6 +808,11 @@ msgstr ""
"Dateinamen mit Umlaute können nicht in %s gespeichert werden. Dies kann zu "
"Problemen führen."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Abgelehnte Verbindung von:"
@@ -1044,7 +1046,7 @@ msgid "Update Available!"
msgstr "Neue Version verfügbar!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Hochladen fehlgeschlagen: %s"
@@ -1291,6 +1293,16 @@ msgstr "Versuche SFV-Überprüfung"
msgid "left"
msgstr "rest"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s:Unbekannter Statuscode%s für Artikel erhalten %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Vermute Fehler im Downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Dieser Server erlaubt kein SSL auf diesem Port"
@@ -1477,106 +1489,18 @@ msgstr "Fehler beim Laden von %s. Beschädigte Datei gefunden."
msgid "NZB added to queue"
msgstr "NZB zur Warteschlange hinzugefügt"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Doppelte NZB \"%s\" wird ignoriert"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "kopieren der NZB \"%s\" fehlgeschlagen"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Doppelte NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Ungültige NZB-Datei %s wird übersprungen (Fehler: %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Leere NZB-Datei %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
"Das Vorwarteschlangen (pre-queue) Skript hat die Downloadaufgabe als "
"gescheitert markiert"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Ungewollte Dateiendung in der Datei %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Abgebrochen, kann nicht fertiggestellt werden"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Fehler beim Importieren von %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLIKAT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATIVE"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "VERSCHLÜSSELT"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "ZU GROSS"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "UNVOLLSTÄNDIG"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "UNERWÜNSCHT"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "WARTE %s Sek"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "AUSBREITUNG %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr ""
"Heruntergeladen in %s mit einer Durchschnittsgeschwindigkeit von %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Alter"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s Artikel hatten ein ungültiges Format"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s Artikel fehlten"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s Artikel hatten nicht übereinstimmende Duplikate"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Doppelt vorhandene NZB \"%s\" angehalten"
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problem mit"
@@ -1860,6 +1784,14 @@ msgstr "Fehler beim Herunterfahren des Systems"
msgid "Received a DBus exception %s"
msgstr "DBus-Ausnahmefehler empfangen %s "
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Leerer RSS-Feed gefunden: %s"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibeler RSS-Feed"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1886,14 +1818,6 @@ msgstr "Der Server %s nutzt ein nicht vertrauenswürdiges HTTPS-Zertifikat"
msgid "RSS Feed %s was empty"
msgstr "RSS-Feed %s war leer"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibeler RSS-Feed"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Leerer RSS-Feed gefunden: %s"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Interface anzeigen"
@@ -2478,7 +2402,7 @@ msgstr "Erneut versuchen"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "Als abgeschlossen markieren und temporäre Dateien entfernen"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -3517,8 +3441,9 @@ msgid "Enable SFV-based checks"
msgstr "SFV-basierte Überprüfung aktivieren"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Zusätzliche Überprüfung mittels SFV-Dateien durchführen"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3861,6 +3786,9 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"Kontingent für diesen Server, gezählt ab dem Zeitpunkt der Festlegung. In "
"Bytes, optional gefolgt von K,M,G.<br />Wird alle paar Minuten überprüft. "
"Benachrichtigung wird gesendet, wenn das Kontingent aufgebraucht ist."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3948,6 +3876,17 @@ msgstr "Für unzuverlässige Server, wird bei Fehlern länger ignoriert"
msgid "Enable"
msgstr "Aktivieren"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4367,22 +4306,30 @@ msgid "Enable Apprise notifications"
msgstr "Aktivieren Sie Info-Benachrichtigungen"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Senden Sie Benachrichtigungen mit Anfragen an fast jeden "
"Benachrichtigungsdienst"
"Senden Sie Benachrichtigungen direkt an jeden von Ihnen genutzten "
"Benachrichtigungsdienst.<br>Zum Beispiel: Slack, Discord, Telegram oder "
"jeden anderen Dienst aus über 100 unterstützten Diensten!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "Standard Apprise URLs"
msgid "Use default Apprise URLs"
msgstr "Standard-Apprise-URLs verwenden"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Verwenden Sie ein Komma und/oder ein Leerzeichen, um mehr als eine URL zu "
"kennzeichnen."
"Apprise definiert Dienstverbindungsinformationen über URLs.<br>Lesen Sie das"
" Apprise-Wiki, um zu erfahren, wie Sie die URL für jeden Dienst "
"definieren.<br>Verwenden Sie ein Komma und/oder Leerzeichen, um mehr als "
"eine URL anzugeben."
#: sabnzbd/skintext.py
msgid ""
@@ -4723,6 +4670,11 @@ msgstr "Löschen"
msgid "Filename"
msgstr "Dateiname"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Alter"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Freier Speicherplatz"
@@ -4856,6 +4808,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"Sind Sie sicher, dass Sie alle Ordner in Ihrem temporären Download-Ordner "
"löschen möchten? Dies kann nicht rückgängig gemacht werden!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -5115,7 +5069,7 @@ msgstr "Assistenten starten"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "Klicken Sie auf \"Server testen\", bevor Sie fortfahren"
#: sabnzbd/skintext.py
msgid "Restore backup"
@@ -5155,6 +5109,10 @@ msgstr "Datei nicht auf dem Server"
msgid "Server could not complete request"
msgstr "Server konnte nicht vollständig antworten"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Leere NZB-Datei %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -9,7 +9,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Spanish (https://app.transifex.com/sabnzbd/teams/111101/es/)\n"
@@ -156,6 +156,11 @@ msgstr ""
"La umask actual (%o) podría denegarle acceso a SABnzbd a los archivos y "
"carpetas que este crea."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -342,7 +347,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Se ha encontrado una extensión desconocida en el fichero rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Se interrumpió la acción porque se detectó una extensión no deseada"
@@ -373,11 +378,11 @@ msgstr "Cuota"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "Advertencia de límite de cuota (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Descarga reanudada después de reiniciar la cuota"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -602,12 +607,6 @@ msgstr "Error al inicializar %s@%s con la razón: %s"
msgid "Fatal error in Downloader"
msgstr "Error grave en el descargador"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
"%s@%s: Se recibió un código de estado desconocido %s para el artículo %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Demasiadas conexiones con el servidor %s [%s]"
@@ -629,11 +628,6 @@ msgstr "Registraccion fallo para servidor %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Ha fallado la conexión a %s@%s, el mensaje=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Error sospechoso en downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Apagando"
@@ -798,6 +792,11 @@ msgstr ""
"%s no permite escribir nombres de archivo con caracteres especiales. Esto "
"puede causar problemas."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Conexión rechazada de:"
@@ -1031,7 +1030,7 @@ msgid "Update Available!"
msgstr "¡Actualización Disponible!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Error al subir archivo: %s"
@@ -1282,6 +1281,17 @@ msgstr "Intentando verificación por SFV"
msgid "left"
msgstr "Restante"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
"%s@%s: Se recibió un código de estado desconocido %s para el artículo %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Error sospechoso en downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Este servidor no permite SSL en este puerto"
@@ -1471,105 +1481,18 @@ msgstr "Error al cargar %s, archivo corrupto"
msgid "NZB added to queue"
msgstr "NZB añadido a la cola"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorando NZB Duplicado \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Fallo al duplicar NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Duplicar NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Fichero NBZ inválido: %s, omitiendo (razón=%s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fichero NZB vacío: %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
"La secuencia de comandos de la cola preestablecida ha marcado la tarea como "
"fallida"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Extensión no deseada en el archivo %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Abortado, No puede ser completado"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Error importando %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLICADO"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATIVO"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ENCRIPTADO"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "DEMASIADO GRANDE"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INCOMPLETO"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NO DESEADO"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "ESPERAR %s seg"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "PROPAGANDO %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Descargado en %s a una media de %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Edad"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artículos estaban mal formados."
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artículos no encontrados"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artículos contenían duplicados inconexos"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Pausando NZB duplicados \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problema con"
@@ -1849,6 +1772,14 @@ msgstr "Error al apagarel sistema"
msgid "Received a DBus exception %s"
msgstr "Se ha recibido una excepción DBus %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrada RSS vacía (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Canal Incorrecto"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1877,14 +1808,6 @@ msgstr "El servidor %s utiliza un certificado HTTPS no fiable"
msgid "RSS Feed %s was empty"
msgstr "El canal RSS %s estaba vacío"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Canal Incorrecto"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrada RSS vacía (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Mostrar interfaz"
@@ -2469,7 +2392,7 @@ msgstr "Reintentar"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "Marcar como completado y eliminar archivos temporales"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -3495,8 +3418,9 @@ msgid "Enable SFV-based checks"
msgstr "Habilitar verificacion basada en SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Realiza una verificación extra basada en ficheros SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3831,6 +3755,9 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"Cuota para este servidor, contada desde el momento en que se establece. En "
"bytes, opcionalmente seguido de K,M,G.<br />Comprobado cada pocos minutos. "
"Se envía una notificación cuando se agota la cuota."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3920,6 +3847,17 @@ msgstr ""
msgid "Enable"
msgstr "Habilitar"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4338,20 +4276,29 @@ msgid "Enable Apprise notifications"
msgstr "Habilitar notificaciones Apprise"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Enviar notificaciones usando Apprise a casi cualquier servicio de "
"notificación"
"Envíe notificaciones directamente a cualquier servicio de notificaciones que"
" utilice.<br>Por ejemplo: Slack, Discord, Telegram o cualquier servicio de "
"más de 100 servicios compatibles."
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "URLs predeterminadas de Apprise"
msgid "Use default Apprise URLs"
msgstr "Usar URLs de Apprise predeterminadas"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "Use una coma y/o espacio para identificar más de una URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise define la información de conexión del servicio mediante URLs.<br>Lea"
" el wiki de Apprise para saber cómo definir la URL de cada servicio.<br>Use "
"una coma y/o espacio para identificar más de una URL."
#: sabnzbd/skintext.py
msgid ""
@@ -4693,6 +4640,11 @@ msgstr "Eliminar"
msgid "Filename"
msgstr "Nombre de archivo"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Edad"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Espacio libre"
@@ -4828,6 +4780,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"¿Está seguro de que desea eliminar todas las carpetas en su carpeta de "
"descargas temporales? ¡Esto no se puede deshacer!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -5087,7 +5041,7 @@ msgstr "Iniciar Asistente"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "Haga clic en Probar servidor antes de continuar"
#: sabnzbd/skintext.py
msgid "Restore backup"
@@ -5127,6 +5081,10 @@ msgstr "El fichero no se encuentra en el servidor"
msgid "Server could not complete request"
msgstr "El servidor no ha podido completar la solicitud"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fichero NZB vacío: %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Finnish (https://app.transifex.com/sabnzbd/teams/111101/fi/)\n"
@@ -146,6 +146,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -315,7 +320,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Ei toivottu tiedostopääte on rar arkistossa %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Peruutettu, ei toivottu tiedostopääte havaittu"
@@ -556,11 +561,6 @@ msgstr "Alustaminen epäonnistui kohteessa %s@%s syy: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Liikaa yhteyksiä palvelimelle %s [%s]"
@@ -580,11 +580,6 @@ msgstr "Kirjautuminen palvelimelle %s epäonnistui [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Yhdistäminen %s@%s epäonnistui, viesti=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Mahdollinen virhe lataajassa"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Sammutetaan"
@@ -742,6 +737,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -971,7 +971,7 @@ msgid "Update Available!"
msgstr "Päivitys saatavilla!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1208,6 +1208,16 @@ msgstr "Yritetään SFV varmennusta"
msgid "left"
msgstr "jäljellä"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Mahdollinen virhe lataajassa"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Tämä palvelin ei salli SSL yhteyksiä tähän porttiin"
@@ -1387,103 +1397,18 @@ msgstr "Virhe ladattaessa %s, korruptoitunut tiedosto havaittu"
msgid "NZB added to queue"
msgstr "NZB lisätty jonoon"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ohitetaan kaksoiskappale NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tyhjä NZB tiedosto %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Peruutettu, ei voi valmistua"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Virhe tuotaessa %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "KAKSOISKAPPALE"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "SALATTU"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "LIIAN SUURI"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "KESKENERÄINEN"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "EI TOIVOTTU"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "ODOTA %s sekuntia"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "LEVITETÄÄN %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Ladattiin ajassa %s keskilatausnopeudella %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Ikä"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artikkelia oli väärin muotoiltuja"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artikkelia puuttui"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artikkelissa oli ei-vastaavia kaksoiskappaleita"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Keskeytetään kaksoiskappale NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Ongelma"
@@ -1756,6 +1681,14 @@ msgstr "Virhe sammutettaessa järjestelmää"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tyhjä RSS kohde löytyi (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Puutteellinen syöte"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1782,14 +1715,6 @@ msgstr "Palvelin %s käyttää epäluotettavaa HTTPS sertifikaattia"
msgid "RSS Feed %s was empty"
msgstr "RSS syöte %s oli tyhjä"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Puutteellinen syöte"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tyhjä RSS kohde löytyi (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Näytä käyttöliittymä"
@@ -3363,8 +3288,9 @@ msgid "Enable SFV-based checks"
msgstr "SFV-pohjaiset tarkistukset käytössä"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Suorittaa ylimääräisen varmennuksen SFV tiedostojen avulla."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3762,6 +3688,17 @@ msgstr ""
msgid "Enable"
msgstr "Ota käyttöön"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4175,17 +4112,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4513,6 +4455,11 @@ msgstr "Poista"
msgid "Filename"
msgstr "Tiedostonimi"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Ikä"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Vapaa tila"
@@ -4943,6 +4890,10 @@ msgstr "Tiedostoa ei ole palvelimella"
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tyhjä NZB tiedosto %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -3,13 +3,13 @@
#
# Translators:
# Safihre <safihre@sabnzbd.org>, 2025
# Fred L <88com88@gmail.com>, 2025
# Fred L <88com88@gmail.com>, 2026
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Fred L <88com88@gmail.com>, 2025\n"
"Last-Translator: Fred L <88com88@gmail.com>, 2026\n"
"Language-Team: French (https://app.transifex.com/sabnzbd/teams/111101/fr/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -157,6 +157,13 @@ msgstr ""
"L'umask actuel (%o) pourrait refuser à SABnzbd l'accès aux fichiers et "
"dossiers qu'il crée."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
"La version Windows ARM de SABnzbd est disponible depuis notre page "
"Téléchargements!"
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -347,7 +354,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "L'extension indésirable est dans le fichier rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Interrompu, extension indésirable détectée"
@@ -606,11 +613,6 @@ msgstr "Échec d'initialisation de %s@%s pour la raison suivante : %s"
msgid "Fatal error in Downloader"
msgstr "Erreur fatale dans le Téléchargeur"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s a reçu le code d'état inconnu %s pour l'article %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Trop de connexions au serveur %s [%s]"
@@ -632,11 +634,6 @@ msgstr "Échec de la connexion au serveur %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "La connexion à %s@%s a échoué, message=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Erreur suspecte dans le téléchargeur"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Arrêt en cours..."
@@ -802,6 +799,13 @@ msgstr ""
"Le fichier %s n'est pas inscriptible à cause des caractères spéciaux dans le"
" nom. Cela peut causer des problèmes."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
"%s ne prend pas en charge les fichiers fragmentés. Désactivation du mode "
"d'écriture directe."
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Connexion refusée de:"
@@ -1035,7 +1039,7 @@ msgid "Update Available!"
msgstr "Mise à Jour disponible!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Échec de l'upload du fichier : %s"
@@ -1282,6 +1286,16 @@ msgstr "Essai vérification SFV"
msgid "left"
msgstr "restant"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s a reçu le code d'état inconnu %s pour l'article %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Erreur suspecte dans le téléchargeur"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Ce serveur n'authorise pas de connexion SSL sur ce port"
@@ -1466,103 +1480,18 @@ msgstr "Erreur lors du chargement de %s, fichier corrompu détecté"
msgid "NZB added to queue"
msgstr "NZB ajouté à la file d'attente"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Doublon NZB ignoré \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Échec de duplication du NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Dupliquer NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Fichier NZB %s invalide, sera ignoré (erreur : %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fichier NZB %s vide"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Le script de pré-file d'attente a marqué la tâche comme échouée"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Extension non souhaitée dans le fichier %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Interrompu, ne peut être achevé"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Erreur lors de l'importation de %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DOUBLON"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATIVE"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "CHIFFRÉ"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "TROP VOLUMINEUX"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INCOMPLET"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "INDÉSIRABLE"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "PATIENTER %s sec"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "PROPAGATION %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Téléchargé en %s à %sB/s de moyenne"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Âge"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s articles malformés"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s articles manquants"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s articles avec doublons sans correspondance"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Mise en pause du doublon NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problème avec"
@@ -1845,6 +1774,14 @@ msgstr "Erreur lors de l'arrêt du système"
msgid "Received a DBus exception %s"
msgstr "Exception DBus reçue %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrée vide de flux RSS trouvée (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Flux incompatible"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1872,14 +1809,6 @@ msgstr "Le serveur %s utilise un certificat de sécurité HTTPS non authentifié
msgid "RSS Feed %s was empty"
msgstr "Le flux RSS %s était vide"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Flux incompatible"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrée vide de flux RSS trouvée (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Afficher linterface"
@@ -3502,8 +3431,11 @@ msgid "Enable SFV-based checks"
msgstr "Activer les contrôles SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fait une vérification supplémentaire basée sur les fichiers SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
"Si aucun fichier par2 n'est disponible, utiliser les fichiers sfv (si "
"présents) pour vérifier les fichiers"
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3934,6 +3866,20 @@ msgstr ""
msgid "Enable"
msgstr "Activer"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr "Articles par demande"
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
" Demandez plusieurs articles par connexion sans attendre chaque réponse.<br "
"/>Cela peut améliorer les vitesses de téléchargement, en particulier sur les"
" connexions à latence élevée. "
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4353,20 +4299,30 @@ msgid "Enable Apprise notifications"
msgstr "Activer les notifications Apprise"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Envoyer des notifications en utilisant Apprise vers presque n'importe quel "
"service de notification"
"Envoyez des notifications directement vers n'importe quel service de "
"notification que vous utilisez.<br>Par exemple : Slack, Discord, Telegram ou"
" tout autre service parmi plus de 100 services pris en charge !"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "URLs par défaut d'Apprise"
msgid "Use default Apprise URLs"
msgstr "Utiliser les URLs Apprise par défaut"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "Utilisez une virgule et/ou un espace pour identifier plusieurs URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise définit les informations de connexion au service à l'aide "
"d'URL.<br>Consultez le wiki Apprise pour savoir comment définir l'URL de "
"chaque service.<br>Utilisez une virgule et/ou un espace pour identifier "
"plusieurs URL."
#: sabnzbd/skintext.py
msgid ""
@@ -4709,6 +4665,11 @@ msgstr "Supprimer"
msgid "Filename"
msgstr "Nom de fichier"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Âge"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Espace libre"
@@ -5147,6 +5108,10 @@ msgstr "Fichier introuvable sur le serveur"
msgid "Server could not complete request"
msgstr "Le serveur n'a pas pu terminer la requête"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fichier NZB %s vide"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -2,14 +2,14 @@
# Copyright 2007-2025 by The SABnzbd-Team (sabnzbd.org)
#
# Translators:
# Safihre <safihre@sabnzbd.org>, 2023
# ION, 2025
# Safihre <safihre@sabnzbd.org>, 2025
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: ION, 2025\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Hebrew (https://app.transifex.com/sabnzbd/teams/111101/he/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -42,7 +42,7 @@ msgstr "לא ניתן למצוא תבניות רשת: %s, מנסה תבנית ת
msgid ""
"Unable to link to OpenSSL, optimized SSL connection functions will not be "
"used."
msgstr ""
msgstr "לא ניתן לקשר ל-OpenSSL, פונקציות חיבור SSL מותאמות לא יהיו בשימוש."
#. Error message
#: SABnzbd.py
@@ -143,6 +143,11 @@ msgstr ""
"פקודת umask נוכחית (%o) עשויה לדחות גישה מן SABnzbd אל הקבצים והתיקיות שהוא "
"יוצר."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -210,12 +215,16 @@ msgid ""
"Could not connect to %s on port %s. Use the default usenet settings: port "
"563 and SSL turned on"
msgstr ""
"לא ניתן להתחבר ל-%s בפורט %s. השתמש בהגדרות ברירת המחדל של usenet: פורט 563 "
"ו-SSL מופעל"
#: sabnzbd/api.py
msgid ""
"Could not connect to %s on port %s. Use the default usenet settings: port "
"119 and SSL turned off"
msgstr ""
"לא ניתן להתחבר ל-%s בפורט %s. השתמש בהגדרות ברירת המחדל של usenet: פורט 119 "
"ו-SSL כבוי"
#: sabnzbd/api.py, sabnzbd/interface.py
msgid "Server address \"%s:%s\" is not valid."
@@ -316,7 +325,7 @@ msgstr "בעבודה \"%s\" יש סיומת בלתי רצויה בתוך קוב
msgid "Unwanted extension is in rar file %s"
msgstr "סיומת בלתי רצויה בקובץ rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "בוטל, סיומת בלתי רצויה התגלתה"
@@ -343,11 +352,11 @@ msgstr "מכסה"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "אזהרת מגבלת מכסה (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "ההורדה התחדשה לאחר איפוס מכסה"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -411,7 +420,7 @@ msgstr ""
#: sabnzbd/cfg.py
msgid ""
"The par2 application was switched, any custom par2 parameters were removed"
msgstr ""
msgstr "יישום par2 הוחלף, כל פרמטרי par2 מותאמים אישית הוסרו"
#. Warning message
#: sabnzbd/config.py
@@ -487,7 +496,7 @@ msgstr "אי־האפלה שינתה שם של %d קבצים"
#: sabnzbd/deobfuscate_filenames.py
msgid "Deobfuscate renamed %d subtitle file(s)"
msgstr ""
msgstr "בוצע ביטול ערפול של %d קבצי כתוביות ששמם שונה"
#: sabnzbd/directunpacker.py, sabnzbd/skintext.py
msgid "Direct Unpack"
@@ -563,11 +572,6 @@ msgstr "כישלון באתחול %s@%s עם סיבה: %s"
msgid "Fatal error in Downloader"
msgstr "שגיאה גורלית במורידן"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: קוד בלתי ידוע של מעמד התקבל %s עבור מאמר %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "יותר מדי חיבורים לשרת %s [%s]"
@@ -589,11 +593,6 @@ msgstr "כניסה נכשלה עבור שרת %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "התחברות אל %s@%s נכשלה, הודעה=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "הורדה חשודה במורידן"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "מכבה"
@@ -751,6 +750,11 @@ msgid ""
"problems."
msgstr "%s אינו בר־כתיבה עם שמות קבצים עם תו מיוחד. זה יכול לגרום לבעיות."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "חיבור מסורב מאת:"
@@ -979,7 +983,7 @@ msgid "Update Available!"
msgstr "עדכון זמין!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "כישלון בהעלאת קובץ: %s"
@@ -1217,6 +1221,16 @@ msgstr "מנסה וידוא SFV"
msgid "left"
msgstr "נותר"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: קוד בלתי ידוע של מעמד התקבל %s עבור מאמר %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "הורדה חשודה במורידן"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "שרת זה אינו מתיר SSL על פתחה זו"
@@ -1235,6 +1249,8 @@ msgid ""
" locally injected certificate (for example by firewall or virus scanner). "
"Try setting Certificate verification to Medium."
msgstr ""
"לא ניתן לאמת את האישור. זה יכול להיות בעיית שרת או בגלל אישור מוזרק מקומית "
"(לדוגמה על ידי חומת אש או סורק וירוסים). נסה להגדיר את אימות האישור לבינוני."
#: sabnzbd/newswrapper.py
msgid "Server %s uses an untrusted certificate [%s]"
@@ -1315,7 +1331,7 @@ msgstr "כישלון בשליחת הודעת Prowl"
#. Warning message
#: sabnzbd/notifier.py
msgid "Failed to send Apprise message - no URLs defined"
msgstr ""
msgstr "שליחת הודעת Apprise נכשלה - לא הוגדרו כתובות URL"
#. Warning message
#: sabnzbd/notifier.py
@@ -1398,103 +1414,18 @@ msgstr "שגיאה בטעינת %s, קובץ פגום התגלה"
msgid "NZB added to queue"
msgstr "NZB התווסף לתור"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "מתעלם מן NZB כפול \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "מכשיל NZB כפול \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "NZB כפול"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "קובץ NZB בלתי תקף %s, מדלג (שגיאה: %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "קובץ NZB ריק %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "תסריט קדם־תור סומן כנכשל"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "סיומת בלתי רצויה בקובץ %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "בוטל, לא יכול להיות שלם"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "שגיאה ביבוא %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "כפול"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "חלופה"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "מוצפן"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "גדול מדי"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "בלתי שלם"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "בלתי רצוי"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "המתן %s שניות"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "מפיץ %s דקות"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "ירד תוך %s בממוצע של %s ב/ש"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "גיל"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s מאמרים עוותו"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s מאמרים היו חסרים"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "אל %s מאמרים יש כפילויות בלתי תואמות"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "משהה NZB כפול \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "בעיה עם"
@@ -1769,6 +1700,14 @@ msgstr "שגיאה בזמן כיבוי מערכת"
msgid "Received a DBus exception %s"
msgstr "חריגת DBus התקבלה %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "כניסת RSS ריקה נמצאה (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "הזנה בלתי תואמת"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1795,14 +1734,6 @@ msgstr "השרת %s משתמש בתעודת HTTPS בלתי מהימנה"
msgid "RSS Feed %s was empty"
msgstr "הזנת RSS %s הייתה ריקה"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "הזנה בלתי תואמת"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "כניסת RSS ריקה נמצאה (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "הראה ממשק"
@@ -2387,7 +2318,7 @@ msgstr "נסה שוב"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "סמן כהושלם והסר קבצים זמניים"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -2933,7 +2864,7 @@ msgstr "העבר עבודות אל הארכיון אם ההיסטוריה חור
#: sabnzbd/skintext.py
msgid ""
"Delete jobs if the history and archive exceeds specified number of jobs"
msgstr ""
msgstr "מחק עבודות אם ההיסטוריה והארכיון עוברים את מספר העבודות שצוין"
#: sabnzbd/skintext.py
msgid "Move jobs to the archive after specified number of days"
@@ -2942,7 +2873,7 @@ msgstr "העבר עבודות אל הארכיון לאחר מספר מצוין
#: sabnzbd/skintext.py
msgid ""
"Delete jobs from the history and archive after specified number of days"
msgstr ""
msgstr "מחק עבודות מההיסטוריה והארכיון לאחר מספר הימים שצוין"
#: sabnzbd/skintext.py
msgid "Move all completed jobs to archive"
@@ -3371,8 +3302,9 @@ msgid "Enable SFV-based checks"
msgstr "אפשר בדיקות מבוססות SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "בצע וידוא נוסף שמבוסס על קבצי SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3688,6 +3620,8 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"מכסה לשרת זה, נספרת מהרגע שהיא נקבעה. בבייטים, באופן אופציונלי ניתן להוסיף "
"K,M,G.<br />נבדקת כל כמה דקות. הודעה נשלחת כאשר המכסה מוצתה."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3718,6 +3652,11 @@ msgid ""
"used. - Disabled: no certification verification. This is not secure at all, "
"anyone could intercept your connection. "
msgstr ""
"כאשר SSL מופעל: - מחמיר: אכוף אימות אישור מלא. זוהי ההגדרה המאובטחת ביותר. -"
" בינוני: אמת שהאישור תקף ותואם לכתובת השרת, אך אפשר אישורים המוזרקים מקומית "
"(למשל על ידי חומת אש או סורק וירוסים). - מינימלי: אמת שהאישור תקף. זה לא "
"מאובטח, כל אישור תקף יכול לשמש. - מושבת: ללא אימות אישור. זה לא מאובטח כלל, "
"כל אחד יכול ליירט את החיבור שלך."
#: sabnzbd/skintext.py
msgid "Disabled"
@@ -3729,7 +3668,7 @@ msgstr "מזערי"
#: sabnzbd/skintext.py
msgid "Medium"
msgstr ""
msgstr "בינוני"
#: sabnzbd/skintext.py
msgid "Strict"
@@ -3767,6 +3706,17 @@ msgstr "עבור שרתים בלתי מהימנים, ייתקל בהתעלמות
msgid "Enable"
msgstr "אפשר"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4181,18 +4131,28 @@ msgid "Enable Apprise notifications"
msgstr "אפשר התראות Apprise"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgstr "שלח התראות ע״י שימוש בשירות Apprise אל כמעט כל שירות התראות"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"שלח הודעות ישירות לכל שירות הודעות שאתה משתמש בו.<br>לדוגמה: Slack, Discord,"
" Telegram או כל שירות מתוך למעלה מ-100 שירותים נתמכים!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "כתובות Apprise ברירות מחדל"
msgid "Use default Apprise URLs"
msgstr "השתמש בכתובות URL של Apprise המוגדרות כברירת מחדל"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "השתמש בפסיק, ברווח או בשניהם כדי לזהות יותר מכתובת אחת."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise מגדיר מידע על חיבור שירות באמצעות כתובות URL.<br>קרא את הוויקי של "
"Apprise כדי ללמוד כיצד להגדיר את כתובת ה-URL עבור כל שירות.<br>השתמש בפסיק "
"ו/או רווח כדי לזהות יותר מכתובת URL אחת."
#: sabnzbd/skintext.py
msgid ""
@@ -4522,6 +4482,11 @@ msgstr "מחק"
msgid "Filename"
msgstr "שם קובץ"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "גיל"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "שטח פנוי"
@@ -4655,6 +4620,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"האם אתה בטוח שברצונך למחוק את כל התיקיות בתיקיית ההורדות הזמנית שלך? לא ניתן"
" לבטל פעולה זו!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -4913,7 +4880,7 @@ msgstr "התחל אשף"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "לחץ על בדיקת שרת לפני המשך"
#: sabnzbd/skintext.py
msgid "Restore backup"
@@ -4953,6 +4920,10 @@ msgstr "קובץ לא על השרת"
msgid "Server could not complete request"
msgstr "השרת לא היה יכול להשלים בקשה"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "קובץ NZB ריק %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Italian (https://app.transifex.com/sabnzbd/teams/111101/it/)\n"
@@ -42,6 +42,8 @@ msgid ""
"Unable to link to OpenSSL, optimized SSL connection functions will not be "
"used."
msgstr ""
"Impossibile collegarsi a OpenSSL, le funzioni di connessione SSL ottimizzate"
" non verranno utilizzate."
#. Error message
#: SABnzbd.py
@@ -148,6 +150,11 @@ msgstr ""
"L'umask corrente (%o) potrebbe negare a SABnzbd l'accesso ai file e alle "
"cartelle che crea."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -338,7 +345,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "L'estensione non desiderata è nel file rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Annullato, rilevata estensione non desiderata"
@@ -369,11 +376,11 @@ msgstr "Quota"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "Avviso limite quota (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Download ripreso dopo il ripristino della quota"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -597,11 +604,6 @@ msgstr "Inizializzazione di %s@%s fallita con motivo: %s"
msgid "Fatal error in Downloader"
msgstr "Errore fatale nel Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Ricevuto codice di stato sconosciuto %s per l'articolo %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Troppe connessioni al server %s [%s]"
@@ -623,11 +625,6 @@ msgstr "Accesso fallito per il server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Connessione a %s@%s fallita, messaggio=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Sospetto errore nel downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Spegnimento in corso"
@@ -791,6 +788,11 @@ msgstr ""
"%s non è scrivibile con nomi di file con caratteri speciali. Questo può "
"causare problemi."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Connessione rifiutata da:"
@@ -1024,7 +1026,7 @@ msgid "Update Available!"
msgstr "Aggiornamento disponibile!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Caricamento del file %s fallito"
@@ -1267,6 +1269,16 @@ msgstr "Tentativo di verifica SFV"
msgid "left"
msgstr "rimanente"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Ricevuto codice di stato sconosciuto %s per l'articolo %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Sospetto errore nel downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Questo server non permette SSL su questa porta"
@@ -1452,103 +1464,18 @@ msgstr "Errore durante il caricamento di %s, rilevato file corrotto"
msgid "NZB added to queue"
msgstr "NZB aggiunto alla coda"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorando NZB duplicato \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Fallimento NZB duplicato \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "NZB duplicato"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "File NZB non valido %s, saltato (errore: %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "File NZB vuoto %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Lo script pre-coda ha contrassegnato il processo come fallito"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Estensione non desiderata nel file %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Annullato, non può essere completato"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Errore durante l'importazione di %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLICATO"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATIVO"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "CRITTOGRAFATO"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "TROPPO GRANDE"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INCOMPLETO"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NON DESIDERATO"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "ATTENDI %s sec"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "PROPAGAZIONE %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Scaricato in %s a una media di %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Età"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s articoli erano malformati"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s articoli erano mancanti"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s articoli avevano duplicati non corrispondenti"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Messa in pausa NZB duplicato \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problema con"
@@ -1824,6 +1751,14 @@ msgstr "Errore durante lo spegnimento del sistema"
msgid "Received a DBus exception %s"
msgstr "Ricevuta un'eccezione DBus %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Trovata voce RSS vuota (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Feed incompatibile"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1850,14 +1785,6 @@ msgstr "Il server %s utilizza un certificato HTTPS non attendibile"
msgid "RSS Feed %s was empty"
msgstr "Il feed RSS %s era vuoto"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Feed incompatibile"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Trovata voce RSS vuota (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Mostra interfaccia"
@@ -2442,7 +2369,7 @@ msgstr "Riprova"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "Segna come completato e rimuovi i file temporanei"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -3469,8 +3396,9 @@ msgid "Enable SFV-based checks"
msgstr "Abilita controlli basati su SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Esegui una verifica extra basata sui file SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3800,6 +3728,9 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"Quota per questo server, contata dal momento in cui viene impostata. In "
"byte, opzionalmente seguito da K,M,G.<br />Controllato ogni pochi minuti. La"
" notifica viene inviata quando la quota è esaurita."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3888,6 +3819,17 @@ msgstr ""
msgid "Enable"
msgstr "Abilita"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4304,18 +4246,29 @@ msgid "Enable Apprise notifications"
msgstr "Abilita notifiche Apprise"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgstr "Invia notifiche usando Apprise a quasi tutti i servizi di notifica"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Invia notifiche direttamente a qualsiasi servizio di notifica che "
"utilizzi.<br>Ad esempio: Slack, Discord, Telegram o qualsiasi servizio tra "
"oltre 100 servizi supportati!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "URL predefiniti di Apprise"
msgid "Use default Apprise URLs"
msgstr "Usa URL Apprise predefiniti"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "Usa una virgola e/o uno spazio per identificare più di un URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise definisce le informazioni di connessione del servizio utilizzando "
"URL.<br>Leggi il wiki di Apprise per sapere come definire l'URL per ogni "
"servizio.<br>Usa una virgola e/o uno spazio per identificare più di un URL."
#: sabnzbd/skintext.py
msgid ""
@@ -4658,6 +4611,11 @@ msgstr "Elimina"
msgid "Filename"
msgstr "Nome file"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Età"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Spazio libero"
@@ -4792,6 +4750,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"Sei sicuro di voler eliminare tutte le cartelle nella tua cartella di "
"download temporanei? Questo non può essere annullato!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -5052,7 +5012,7 @@ msgstr "Avvia procedura guidata"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "Fai clic su Prova server prima di continuare"
#: sabnzbd/skintext.py
msgid "Restore backup"
@@ -5092,6 +5052,10 @@ msgstr "File non presente sul server"
msgid "Server could not complete request"
msgstr "Il server non ha potuto completare la richiesta"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "File NZB vuoto %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Norwegian Bokmål (https://app.transifex.com/sabnzbd/teams/111101/nb/)\n"
@@ -142,6 +142,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -313,7 +318,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Uønsket forlenging finnes i rar fil %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Avbryt, uønsket forlenging oppdaget"
@@ -553,11 +558,6 @@ msgstr "Feilet å starte %s@%s grunnet: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "For mange tilkoblinger til server %s [%s]"
@@ -577,11 +577,6 @@ msgstr "Kunne ikke logge inn på server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Kontaker %s@%s feilet, feilmelding=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Mistenker feil i nedlaster"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Starter avslutning av SABnzbd.."
@@ -739,6 +734,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -968,7 +968,7 @@ msgid "Update Available!"
msgstr "Oppdatering tilgjengelig"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1206,6 +1206,16 @@ msgstr "Prøver SFV-verifisering"
msgid "left"
msgstr "gjenstår"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Mistenker feil i nedlaster"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Denne serveren tillater ikke SSL på denne porten"
@@ -1385,103 +1395,18 @@ msgstr "Lastingsfeil %s, feilaktig fil oppdaget"
msgid "NZB added to queue"
msgstr "NZB er lagt til i køen"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorerer duplikatfil \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tom NZB-fil %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Avbrutt, kan ikke fullføres"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Kunne ikke importere %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLIKAT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "KRYPTERT"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "FOR STOR"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "UFULLSTENDIG"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "UØNSKET"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "VENT %s sek"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Hentet filer på %s med gjenomsnitts hastighet på %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Tid"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artikler var korrupte"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artikler manglet"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artikler hadde ulike duplikater"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Stanser duplikatfil \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problem med"
@@ -1754,6 +1679,14 @@ msgstr "Feil under avslutting av systemet"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post funnet (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Ukompatibel nyhetsstrøm"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1780,14 +1713,6 @@ msgstr "Server %s bruker et usikkert HTTP sertifikat"
msgid "RSS Feed %s was empty"
msgstr "RSS-kilde %s var tom"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Ukompatibel nyhetsstrøm"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post funnet (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Vis grensesnitt"
@@ -3346,8 +3271,9 @@ msgid "Enable SFV-based checks"
msgstr "Aktiver SFV-baserte sjekker"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Utfør ekstra verifisering basert på SFV filer"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3741,6 +3667,17 @@ msgstr ""
msgid "Enable"
msgstr "Aktivere"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4154,17 +4091,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4487,6 +4429,11 @@ msgstr "Fjern"
msgid "Filename"
msgstr "Filnavn"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Tid"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Ledig plass"
@@ -4915,6 +4862,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Tom NZB-fil %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -8,7 +8,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Dutch (https://app.transifex.com/sabnzbd/teams/111101/nl/)\n"
@@ -44,6 +44,8 @@ msgid ""
"Unable to link to OpenSSL, optimized SSL connection functions will not be "
"used."
msgstr ""
"Kan niet koppelen aan OpenSSL, geoptimaliseerde SSL-verbindingsfuncties "
"worden niet gebruikt."
#. Error message
#: SABnzbd.py
@@ -150,6 +152,11 @@ msgstr ""
"Huidige umask (%o) zou kunnen beletten dat SABnzbd toegang heeft tot de "
"aangemaakte bestanden en mappen."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -335,7 +342,7 @@ msgstr "Ongewenste extensie ontdekt in \"%s\". Het ongewenste bestand is \"%s\"
msgid "Unwanted extension is in rar file %s"
msgstr "De ongewenste extensie zit in RAR-bestand %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Afgebroken, ongewenste extensie ontdekt"
@@ -366,11 +373,11 @@ msgstr "Quotum"
#: sabnzbd/bpsmeter.py
msgid "Quota limit warning (%d%%)"
msgstr ""
msgstr "Waarschuwing quotumlimiet (%d%%)"
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Downloaden hervat na quotumreset"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -598,11 +605,6 @@ msgstr "Initialisatie van %s@%s mislukt, vanwege: %s"
msgid "Fatal error in Downloader"
msgstr "Onherstelbare fout in de Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Onbekende statuscode %s ontvangen voor artikel %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Te veel verbindingen met server %s [%s]"
@@ -624,11 +626,6 @@ msgstr "Aanmelden bij server %s mislukt [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Verbinding %s@%s mislukt, bericht=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Vedachte fout in downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Afsluiten"
@@ -794,6 +791,11 @@ msgstr ""
"Het is niet mogelijk bestanden met speciale tekens op te slaan in %s. Dit "
"geeft mogelijk problemen bij het verwerken van downloads."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Verbinding geweigerd van: "
@@ -1027,7 +1029,7 @@ msgid "Update Available!"
msgstr "Update beschikbaar!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Kon het volgende bestand niet uploaden: %s"
@@ -1270,6 +1272,16 @@ msgstr "Probeer SFV-verificatie"
msgid "left"
msgstr "over"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Onbekende statuscode %s ontvangen voor artikel %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Vedachte fout in downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "De server staat geen SSL toe op deze poort"
@@ -1455,103 +1467,18 @@ msgstr "Fout bij inladen van %s, corrupt bestand gevonden"
msgid "NZB added to queue"
msgstr "Download aan wachtrij toegevoegd"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Dubbele download \"%s\" overgeslagen"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Download '%s' geweigerd omdat het een dubbele is"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Dubbele download"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Corrupte NZB %s wordt overgeslagen (foutmelding: %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "NZB-bestand %s is leeg"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Wachtrij filter script heeft de download afgekeurd"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Ongewenste extensie gevonden in %s (%s) "
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Afgebroken, kan niet voltooid worden"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Fout bij importeren van %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUBBEL"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATIEF"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "VERSLEUTELD"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "TE GROOT"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "ONVOLLEDIG"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "ONGEWENST"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "WACHT %s sec"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "VERSPREIDINGSWACHTTIJD %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Gedownload in %s met een gemiddelde snelheid van %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Leeftijd"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artikelen zijn misvormd"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artikelen ontbreken"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artikelen hadden afwijkende duplicaten"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Dubbele download \"%s\" gepauzeerd"
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Probleem met"
@@ -1827,6 +1754,14 @@ msgstr "Fout bij het afsluiten van het systeem"
msgid "Received a DBus exception %s"
msgstr "DBus foutmelding %s "
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Lege RSS-feed gevonden (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Ongeschikte RSS-feed"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1853,14 +1788,6 @@ msgstr "Server %s gebruikt een onbetrouwbaar HTTPS-certificaat"
msgid "RSS Feed %s was empty"
msgstr "RSS-feed %s is leeg"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Ongeschikte RSS-feed"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Lege RSS-feed gevonden (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Toon webinterface"
@@ -2445,7 +2372,7 @@ msgstr "Opnieuw"
#. History page button
#: sabnzbd/skintext.py
msgid "Mark as Completed & Remove Temporary Files"
msgstr ""
msgstr "Markeer als voltooid en verwijder tijdelijke bestanden"
#. Queue page table, script selection menu
#: sabnzbd/skintext.py
@@ -3467,8 +3394,9 @@ msgid "Enable SFV-based checks"
msgstr "Voer SFV-gebaseerde controles uit"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Doe een extra verificatie m.b.v. SFV-bestanden"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3802,6 +3730,9 @@ msgid ""
" follow with K,M,G.<br />Checked every few minutes. Notification is sent "
"when quota is spent."
msgstr ""
"Quotum voor deze server, geteld vanaf het moment dat het is ingesteld. In "
"bytes, optioneel gevolgd door K,M,G.<br />Wordt om de paar minuten "
"gecontroleerd. Melding wordt verzonden wanneer het quotum is opgebruikt."
#. Server's retention time in days
#: sabnzbd/skintext.py
@@ -3888,6 +3819,17 @@ msgstr ""
msgid "Enable"
msgstr "Inschakelen"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4306,19 +4248,30 @@ msgid "Enable Apprise notifications"
msgstr "Apprise-meldingen activeren"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Stuur meldingen met behulp van Apprise naar bijna elke bestaande service."
"Stuur meldingen rechtstreeks naar elke meldingsservice die u "
"gebruikt.<br>Bijvoorbeeld: Slack, Discord, Telegram of elke andere service "
"uit meer dan 100 ondersteunde services!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "Standaard Apprise-URL's"
msgid "Use default Apprise URLs"
msgstr "Gebruik standaard Apprise-URL's"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgstr "Gebruik een komma en/of spatie om meer dan één URL op te geven."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Apprise definieert serviceverbindingsinformatie met behulp van "
"URL's.<br>Lees de Apprise-wiki om te leren hoe u de URL voor elke service "
"definieert.<br>Gebruik een komma en/of spatie om meer dan één URL te "
"identificeren."
#: sabnzbd/skintext.py
msgid ""
@@ -4656,6 +4609,11 @@ msgstr "Verwijder"
msgid "Filename"
msgstr "Bestandsnaam"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Leeftijd"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Vrije ruimte"
@@ -4789,6 +4747,8 @@ msgid ""
"Are you sure you want to delete all folders in your Temporary Download "
"Folder? This cannot be undone!"
msgstr ""
"Weet u zeker dat u alle mappen in uw tijdelijke downloadmap wilt "
"verwijderen? Dit kan niet ongedaan worden gemaakt!"
#: sabnzbd/skintext.py
msgid "Fetch NZB from URL"
@@ -5048,7 +5008,7 @@ msgstr "Wizard starten"
#. Tooltip for disabled Next button
#: sabnzbd/skintext.py
msgid "Click on Test Server before continuing"
msgstr ""
msgstr "Klik op Test server voordat u doorgaat"
#: sabnzbd/skintext.py
msgid "Restore backup"
@@ -5088,6 +5048,10 @@ msgstr "Bestand bestaat niet op de server"
msgid "Server could not complete request"
msgstr "De server kon de opdracht niet uitvoeren"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "NZB-bestand %s is leeg"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Polish (https://app.transifex.com/sabnzbd/teams/111101/pl/)\n"
@@ -138,6 +138,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -312,7 +317,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Niepożądane rozszerzenie w pliku RAR %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Przerwano, wykryto niepożądane rozszerzenie"
@@ -554,11 +559,6 @@ msgstr "Błąd podczas inicjalizacji %s@%s: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Zbyt wiele połączeń do serwera %s [%s]"
@@ -578,11 +578,6 @@ msgstr "Błąd logowania do serwera %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Błąd połączenia %s@%s, komunikat=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Nieobsługiwany błąd w module pobierania"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Wyłączanie"
@@ -742,6 +737,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -971,7 +971,7 @@ msgid "Update Available!"
msgstr "Dostępna aktualizacja!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1211,6 +1211,16 @@ msgstr "Próba weryfikacji SFV"
msgid "left"
msgstr "pozostało"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Nieobsługiwany błąd w module pobierania"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Serwer nie obsługuje SSL na tym porcie"
@@ -1390,103 +1400,18 @@ msgstr "Błąd ładowania %s, wykryto uszkodzony plik"
msgid "NZB added to queue"
msgstr "NZB dodany do kolejki"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignoruję zduplikowany NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Pusty plik NZB %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Przerwano, nie można ukończyć"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Błąd importu %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLIKAT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ZASZYFROWANY"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "ZA DUŻY"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "NIEKOMPLETNY"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NIEPOŻĄDANY"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "CZEKAM %s s"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Pobrano w %s ze średnią %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Wiek"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artykułów było uszkodzonych"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "Brakowało %s artykułów"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artykułów posiadało niepasujące duplikaty"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Wstrzymuję zduplikowany NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problem z"
@@ -1763,6 +1688,14 @@ msgstr "Wyłączenie systemu nie powiodło się"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Znaleziono pusty wpis RSS (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Niekompatybilny kanał"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1789,14 +1722,6 @@ msgstr "Serwer %s używa niezaufanego certyfikatu HTTPS"
msgid "RSS Feed %s was empty"
msgstr "Kanał RSS %s był pusty"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Niekompatybilny kanał"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Znaleziono pusty wpis RSS (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Pokaż interfejs"
@@ -3357,8 +3282,9 @@ msgid "Enable SFV-based checks"
msgstr "Włącz sprawdzanie przy użyciu SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Wykonuj dodatkową weryfikację na podstawie plików SFV"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3753,6 +3679,17 @@ msgstr ""
msgid "Enable"
msgstr "Włączony"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4166,17 +4103,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4499,6 +4441,11 @@ msgstr "Usuń"
msgid "Filename"
msgstr "Nazwa pliku"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Wiek"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Wolne miejsce"
@@ -4925,6 +4872,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Pusty plik NZB %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Portuguese (Brazil) (https://app.transifex.com/sabnzbd/teams/111101/pt_BR/)\n"
@@ -147,6 +147,11 @@ msgstr ""
"Mascara atual (%o) pode negar ao SABnzbd acesso aos arquivos e diretórios "
"criados."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -324,7 +329,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "A extensão indesejada está no arquivo rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Cancelado, extensão indesejada detectada"
@@ -568,11 +573,6 @@ msgstr "Falha ao iniciar %s@%s devido as seguintes razões: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Excesso de conexões ao servidor %s [%s]"
@@ -592,11 +592,6 @@ msgstr "Falha de logon ao servidor %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "A conexão a %s@%s falhou. Mensagem=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Erro suspeito no downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Encerrando"
@@ -754,6 +749,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -983,7 +983,7 @@ msgid "Update Available!"
msgstr "Atualização Disponível!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1220,6 +1220,16 @@ msgstr "Tentando verificação SFV"
msgid "left"
msgstr "restantes"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Erro suspeito no downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Este servidor não permite SSL nesta porta"
@@ -1399,103 +1409,18 @@ msgstr "Erro ao carregar %s. Arquivo corrompido detectado"
msgid "NZB added to queue"
msgstr "NZB adicionado à fila"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorando NZB duplicado \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Arquivo NZB %s vazio"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Cancelado, não é possível concluir"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Erro ao importar %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLICADO"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "CRIPTOGRAFADO"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "MUITO GRANDE"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INCOMPLETO"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "INDESEJADO"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "Espere %s segundo(s)"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Baixado em %s a uma média de %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Idade"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artigos estavam malformados"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artigos estavam faltando"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artigos tinham duplicatas não-correspondentes"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Pausando NZB duplicado \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problema com"
@@ -1773,6 +1698,14 @@ msgstr "Erro ao desligar o sistema"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrada RSS vazia encontrada (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Feed incompatível"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1800,14 +1733,6 @@ msgstr "Servidor %s usa um certificado HTTPS não confiável"
msgid "RSS Feed %s was empty"
msgstr "O feed RSS %s estava vazio"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Feed incompatível"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Entrada RSS vazia encontrada (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Exibir interface"
@@ -3367,8 +3292,9 @@ msgid "Enable SFV-based checks"
msgstr "Habilitar verificações baseadas em SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fazer uma verificação extra baseada em arquivos SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3764,6 +3690,17 @@ msgstr ""
msgid "Enable"
msgstr "Habilitar"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4177,17 +4114,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4510,6 +4452,11 @@ msgstr "Eliminar"
msgid "Filename"
msgstr "Nome do arquivo"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Idade"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Espaço Disponível"
@@ -4936,6 +4883,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Arquivo NZB %s vazio"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Romanian (https://app.transifex.com/sabnzbd/teams/111101/ro/)\n"
@@ -147,6 +147,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -326,7 +331,7 @@ msgstr "Extensie nedorită în fișierul RAR al „%s”. Fișierul nedorit este
msgid "Unwanted extension is in rar file %s"
msgstr "Extensii fișier nedorite în fișierul rar %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Oprit, extensii nedorite detectate"
@@ -576,11 +581,6 @@ msgstr "Nu am putu inițializa %s@%s din cauza următorului motiv: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Prea multe conexiuni la serverul %s [%s]"
@@ -600,11 +600,6 @@ msgstr "Autentificare nereuşită la serverul %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Conectare %s@%s eșuată, mesaj=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Eroare suspectă în sistemul de descprcare"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Închidere"
@@ -762,6 +757,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -993,7 +993,7 @@ msgid "Update Available!"
msgstr "Actualizare Disponibilă!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Eșuare la încărcarea fișierului: %s"
@@ -1236,6 +1236,16 @@ msgstr "Încerc verificare SFV"
msgid "left"
msgstr "rămas"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Eroare suspectă în sistemul de descprcare"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Acest server nu permite SSL pe acest port"
@@ -1417,103 +1427,18 @@ msgstr "Eroare încărcare %s, fişier corupt detectat"
msgid "NZB added to queue"
msgstr "NZB adăugat în coadă"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorăm duplicat NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "Eșuare duplicat NZB „%s”"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "NZB duplicat"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fişier NZB gol %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Scriptul pre-coadă a marcat sarcina ca nereușită"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "Extensie nedorită în fișierul %s (%s)"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Anulat nu poate fi finalizat"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Eroare importare %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUPLICAT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ENCRIPTAT"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "PREA MARE"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INCOMPLET"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NEDORIT"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "AŞTEAPTĂ %s sec"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "SE PROPAGHEAZĂ %s min"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Descărcat în %s cu o medie de %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Vârsta"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s articolele au fost incorecte"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s articolele au fost lipsă"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s articolele au avut duplicate diferite"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Întrerupem duplicat NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problemă cu"
@@ -1792,6 +1717,14 @@ msgstr "Eroare la oprirea sistemului"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Valoare RSS gasită a fost goală (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Fulx RSS incompatibil"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1818,14 +1751,6 @@ msgstr "Serverul %s utilizează un certificat HTTPS nesigur"
msgid "RSS Feed %s was empty"
msgstr "Fluxul RSS %s a fost gol"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Fulx RSS incompatibil"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Valoare RSS gasită a fost goală (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Arată interfața"
@@ -3385,8 +3310,9 @@ msgid "Enable SFV-based checks"
msgstr "Activează verficări SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fă o verificare extra bazată pe fişiere SFV"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3785,6 +3711,17 @@ msgstr ""
msgid "Enable"
msgstr "Activează"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4198,17 +4135,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4530,6 +4472,11 @@ msgstr "Şterge"
msgid "Filename"
msgstr "Nume de fișier"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Vârsta"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Spațiu liber"
@@ -4958,6 +4905,10 @@ msgstr "Fișierul nu este pe server"
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Fişier NZB gol %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -3,12 +3,13 @@
#
# Translators:
# Safihre <safihre@sabnzbd.org>, 2023
# ST02, 2026
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Last-Translator: ST02, 2026\n"
"Language-Team: Russian (https://app.transifex.com/sabnzbd/teams/111101/ru/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -24,7 +25,7 @@ msgstr "Предупреждение"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr ""
msgstr "Ошибка"
#. Error message
#: SABnzbd.py
@@ -88,7 +89,7 @@ msgstr ""
#. Error message
#: SABnzbd.py
msgid "HTTP and HTTPS ports cannot be the same"
msgstr ""
msgstr "HTTP и HTTPS порты не могут быть одинаковыми"
#. Warning message
#: SABnzbd.py
@@ -103,12 +104,12 @@ msgstr "HTTPS отключён, поскольку отсутствуют фай
#. Warning message
#: SABnzbd.py
msgid "Disabled HTTPS because of invalid CERT and KEY files"
msgstr ""
msgstr "HTTPS отключён, поскольку файлы CERT и KEY недействительны"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface: "
msgstr ""
msgstr "Не удалось запустить веб-интерфейс:"
#: SABnzbd.py
msgid "SABnzbd %s started"
@@ -142,6 +143,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -301,7 +307,7 @@ msgstr ""
#: sabnzbd/assembler.py
msgid "Aborted, encryption detected"
msgstr ""
msgstr "Прервано, обнаружено шифрование"
#. Warning message
#: sabnzbd/assembler.py
@@ -312,9 +318,9 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr ""
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr ""
msgstr "Прервано, обнаружено нежелательное расширение"
#. Warning message
#: sabnzbd/assembler.py
@@ -343,7 +349,7 @@ msgstr ""
#: sabnzbd/bpsmeter.py
msgid "Downloading resumed after quota reset"
msgstr ""
msgstr "Загрузка возобновилась после сброса квоты"
#: sabnzbd/cfg.py, sabnzbd/interface.py
msgid "Incorrect parameter"
@@ -511,7 +517,7 @@ msgstr "Не удаётся прочитать наблюдаемую папку
#: sabnzbd/downloader.py
msgid "Resuming"
msgstr ""
msgstr "Возобновление"
#. PP status - Priority pick list
#: sabnzbd/downloader.py, sabnzbd/macosmenu.py, sabnzbd/sabtray.py,
@@ -552,11 +558,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr ""
@@ -576,11 +577,6 @@ msgstr "Ошибка входа на сервер %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Завершение работы"
@@ -738,6 +734,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -967,7 +968,7 @@ msgid "Update Available!"
msgstr "Доступно обновление!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1206,6 +1207,16 @@ msgstr "Проверка SFV-суммы"
msgid "left"
msgstr "осталось"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr ""
@@ -1385,103 +1396,18 @@ msgstr "Ошибка загрузки %s: обнаружен повреждён
msgid "NZB added to queue"
msgstr "NZB-файл добавлен в очередь"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Пропущен повторяющийся NZB-файл «%s»"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Пустой NZB-файл %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr ""
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Ошибка импорта %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "ПОВТОР"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ЗАШИФРОВАН"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "СЛИШКОМ БОЛЬШОЙ"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "НЕПОЛНЫЙ"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "ОЖИДАНИЕ %s с"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Загружено за %s со средней скоростью %sБ/с"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Возраст"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s статей с ошибками"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s статей отсутствует"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s статей содержат несовпадающие повторы"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Приостановлен повторяющийся NZB-файл «%s»"
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Проблема с"
@@ -1756,6 +1682,14 @@ msgstr "Не удалось завершить работу системы"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Обнаружена пустая запись RSS (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Несовместимая лента"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1782,14 +1716,6 @@ msgstr ""
msgid "RSS Feed %s was empty"
msgstr "RSS-лента %s была пустой"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Несовместимая лента"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Обнаружена пустая запись RSS (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Показать интерфейс"
@@ -3349,8 +3275,9 @@ msgid "Enable SFV-based checks"
msgstr "Использовать проверку по SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Выполнять дополнительную проверку по SFV-файлам."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3742,6 +3669,17 @@ msgstr ""
msgid "Enable"
msgstr "Включить"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4162,17 +4100,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4494,6 +4437,11 @@ msgstr "Удалить"
msgid "Filename"
msgstr "Название файла"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Возраст"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "свободно на диске"
@@ -4921,6 +4869,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Пустой NZB-файл %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Serbian (https://app.transifex.com/sabnzbd/teams/111101/sr/)\n"
@@ -140,6 +140,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -309,7 +314,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Neželjena ekstenzija je u rar datoteci %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Prekinuto, detektovana neželjena ekstenzija"
@@ -550,11 +555,6 @@ msgstr "Neuspešna inicijalizacija %s@%s iz razloga: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Previše konekcija ka serveru %s [%s]"
@@ -574,11 +574,6 @@ msgstr "Неуспешно пријављивање на сервер %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Povezivanje na %s@%s neuspešno, poruka=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Sumnja u grešku u programu za download"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Гашење"
@@ -736,6 +731,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -963,7 +963,7 @@ msgid "Update Available!"
msgstr "Нова верзија доступна!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1201,6 +1201,16 @@ msgstr "Pokušaj SFV provere"
msgid "left"
msgstr "остало"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Sumnja u grešku u programu za download"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Ovaj server ne dozvoljava SSL na ovom portu"
@@ -1380,103 +1390,18 @@ msgstr "Грешка учитавање %s, покварена датотека
msgid "NZB added to queue"
msgstr "NZB додат у ред"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Игнорисање дуплог NZB-а \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Празан NZB %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Поништено, не може да се заврши"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Грешка увоза %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "ДУПЛИКАТ"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ШИФРИРАНО"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "ПРЕВЕЛИКО"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "НЕПОТПУНО"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "NEŽELJENI"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "Чекање %s сек"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Преузето за %s на просек од %sБ/с"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Старост"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s артикла нису добро формирани"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s артикла недостају"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s артикла нису дупликате"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Паузирам због дуплог NZB-а \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Проблем са"
@@ -1749,6 +1674,14 @@ msgstr "Greška pri gašenju sistema"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Nađen prazan RSS unos (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Некомпатибилан Фид"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1775,14 +1708,6 @@ msgstr "Server %s koristi nepouzdan HTTPS sertifikat"
msgid "RSS Feed %s was empty"
msgstr "RSS фид %s је празан"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Некомпатибилан Фид"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Nađen prazan RSS unos (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Pokaži interfejs"
@@ -3335,8 +3260,9 @@ msgid "Enable SFV-based checks"
msgstr "Упали SFV провере"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Уради још једну проверу базирану на SFV датотеке."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3728,6 +3654,17 @@ msgstr ""
msgid "Enable"
msgstr "Омогући"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4140,17 +4077,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4472,6 +4414,11 @@ msgstr "Обриши"
msgid "Filename"
msgstr "Име датотеке"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Старост"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Слободан простор"
@@ -4898,6 +4845,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Празан NZB %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2023\n"
"Language-Team: Swedish (https://app.transifex.com/sabnzbd/teams/111101/sv/)\n"
@@ -140,6 +140,11 @@ msgid ""
"creates."
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr ""
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -309,7 +314,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "Oönskad filändelse i RAR-fil %s"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "Avbruten, oönskad filändelse detekterad"
@@ -550,11 +555,6 @@ msgstr "Misslyckades att initiera %s@%s med orsak %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "För många anslutningar till servern %s [%s]"
@@ -574,11 +574,6 @@ msgstr "Det gick inte att logga in på server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Anslutning %s@%s misslyckades, meddelande=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Misstänker fel i nedladdare"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Påbörjar nedstängning av SABnzbd.."
@@ -736,6 +731,11 @@ msgid ""
"problems."
msgstr ""
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr ""
@@ -965,7 +965,7 @@ msgid "Update Available!"
msgstr "Uppdatering tillgänglig"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr ""
@@ -1205,6 +1205,16 @@ msgstr "Försöker verifiera SFV"
msgid "left"
msgstr "kvar"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Misstänker fel i nedladdare"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Den här servern tillåter in SSL på denna port"
@@ -1384,103 +1394,18 @@ msgstr "Laddningsfel %s, felaktig fil detekterad"
msgid "NZB added to queue"
msgstr "NZB tillagd i kön"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Ignorerar dubblett för NZB \"%s\""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr ""
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "NZB filen %s är tom"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr ""
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "Avbrutet, kan inte slutföras"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "Det gick inte att importera %s"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "DUBLETT"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "KRYPTERAT"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "FÖR STOR"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "INKOMPLETT"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "OÖNSKAD"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "VÄNTA %s SEKUNDER"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr ""
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "Hämtade i %s vid ett genomsnitt på %sB/s"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Ålder"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s artiklar var felaktiga"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s artiklar saknades"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s artiklar hade icke-matchande dubletter"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Pausar dubblett för NZB \"%s\""
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Problem med"
@@ -1755,6 +1680,14 @@ msgstr "Fel uppstod då systemet skulle stängas"
msgid "Received a DBus exception %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post hittades (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibel feed"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1781,14 +1714,6 @@ msgstr "Server %s använder ett otillförlitlig HTTPS-certifikat"
msgid "RSS Feed %s was empty"
msgstr "RSS-flödet %s var tomt"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Inkompatibel feed"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Tom RSS post hittades (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Visa gränssnitt"
@@ -3345,8 +3270,9 @@ msgid "Enable SFV-based checks"
msgstr "Använd SFV-baserade kontroller"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Gör en extra kontroll med SFV filer"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3740,6 +3666,17 @@ msgstr ""
msgid "Enable"
msgstr "Aktivera"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr ""
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4153,17 +4090,22 @@ msgid "Enable Apprise notifications"
msgstr ""
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgid "Use default Apprise URLs"
msgstr ""
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
#: sabnzbd/skintext.py
@@ -4485,6 +4427,11 @@ msgstr "Ta bort"
msgid "Filename"
msgstr "Filnamn"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Ålder"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Ledigt diskutrymme"
@@ -4912,6 +4859,10 @@ msgstr ""
msgid "Server could not complete request"
msgstr ""
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "NZB filen %s är tom"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

@@ -3,13 +3,14 @@
#
# Translators:
# Taylan Tatlı, 2025
# mauron, 2025
# Safihre <safihre@sabnzbd.org>, 2025
# mauron, 2026
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: mauron, 2025\n"
"Last-Translator: mauron, 2026\n"
"Language-Team: Turkish (https://app.transifex.com/sabnzbd/teams/111101/tr/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -150,6 +151,11 @@ msgstr ""
"Güncel umask (%o), SABnzbd'nin oluşturduğu dosya ve dizinlere erişimini "
"reddedebilir."
#. Warning message
#: sabnzbd/__init__.py
msgid "Windows ARM version of SABnzbd is available from our Downloads page!"
msgstr "SABnzbd'nin Windows ARM sürümü İndirmeler sayfamızda mevcuttur!"
#. Warning message
#: sabnzbd/__init__.py
msgid ""
@@ -341,7 +347,7 @@ msgstr ""
msgid "Unwanted extension is in rar file %s"
msgstr "İstenmeyen uzantı %s rar dosyasındadır"
#: sabnzbd/assembler.py, sabnzbd/nzbstuff.py
#: sabnzbd/assembler.py
msgid "Aborted, unwanted extension detected"
msgstr "İptal edildi, istenmeyen uzantı tespit edildi"
@@ -596,11 +602,6 @@ msgstr "%s@%s başlatması şu sebepten dolayı başarısız oldu: %s"
msgid "Fatal error in Downloader"
msgstr "İndirici'de ölümcül hata"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: bilinmeyen durum kodu %s, şu makale için alındı: %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "%s [%s] sunucusuna çok fazla bağlantı"
@@ -622,11 +623,6 @@ msgstr "%s [%s] sunucusunda oturum açılışı başarısız oldu"
msgid "Connecting %s@%s failed, message=%s"
msgstr "%s@%s bağlantısı başarısız oldu, mesaj=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "İndiricide şüpheli hata"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Kapatılıyor"
@@ -786,6 +782,13 @@ msgid ""
msgstr ""
"%s özel karakterli dosya isimleri ile yazılamıyor. Bu, sorun oluşturabilir."
#. Warning message
#: sabnzbd/filesystem.py
msgid "%s does not support sparse files. Disabling direct write mode."
msgstr ""
"%s aralıklı dosyaları desteklememektedir. Doğrudan yazma kipi devre dışı "
"bırakılıyor."
#: sabnzbd/interface.py
msgid "Refused connection from:"
msgstr "Şuradan bağlantı reddedildi:"
@@ -1017,7 +1020,7 @@ msgid "Update Available!"
msgstr "Güncelleme Mevcut!"
#. Error message
#: sabnzbd/misc.py
#: sabnzbd/misc.py, sabnzbd/skintext.py
msgid "Failed to upload file: %s"
msgstr "Dosyanın gönderilmesi başarısız oldu: %s"
@@ -1259,6 +1262,16 @@ msgstr "SFV doğrulaması deneniyor"
msgid "left"
msgstr "kaldı"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: bilinmeyen durum kodu %s, şu makale için alındı: %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "İndiricide şüpheli hata"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Bu sunucu, bu bağlantı noktasında SSL kullanımına izin vermiyor"
@@ -1444,103 +1457,18 @@ msgstr "%s yüklenirken hata, bozuk dosya tespit edildi"
msgid "NZB added to queue"
msgstr "NZB kuyruğa ilave edildi"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Ignoring duplicate NZB \"%s\""
msgstr "Yinelenmiş NZB \"%s\" dikkate alınmıyor"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Failing duplicate NZB \"%s\""
msgstr "\"%s\" NSB dosyasının yinelenmesi başarısız"
#: sabnzbd/nzbqueue.py, sabnzbd/nzbstuff.py
#: sabnzbd/nzbqueue.py
msgid "Duplicate NZB"
msgstr "Yinelenmiş NZB"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Invalid NZB file %s, skipping (error: %s)"
msgstr "Geçersiz NZB dosyası %s, atlanıyor (hata: %s)"
#. Warning message
#: sabnzbd/nzbstuff.py, sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Boş NZB dosyası %s"
#: sabnzbd/nzbstuff.py
msgid "Pre-queue script marked job as failed"
msgstr "Kuyruk öncesi betiği işi başarısız oldu olarak işaretlemiş"
#. Warning message
#: sabnzbd/nzbstuff.py
msgid "Unwanted Extension in file %s (%s)"
msgstr "%s (%s) dosyasında İstenmeyen Uzantı"
#: sabnzbd/nzbstuff.py
msgid "Aborted, cannot be completed"
msgstr "İptal edildi, tamamlanamıyor"
#. Error message
#: sabnzbd/nzbstuff.py
msgid "Error importing %s"
msgstr "%s unsurunun içe aktarılmasında hata"
#: sabnzbd/nzbstuff.py
msgid "DUPLICATE"
msgstr "YİNELENMİŞ"
#: sabnzbd/nzbstuff.py
msgid "ALTERNATIVE"
msgstr "ALTERNATİF"
#: sabnzbd/nzbstuff.py
msgid "ENCRYPTED"
msgstr "ŞİFRELENMİŞ"
#: sabnzbd/nzbstuff.py
msgid "TOO LARGE"
msgstr "ÇOK BÜYÜK"
#: sabnzbd/nzbstuff.py
msgid "INCOMPLETE"
msgstr "TAMAMLANMAMIŞ"
#: sabnzbd/nzbstuff.py
msgid "UNWANTED"
msgstr "İSTENMEYEN"
#: sabnzbd/nzbstuff.py
msgid "WAIT %s sec"
msgstr "%s saniye BEKLEYİN"
#: sabnzbd/nzbstuff.py
msgid "PROPAGATING %s min"
msgstr "YAYINLANIYOR %s dakika"
#: sabnzbd/nzbstuff.py
msgid "Downloaded in %s at an average of %sB/s"
msgstr "%s içinde ortalama %sB/s hızında indirildi"
#. Job details page, file age column header
#: sabnzbd/nzbstuff.py, sabnzbd/skintext.py
msgid "Age"
msgstr "Yaş"
#: sabnzbd/nzbstuff.py
msgid "%s articles were malformed"
msgstr "%s makale yanlış şekillendirilmişti"
#: sabnzbd/nzbstuff.py
msgid "%s articles were missing"
msgstr "%s makale eksikti"
#: sabnzbd/nzbstuff.py
msgid "%s articles had non-matching duplicates"
msgstr "%s makale eşleşmeyen yinelenmişler bulunduruyordu"
#: sabnzbd/nzbstuff.py
msgid "Pausing duplicate NZB \"%s\""
msgstr "Yinelenmiş NZB \"%s\" duraklatılıyor"
#: sabnzbd/panic.py
msgid "Problem with"
msgstr "Şununla sorun"
@@ -1817,6 +1745,14 @@ msgstr "Sistemin kapatılması esnasında hata"
msgid "Received a DBus exception %s"
msgstr "Bir DBUS istisnası alındı %s"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Boş RSS girdisi bulundu (%s)"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Uyumsuz besleme"
#. Error message
#: sabnzbd/rss.py
msgid "Incorrect RSS feed description \"%s\""
@@ -1843,14 +1779,6 @@ msgstr "%s sunucusu güvenilmez bir HTTPS sertifikası kullanıyor"
msgid "RSS Feed %s was empty"
msgstr "%s RSS Beselemesi boştu"
#: sabnzbd/rss.py
msgid "Incompatible feed"
msgstr "Uyumsuz besleme"
#: sabnzbd/rss.py
msgid "Empty RSS entry found (%s)"
msgstr "Boş RSS girdisi bulundu (%s)"
#: sabnzbd/sabtray.py, sabnzbd/sabtraylinux.py
msgid "Show interface"
msgstr "Arayüzü göster"
@@ -3457,8 +3385,11 @@ msgid "Enable SFV-based checks"
msgstr "SFV temelli kontrolleri etkinleştir"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "SFV dosyalarına dayalı ilave bir doğrulama yap."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
"Eğer hiçbir par2 dosyası mevcut değilse, dosyaları kontrol etmek için "
"(mevcutsa) sfv dosyalarını kullan"
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"
@@ -3879,6 +3810,20 @@ msgstr ""
msgid "Enable"
msgstr "Etkinleştir"
#: sabnzbd/skintext.py
msgid "Articles per request"
msgstr "Talep başı makale"
#: sabnzbd/skintext.py
msgid ""
"Request multiple articles per connection without waiting for each response "
"first.<br />This can improve download speeds, especially on connections with"
" higher latency."
msgstr ""
"Her bir cevabı beklemeden bağlantı başına birden fazla makale talep et.<br "
"/>Bu, indirme hızlarını bilhassa yüksek gecikmeli bağlantılarda "
"arttırabilir."
#. Button: Remove server
#: sabnzbd/skintext.py
msgid "Remove Server"
@@ -4295,20 +4240,29 @@ msgid "Enable Apprise notifications"
msgstr "Apprise bildirimlerini etkinleştir"
#: sabnzbd/skintext.py
msgid "Send notifications using Apprise to almost any notification service"
msgid ""
"Send notifications directly to any notification service you use.<br>For "
"example: Slack, Discord, Telegram, or any service from over 100 supported "
"services!"
msgstr ""
"Apprise kullanarak neredeyse tüm bildirim hizmetlerine bildirim gönderin"
"Bildirimleri kullandığınız herhangi bir bildirim hizmetine doğrudan "
"gönderin.<br>Örneğin: Slack, Discord, Telegram veya 100'den fazla "
"desteklenen hizmetten herhangi biri!"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Default Apprise URLs"
msgstr "Varsayılan Apprise URL'leri"
msgid "Use default Apprise URLs"
msgstr "Varsayılan Apprise URL'lerini kullan"
#. Apprise settings
#: sabnzbd/skintext.py
msgid "Use a comma and/or space to identify more than one URL."
msgid ""
"Apprise defines service connection information using URLs.<br>Read the "
"Apprise wiki how to define the URL for each service.<br>Use a comma and/or "
"space to identify more than one URL."
msgstr ""
"Birden fazla URL (adres) tanımlamak için virgül ve/veya boşluk kullanın."
"Apprise, hizmet bağlantı bilgilerini URL'ler kullanarak tanımlar.<br>Her "
"hizmet için URL'nin nasıl tanımlanacağını öğrenmek için Apprise wiki'sini "
"okuyun.<br>Birden fazla URL tanımlamak için virgül ve/veya boşluk kullanın."
#: sabnzbd/skintext.py
msgid ""
@@ -4650,6 +4604,11 @@ msgstr "Sil"
msgid "Filename"
msgstr "Dosya ismi"
#. Job details page, file age column header
#: sabnzbd/skintext.py
msgid "Age"
msgstr "Yaş"
#: sabnzbd/skintext.py
msgid "Free Space"
msgstr "Boş alan"
@@ -5089,6 +5048,10 @@ msgstr "Dosya sunucuda yok"
msgid "Server could not complete request"
msgstr "Sunucu talebi tamamlayamadı"
#: sabnzbd/urlgrabber.py
msgid "Empty NZB file %s"
msgstr "Boş NZB dosyası %s"
#. Error message
#: sabnzbd/urlgrabber.py
msgid "URLGRABBER CRASHED"

View File

File diff suppressed because it is too large Load Diff

View File

@@ -4,7 +4,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: team@sabnzbd.org\n"
"Language-Team: SABnzbd <team@sabnzbd.org>\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Czech (https://app.transifex.com/sabnzbd/teams/111101/cs/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Danish (https://app.transifex.com/sabnzbd/teams/111101/da/)\n"

View File

@@ -8,7 +8,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: German (https://app.transifex.com/sabnzbd/teams/111101/de/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Spanish (https://app.transifex.com/sabnzbd/teams/111101/es/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Finnish (https://app.transifex.com/sabnzbd/teams/111101/fi/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: French (https://app.transifex.com/sabnzbd/teams/111101/fr/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Hebrew (https://app.transifex.com/sabnzbd/teams/111101/he/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Italian (https://app.transifex.com/sabnzbd/teams/111101/it/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Norwegian Bokmål (https://app.transifex.com/sabnzbd/teams/111101/nb/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Dutch (https://app.transifex.com/sabnzbd/teams/111101/nl/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Polish (https://app.transifex.com/sabnzbd/teams/111101/pl/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Portuguese (Brazil) (https://app.transifex.com/sabnzbd/teams/111101/pt_BR/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Romanian (https://app.transifex.com/sabnzbd/teams/111101/ro/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Russian (https://app.transifex.com/sabnzbd/teams/111101/ru/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Serbian (https://app.transifex.com/sabnzbd/teams/111101/sr/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Swedish (https://app.transifex.com/sabnzbd/teams/111101/sv/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Turkish (https://app.transifex.com/sabnzbd/teams/111101/tr/)\n"

View File

@@ -6,7 +6,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-4.6.0\n"
"Project-Id-Version: SABnzbd-5.0.0\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2025\n"
"Language-Team: Chinese (China) (https://app.transifex.com/sabnzbd/teams/111101/zh_CN/)\n"

View File

@@ -1,16 +1,17 @@
# Main requirements
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
apprise==1.9.5
sabctools==8.2.6
CT3==3.4.0
apprise==1.9.7
sabctools==9.3.1
CT3==3.4.0.post5
cffi==2.0.0
pycparser==2.23
pycparser # Version-less for Python 3.9 and below
pycparser==3.0; python_version > '3.9'
feedparser==6.0.12
configobj==5.0.9
cheroot==11.0.0
cheroot==11.1.2
six==1.17.0
cherrypy==18.10.0
jaraco.functools==4.3.0
jaraco.functools==4.4.0
jaraco.collections==5.0.0
jaraco.text==3.8.1 # Newer version introduces irrelevant extra dependencies
jaraco.classes==3.4.0
@@ -37,7 +38,7 @@ cryptography==46.0.3
# We recommend using "orjson" as it is 2x as fast as "ujson". However, it requires
# Rust so SABnzbd works just as well with "ujson" or the Python built in "json" module
ujson==5.11.0
orjson==3.11.3
orjson==3.11.5
# Windows system integration
pywin32==311; sys_platform == 'win32'
@@ -50,8 +51,8 @@ winrt-Windows.UI.Notifications==3.2.1; sys_platform == 'win32'
typing_extensions==4.15.0; sys_platform == 'win32'
# macOS system calls
pyobjc-core==12.0; sys_platform == 'darwin'
pyobjc-framework-Cocoa==12.0; sys_platform == 'darwin'
pyobjc-core==12.1; sys_platform == 'darwin'
pyobjc-framework-Cocoa==12.1; sys_platform == 'darwin'
# Linux notifications
notify2==0.3.1; sys_platform != 'win32' and sys_platform != 'darwin'
@@ -60,14 +61,15 @@ notify2==0.3.1; sys_platform != 'win32' and sys_platform != 'darwin'
requests==2.32.5
requests-oauthlib==2.0.0
PyYAML==6.0.3
markdown==3.9
markdown # Version-less for Python 3.9 and below
markdown==3.10.1; python_version > '3.9'
paho-mqtt==1.6.1 # Pinned, newer versions don't work with AppRise yet
# Requests Requirements
charset_normalizer==3.4.4
idna==3.11
urllib3==2.5.0
certifi==2025.10.5
urllib3==2.6.3
certifi==2026.1.4
oauthlib==3.3.1
PyJWT==2.10.1
blinker==1.9.0

View File

@@ -32,11 +32,12 @@ from threading import Lock, Condition
# Determine platform flags
##############################################################################
WINDOWS = MACOS = MACOSARM64 = FOUNDATION = False
WINDOWS = WINDOWSARM64 = MACOS = MACOSARM64 = FOUNDATION = False
KERNEL32 = LIBC = MACOSLIBC = PLATFORM = None
if os.name == "nt":
WINDOWS = True
WINDOWSARM64 = platform.uname().machine == "ARM64"
if platform.uname().machine not in ["AMD64", "ARM64"]:
print("SABnzbd only supports 64-bit Windows")
@@ -82,15 +83,15 @@ from sabnzbd.version import __version__, __baseline__
import sabnzbd.misc as misc
import sabnzbd.filesystem as filesystem
import sabnzbd.powersup as powersup
import sabnzbd.rss as rss
import sabnzbd.emailer as emailer
import sabnzbd.encoding as encoding
import sabnzbd.config as config
import sabnzbd.cfg as cfg
import sabnzbd.database
import sabnzbd.lang as lang
import sabnzbd.nzb
import sabnzbd.nzbparser as nzbparser
import sabnzbd.nzbstuff
import sabnzbd.rss as rss
import sabnzbd.emailer as emailer
import sabnzbd.getipaddress
import sabnzbd.newsunpack
import sabnzbd.par2file
@@ -248,6 +249,7 @@ def initialize(pause_downloader=False, clean_up=False, repair=0):
# Set call backs for Config items
cfg.cache_limit.callback(cfg.new_limit)
cfg.direct_write.callback(cfg.new_direct_write)
cfg.web_host.callback(cfg.guard_restart)
cfg.web_port.callback(cfg.guard_restart)
cfg.web_dir.callback(cfg.guard_restart)
@@ -302,6 +304,7 @@ def initialize(pause_downloader=False, clean_up=False, repair=0):
sabnzbd.NzbQueue.read_queue(repair)
sabnzbd.Scheduler.analyse(pause_downloader)
sabnzbd.ArticleCache.new_limit(cfg.cache_limit.get_int())
sabnzbd.Assembler.new_limit(sabnzbd.ArticleCache.cache_info().cache_limit)
logging.info("All processes started")
sabnzbd.RESTART_REQ = False
@@ -314,6 +317,9 @@ def start():
logging.debug("Starting postprocessor")
sabnzbd.PostProcessor.start()
logging.debug("Starting article cache")
sabnzbd.ArticleCache.start()
logging.debug("Starting assembler")
sabnzbd.Assembler.start()
@@ -382,6 +388,13 @@ def halt():
except Exception:
pass
logging.debug("Stopping article cache")
sabnzbd.ArticleCache.stop()
try:
sabnzbd.ArticleCache.join(timeout=3)
except Exception:
pass
logging.debug("Stopping postprocessor")
sabnzbd.PostProcessor.stop()
try:
@@ -481,6 +494,10 @@ def delayed_startup_actions():
sabnzbd.ORG_UMASK,
)
# Check if maybe we are running x64 version on ARM hardware
if sabnzbd.WINDOWSARM64 and "AMD64" in sys.version:
misc.helpful_warning(T("Windows ARM version of SABnzbd is available from our Downloads page!"))
# List the number of certificates available (can take up to 1.5 seconds)
if cfg.log_level() > 1:
logging.debug("Available certificates = %s", repr(ssl.create_default_context().cert_store_stats()))
@@ -495,7 +512,7 @@ def delayed_startup_actions():
logging.debug("Completed Download Folder %s is not on FAT", complete_dir)
if filesystem.directory_is_writable(sabnzbd.cfg.download_dir.get_path()):
filesystem.check_filesystem_capabilities(sabnzbd.cfg.download_dir.get_path())
filesystem.check_filesystem_capabilities(sabnzbd.cfg.download_dir.get_path(), is_download_dir=True)
if filesystem.directory_is_writable(sabnzbd.cfg.complete_dir.get_path()):
filesystem.check_filesystem_capabilities(sabnzbd.cfg.complete_dir.get_path())

View File

@@ -20,6 +20,7 @@ sabnzbd.api - api
"""
import os
import sys
import logging
import re
import gc
@@ -28,7 +29,9 @@ import time
import getpass
import cherrypy
from threading import Thread
from typing import Tuple, Optional, List, Dict, Any, Union
from typing import Optional, Any, Union
import sabctools
# For json.dumps, orjson is magnitudes faster than ujson, but it is harder to
# compile due to Rust dependency. Since the output is the same, we support all modules.
@@ -55,6 +58,7 @@ from sabnzbd.constants import (
PP_LOOKUP,
STAGES,
DEF_NETWORKING_TEST_TIMEOUT,
DEF_PIPELINING_REQUESTS,
)
import sabnzbd.config as config
import sabnzbd.cfg as cfg
@@ -77,13 +81,14 @@ from sabnzbd.misc import (
clean_comma_separated_list,
match_str,
bool_conv,
get_platform_description,
)
from sabnzbd.filesystem import diskspace, get_ext, clip_path, remove_all, list_scripts, purge_log_files, pathbrowser
from sabnzbd.encoding import xml_name, utob
from sabnzbd.getipaddress import local_ipv4, public_ipv4, public_ipv6, dnslookup, active_socks5_proxy
from sabnzbd.database import HistoryDB
from sabnzbd.lang import is_rtl
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.nzb import TryList, NzbObject
from sabnzbd.newswrapper import NewsWrapper, NNTPPermanentError
import sabnzbd.emailer
import sabnzbd.sorting
@@ -103,7 +108,7 @@ _MSG_NO_SUCH_CONFIG = "Config item does not exist"
_MSG_CONFIG_LOCKED = "Configuration locked"
def api_handler(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def api_handler(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API Dispatcher"""
# Clean-up the arguments
for vr in ("mode", "name", "value", "value2", "value3", "start", "limit", "search"):
@@ -117,13 +122,13 @@ def api_handler(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return response
def _api_get_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_get_config(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts keyword, section"""
_, data = config.get_dconfig(kwargs.get("section"), kwargs.get("keyword"))
return report(keyword="config", data=data)
def _api_set_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_set_config(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts keyword, section"""
if cfg.configlock():
return report(_MSG_CONFIG_LOCKED)
@@ -144,7 +149,7 @@ def _api_set_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byte
return report(keyword="config", data=data)
def _api_set_config_default(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_set_config_default(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Reset requested config variables back to defaults. Currently only for misc-section"""
if cfg.configlock():
return report(_MSG_CONFIG_LOCKED)
@@ -159,7 +164,7 @@ def _api_set_config_default(name: str, kwargs: Dict[str, Union[str, List[str]]])
return report()
def _api_del_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_del_config(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts keyword, section"""
if cfg.configlock():
return report(_MSG_CONFIG_LOCKED)
@@ -169,13 +174,13 @@ def _api_del_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byte
return report(_MSG_NOT_IMPLEMENTED)
def _api_queue(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Dispatcher for mode=queue"""
value = kwargs.get("value", "")
return _api_queue_table.get(name, (_api_queue_default, 2))[0](value, kwargs)
def _api_queue_delete(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_delete(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value"""
if value.lower() == "all":
removed = sabnzbd.NzbQueue.remove_all(kwargs.get("search"))
@@ -188,7 +193,7 @@ def _api_queue_delete(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> b
return report(_MSG_NO_VALUE)
def _api_queue_delete_nzf(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_delete_nzf(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id), value2(=nzf_ids)"""
nzf_ids = clean_comma_separated_list(kwargs.get("value2"))
if value and nzf_ids:
@@ -198,7 +203,7 @@ def _api_queue_delete_nzf(value: str, kwargs: Dict[str, Union[str, List[str]]])
return report(_MSG_NO_VALUE2)
def _api_queue_rename(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_rename(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=old name), value2(=new name), value3(=password)"""
value2 = kwargs.get("value2")
value3 = kwargs.get("value3")
@@ -209,18 +214,18 @@ def _api_queue_rename(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> b
return report(_MSG_NO_VALUE2)
def _api_queue_change_complete_action(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_change_complete_action(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=action)"""
change_queue_complete_action(value)
return report()
def _api_queue_purge(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_purge(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
removed = sabnzbd.NzbQueue.remove_all(kwargs.get("search"))
return report(keyword="", data={"status": bool(removed), "nzo_ids": removed})
def _api_queue_pause(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_pause(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=list of nzo_id)"""
if items := clean_comma_separated_list(value):
handled = sabnzbd.NzbQueue.pause_multiple_nzo(items)
@@ -229,7 +234,7 @@ def _api_queue_pause(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> by
return report(keyword="", data={"status": bool(handled), "nzo_ids": handled})
def _api_queue_resume(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_resume(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=list of nzo_id)"""
if items := clean_comma_separated_list(value):
handled = sabnzbd.NzbQueue.resume_multiple_nzo(items)
@@ -238,7 +243,7 @@ def _api_queue_resume(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> b
return report(keyword="", data={"status": bool(handled), "nzo_ids": handled})
def _api_queue_priority(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_priority(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id), value2(=priority)"""
nzo_ids = clean_comma_separated_list(value)
priority = kwargs.get("value2")
@@ -257,7 +262,7 @@ def _api_queue_priority(value: str, kwargs: Dict[str, Union[str, List[str]]]) ->
return report(_MSG_NO_VALUE2)
def _api_queue_sort(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_sort(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts sort, dir"""
sort = kwargs.get("sort", "")
direction = kwargs.get("dir", "")
@@ -268,7 +273,7 @@ def _api_queue_sort(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> byt
return report(_MSG_NO_VALUE2)
def _api_queue_default(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_queue_default(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts sort, dir, start, limit and search terms"""
start = int_conv(kwargs.get("start"))
limit = int_conv(kwargs.get("limit"))
@@ -296,12 +301,12 @@ def _api_queue_default(value: str, kwargs: Dict[str, Union[str, List[str]]]) ->
)
def _api_translate(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_translate(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=acronym)"""
return report(keyword="value", data=T(kwargs.get("value", "")))
def _api_addfile(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_addfile(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, pp, script, cat, priority, nzbname"""
# Normal upload will send the nzb in a kw arg called name or nzbfile
if not name or isinstance(name, str):
@@ -322,7 +327,7 @@ def _api_addfile(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(_MSG_NO_VALUE)
def _api_retry(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_retry(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, value(=nzo_id), nzbfile(=optional NZB), password (optional)"""
value = kwargs.get("value")
# Normal upload will send the nzb in a kw arg called nzbfile
@@ -337,7 +342,7 @@ def _api_retry(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(_MSG_NO_ITEM)
def _api_cancel_pp(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_cancel_pp(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, value(=nzo_ids)"""
if nzo_ids := clean_comma_separated_list(kwargs.get("value")):
if sabnzbd.PostProcessor.cancel_pp(nzo_ids):
@@ -345,7 +350,7 @@ def _api_cancel_pp(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes
return report(_MSG_NO_ITEM)
def _api_addlocalfile(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_addlocalfile(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, pp, script, cat, priority, nzbname"""
if name:
if os.path.exists(name):
@@ -372,7 +377,7 @@ def _api_addlocalfile(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> by
return report(_MSG_NO_VALUE)
def _api_switch(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_switch(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=first id), value2(=second id)"""
value = kwargs.get("value")
value2 = kwargs.get("value2")
@@ -384,7 +389,7 @@ def _api_switch(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(_MSG_NO_VALUE2)
def _api_change_cat(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_change_cat(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id), value2(=category)"""
nzo_ids = clean_comma_separated_list(kwargs.get("value"))
cat = kwargs.get("value2")
@@ -397,7 +402,7 @@ def _api_change_cat(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byte
return report(_MSG_NO_VALUE)
def _api_change_script(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_change_script(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id), value2(=script)"""
nzo_ids = clean_comma_separated_list(kwargs.get("value"))
script = kwargs.get("value2")
@@ -410,7 +415,7 @@ def _api_change_script(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> b
return report(_MSG_NO_VALUE)
def _api_change_opts(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_change_opts(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id), value2(=pp)"""
nzo_ids = clean_comma_separated_list(kwargs.get("value"))
pp = kwargs.get("value2")
@@ -420,7 +425,7 @@ def _api_change_opts(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byt
return report(_MSG_NO_ITEM)
def _api_fullstatus(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_fullstatus(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: full history status"""
status = build_status(
calculate_performance=bool_conv(kwargs.get("calculate_performance")),
@@ -429,19 +434,19 @@ def _api_fullstatus(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byte
return report(keyword="status", data=status)
def _api_status(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_status(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Dispatcher for mode=status, passing on the value"""
value = kwargs.get("value", "")
return _api_status_table.get(name, (_api_fullstatus, 2))[0](value, kwargs)
def _api_unblock_server(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_unblock_server(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Unblock a blocked server"""
sabnzbd.Downloader.unblock(value)
return report()
def _api_delete_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_delete_orphan(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Remove orphaned job"""
if value:
path = os.path.join(cfg.download_dir.get_path(), value)
@@ -452,7 +457,7 @@ def _api_delete_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]) ->
return report(_MSG_NO_ITEM)
def _api_delete_all_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_delete_all_orphan(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Remove all orphaned jobs"""
paths = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
for path in paths:
@@ -460,7 +465,7 @@ def _api_delete_all_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]])
return report()
def _api_add_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]):
def _api_add_orphan(value: str, kwargs: dict[str, Union[str, list[str]]]):
"""Add orphaned job"""
if value:
path = os.path.join(cfg.download_dir.get_path(), value)
@@ -471,7 +476,7 @@ def _api_add_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]):
return report(_MSG_NO_ITEM)
def _api_add_all_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_add_all_orphan(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Add all orphaned jobs"""
paths = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
for path in paths:
@@ -479,13 +484,13 @@ def _api_add_all_orphan(value: str, kwargs: Dict[str, Union[str, List[str]]]) ->
return report()
def _api_history(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_history(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Dispatcher for mode=history"""
value = kwargs.get("value", "")
return _api_history_table.get(name, (_api_history_default, 2))[0](value, kwargs)
def _api_history_delete(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_history_delete(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id or special), search, archive, del_files"""
search = kwargs.get("search")
archive = True
@@ -531,7 +536,7 @@ def _api_history_delete(value: str, kwargs: Dict[str, Union[str, List[str]]]) ->
return report(_MSG_NO_VALUE)
def _api_history_mark_as_completed(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_history_mark_as_completed(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id)"""
if value:
history_db = sabnzbd.get_db_connection()
@@ -550,7 +555,7 @@ def _api_history_mark_as_completed(value: str, kwargs: Dict[str, Union[str, List
return report(_MSG_NO_VALUE)
def _api_history_default(value: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_history_default(value: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts start, limit, search, failed_only, archive, cat, status, nzo_ids"""
start = int_conv(kwargs.get("start"))
limit = int_conv(kwargs.get("limit"))
@@ -595,7 +600,7 @@ def _api_history_default(value: str, kwargs: Dict[str, Union[str, List[str]]]) -
return report(keyword="history", data=history)
def _api_get_files(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_get_files(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=nzo_id)"""
value = kwargs.get("value")
if value:
@@ -604,7 +609,7 @@ def _api_get_files(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes
return report(_MSG_NO_VALUE)
def _api_move_nzf_bulk(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_move_nzf_bulk(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name(=top/up/down/bottom), value=(=nzo_id), nzf_ids, size (optional)"""
nzo_id = kwargs.get("value")
nzf_ids = clean_comma_separated_list(kwargs.get("nzf_ids"))
@@ -630,7 +635,7 @@ def _api_move_nzf_bulk(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> b
return report(_MSG_NO_VALUE)
def _api_addurl(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_addurl(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, output, pp, script, cat, priority, nzbname"""
pp = kwargs.get("pp")
script = kwargs.get("script")
@@ -648,24 +653,24 @@ def _api_addurl(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(_MSG_NO_VALUE)
def _api_pause(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_pause(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
return report()
def _api_resume(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_resume(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.downloader.unpause_all()
return report()
def _api_shutdown(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_shutdown(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.shutdown_program()
return report()
def _api_warnings(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_warnings(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts name, output"""
if name == "clear":
return report(keyword="warnings", data=sabnzbd.GUIHANDLER.clear())
@@ -685,11 +690,18 @@ LOG_INI_HIDE_RE = re.compile(
LOG_HASH_RE = re.compile(rb"([a-zA-Z\d]{25})", re.I)
def _api_showlog(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_showlog(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Fetch the INI and the log-data and add a message at the top"""
log_data = b"--------------------------------\n\n"
log_data += b"The log includes a copy of your sabnzbd.ini with\nall usernames, passwords and API-keys removed."
log_data += b"\n\n--------------------------------\n"
# Build header with version and environment info
header = "--------------------------------\n"
header += f"SABnzbd version: {sabnzbd.__version__}\n"
header += f"Commit: {sabnzbd.__baseline__}\n"
header += f"Python-version: {sys.version}\n"
header += f"Platform: {get_platform_description()}\n"
header += "--------------------------------\n\n"
header += "The log includes a copy of your sabnzbd.ini with\nall usernames, passwords and API-keys removed."
header += "\n\n--------------------------------\n"
log_data = header.encode("utf-8")
if sabnzbd.LOGFILE and os.path.exists(sabnzbd.LOGFILE):
with open(sabnzbd.LOGFILE, "rb") as f:
@@ -718,19 +730,19 @@ def _api_showlog(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return log_data
def _api_get_cats(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_get_cats(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
return report(keyword="categories", data=list_cats(False))
def _api_get_scripts(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_get_scripts(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
return report(keyword="scripts", data=list_scripts())
def _api_version(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_version(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
return report(keyword="version", data=sabnzbd.__version__)
def _api_auth(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_auth(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
key = kwargs.get("key", "")
if not key:
auth = "apikey"
@@ -743,14 +755,14 @@ def _api_auth(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(keyword="auth", data=auth)
def _api_restart(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_restart(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
logging.info("Restart requested by API")
# Do the shutdown async to still send goodbye to browser
Thread(target=sabnzbd.trigger_restart, kwargs={"timeout": 1}).start()
return report()
def _api_restart_repair(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_restart_repair(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
logging.info("Queue repair requested by API")
request_repair()
# Do the shutdown async to still send goodbye to browser
@@ -758,12 +770,12 @@ def _api_restart_repair(name: str, kwargs: Dict[str, Union[str, List[str]]]) ->
return report()
def _api_disconnect(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_disconnect(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.Downloader.disconnect()
return report()
def _api_eval_sort(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_eval_sort(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: evaluate sorting expression"""
sort_string = kwargs.get("sort_string", "")
job_name = kwargs.get("job_name", "")
@@ -775,28 +787,28 @@ def _api_eval_sort(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes
return report(keyword="result", data=path)
def _api_watched_now(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_watched_now(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.DirScanner.scan()
return report()
def _api_resume_pp(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
sabnzbd.PostProcessor.paused = False
def _api_resume_pp(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.PostProcessor.resume()
return report()
def _api_pause_pp(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
sabnzbd.PostProcessor.paused = True
def _api_pause_pp(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.PostProcessor.pause()
return report()
def _api_rss_now(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_rss_now(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
# Run RSS scan async, because it can take a long time
sabnzbd.Scheduler.force_rss()
return report()
def _api_retry_all(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_retry_all(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Retry all failed items in History"""
items = sabnzbd.api.build_history()[0]
nzo_ids = []
@@ -806,13 +818,13 @@ def _api_retry_all(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes
return report(keyword="status", data=nzo_ids)
def _api_reset_quota(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_reset_quota(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Reset quota left"""
sabnzbd.BPSMeter.reset_quota(force=True)
return report()
def _api_test_email(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_email(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test email, return result"""
logging.info("Sending test email")
pack = {"download": ["action 1", "action 2"], "unpack": ["action 1", "action 2"]}
@@ -834,67 +846,67 @@ def _api_test_email(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> byte
return report(error=res)
def _api_test_windows(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_windows(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test to Windows, return result"""
logging.info("Sending test notification")
res = sabnzbd.notifier.send_windows("SABnzbd", T("Test Notification"), "other")
return report(error=res)
def _api_test_notif(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_notif(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test to Notification Center, return result"""
logging.info("Sending test notification")
res = sabnzbd.notifier.send_notification_center("SABnzbd", T("Test Notification"), "other")
return report(error=res)
def _api_test_osd(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_osd(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test OSD notification, return result"""
logging.info("Sending OSD notification")
res = sabnzbd.notifier.send_notify_osd("SABnzbd", T("Test Notification"))
return report(error=res)
def _api_test_prowl(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_prowl(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test Prowl notification, return result"""
logging.info("Sending Prowl notification")
res = sabnzbd.notifier.send_prowl("SABnzbd", T("Test Notification"), "other", force=True, test=kwargs)
return report(error=res)
def _api_test_pushover(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_pushover(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test Pushover notification, return result"""
logging.info("Sending Pushover notification")
res = sabnzbd.notifier.send_pushover("SABnzbd", T("Test Notification"), "other", force=True, test=kwargs)
return report(error=res)
def _api_test_pushbullet(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_pushbullet(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test Pushbullet notification, return result"""
logging.info("Sending Pushbullet notification")
res = sabnzbd.notifier.send_pushbullet("SABnzbd", T("Test Notification"), "other", force=True, test=kwargs)
return report(error=res)
def _api_test_apprise(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_apprise(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: send a test Apprise notification, return result"""
logging.info("Sending Apprise notification")
res = sabnzbd.notifier.send_apprise("SABnzbd", T("Test Notification"), "other", force=True, test=kwargs)
return report(error=res)
def _api_test_nscript(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_test_nscript(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: execute a test notification script, return result"""
logging.info("Executing notification script")
res = sabnzbd.notifier.send_nscript("SABnzbd", T("Test Notification"), "other", force=True, test=kwargs)
return report(error=res)
def _api_undefined(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_undefined(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
return report(_MSG_NOT_IMPLEMENTED)
def _api_browse(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_browse(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Return tree of local path"""
compact = bool_conv(kwargs.get("compact"))
show_files = bool_conv(kwargs.get("show_files"))
@@ -911,14 +923,14 @@ def _api_browse(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report(keyword="paths", data=paths)
def _api_config(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: Dispatcher for "config" """
if cfg.configlock():
return report(_MSG_CONFIG_LOCKED)
return _api_config_table.get(name, (_api_config_undefined, 2))[0](kwargs)
def _api_config_speedlimit(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_speedlimit(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=speed)"""
value = kwargs.get("value")
if not value:
@@ -927,26 +939,26 @@ def _api_config_speedlimit(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
return report()
def _api_config_set_pause(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_set_pause(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts value(=pause interval)"""
value = kwargs.get("value")
sabnzbd.Scheduler.plan_resume(int_conv(value))
return report()
def _api_config_set_apikey(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_set_apikey(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
cfg.api_key.set(config.create_api_key())
config.save_config()
return report(keyword="apikey", data=cfg.api_key())
def _api_config_set_nzbkey(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_set_nzbkey(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
cfg.nzb_key.set(config.create_api_key())
config.save_config()
return report(keyword="nzbkey", data=cfg.nzb_key())
def _api_config_regenerate_certs(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_regenerate_certs(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
# Make sure we only over-write default locations
result = False
if (
@@ -960,27 +972,27 @@ def _api_config_regenerate_certs(kwargs: Dict[str, Union[str, List[str]]]) -> by
return report(data=result)
def _api_config_test_server(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_test_server(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""API: accepts server-params"""
result, msg = test_nntp_server_dict(kwargs)
return report(data={"result": result, "message": msg})
def _api_config_create_backup(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_create_backup(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
backup_file = config.create_config_backup()
return report(data={"result": bool(backup_file), "message": backup_file})
def _api_config_purge_log_files(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_purge_log_files(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
purge_log_files()
return report()
def _api_config_undefined(kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_config_undefined(kwargs: dict[str, Union[str, list[str]]]) -> bytes:
return report(_MSG_NOT_IMPLEMENTED)
def _api_server_stats(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_server_stats(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sum_t, sum_m, sum_w, sum_d = sabnzbd.BPSMeter.get_sums()
stats = {"total": sum_t, "month": sum_m, "week": sum_w, "day": sum_d, "servers": {}}
@@ -999,12 +1011,12 @@ def _api_server_stats(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> by
return report(keyword="", data=stats)
def _api_gc_stats(name: str, kwargs: Dict[str, Union[str, List[str]]]) -> bytes:
def _api_gc_stats(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
"""Function only intended for internal testing of the memory handling"""
# Collect before we check
gc.collect()
# We cannot create any lists/dicts, as they would create a reference
return report(data=[str(obj) for obj in gc.get_objects() if isinstance(obj, sabnzbd.nzbstuff.TryList)])
return report(data=[str(obj) for obj in gc.get_objects() if isinstance(obj, TryList)])
##############################################################################
@@ -1210,7 +1222,7 @@ class XmlOutputFactory:
return text
def handle_server_api(kwargs: Dict[str, Union[str, List[str]]]) -> str:
def handle_server_api(kwargs: dict[str, Union[str, list[str]]]) -> str:
"""Special handler for API-call 'set_config' [servers]"""
name = kwargs.get("keyword")
if not name:
@@ -1228,7 +1240,7 @@ def handle_server_api(kwargs: Dict[str, Union[str, List[str]]]) -> str:
return name
def handle_sorter_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]:
def handle_sorter_api(kwargs: dict[str, Union[str, list[str]]]) -> Optional[str]:
"""Special handler for API-call 'set_config' [sorters]"""
name = kwargs.get("keyword")
if not name:
@@ -1244,7 +1256,7 @@ def handle_sorter_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]
return name
def handle_rss_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]:
def handle_rss_api(kwargs: dict[str, Union[str, list[str]]]) -> Optional[str]:
"""Special handler for API-call 'set_config' [rss]"""
name = kwargs.get("keyword")
if not name:
@@ -1278,7 +1290,7 @@ def handle_rss_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]:
return name
def handle_cat_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]:
def handle_cat_api(kwargs: dict[str, Union[str, list[str]]]) -> Optional[str]:
"""Special handler for API-call 'set_config' [categories]"""
name = kwargs.get("keyword")
if not name:
@@ -1295,7 +1307,7 @@ def handle_cat_api(kwargs: Dict[str, Union[str, List[str]]]) -> Optional[str]:
return name
def test_nntp_server_dict(kwargs: Dict[str, Union[str, List[str]]]) -> Tuple[bool, str]:
def test_nntp_server_dict(kwargs: dict[str, Union[str, list[str]]]) -> tuple[bool, str]:
"""Will connect (blocking) to the NNTP server and report back any errors"""
host = kwargs.get("host", "").strip()
port = int_conv(kwargs.get("port", 0))
@@ -1307,6 +1319,7 @@ def test_nntp_server_dict(kwargs: Dict[str, Union[str, List[str]]]) -> Tuple[boo
ssl = int_conv(kwargs.get("ssl", 0))
ssl_verify = int_conv(kwargs.get("ssl_verify", 3))
ssl_ciphers = kwargs.get("ssl_ciphers", "").strip()
pipelining_requests = int_conv(kwargs.get("pipelining_requests", DEF_PIPELINING_REQUESTS))
if not host:
return False, T("The hostname is not set.")
@@ -1343,6 +1356,7 @@ def test_nntp_server_dict(kwargs: Dict[str, Union[str, List[str]]]) -> Tuple[boo
use_ssl=ssl,
ssl_verify=ssl_verify,
ssl_ciphers=ssl_ciphers,
pipelining_requests=lambda: pipelining_requests,
username=username,
password=password,
)
@@ -1387,12 +1401,22 @@ def test_nntp_server_dict(kwargs: Dict[str, Union[str, List[str]]]) -> Tuple[boo
# Sorry, no clever analysis:
return False, T('Server address "%s:%s" is not valid.') % (host, port)
nw = NewsWrapper(server=test_server, thrdnum=-1, block=True)
nntp_code: int = 0
nntp_message: str = ""
def on_response(code: int, message: str):
nonlocal nntp_code, nntp_message
nntp_code = code
nntp_message = message
try:
nw = NewsWrapper(server=test_server, thrdnum=-1, block=True)
nw.init_connect()
while not nw.connected:
nw.recv_chunk()
nw.finish_connect(nw.status_code)
while test_server.active:
nw.write()
nw.read(on_response=on_response)
if nw.ready:
break
except socket.timeout:
if port != 119 and not ssl:
@@ -1414,37 +1438,37 @@ def test_nntp_server_dict(kwargs: Dict[str, Union[str, List[str]]]) -> Tuple[boo
return False, str(err)
if not username or not password:
nw.nntp.sock.sendall(b"ARTICLE <test@home>\r\n")
nw.queue_command(b"ARTICLE <test@home>\r\n")
try:
nw.reset_data_buffer()
nw.recv_chunk()
nw.write()
nw.read(on_response=on_response)
except Exception as err:
# Some internal error, not always safe to close connection
return False, str(err)
# Parse result
return_status = ()
if nw.status_code:
if nw.status_code == 480:
if nntp_code:
if nntp_code == 480:
return_status = (False, T("Server requires username and password."))
elif nw.status_code < 300 or nw.status_code in (411, 423, 430):
elif nntp_code < 300 or nntp_code in (411, 423, 430):
# If no username/password set and we requested fake-article, it will return 430 Not Found
return_status = (True, T("Connection Successful!"))
elif nw.status_code == 502 or sabnzbd.downloader.clues_login(nw.nntp_msg):
elif nntp_code == 502 or sabnzbd.downloader.clues_login(nntp_message):
return_status = (False, T("Authentication failed, check username/password."))
elif sabnzbd.downloader.clues_too_many(nw.nntp_msg):
elif sabnzbd.downloader.clues_too_many(nntp_message):
return_status = (False, T("Too many connections, please pause downloading or try again later"))
# Fallback in case no data was received or unknown status
if not return_status:
return_status = (False, T("Could not determine connection result (%s)") % nw.nntp_msg)
return_status = (False, T("Could not determine connection result (%s)") % nntp_message)
# Close the connection and return result
nw.hard_reset()
return return_status
def build_status(calculate_performance: bool = False, skip_dashboard: bool = False) -> Dict[str, Any]:
def build_status(calculate_performance: bool = False, skip_dashboard: bool = False) -> dict[str, Any]:
# build up header full of basic information
info = build_header(trans_functions=False)
@@ -1497,18 +1521,18 @@ def build_status(calculate_performance: bool = False, skip_dashboard: bool = Fal
info["servers"] = []
# Servers-list could be modified during iteration, so we need a copy
for server in sabnzbd.Downloader.servers[:]:
activeconn = sum(nw.connected for nw in server.idle_threads.copy())
activeconn = sum(nw.ready for nw in server.idle_threads.copy())
serverconnections = []
for nw in server.busy_threads.copy():
if nw.connected:
if nw.ready:
activeconn += 1
if nw.article:
if article := nw.article:
serverconnections.append(
{
"thrdnum": nw.thrdnum,
"art_name": nw.article.article,
"nzf_name": nw.article.nzf.filename,
"nzo_name": nw.article.nzf.nzo.final_name,
"art_name": article.article,
"nzf_name": article.nzf.filename,
"nzo_name": article.nzf.nzo.final_name,
}
)
@@ -1546,11 +1570,11 @@ def build_queue(
start: int = 0,
limit: int = 0,
search: Optional[str] = None,
categories: Optional[List[str]] = None,
priorities: Optional[List[str]] = None,
statuses: Optional[List[str]] = None,
nzo_ids: Optional[List[str]] = None,
) -> Dict[str, Any]:
categories: Optional[list[str]] = None,
priorities: Optional[list[str]] = None,
statuses: Optional[list[str]] = None,
nzo_ids: Optional[list[str]] = None,
) -> dict[str, Any]:
info = build_header(for_template=False)
(
queue_bytes_total,
@@ -1659,7 +1683,7 @@ def build_queue(
return info
def fast_queue() -> Tuple[bool, int, float, str]:
def fast_queue() -> tuple[bool, int, float, str]:
"""Return paused, bytes_left, bpsnow, time_left"""
bytes_left = sabnzbd.sabnzbd.NzbQueue.remaining()
paused = sabnzbd.Downloader.paused
@@ -1668,7 +1692,7 @@ def fast_queue() -> Tuple[bool, int, float, str]:
return paused, bytes_left, bpsnow, time_left
def build_file_list(nzo_id: str) -> List[Dict[str, Any]]:
def build_file_list(nzo_id: str) -> list[dict[str, Any]]:
"""Build file lists for specified job"""
jobs = []
nzo = sabnzbd.sabnzbd.NzbQueue.get_nzo(nzo_id)
@@ -1742,7 +1766,7 @@ def retry_job(
return None
def del_job_files(job_paths: List[str]):
def del_job_files(job_paths: list[str]):
"""Remove files of each path in the list"""
for path in job_paths:
if path and clip_path(path).lower().startswith(cfg.download_dir.get_clipped_path().lower()):
@@ -1785,7 +1809,7 @@ def clear_trans_cache():
sabnzbd.WEBUI_READY = True
def build_header(webdir: str = "", for_template: bool = True, trans_functions: bool = True) -> Dict[str, Any]:
def build_header(webdir: str = "", for_template: bool = True, trans_functions: bool = True) -> dict[str, Any]:
"""Build the basic header"""
header = {}
@@ -1852,10 +1876,10 @@ def build_history(
limit: int = 1000000,
archive: bool = False,
search: Optional[str] = None,
categories: Optional[List[str]] = None,
statuses: Optional[List[str]] = None,
nzo_ids: Optional[List[str]] = None,
) -> Tuple[List[Dict[str, Any]], int, int]:
categories: Optional[list[str]] = None,
statuses: Optional[list[str]] = None,
nzo_ids: Optional[list[str]] = None,
) -> tuple[list[dict[str, Any]], int, int]:
"""Combine the jobs still in post-processing and the database history"""
if not archive:
# Grab any items that are active or queued in postproc
@@ -1931,7 +1955,7 @@ def build_history(
return items, postproc_queue_size, total_items
def add_active_history(postproc_queue: List[NzbObject], items: List[Dict[str, Any]]):
def add_active_history(postproc_queue: list[NzbObject], items: list[dict[str, Any]]):
"""Get the active history queue and add it to the existing items list"""
nzo_ids = set([nzo["nzo_id"] for nzo in items])
@@ -1990,7 +2014,7 @@ def calc_timeleft(bytesleft: float, bps: float) -> str:
return format_time_left(int(bytesleft / bps))
def list_cats(default: bool = True) -> List[str]:
def list_cats(default: bool = True) -> list[str]:
"""Return list of (ordered) categories,
when default==False use '*' for Default category
"""
@@ -2019,7 +2043,7 @@ def plural_to_single(kw, def_kw=""):
return def_kw
def del_from_section(kwargs: Dict[str, Union[str, List[str]]]) -> bool:
def del_from_section(kwargs: dict[str, Union[str, list[str]]]) -> bool:
"""Remove keyword in section"""
section = kwargs.get("section", "")
if section in ("sorters", "servers", "rss", "categories"):

View File

@@ -22,26 +22,39 @@ sabnzbd.articlecache - Article cache handling
import logging
import threading
import struct
from typing import Dict, Collection
import time
from typing import Collection, Optional
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.decorators import synchronized
from sabnzbd.constants import GIGI, ANFO, ASSEMBLER_WRITE_THRESHOLD
from sabnzbd.nzbstuff import Article
from sabnzbd.constants import (
GIGI,
ANFO,
ARTICLE_CACHE_NON_CONTIGUOUS_FLUSH_PERCENTAGE,
)
from sabnzbd.nzb import Article, NzbFile
from sabnzbd.misc import to_units
# Operations on the article table are handled via try/except.
# The counters need to be made atomic to ensure consistency.
ARTICLE_COUNTER_LOCK = threading.RLock()
_SECONDS_BETWEEN_FLUSHES = 0.5
class ArticleCache:
class ArticleCache(threading.Thread):
def __init__(self):
super().__init__()
self.shutdown = False
self.__direct_write: bool = bool(cfg.direct_write())
self.__cache_limit_org = 0
self.__cache_limit = 0
self.__cache_size = 0
self.__article_table: Dict[Article, bytes] = {} # Dict of buffered articles
self.assembler_write_trigger: int = 1
self.__article_table: dict[Article, bytearray] = {} # Dict of buffered articles
self.__cache_size_cv: threading.Condition = threading.Condition(ARTICLE_COUNTER_LOCK)
self.__last_flush: float = 0
self.__non_contiguous_trigger: int = 0 # Force flush trigger
# On 32 bit we only allow the user to set 1GB
# For 64 bit we allow up to 4GB, in case somebody wants that
@@ -49,9 +62,62 @@ class ArticleCache:
if sabnzbd.MACOS or sabnzbd.WINDOWS or (struct.calcsize("P") * 8) == 64:
self.__cache_upper_limit = 4 * GIGI
def cache_info(self):
return ANFO(len(self.__article_table), abs(self.__cache_size), self.__cache_limit_org)
def change_direct_write(self, direct_write: bool) -> None:
self.__direct_write = direct_write and self.__cache_limit > 1
def stop(self):
self.shutdown = True
with self.__cache_size_cv:
self.__cache_size_cv.notify_all()
def should_flush(self) -> bool:
"""
Should we flush the cache?
Only if direct write is supported and cache usage is over the upper limit.
Or the downloader is paused and cache is not empty.
"""
return (
self.__direct_write
and self.__cache_limit
and (
self.__cache_size > self.__non_contiguous_trigger
or self.__cache_size
and sabnzbd.Downloader.no_active_jobs()
)
)
def flush_cache(self) -> None:
"""In direct_write mode flush cache contents to file"""
forced: set[NzbFile] = set()
for article in self.__article_table.copy():
if not article.can_direct_write or article.nzf in forced:
continue
forced.add(article.nzf)
if time.monotonic() - self.__last_flush > 1:
logging.debug("Forcing write of %s", article.nzf.filepath)
sabnzbd.Assembler.process(article.nzf.nzo, article.nzf, allow_non_contiguous=True, article=article)
self.__last_flush = time.monotonic()
def run(self):
while True:
with self.__cache_size_cv:
self.__cache_size_cv.wait_for(
lambda: self.shutdown or self.should_flush(),
timeout=5.0,
)
if self.shutdown:
break
# Could be reached by timeout when paused and no further articles arrive
with self.__cache_size_cv:
if not self.should_flush():
continue
self.flush_cache()
time.sleep(_SECONDS_BETWEEN_FLUSHES)
def cache_info(self):
return ANFO(len(self.__article_table), abs(self.__cache_size), self.__cache_limit)
@synchronized(ARTICLE_COUNTER_LOCK)
def new_limit(self, limit: int):
"""Called when cache limit changes"""
self.__cache_limit_org = limit
@@ -59,31 +125,32 @@ class ArticleCache:
self.__cache_limit = self.__cache_upper_limit
else:
self.__cache_limit = min(limit, self.__cache_upper_limit)
# Set assembler_write_trigger to be the equivalent of ASSEMBLER_WRITE_THRESHOLD %
# of the total cache, assuming an article size of 750 000 bytes
self.assembler_write_trigger = int(self.__cache_limit * ASSEMBLER_WRITE_THRESHOLD / 100 / 750_000) + 1
logging.debug(
"Assembler trigger = %d",
self.assembler_write_trigger,
)
self.__non_contiguous_trigger = self.__cache_limit * ARTICLE_CACHE_NON_CONTIGUOUS_FLUSH_PERCENTAGE
if self.__cache_limit:
logging.debug("Article cache trigger:%s", to_units(self.__non_contiguous_trigger))
self.change_direct_write(cfg.direct_write())
@synchronized(ARTICLE_COUNTER_LOCK)
def reserve_space(self, data_size: int):
def reserve_space(self, data_size: int) -> bool:
"""Reserve space in the cache"""
self.__cache_size += data_size
if (usage := self.__cache_size + data_size) > self.__cache_limit:
return False
self.__cache_size = usage
self.__cache_size_cv.notify_all()
return True
@synchronized(ARTICLE_COUNTER_LOCK)
def free_reserved_space(self, data_size: int):
"""Remove previously reserved space"""
self.__cache_size -= data_size
self.__cache_size_cv.notify_all()
@synchronized(ARTICLE_COUNTER_LOCK)
def space_left(self) -> bool:
"""Is there space left in the set limit?"""
return self.__cache_size < self.__cache_limit
def save_article(self, article: Article, data: bytes):
def save_article(self, article: Article, data: bytearray):
"""Save article in cache, either memory or disk"""
nzo = article.nzf.nzo
# Skip if already post-processing or fully finished
@@ -91,7 +158,8 @@ class ArticleCache:
return
# Register article for bookkeeping in case the job is deleted
nzo.saved_articles.add(article)
with nzo.lock:
nzo.saved_articles.add(article)
if article.lowest_partnum and not (article.nzf.import_finished or article.nzf.filename_checked):
# Write the first-fetched articles to temporary file unless downloading
@@ -100,24 +168,17 @@ class ArticleCache:
self.__flush_article_to_disk(article, data)
return
if self.__cache_limit:
# Check if we exceed the limit
data_size = len(data)
self.reserve_space(data_size)
if self.space_left():
# Add new article to the cache
self.__article_table[article] = data
else:
# Return the space and save to disk
self.free_reserved_space(data_size)
self.__flush_article_to_disk(article, data)
# Check if we exceed the limit
if self.__cache_limit and self.reserve_space(len(data)):
# Add new article to the cache
self.__article_table[article] = data
else:
# No data saved in memory, direct to disk
self.__flush_article_to_disk(article, data)
def load_article(self, article: Article):
def load_article(self, article: Article) -> Optional[bytearray]:
"""Load the data of the article"""
data = None
data: Optional[bytearray] = None
nzo = article.nzf.nzo
if article in self.__article_table:
@@ -131,9 +192,10 @@ class ArticleCache:
return data
elif article.art_id:
data = sabnzbd.filesystem.load_data(
article.art_id, nzo.admin_path, remove=True, do_pickle=False, silent=True
article.art_id, nzo.admin_path, remove=True, do_pickle=False, silent=True, mutable=True
)
nzo.saved_articles.discard(article)
with nzo.lock:
nzo.saved_articles.discard(article)
return data
def flush_articles(self):
@@ -161,10 +223,16 @@ class ArticleCache:
elif article.art_id:
sabnzbd.filesystem.remove_data(article.art_id, article.nzf.nzo.admin_path)
@staticmethod
def __flush_article_to_disk(article: Article, data):
def __flush_article_to_disk(self, article: Article, data: bytearray):
# Save data, but don't complain when destination folder is missing
# because this flush may come after completion of the NZO.
# Direct write to destination if cache is being used
if self.__cache_limit and self.__direct_write and sabnzbd.Assembler.assemble_article(article, data):
with article.nzf.nzo.lock:
article.nzf.nzo.saved_articles.discard(article)
return
# Fallback to disk cache
sabnzbd.filesystem.save_data(
data, article.get_art_id(), article.nzf.nzo.admin_path, do_pickle=False, silent=True
)

View File

@@ -23,13 +23,16 @@ import os
import queue
import logging
import re
import threading
from threading import Thread
import ctypes
from typing import Tuple, Optional, List
from typing import Optional, NamedTuple, Union
import rarfile
import time
import sabctools
import sabnzbd
from sabnzbd.misc import get_all_passwords, match_str, SABRarFile
from sabnzbd.misc import get_all_passwords, match_str, SABRarFile, to_units
from sabnzbd.filesystem import (
set_permissions,
clip_path,
@@ -39,32 +42,222 @@ from sabnzbd.filesystem import (
has_unwanted_extension,
get_basename,
)
from sabnzbd.constants import Status, GIGI, MAX_ASSEMBLER_QUEUE
from sabnzbd.constants import (
Status,
GIGI,
ASSEMBLER_WRITE_THRESHOLD_FACTOR_APPEND,
ASSEMBLER_WRITE_THRESHOLD_FACTOR_DIRECT_WRITE,
ASSEMBLER_MAX_WRITE_THRESHOLD_DIRECT_WRITE,
SOFT_ASSEMBLER_QUEUE_LIMIT,
ASSEMBLER_DELAY_FACTOR_DIRECT_WRITE,
ARTICLE_CACHE_NON_CONTIGUOUS_FLUSH_PERCENTAGE,
ASSEMBLER_WRITE_INTERVAL,
)
import sabnzbd.cfg as cfg
from sabnzbd.nzbstuff import NzbObject, NzbFile
from sabnzbd.nzb import NzbFile, NzbObject, Article
import sabnzbd.par2file as par2file
class AssemblerTask(NamedTuple):
nzo: Optional[NzbObject] = None
nzf: Optional[NzbFile] = None
file_done: bool = False
allow_non_contiguous: bool = False
direct_write: bool = False
class Assembler(Thread):
def __init__(self):
super().__init__()
self.queue: queue.Queue[Tuple[Optional[NzbObject], Optional[NzbFile], Optional[bool]]] = queue.Queue()
self.max_queue_size: int = cfg.assembler_max_queue_size()
self.direct_write: bool = cfg.direct_write()
self.cache_limit: int = 0
# Contiguous bytes required to trigger append writes
self.append_trigger: int = 1
# Total bytes required to trigger direct-write assembles
self.direct_write_trigger: int = 1
self.delay_trigger: int = 1
self.queue: queue.Queue[AssemblerTask] = queue.Queue()
self.queued_lock = threading.Lock()
self.queued_nzf: set[str] = set()
self.queued_nzf_non_contiguous: set[str] = set()
self.queued_next_time: dict[str, float] = dict()
self.ready_bytes_lock = threading.Lock()
self.ready_bytes: dict[str, int] = dict()
def stop(self):
self.queue.put((None, None, None))
self.queue.put(AssemblerTask())
def process(self, nzo: NzbObject, nzf: Optional[NzbFile] = None, file_done: Optional[bool] = None):
self.queue.put((nzo, nzf, file_done))
def new_limit(self, limit: int):
"""Called when cache limit changes"""
self.cache_limit = limit
self.append_trigger = max(1, int(limit * ASSEMBLER_WRITE_THRESHOLD_FACTOR_APPEND))
self.direct_write_trigger = max(
1,
min(
max(1, int(limit * ASSEMBLER_WRITE_THRESHOLD_FACTOR_DIRECT_WRITE)),
ASSEMBLER_MAX_WRITE_THRESHOLD_DIRECT_WRITE,
),
)
self.calculate_delay_trigger()
self.change_direct_write(cfg.direct_write())
logging.debug(
"Assembler trigger append=%s, direct=%s, delay=%s",
to_units(self.append_trigger),
to_units(self.direct_write_trigger),
to_units(self.delay_trigger),
)
def queue_level(self) -> float:
return self.queue.qsize() / MAX_ASSEMBLER_QUEUE
def change_direct_write(self, direct_write: bool) -> None:
self.direct_write = direct_write and self.direct_write_trigger > 1
self.calculate_delay_trigger()
def calculate_delay_trigger(self):
"""Point at which downloader should start being delayed, recalculated when cache limit or direct write changes"""
self.delay_trigger = int(
max(
(
750_000 * self.max_queue_size * ASSEMBLER_DELAY_FACTOR_DIRECT_WRITE
if self.direct_write
else 750_000 * self.max_queue_size
),
(
self.cache_limit * ARTICLE_CACHE_NON_CONTIGUOUS_FLUSH_PERCENTAGE
if self.direct_write
else min(self.append_trigger * self.max_queue_size, int(self.cache_limit * 0.5))
),
)
)
def is_busy(self) -> bool:
"""Returns True if the assembler thread has at least one NzbFile it is assembling"""
return bool(self.queued_nzf or self.queued_nzf_non_contiguous)
def total_ready_bytes(self) -> int:
with self.ready_bytes_lock:
return sum(self.ready_bytes.values())
def update_ready_bytes(self, nzf: NzbFile, delta: int) -> int:
with self.ready_bytes_lock:
cur = self.ready_bytes.get(nzf.nzf_id, 0) + delta
if cur <= 0:
self.ready_bytes.pop(nzf.nzf_id, None)
else:
self.ready_bytes[nzf.nzf_id] = cur
return cur
def clear_ready_bytes(self, *nzfs: NzbFile) -> None:
with self.ready_bytes_lock:
for nzf in nzfs:
self.ready_bytes.pop(nzf.nzf_id, None)
self.queued_next_time.pop(nzf.nzf_id, None)
def process(
self,
nzo: NzbObject = None,
nzf: Optional[NzbFile] = None,
file_done: bool = False,
allow_non_contiguous: bool = False,
article: Optional[Article] = None,
) -> None:
if nzf is None:
# post-proc
self.queue.put(AssemblerTask(nzo))
return
# Track bytes pending being written for this nzf
if self.should_track_ready_bytes(article, allow_non_contiguous):
ready_bytes = self.update_ready_bytes(nzf, article.decoded_size)
else:
ready_bytes = 0
article_has_first_part = bool(article and article.lowest_partnum)
if article_has_first_part:
self.queued_next_time[nzf.nzf_id] = time.monotonic() + ASSEMBLER_WRITE_INTERVAL
if not self.should_queue_nzf(
nzf,
article_has_first_part=article_has_first_part,
filename_checked=nzf.filename_checked,
import_finished=nzf.import_finished,
file_done=file_done,
allow_non_contiguous=allow_non_contiguous,
ready_bytes=ready_bytes,
):
return
with self.queued_lock:
# Recheck not already in the normal queue under lock, but always enqueue when file_done
if not file_done and nzf.nzf_id in self.queued_nzf:
return
if allow_non_contiguous:
if not file_done and nzf.nzf_id in self.queued_nzf_non_contiguous:
return
self.queued_nzf_non_contiguous.add(nzf.nzf_id)
else:
self.queued_nzf.add(nzf.nzf_id)
self.queued_next_time[nzf.nzf_id] = time.monotonic() + ASSEMBLER_WRITE_INTERVAL
can_direct_write = self.direct_write and nzf.type == "yenc"
self.queue.put(AssemblerTask(nzo, nzf, file_done, allow_non_contiguous, can_direct_write))
def should_queue_nzf(
self,
nzf: NzbFile,
*,
article_has_first_part: bool,
filename_checked: bool,
import_finished: bool,
file_done: bool,
allow_non_contiguous: bool,
ready_bytes: int,
) -> bool:
# Always queue if done
if file_done:
return True
if nzf.nzf_id in self.queued_nzf:
return False
# Always write
if article_has_first_part and filename_checked and not import_finished:
return True
next_ready = (next_article := nzf.assembler_next_article) and (next_article.decoded or next_article.on_disk)
# Trigger every 5 seconds if next article is decoded or on_disk
if next_ready and time.monotonic() > self.queued_next_time.get(nzf.nzf_id, 0):
return True
# Append
if not self.direct_write or nzf.type != "yenc":
return nzf.contiguous_ready_bytes() >= self.append_trigger
# Direct Write
if allow_non_contiguous:
return True
# Direct Write ready bytes trigger if next is also ready
if next_ready and ready_bytes >= self.direct_write_trigger:
return True
return False
@staticmethod
def should_track_ready_bytes(article: Optional[Article], allow_non_contiguous: bool) -> bool:
""""""
return article and not allow_non_contiguous and article.decoded_size
def delay(self) -> float:
"""Calculate how long if at all the downloader thread should sleep to allow the assembler to catch up"""
ready_total = self.total_ready_bytes()
# Below trigger: no delay possible
if ready_total <= self.delay_trigger:
return 0
pressure = (ready_total - self.delay_trigger) / max(1.0, self.cache_limit - self.delay_trigger)
if pressure <= SOFT_ASSEMBLER_QUEUE_LIMIT:
return 0
# 50-100%: 0-0.25 seconds, capped at 0.15
sleep = min((pressure - SOFT_ASSEMBLER_QUEUE_LIMIT) / 2, 0.15)
return max(0.001, sleep)
def run(self):
while 1:
# Set NzbObject and NzbFile objects to None so references
# from this thread do not keep the objects alive (see #1628)
nzo = nzf = None
nzo, nzf, file_done = self.queue.get()
nzo, nzf, file_done, allow_non_contiguous, direct_write = self.queue.get()
if not nzo:
logging.debug("Shutting down assembler")
break
@@ -74,11 +267,15 @@ class Assembler(Thread):
if file_done and not sabnzbd.Downloader.paused:
self.diskspace_check(nzo, nzf)
# Prepare filepath
if filepath := nzf.prepare_filepath():
try:
# Prepare filepath
if not (filepath := nzf.prepare_filepath()):
logging.debug("Prepare filepath failed for file %s in job %s", nzf.filename, nzo.final_name)
continue
try:
logging.debug("Decoding part of %s", filepath)
self.assemble(nzo, nzf, file_done)
self.assemble(nzo, nzf, file_done, allow_non_contiguous, direct_write)
# Continue after partly written data
if not file_done:
@@ -121,9 +318,16 @@ class Assembler(Thread):
except Exception:
logging.error(T("Fatal error in Assembler"), exc_info=True)
break
finally:
with self.queued_lock:
if allow_non_contiguous:
self.queued_nzf_non_contiguous.discard(nzf.nzf_id)
else:
self.queued_nzf.discard(nzf.nzf_id)
else:
sabnzbd.NzbQueue.remove(nzo.nzo_id, cleanup=False)
sabnzbd.PostProcessor.process(nzo)
self.clear_ready_bytes(*nzo.files)
@staticmethod
def diskspace_check(nzo: NzbObject, nzf: NzbFile):
@@ -161,52 +365,115 @@ class Assembler(Thread):
sabnzbd.emailer.diskfull_mail()
@staticmethod
def assemble(nzo: NzbObject, nzf: NzbFile, file_done: bool):
def assemble(nzo: NzbObject, nzf: NzbFile, file_done: bool, allow_non_contiguous: bool, direct_write: bool) -> None:
"""Assemble a NZF from its table of articles
1) Partial write: write what we have
2) Nothing written before: write all
"""
load_article = sabnzbd.ArticleCache.load_article
downloader = sabnzbd.Downloader
decodetable = nzf.decodetable
fd: Optional[int] = None
skipped: bool = False # have any articles been skipped
offset: int = 0 # sequential offset for append writes
try:
# Resume assembly from where we got to previously
for idx in range(nzf.assembler_next_index, len(decodetable)):
article = decodetable[idx]
# We write large article-sized chunks, so we can safely skip the buffering of Python
with open(nzf.filepath, "ab", buffering=0) as fout:
for article in nzf.decodetable:
# Break if deleted during writing
if nzo.status is Status.DELETED:
break
# allow_non_contiguous is when the cache forces the assembler to write all articles, even if it leaves gaps.
# In most cases we can stop at the first article that has not been tried, because they are requested in order.
# However, if we are paused then always consider the whole decodetable to ensure everything possible is written.
if allow_non_contiguous and not article.tries and not downloader.paused:
break
# Skip already written articles
if article.on_disk:
if fd is not None and article.decoded_size is not None:
# Move the file descriptor forward past this article
offset += article.decoded_size
if not skipped:
with nzf.lock:
nzf.assembler_next_index = idx + 1
continue
# Write all decoded articles
if article.decoded:
# Could be empty in case nzo was deleted
if data := sabnzbd.ArticleCache.load_article(article):
written = fout.write(data)
# In raw/non-buffered mode fout.write may not write everything requested:
# https://docs.python.org/3/library/io.html?highlight=write#io.RawIOBase.write
while written < len(data):
written += fout.write(data[written:])
nzf.update_crc32(article.crc32, len(data))
article.on_disk = True
else:
logging.info("No data found when trying to write %s", article)
else:
# stop if next piece not yet decoded
if not article.decoded:
# If the article was not decoded but the file
# is done, it is just a missing piece, so keep writing
if file_done:
continue
# We reach an article that was not decoded
if allow_non_contiguous:
skipped = True
continue
break
# Could be empty in case nzo was deleted
data = load_article(article)
if not data:
if file_done:
continue
if allow_non_contiguous:
skipped = True
continue
else:
# We reach an article that was not decoded
logging.info("No data found when trying to write %s", article)
break
# If required open the file
if fd is None:
fd, offset, direct_write = Assembler.open(
nzf, direct_write and article.can_direct_write, article.file_size
)
if not direct_write and allow_non_contiguous:
# Can only be allow_non_contiguous if we wanted direct_write, file_done will always be queued separately
break
if direct_write and article.can_direct_write:
offset += Assembler.write(fd, idx, nzf, article, data)
else:
if direct_write and skipped and not file_done:
# If we have already skipped an article then need to abort, unless this is the final assemble
break
offset += Assembler.write(fd, idx, nzf, article, data, offset)
finally:
if fd is not None:
os.close(fd)
# Final steps
if file_done:
sabnzbd.Assembler.clear_ready_bytes(nzf)
set_permissions(nzf.filepath)
nzf.assembled = True
@staticmethod
def assemble_article(article: Article, data: bytearray) -> bool:
"""Write a single article to disk"""
if not article.can_direct_write:
return False
nzf = article.nzf
with nzf.file_lock:
fd, _, direct_write = Assembler.open(nzf, True, article.file_size)
try:
if not direct_write:
cfg.direct_write.set(False)
return False
Assembler.write(fd, None, nzf, article, data)
except FileNotFoundError:
# nzo has probably been deleted, ArticleCache tries the fallback and handles it
return False
finally:
os.close(fd)
return True
@staticmethod
def check_encrypted_and_unwanted(nzo: NzbObject, nzf: NzbFile):
"""Encryption and unwanted extension detection"""
@@ -244,12 +511,77 @@ class Assembler(Thread):
nzo.fail_msg = T("Aborted, unwanted extension detected")
sabnzbd.NzbQueue.end_job(nzo)
@staticmethod
def write(
fd: int, nzf_index: Optional[int], nzf: NzbFile, article: Article, data: bytearray, offset: Optional[int] = None
) -> int:
"""Write data at position in a file"""
pos = article.data_begin if offset is None else offset
written = Assembler._write(fd, nzf, data, pos)
# In raw/non-buffered mode os.write may not write everything requested:
# https://docs.python.org/3/library/io.html?highlight=write#io.RawIOBase.write
if written < len(data) and (mv := memoryview(data)):
while written < len(data):
written += Assembler._write(fd, nzf, mv[written:], pos + written)
nzf.update_crc32(article.crc32, len(data))
article.on_disk = True
sabnzbd.Assembler.update_ready_bytes(nzf, -len(data))
with nzf.lock:
# assembler_next_index is the lowest index that has not yet been written sequentially from the start of the file.
# If this was the next required index to remain sequential, it can be incremented which allows the assembler to
# resume without rechecking articles that are already known to be on disk.
# If nzf_index is None, determine it now.
if nzf_index is None:
idx = nzf.assembler_next_index
if idx < len(nzf.decodetable) and article == nzf.decodetable[idx]:
nzf_index = idx
if nzf_index is not None and nzf.assembler_next_index == nzf_index:
nzf.assembler_next_index += 1
return written
@staticmethod
def _write(fd: int, nzf: NzbFile, data: Union[bytearray, memoryview], offset: int) -> int:
if sabnzbd.WINDOWS:
# pwrite is not implemented on Windows so fallback to os.lseek and os.write
# Must lock since it is possible to write from multiple threads (assembler + downloader)
with nzf.file_lock:
os.lseek(fd, offset, os.SEEK_SET)
return os.write(fd, data)
else:
return os.pwrite(fd, data, offset)
@staticmethod
def open(nzf: NzbFile, direct_write: bool, file_size: int) -> tuple[int, int, bool]:
"""Open file for nzf
Use direct_write if requested, with a fallback to setting the current file position for append mode
:returns (file_descriptor, current_offset, can_direct_write)
"""
with nzf.file_lock:
# Get the current umask without changing it, to create a file with the same permissions as `with open(...)`
os.umask(os.umask(0))
fd = os.open(nzf.filepath, os.O_CREAT | os.O_WRONLY | getattr(os, "O_BINARY", 0), 0o666)
offset = nzf.contiguous_offset()
os.lseek(fd, offset, os.SEEK_SET)
if direct_write:
if not file_size:
direct_write = False
if os.fstat(fd).st_size == 0:
try:
sabctools.sparse(fd, file_size)
except OSError:
logging.debug("Sparse call failed for %s", nzf.filepath)
cfg.direct_write.set(False)
direct_write = False
return fd, offset, direct_write
RE_SUBS = re.compile(r"\W+sub|subs|subpack|subtitle|subtitles(?![a-z])", re.I)
SAFE_EXTS = (".mkv", ".mp4", ".avi", ".wmv", ".mpg", ".webm")
def is_cloaked(nzo: NzbObject, path: str, names: List[str]) -> bool:
def is_cloaked(nzo: NzbObject, path: str, names: list[str]) -> bool:
"""Return True if this is likely to be a cloaked encrypted post"""
fname = get_basename(get_filename(path.lower()))
for name in names:
@@ -278,7 +610,7 @@ def is_cloaked(nzo: NzbObject, path: str, names: List[str]) -> bool:
return False
def check_encrypted_and_unwanted_files(nzo: NzbObject, filepath: str) -> Tuple[bool, Optional[str]]:
def check_encrypted_and_unwanted_files(nzo: NzbObject, filepath: str) -> tuple[bool, Optional[str]]:
"""Combines check for unwanted and encrypted files to save on CPU and IO"""
encrypted = False
unwanted = None

View File

@@ -22,7 +22,7 @@ sabnzbd.bpsmeter - bpsmeter
import time
import logging
import re
from typing import List, Dict, Optional
from typing import Optional
import sabnzbd
from sabnzbd.constants import BYTES_FILE_NAME, KIBI
@@ -132,20 +132,20 @@ class BPSMeter:
self.speed_log_time = t
self.last_update = t
self.bps = 0.0
self.bps_list: List[int] = []
self.bps_list: list[int] = []
self.server_bps: Dict[str, float] = {}
self.cached_amount: Dict[str, int] = {}
self.server_bps: dict[str, float] = {}
self.cached_amount: dict[str, int] = {}
self.sum_cached_amount: int = 0
self.day_total: Dict[str, int] = {}
self.week_total: Dict[str, int] = {}
self.month_total: Dict[str, int] = {}
self.grand_total: Dict[str, int] = {}
self.day_total: dict[str, int] = {}
self.week_total: dict[str, int] = {}
self.month_total: dict[str, int] = {}
self.grand_total: dict[str, int] = {}
self.timeline_total: Dict[str, Dict[str, int]] = {}
self.timeline_total: dict[str, dict[str, int]] = {}
self.article_stats_tried: Dict[str, Dict[str, int]] = {}
self.article_stats_failed: Dict[str, Dict[str, int]] = {}
self.article_stats_tried: dict[str, dict[str, int]] = {}
self.article_stats_failed: dict[str, dict[str, int]] = {}
self.delayed_assembler: int = 0
@@ -254,8 +254,6 @@ class BPSMeter:
self.week_total[server] = 0
if server not in self.month_total:
self.month_total[server] = 0
if server not in self.month_total:
self.month_total[server] = 0
if server not in self.grand_total:
self.grand_total[server] = 0
if server not in self.timeline_total:
@@ -302,45 +300,51 @@ class BPSMeter:
for server in sabnzbd.Downloader.servers[:]:
self.init_server_stats(server.id)
# Cache dict references for faster access
day_total = self.day_total
week_total = self.week_total
month_total = self.month_total
grand_total = self.grand_total
timeline_total = self.timeline_total
cached_amount = self.cached_amount
server_bps = self.server_bps
start_time = self.start_time
last_update = self.last_update
# Minimum epsilon to avoid division by zero
dt_total = max(t - start_time, 1e-6)
dt_last = max(last_update - start_time, 1e-6)
# Add amounts that have been stored temporarily to statistics
for srv in self.cached_amount:
if self.cached_amount[srv]:
self.day_total[srv] += self.cached_amount[srv]
self.week_total[srv] += self.cached_amount[srv]
self.month_total[srv] += self.cached_amount[srv]
self.grand_total[srv] += self.cached_amount[srv]
self.timeline_total[srv][self.day_label] += self.cached_amount[srv]
if cached := self.cached_amount[srv]:
day_total[srv] += cached
week_total[srv] += cached
month_total[srv] += cached
grand_total[srv] += cached
timeline_total[srv][self.day_label] += cached
# Reset for next time
cached_amount[srv] = 0
# Update server bps
try:
self.server_bps[srv] = (
self.server_bps[srv] * (self.last_update - self.start_time) + self.cached_amount[srv]
) / (t - self.start_time)
except ZeroDivisionError:
self.server_bps[srv] = 0.0
# Reset for next time
self.cached_amount[srv] = 0
server_bps[srv] = (server_bps[srv] * dt_last + cached) / dt_total
# Quota check
total_cached = self.sum_cached_amount
if self.have_quota and self.quota_enabled:
self.left -= self.sum_cached_amount
self.left -= total_cached
self.check_quota()
# Speedometer
try:
self.bps = (self.bps * (self.last_update - self.start_time) + self.sum_cached_amount) / (
t - self.start_time
)
except ZeroDivisionError:
self.bps = 0.0
self.bps = (self.bps * dt_last + total_cached) / dt_total
self.sum_cached_amount = 0
self.last_update = t
check_time = t - 5.0
if self.start_time < check_time:
if start_time < check_time:
self.start_time = check_time
if self.bps < 0.01:
@@ -382,7 +386,7 @@ class BPSMeter:
# Always trim the list to the max-length
if len(self.bps_list) > BPS_LIST_MAX:
self.bps_list = self.bps_list[len(self.bps_list) - BPS_LIST_MAX :]
self.bps_list = self.bps_list[-BPS_LIST_MAX:]
def get_sums(self):
"""return tuple of grand, month, week, day totals"""

View File

@@ -25,7 +25,8 @@ import re
import argparse
import socket
import ipaddress
from typing import List, Tuple, Union
import threading
from typing import Union
import sabnzbd
from sabnzbd.config import (
@@ -52,12 +53,14 @@ from sabnzbd.constants import (
DEF_STD_WEB_COLOR,
DEF_HTTPS_CERT_FILE,
DEF_HTTPS_KEY_FILE,
DEF_MAX_ASSEMBLER_QUEUE,
DEF_PIPELINING_REQUESTS,
)
from sabnzbd.filesystem import same_directory, real_path, is_valid_script, is_network_path
# Validators currently only are made for string/list-of-strings
# and return those on success or an error message.
ValidateResult = Union[Tuple[None, str], Tuple[None, List[str]], Tuple[str, None]]
ValidateResult = Union[tuple[None, str], tuple[None, list[str]], tuple[str, None]]
##############################################################################
@@ -122,21 +125,21 @@ def supported_unrar_parameters(value: str) -> ValidateResult:
return None, value
def all_lowercase(value: Union[str, List]) -> Tuple[None, Union[str, List]]:
def all_lowercase(value: Union[str, list]) -> tuple[None, Union[str, list]]:
"""Lowercase and strip everything!"""
if isinstance(value, list):
return None, [item.lower().strip() for item in value]
return None, value.lower().strip()
def lower_case_ext(value: Union[str, List]) -> Tuple[None, Union[str, List]]:
def lower_case_ext(value: Union[str, list]) -> tuple[None, Union[str, list]]:
"""Generate lower case extension(s), without dot"""
if isinstance(value, list):
return None, [item.lower().strip(" .") for item in value]
return None, value.lower().strip(" .")
def validate_single_tag(value: List[str]) -> Tuple[None, List[str]]:
def validate_single_tag(value: list[str]) -> tuple[None, list[str]]:
"""Don't split single indexer tags like "TV > HD"
into ['TV', '>', 'HD']
"""
@@ -146,7 +149,7 @@ def validate_single_tag(value: List[str]) -> Tuple[None, List[str]]:
return None, value
def validate_url_base(value: str) -> Tuple[None, str]:
def validate_url_base(value: str) -> tuple[None, str]:
"""Strips the right slash and adds starting slash, if not present"""
if value and isinstance(value, str):
if not value.startswith("/"):
@@ -158,7 +161,7 @@ def validate_url_base(value: str) -> Tuple[None, str]:
RE_VAL = re.compile(r"[^@ ]+@[^.@ ]+\.[^.@ ]")
def validate_email(value: Union[List, str]) -> ValidateResult:
def validate_email(value: Union[list, str]) -> ValidateResult:
if email_endjob() or email_full() or email_rss():
if isinstance(value, list):
values = value
@@ -285,7 +288,7 @@ def validate_download_vs_complete_dir(root: str, value: str, default: str):
return validate_safedir(root, value, default)
def validate_scriptdir_not_appdir(root: str, value: str, default: str) -> Tuple[None, str]:
def validate_scriptdir_not_appdir(root: str, value: str, default: str) -> tuple[None, str]:
"""Warn users to not use the Program Files folder for their scripts"""
# Need to add separator so /mnt/sabnzbd and /mnt/sabnzbd-data are not detected as equal
if value and same_directory(sabnzbd.DIR_PROG, os.path.join(root, value)):
@@ -298,7 +301,7 @@ def validate_scriptdir_not_appdir(root: str, value: str, default: str) -> Tuple[
return None, value
def validate_default_if_empty(root: str, value: str, default: str) -> Tuple[None, str]:
def validate_default_if_empty(root: str, value: str, default: str) -> tuple[None, str]:
"""If value is empty, return default"""
if value:
return None, value
@@ -505,7 +508,8 @@ no_penalties = OptionBool("misc", "no_penalties", False)
x_frame_options = OptionBool("misc", "x_frame_options", True)
allow_old_ssl_tls = OptionBool("misc", "allow_old_ssl_tls", False)
enable_season_sorting = OptionBool("misc", "enable_season_sorting", True)
verify_xff_header = OptionBool("misc", "verify_xff_header", False)
verify_xff_header = OptionBool("misc", "verify_xff_header", True)
direct_write = OptionBool("misc", "direct_write", True)
# Text values
rss_odd_titles = OptionList("misc", "rss_odd_titles", ["nzbindex.nl/", "nzbindex.com/", "nzbclub.com/"])
@@ -527,6 +531,7 @@ local_ranges = OptionList("misc", "local_ranges", protect=True)
max_url_retries = OptionNumber("misc", "max_url_retries", 10, minval=1)
downloader_sleep_time = OptionNumber("misc", "downloader_sleep_time", 10, minval=0)
receive_threads = OptionNumber("misc", "receive_threads", 2, minval=1)
assembler_max_queue_size = OptionNumber("misc", "assembler_max_queue_size", DEF_MAX_ASSEMBLER_QUEUE, minval=1)
switchinterval = OptionNumber("misc", "switchinterval", 0.005, minval=0.001)
ssdp_broadcast_interval = OptionNumber("misc", "ssdp_broadcast_interval", 15, minval=1, maxval=600)
ext_rename_ignore = OptionList("misc", "ext_rename_ignore", validation=lower_case_ext)
@@ -740,6 +745,13 @@ def new_limit():
if sabnzbd.__INITIALIZED__:
# Only update after full startup
sabnzbd.ArticleCache.new_limit(cache_limit.get_int())
sabnzbd.Assembler.new_limit(sabnzbd.ArticleCache.cache_info().cache_limit)
def new_direct_write():
"""Callback for direct write changes"""
sabnzbd.Assembler.change_direct_write(bool(direct_write()))
sabnzbd.ArticleCache.change_direct_write(bool(direct_write()))
def guard_restart():

View File

@@ -28,7 +28,7 @@ import time
import uuid
import io
import zipfile
from typing import List, Dict, Any, Callable, Optional, Union, Tuple
from typing import Any, Callable, Optional, Union
from urllib.parse import urlparse
import configobj
@@ -42,6 +42,7 @@ from sabnzbd.constants import (
CONFIG_BACKUP_HTTPS,
DEF_INI_FILE,
DEF_SORTER_RENAME_SIZE,
DEF_PIPELINING_REQUESTS,
)
from sabnzbd.decorators import synchronized
from sabnzbd.filesystem import clip_path, real_path, create_real_path, renamer, remove_file, is_writable
@@ -101,14 +102,14 @@ class Option:
def get_string(self) -> str:
return str(self.get())
def get_dict(self, for_public_api: bool = False) -> Dict[str, Any]:
def get_dict(self, for_public_api: bool = False) -> dict[str, Any]:
"""Return value as a dictionary.
Will not show non-public options if needed for the API"""
if not self.__public and for_public_api:
return {}
return {self.__keyword: self.get()}
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Set value based on dictionary"""
if not self.__protect:
try:
@@ -209,7 +210,8 @@ class OptionBool(Option):
super().set(sabnzbd.misc.bool_conv(value))
def __call__(self) -> int:
"""get() replacement"""
"""Many places assume 0/1 is used for historical reasons.
Using pure bools breaks in random places"""
return int(self.get())
@@ -307,7 +309,7 @@ class OptionList(Option):
self,
section: str,
keyword: str,
default_val: Union[str, List, None] = None,
default_val: Union[str, list, None] = None,
validation: Optional[Callable] = None,
add: bool = True,
public: bool = True,
@@ -318,7 +320,7 @@ class OptionList(Option):
default_val = []
super().__init__(section, keyword, default_val, add=add, public=public, protect=protect)
def set(self, value: Union[str, List]) -> Optional[str]:
def set(self, value: Union[str, list]) -> Optional[str]:
"""Set the list given a comma-separated string or a list"""
error = None
if value is not None:
@@ -341,7 +343,7 @@ class OptionList(Option):
"""Return the default list as a comma-separated string"""
return ", ".join(self.default)
def __call__(self) -> List[str]:
def __call__(self) -> list[str]:
"""get() replacement"""
return self.get()
@@ -406,7 +408,7 @@ class OptionPassword(Option):
return "*" * 10
return ""
def get_dict(self, for_public_api: bool = False) -> Dict[str, str]:
def get_dict(self, for_public_api: bool = False) -> dict[str, str]:
"""Return value a dictionary"""
if for_public_api:
return {self.keyword: self.get_stars()}
@@ -444,6 +446,7 @@ class ConfigServer:
self.enable = OptionBool(name, "enable", True, add=False)
self.required = OptionBool(name, "required", False, add=False)
self.optional = OptionBool(name, "optional", False, add=False)
self.pipelining_requests = OptionNumber(name, "pipelining_requests", DEF_PIPELINING_REQUESTS, 1, 20, add=False)
self.retention = OptionNumber(name, "retention", 0, add=False)
self.expire_date = OptionStr(name, "expire_date", add=False)
self.quota = OptionStr(name, "quota", add=False)
@@ -454,7 +457,7 @@ class ConfigServer:
self.set_dict(values)
add_to_database("servers", self.__name, self)
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Set one or more fields, passed as dictionary"""
# Replace usage_at_start value with most recent statistics if the user changes the quota value
# Only when we are updating it from the Config
@@ -476,6 +479,7 @@ class ConfigServer:
"enable",
"required",
"optional",
"pipelining_requests",
"retention",
"expire_date",
"quota",
@@ -491,7 +495,7 @@ class ConfigServer:
if not self.displayname():
self.displayname.set(self.__name)
def get_dict(self, for_public_api: bool = False) -> Dict[str, Any]:
def get_dict(self, for_public_api: bool = False) -> dict[str, Any]:
"""Return a dictionary with all attributes"""
output_dict = {}
output_dict["name"] = self.__name
@@ -511,6 +515,7 @@ class ConfigServer:
output_dict["enable"] = self.enable()
output_dict["required"] = self.required()
output_dict["optional"] = self.optional()
output_dict["pipelining_requests"] = self.pipelining_requests()
output_dict["retention"] = self.retention()
output_dict["expire_date"] = self.expire_date()
output_dict["quota"] = self.quota()
@@ -531,7 +536,7 @@ class ConfigServer:
class ConfigCat:
"""Class defining a single category"""
def __init__(self, name: str, values: Dict[str, Any]):
def __init__(self, name: str, values: dict[str, Any]):
self.__name = clean_section_name(name)
name = "categories," + self.__name
@@ -545,7 +550,7 @@ class ConfigCat:
self.set_dict(values)
add_to_database("categories", self.__name, self)
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Set one or more fields, passed as dictionary"""
for kw in ("order", "pp", "script", "dir", "newzbin", "priority"):
try:
@@ -554,7 +559,7 @@ class ConfigCat:
except KeyError:
continue
def get_dict(self, for_public_api: bool = False) -> Dict[str, Any]:
def get_dict(self, for_public_api: bool = False) -> dict[str, Any]:
"""Return a dictionary with all attributes"""
output_dict = {}
output_dict["name"] = self.__name
@@ -589,7 +594,7 @@ class ConfigSorter:
self.set_dict(values)
add_to_database("sorters", self.__name, self)
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Set one or more fields, passed as dictionary"""
for kw in ("order", "min_size", "multipart_label", "sort_string", "sort_cats", "sort_type", "is_active"):
try:
@@ -598,7 +603,7 @@ class ConfigSorter:
except KeyError:
continue
def get_dict(self, for_public_api: bool = False) -> Dict[str, Any]:
def get_dict(self, for_public_api: bool = False) -> dict[str, Any]:
"""Return a dictionary with all attributes"""
output_dict = {}
output_dict["name"] = self.__name
@@ -639,7 +644,7 @@ class OptionFilters(Option):
return
self.set(lst)
def update(self, pos: int, value: Tuple):
def update(self, pos: int, value: tuple):
"""Update filter 'pos' definition, value is a list
Append if 'pos' outside list
"""
@@ -659,14 +664,14 @@ class OptionFilters(Option):
return
self.set(lst)
def get_dict(self, for_public_api: bool = False) -> Dict[str, str]:
def get_dict(self, for_public_api: bool = False) -> dict[str, str]:
"""Return filter list as a dictionary with keys 'filter[0-9]+'"""
output_dict = {}
for n, rss_filter in enumerate(self.get()):
output_dict[f"filter{n}"] = rss_filter
return output_dict
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Create filter list from dictionary with keys 'filter[0-9]+'"""
filters = []
# We don't know how many filters there are, so just assume all values are filters
@@ -677,7 +682,7 @@ class OptionFilters(Option):
if filters:
self.set(filters)
def __call__(self) -> List[List[str]]:
def __call__(self) -> list[list[str]]:
"""get() replacement"""
return self.get()
@@ -701,7 +706,7 @@ class ConfigRSS:
self.set_dict(values)
add_to_database("rss", self.__name, self)
def set_dict(self, values: Dict[str, Any]):
def set_dict(self, values: dict[str, Any]):
"""Set one or more fields, passed as dictionary"""
for kw in ("uri", "cat", "pp", "script", "priority", "enable"):
try:
@@ -711,7 +716,7 @@ class ConfigRSS:
continue
self.filters.set_dict(values)
def get_dict(self, for_public_api: bool = False) -> Dict[str, Any]:
def get_dict(self, for_public_api: bool = False) -> dict[str, Any]:
"""Return a dictionary with all attributes"""
output_dict = {}
output_dict["name"] = self.__name
@@ -755,7 +760,7 @@ AllConfigTypes = Union[
ConfigRSS,
ConfigServer,
]
CFG_DATABASE: Dict[str, Dict[str, AllConfigTypes]] = {}
CFG_DATABASE: dict[str, dict[str, AllConfigTypes]] = {}
@synchronized(CONFIG_LOCK)
@@ -1103,7 +1108,7 @@ def restore_config_backup(config_backup_data: bytes):
@synchronized(CONFIG_LOCK)
def get_servers() -> Dict[str, ConfigServer]:
def get_servers() -> dict[str, ConfigServer]:
global CFG_DATABASE
try:
return CFG_DATABASE["servers"]
@@ -1112,7 +1117,7 @@ def get_servers() -> Dict[str, ConfigServer]:
@synchronized(CONFIG_LOCK)
def get_sorters() -> Dict[str, ConfigSorter]:
def get_sorters() -> dict[str, ConfigSorter]:
global CFG_DATABASE
try:
return CFG_DATABASE["sorters"]
@@ -1120,7 +1125,7 @@ def get_sorters() -> Dict[str, ConfigSorter]:
return {}
def get_ordered_sorters() -> List[Dict]:
def get_ordered_sorters() -> list[dict]:
"""Return sorters as an ordered list"""
database_sorters = get_sorters()
@@ -1131,7 +1136,7 @@ def get_ordered_sorters() -> List[Dict]:
@synchronized(CONFIG_LOCK)
def get_categories() -> Dict[str, ConfigCat]:
def get_categories() -> dict[str, ConfigCat]:
"""Return link to categories section.
This section will always contain special category '*'
"""
@@ -1163,7 +1168,7 @@ def get_category(cat: str = "*") -> ConfigCat:
return cats["*"]
def get_ordered_categories() -> List[Dict]:
def get_ordered_categories() -> list[dict]:
"""Return list-copy of categories section that's ordered
by user's ordering including Default-category
"""
@@ -1183,7 +1188,7 @@ def get_ordered_categories() -> List[Dict]:
@synchronized(CONFIG_LOCK)
def get_rss() -> Dict[str, ConfigRSS]:
def get_rss() -> dict[str, ConfigRSS]:
global CFG_DATABASE
try:
# We have to remove non-separator commas by detecting if they are valid URL's

View File

@@ -50,7 +50,7 @@ RENAMES_FILE = "__renames__"
ATTRIB_FILE = "SABnzbd_attrib"
REPAIR_REQUEST = "repair-all.sab"
SABCTOOLS_VERSION_REQUIRED = "8.2.6"
SABCTOOLS_VERSION_REQUIRED = "9.3.1"
DB_HISTORY_VERSION = 1
DB_HISTORY_NAME = "history%s.db" % DB_HISTORY_VERSION
@@ -97,12 +97,19 @@ CONFIG_BACKUP_HTTPS = { # "basename": "associated setting"
}
# Constants affecting download performance
MAX_ASSEMBLER_QUEUE = 12
SOFT_QUEUE_LIMIT = 0.5
DEF_MAX_ASSEMBLER_QUEUE = 12
SOFT_ASSEMBLER_QUEUE_LIMIT = 0.5
# Percentage of cache to use before adding file to assembler
ASSEMBLER_WRITE_THRESHOLD = 5
NNTP_BUFFER_SIZE = int(800 * KIBI)
ASSEMBLER_WRITE_THRESHOLD_FACTOR_APPEND = 0.05
ASSEMBLER_WRITE_THRESHOLD_FACTOR_DIRECT_WRITE = 0.75
ASSEMBLER_MAX_WRITE_THRESHOLD_DIRECT_WRITE = int(1 * GIGI)
ASSEMBLER_DELAY_FACTOR_DIRECT_WRITE = 1.5
ASSEMBLER_WRITE_INTERVAL = 5.0
NNTP_BUFFER_SIZE = int(256 * KIBI)
NTTP_MAX_BUFFER_SIZE = int(10 * MEBI)
DEF_PIPELINING_REQUESTS = 1
# Article cache capacity factor to force a non-contiguous flush to disk
ARTICLE_CACHE_NON_CONTIGUOUS_FLUSH_PERCENTAGE = 0.9
REPAIR_PRIORITY = 3
FORCE_PRIORITY = 2

View File

@@ -27,7 +27,7 @@ import sys
import threading
import sqlite3
from sqlite3 import Connection, Cursor
from typing import Optional, List, Sequence, Dict, Any, Tuple, Union
from typing import Optional, Sequence, Any
import sabnzbd
import sabnzbd.cfg
@@ -114,6 +114,12 @@ class HistoryDB:
_ = self.execute("PRAGMA user_version = 5;") and self.execute(
"ALTER TABLE history ADD COLUMN time_added INTEGER;"
)
if version < 6:
_ = (
self.execute("PRAGMA user_version = 6;")
and self.execute("CREATE UNIQUE INDEX idx_history_nzo_id ON history(nzo_id);")
and self.execute("CREATE INDEX idx_history_archive_completed ON history(archive, completed DESC);")
)
HistoryDB.startup_done = True
@@ -160,8 +166,7 @@ class HistoryDB:
def create_history_db(self):
"""Create a new (empty) database file"""
self.execute(
"""
self.execute("""
CREATE TABLE history (
"id" INTEGER PRIMARY KEY,
"completed" INTEGER NOT NULL,
@@ -194,9 +199,10 @@ class HistoryDB:
"archive" INTEGER,
"time_added" INTEGER
)
"""
)
self.execute("PRAGMA user_version = 5;")
""")
self.execute("PRAGMA user_version = 6;")
self.execute("CREATE UNIQUE INDEX idx_history_nzo_id ON history(nzo_id);")
self.execute("CREATE INDEX idx_history_archive_completed ON history(archive, completed DESC);")
def close(self):
"""Close database connection"""
@@ -237,7 +243,7 @@ class HistoryDB:
self.execute("""UPDATE history SET status = ? WHERE nzo_id = ?""", (Status.COMPLETED, job))
logging.info("[%s] Marked job %s as completed", caller_name(), job)
def get_failed_paths(self, search: Optional[str] = None) -> List[str]:
def get_failed_paths(self, search: Optional[str] = None) -> list[str]:
"""Return list of all storage paths of failed jobs (may contain non-existing or empty paths)"""
search = convert_search(search)
fetch_ok = self.execute(
@@ -315,10 +321,10 @@ class HistoryDB:
limit: Optional[int] = None,
archive: Optional[bool] = None,
search: Optional[str] = None,
categories: Optional[List[str]] = None,
statuses: Optional[List[str]] = None,
nzo_ids: Optional[List[str]] = None,
) -> Tuple[List[Dict[str, Any]], int]:
categories: Optional[list[str]] = None,
statuses: Optional[list[str]] = None,
nzo_ids: Optional[list[str]] = None,
) -> tuple[list[dict[str, Any]], int]:
"""Return records for specified jobs"""
command_args = [convert_search(search)]
@@ -369,35 +375,36 @@ class HistoryDB:
def have_duplicate_key(self, duplicate_key: str) -> bool:
"""Check whether History contains this duplicate key"""
total = 0
if self.execute(
"""
SELECT COUNT(*)
FROM History
WHERE
duplicate_key = ? AND
STATUS != ?""",
SELECT EXISTS(
SELECT 1
FROM history
WHERE duplicate_key = ? AND status != ?
) as found
""",
(duplicate_key, Status.FAILED),
):
total = self.cursor.fetchone()["COUNT(*)"]
return total > 0
return bool(self.cursor.fetchone()["found"])
return False
def have_name_or_md5sum(self, name: str, md5sum: str) -> bool:
"""Check whether this name or md5sum is already in History"""
total = 0
if self.execute(
"""
SELECT COUNT(*)
FROM History
WHERE
( LOWER(name) = LOWER(?) OR md5sum = ? ) AND
STATUS != ?""",
SELECT EXISTS(
SELECT 1
FROM history
WHERE (name = ? COLLATE NOCASE OR md5sum = ?)
AND status != ?
) as found
""",
(name, md5sum, Status.FAILED),
):
total = self.cursor.fetchone()["COUNT(*)"]
return total > 0
return bool(self.cursor.fetchone()["found"])
return False
def get_history_size(self) -> Tuple[int, int, int]:
def get_history_size(self) -> tuple[int, int, int]:
"""Returns the total size of the history and
amounts downloaded in the last month and week
"""
@@ -457,7 +464,7 @@ class HistoryDB:
return path
return path
def get_other(self, nzo_id: str) -> Tuple[str, str, str, str, str]:
def get_other(self, nzo_id: str) -> tuple[str, str, str, str, str]:
"""Return additional data for job `nzo_id`"""
if self.execute("""SELECT * FROM history WHERE nzo_id = ?""", (nzo_id,)):
try:
@@ -498,9 +505,14 @@ def convert_search(search: str) -> str:
return search
def build_history_info(nzo, workdir_complete: str, postproc_time: int, script_output: str, script_line: str):
def build_history_info(
nzo: "sabnzbd.nzb.NzbObject",
workdir_complete: str,
postproc_time: int,
script_output: str,
script_line: str,
):
"""Collects all the information needed for the database"""
nzo: sabnzbd.nzbstuff.NzbObject
completed = int(time.time())
pp = PP_LOOKUP.get(opts_to_pp(nzo.repair, nzo.unpack, nzo.delete), "X")
@@ -554,7 +566,7 @@ def build_history_info(nzo, workdir_complete: str, postproc_time: int, script_ou
)
def unpack_history_info(item: sqlite3.Row) -> Dict[str, Any]:
def unpack_history_info(item: sqlite3.Row) -> dict[str, Any]:
"""Expands the single line stage_log from the DB
into a python dictionary for use in the history display
"""

View File

@@ -21,14 +21,11 @@ sabnzbd.decoder - article decoder
import logging
import hashlib
import binascii
from io import BytesIO
from zlib import crc32
from typing import Optional
import sabnzbd
from sabnzbd.constants import SABCTOOLS_VERSION_REQUIRED
from sabnzbd.encoding import ubtou
from sabnzbd.nzbstuff import Article
from sabnzbd.nzb import Article
from sabnzbd.misc import match_str
# Check for correct SABCTools version
@@ -50,7 +47,7 @@ except Exception:
class BadData(Exception):
def __init__(self, data: bytes):
def __init__(self, data: bytearray):
super().__init__()
self.data = data
@@ -63,8 +60,8 @@ class BadUu(Exception):
pass
def decode(article: Article, data_view: memoryview):
decoded_data = None
def decode(article: Article, decoder: sabctools.NNTPResponse):
decoded_data: Optional[bytearray] = None
nzo = article.nzf.nzo
art_id = article.article
@@ -78,10 +75,10 @@ def decode(article: Article, data_view: memoryview):
if sabnzbd.LOG_ALL:
logging.debug("Decoding %s", art_id)
if article.nzf.type == "uu":
decoded_data = decode_uu(article, bytes(data_view))
if decoder.format is sabctools.EncodingFormat.UU:
decoded_data = decode_uu(article, decoder)
else:
decoded_data = decode_yenc(article, data_view)
decoded_data = decode_yenc(article, decoder)
article_success = True
@@ -112,28 +109,18 @@ def decode(article: Article, data_view: memoryview):
except (BadYenc, ValueError):
# Handles precheck and badly formed articles
if nzo.precheck and data_view and data_view[:4] == b"223 ":
if nzo.precheck and decoder.status_code == 223:
# STAT was used, so we only get a status code
article_success = True
else:
# Try uu-decoding
if not nzo.precheck and article.nzf.type != "yenc":
try:
decoded_data = decode_uu(article, bytes(data_view))
logging.debug("Found uu-encoded article %s in job %s", art_id, nzo.final_name)
article_success = True
except Exception:
pass
# Only bother with further checks if uu-decoding didn't work out
if not article_success:
# Convert the first 2000 bytes of raw socket data to article lines,
# and examine the headers (for precheck) or body (for download).
for line in bytes(data_view[:2000]).split(b"\r\n"):
# Examine the headers (for precheck) or body (for download).
if lines := decoder.lines:
for line in lines:
lline = line.lower()
if lline.startswith(b"message-id:"):
if lline.startswith("message-id:"):
article_success = True
# Look for DMCA clues (while skipping "X-" headers)
if not lline.startswith(b"x-") and match_str(lline, (b"dmca", b"removed", b"cancel", b"blocked")):
if not lline.startswith("x-") and match_str(lline, ("dmca", "removed", "cancel", "blocked")):
article_success = False
logging.info("Article removed from server (%s)", art_id)
break
@@ -170,164 +157,65 @@ def decode(article: Article, data_view: memoryview):
sabnzbd.NzbQueue.register_article(article, article_success)
def decode_yenc(article: Article, data_view: memoryview) -> bytearray:
def decode_yenc(article: Article, response: sabctools.NNTPResponse) -> bytearray:
# Let SABCTools do all the heavy lifting
(
decoded_data,
yenc_filename,
article.file_size,
article.data_begin,
article.data_size,
crc_correct,
) = sabctools.yenc_decode(data_view)
decoded_data = response.data
article.file_size = response.file_size
article.data_begin = response.part_begin
article.data_size = response.part_size
article.decoded_size = response.bytes_decoded
nzf = article.nzf
# Assume it is yenc
nzf.type = "yenc"
# Only set the name if it was found and not obfuscated
if not nzf.filename_checked and yenc_filename:
if not nzf.filename_checked and (file_name := response.file_name):
# Set the md5-of-16k if this is the first article
if article.lowest_partnum:
nzf.md5of16k = hashlib.md5(decoded_data[:16384]).digest()
nzf.md5of16k = hashlib.md5(memoryview(decoded_data)[:16384]).digest()
# Try the rename, even if it's not the first article
# For example when the first article was missing
nzf.nzo.verify_nzf_filename(nzf, yenc_filename)
nzf.nzo.verify_nzf_filename(nzf, file_name)
# CRC check
if crc_correct is None:
if (crc := response.crc) is None:
logging.info("CRC Error in %s", article.article)
raise BadData(decoded_data)
article.crc32 = crc_correct
article.crc32 = crc
return decoded_data
def decode_uu(article: Article, raw_data: bytes) -> bytes:
"""Try to uu-decode an article. The raw_data may or may not contain headers.
If there are headers, they will be separated from the body by at least one
empty line. In case of no headers, the first line seems to always be the nntp
response code (220/222) directly followed by the msg body."""
if not raw_data:
def decode_uu(article: Article, response: sabctools.NNTPResponse) -> bytearray:
"""Process a uu-decoded response"""
if not response.bytes_decoded:
logging.debug("No data to decode")
raise BadUu
# Line up the raw_data
raw_data = raw_data.split(b"\r\n")
if response.baddata:
raise BadData(response.data)
# Index of the uu payload start in raw_data
uu_start = 0
# Limit the number of lines to check for the onset of uu data
limit = min(len(raw_data), 32) - 1
if limit < 3:
logging.debug("Article too short to contain valid uu-encoded data")
raise BadUu
# Try to find an empty line separating the body from headers or response
# code and set the expected payload start to the next line.
try:
uu_start = raw_data[:limit].index(b"") + 1
except ValueError:
# No empty line, look for a response code instead
if raw_data[0].startswith(b"220 ") or raw_data[0].startswith(b"222 "):
uu_start = 1
else:
# Invalid data?
logging.debug("Failed to locate start of uu payload")
raise BadUu
def is_uu_junk(line: bytes) -> bool:
"""Determine if the line is empty or contains known junk data"""
return (not line) or line == b"-- " or line.startswith(b"Posted via ")
# Check the uu 'begin' line
if article.lowest_partnum:
try:
# Make sure the line after the uu_start one isn't empty as well or
# detection of the 'begin' line won't work. For articles other than
# lowest_partnum, filtering out empty lines (and other junk) can
# wait until the actual decoding step.
for index in range(uu_start, limit):
if is_uu_junk(raw_data[index]):
uu_start = index + 1
else:
# Bingo
break
else:
# Search reached the limit
raise IndexError
uu_begin_data = raw_data[uu_start].split(b" ")
# Filename may contain spaces
uu_filename = ubtou(b" ".join(uu_begin_data[2:]).strip())
# Sanity check the 'begin' line
if (
len(uu_begin_data) < 3
or uu_begin_data[0].lower() != b"begin"
or (not int(uu_begin_data[1], 8))
or (not uu_filename)
):
raise ValueError
# Consider this enough proof to set the type, avoiding further
# futile attempts at decoding articles in this nzf as yenc.
article.nzf.type = "uu"
# Bump the pointer for the payload to the next line
uu_start += 1
except Exception:
logging.debug("Missing or invalid uu 'begin' line: %s", raw_data[uu_start] if uu_start < limit else None)
raise BadUu
# Do the actual decoding
with BytesIO() as decoded_data:
for line in raw_data[uu_start:]:
# Ignore junk
if is_uu_junk(line):
continue
# End of the article
if line in (b"`", b"end", b"."):
break
# Remove dot stuffing
if line.startswith(b".."):
line = line[1:]
try:
decoded_line = binascii.a2b_uu(line)
except binascii.Error as msg:
try:
# Workaround for broken uuencoders by Fredrik Lundh
nbytes = (((line[0] - 32) & 63) * 4 + 5) // 3
decoded_line = binascii.a2b_uu(line[:nbytes])
except Exception as msg2:
logging.info(
"Error while uu-decoding %s: %s (line: %s; workaround: %s)", article.article, msg, line, msg2
)
raise BadData(decoded_data.getvalue())
# Store the decoded data
decoded_data.write(decoded_line)
# Set the type to uu; the latter is still needed in
# case the lowest_partnum article was damaged or slow to download.
article.nzf.type = "uu"
decoded_data = response.data
article.decoded_size = response.bytes_decoded
nzf = article.nzf
nzf.type = "uu"
# Only set the name if it was found and not obfuscated
if not nzf.filename_checked and (file_name := response.file_name):
# Set the md5-of-16k if this is the first article
if article.lowest_partnum:
decoded_data.seek(0)
article.nzf.md5of16k = hashlib.md5(decoded_data.read(16384)).digest()
# Handle the filename
if not article.nzf.filename_checked and uu_filename:
article.nzf.nzo.verify_nzf_filename(article.nzf, uu_filename)
nzf.md5of16k = hashlib.md5(memoryview(decoded_data)[:16384]).digest()
data = decoded_data.getvalue()
article.crc32 = crc32(data)
return data
# Try the rename, even if it's not the first article
# For example when the first article was missing
nzf.nzo.verify_nzf_filename(nzf, file_name)
article.crc32 = response.crc
return decoded_data
def search_new_server(article: Article) -> bool:

View File

@@ -20,10 +20,9 @@
##############################################################################
import time
import functools
from typing import Union, Callable
from typing import Union, Callable, Any
from threading import Lock, RLock, Condition
# All operations that modify the queue need to happen in a lock
# Also used when importing NZBs to prevent IO-race conditions
# Names of wrapper-functions should be the same in misc.caller_name
@@ -35,15 +34,21 @@ DOWNLOADER_CV = Condition(NZBQUEUE_LOCK)
DOWNLOADER_LOCK = RLock()
def synchronized(lock: Union[Lock, RLock]):
def synchronized(lock: Union[Lock, RLock, Condition, None] = None):
def wrap(func: Callable):
def call_func(*args, **kw):
# Using the try/finally approach is 25% faster compared to using "with lock"
# Either use the supplied lock or the object-specific one
# Because it's a variable in the upper function, we cannot use it directly
lock_obj = lock
if not lock_obj:
lock_obj = getattr(args[0], "lock")
# Using try/finally is ~25% faster than "with lock"
try:
lock.acquire()
lock_obj.acquire()
return func(*args, **kw)
finally:
lock.release()
lock_obj.release()
return call_func
@@ -70,7 +75,7 @@ def conditional_cache(cache_time: int):
Empty results (None, empty collections, empty strings, False, 0) are not cached.
If a keyword argument of `force=True` is used, the cache is skipped.
Unhashable types (such as List) can not be used as an input to the wrapped function in the current implementation!
Unhashable types (such as list) can not be used as an input to the wrapped function in the current implementation!
:param cache_time: Time in seconds to cache non-empty results
"""

19
sabnzbd/deobfuscate_filenames.py Executable file → Normal file
View File

@@ -27,6 +27,7 @@ files to the job-name in the queue if the filename looks obfuscated
Based on work by P1nGu1n
"""
import hashlib
import logging
import os
@@ -38,14 +39,13 @@ from sabnzbd.par2file import is_par2_file, parse_par2_file
import sabnzbd.utils.file_extension as file_extension
from sabnzbd.misc import match_str
from sabnzbd.constants import IGNORED_MOVIE_FOLDERS
from typing import List
# Files to exclude and minimal file size for renaming
EXCLUDED_FILE_EXTS = (".vob", ".rar", ".par2", ".mts", ".m2ts", ".cpi", ".clpi", ".mpl", ".mpls", ".bdm", ".bdmv")
MIN_FILE_SIZE = 10 * 1024 * 1024
def decode_par2(parfile: str) -> List[str]:
def decode_par2(parfile: str) -> list[str]:
"""Parse a par2 file and rename files listed in the par2 to their real name. Return list of generated files"""
# Check if really a par2 file
if not is_par2_file(parfile):
@@ -77,7 +77,7 @@ def decode_par2(parfile: str) -> List[str]:
return new_files
def recover_par2_names(filelist: List[str]) -> List[str]:
def recover_par2_names(filelist: list[str]) -> list[str]:
"""Find par2 files and use them for renaming"""
# Check that files exists
filelist = [f for f in filelist if os.path.isfile(f)]
@@ -168,7 +168,7 @@ def is_probably_obfuscated(myinputfilename: str) -> bool:
return True # default is obfuscated
def get_biggest_file(filelist: List[str]) -> str:
def get_biggest_file(filelist: list[str]) -> str:
"""Returns biggest file if that file is much bigger than the other files
If only one file exists, return that. If no file, return None
Note: the files in filelist must exist, because their sizes on disk are checked"""
@@ -190,7 +190,7 @@ def get_biggest_file(filelist: List[str]) -> str:
return None
def deobfuscate(nzo, filelist: List[str], usefulname: str) -> List[str]:
def deobfuscate(nzo: "sabnzbd.nzb.NzbObject", filelist: list[str], usefulname: str) -> list[str]:
"""
For files in filelist:
1. if a file has no meaningful extension, add it (for example ".txt" or ".png")
@@ -228,9 +228,6 @@ def deobfuscate(nzo, filelist: List[str], usefulname: str) -> List[str]:
"""
# Can't be imported directly due to circular import
nzo: sabnzbd.nzbstuff.NzbObject
# to be sure, only keep really existing files and remove any duplicates:
filtered_filelist = list(set(f for f in filelist if os.path.isfile(f)))
@@ -321,7 +318,7 @@ def without_extension(fullpathfilename: str) -> str:
return os.path.splitext(fullpathfilename)[0]
def deobfuscate_subtitles(nzo, filelist: List[str]):
def deobfuscate_subtitles(nzo: "sabnzbd.nzb.NzbObject", filelist: list[str]):
"""
input:
nzo, so we can update result via set_unpack_info()
@@ -346,10 +343,6 @@ def deobfuscate_subtitles(nzo, filelist: List[str]):
Something.else.txt
"""
# Can't be imported directly due to circular import
nzo: sabnzbd.nzbstuff.NzbObject
# find .srt files
if not (srt_files := [f for f in filelist if f.endswith(".srt")]):
logging.debug("No .srt files found, so nothing to do")

View File

@@ -25,19 +25,18 @@ import subprocess
import time
import threading
import logging
from typing import Optional, Dict, List, Tuple
from typing import Optional
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.misc import int_conv, format_time_string, build_and_run_command
from sabnzbd.filesystem import remove_all, real_path, remove_file, get_basename, clip_path
from sabnzbd.nzbstuff import NzbObject, NzbFile
from sabnzbd.nzb import NzbFile, NzbObject
from sabnzbd.encoding import platform_btou
from sabnzbd.decorators import synchronized
from sabnzbd.newsunpack import RAR_EXTRACTFROM_RE, RAR_EXTRACTED_RE, rar_volumelist, add_time_left
from sabnzbd.postproc import prepare_extraction_path
from sabnzbd.misc import SABRarFile
import rarfile
from sabnzbd.utils.diskspeed import diskspeedmeasure
# Need a lock to make sure start and stop is handled correctly
@@ -62,11 +61,11 @@ class DirectUnpacker(threading.Thread):
self.rarfile_nzf: Optional[NzbFile] = None
self.cur_setname: Optional[str] = None
self.cur_volume: int = 0
self.total_volumes: Dict[str, int] = {}
self.total_volumes: dict[str, int] = {}
self.unpack_time: float = 0.0
self.success_sets: Dict[str, Tuple[List[str], List[str]]] = {}
self.next_sets: List[NzbFile] = []
self.success_sets: dict[str, tuple[list[str], list[str]]] = {}
self.next_sets: list[NzbFile] = []
self.duplicate_lines: int = 0

View File

@@ -23,7 +23,7 @@ import asyncio
import os
import logging
import threading
from typing import Generator, Set, Optional, Tuple
from typing import Generator, Optional
import sabnzbd
from sabnzbd.constants import SCAN_FILE_NAME, VALID_ARCHIVES, VALID_NZB_FILES, AddNzbFileResult
@@ -128,7 +128,7 @@ class DirScanner(threading.Thread):
def get_suspected_files(
self, folder: str, catdir: Optional[str] = None
) -> Generator[Tuple[str, Optional[str], Optional[os.stat_result]], None, None]:
) -> Generator[tuple[str, Optional[str], Optional[os.stat_result]], None, None]:
"""Generator listing possible paths to NZB files"""
if catdir is None:
@@ -222,17 +222,15 @@ class DirScanner(threading.Thread):
async def scan_async(self, dirscan_dir: str):
"""Do one scan of the watched folder"""
# On Python 3.8 we first need an event loop before we can create a asyncio.Lock
if not self.lock:
with DIR_SCANNER_LOCK:
self.lock = asyncio.Lock()
with DIR_SCANNER_LOCK:
self.lock = asyncio.Lock()
async with self.lock:
if sabnzbd.PAUSED_ALL:
return
files: Set[str] = set()
futures: Set[asyncio.Task] = set()
files: set[str] = set()
futures: set[asyncio.Task] = set()
for path, catdir, stat_tuple in self.get_suspected_files(dirscan_dir):
files.add(path)

View File

@@ -19,25 +19,26 @@
sabnzbd.downloader - download engine
"""
import select
import logging
import selectors
from collections import deque
from threading import Thread, RLock, current_thread
import socket
import sys
import ssl
import time
from datetime import date
from typing import List, Dict, Optional, Union, Set
from typing import Optional, Union, Deque, Callable
import sabctools
import sabnzbd
from sabnzbd.decorators import synchronized, NzbQueueLocker, DOWNLOADER_CV, DOWNLOADER_LOCK
from sabnzbd.newswrapper import NewsWrapper, NNTPPermanentError
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.misc import from_units, helpful_warning, int_conv, MultiAddQueue
from sabnzbd.misc import from_units, helpful_warning, int_conv, MultiAddQueue, to_units
from sabnzbd.get_addrinfo import get_fastest_addrinfo, AddrInfo
from sabnzbd.constants import SOFT_QUEUE_LIMIT
# Timeout penalty in minutes for each cause
_PENALTY_UNKNOWN = 3 # Unknown cause
@@ -82,6 +83,7 @@ class Server:
"retention",
"username",
"password",
"pipelining_requests",
"busy_threads",
"next_busy_threads_check",
"idle_threads",
@@ -110,6 +112,7 @@ class Server:
use_ssl,
ssl_verify,
ssl_ciphers,
pipelining_requests,
username=None,
password=None,
required=False,
@@ -134,10 +137,11 @@ class Server:
self.retention: int = retention
self.username: Optional[str] = username
self.password: Optional[str] = password
self.pipelining_requests: Callable[[], int] = pipelining_requests
self.busy_threads: Set[NewsWrapper] = set()
self.busy_threads: set[NewsWrapper] = set()
self.next_busy_threads_check: float = 0
self.idle_threads: Set[NewsWrapper] = set()
self.idle_threads: set[NewsWrapper] = set()
self.next_article_search: float = 0
self.active: bool = True
self.bad_cons: int = 0
@@ -148,7 +152,7 @@ class Server:
self.request: bool = False # True if a getaddrinfo() request is pending
self.have_body: bool = True # Assume server has "BODY", until proven otherwise
self.have_stat: bool = True # Assume server has "STAT", until proven otherwise
self.article_queue: List[sabnzbd.nzbstuff.Article] = []
self.article_queue: Deque[sabnzbd.nzb.Article] = deque()
# Skip during server testing
if threads:
@@ -167,25 +171,24 @@ class Server:
def stop(self):
"""Remove all connections and cached articles from server"""
for nw in self.idle_threads:
sabnzbd.Downloader.remove_socket(nw)
nw.hard_reset()
self.idle_threads = set()
self.reset_article_queue()
@synchronized(DOWNLOADER_LOCK)
def get_article(self):
def get_article(self, peek: bool = False):
"""Get article from pre-fetched and pre-fetch new ones if necessary.
Articles that are too old for this server are immediately marked as tried"""
if self.article_queue:
return self.article_queue.pop(0)
return self.article_queue[0] if peek else self.article_queue.popleft()
if self.next_article_search < time.time():
# Pre-fetch new articles
self.article_queue = sabnzbd.NzbQueue.get_articles(self, sabnzbd.Downloader.servers, _ARTICLE_PREFETCH)
sabnzbd.NzbQueue.get_articles(self, sabnzbd.Downloader.servers, _ARTICLE_PREFETCH)
if self.article_queue:
article = self.article_queue.pop(0)
article = self.article_queue[0] if peek else self.article_queue.popleft()
# Mark expired articles as tried on this server
if self.retention and article.nzf.nzo.avg_stamp < time.time() - self.retention:
if not peek and self.retention and article.nzf.nzo.avg_stamp < time.time() - self.retention:
sabnzbd.Downloader.decode(article)
while self.article_queue:
sabnzbd.Downloader.decode(self.article_queue.pop())
@@ -201,9 +204,12 @@ class Server:
"""Reset articles queued for the Server. Locked to prevent
articles getting stuck in the Server when enabled/disabled"""
logging.debug("Resetting article queue for %s (%s)", self, self.article_queue)
for article in self.article_queue:
article.allow_new_fetcher()
self.article_queue = []
while self.article_queue:
try:
article = self.article_queue.popleft()
article.allow_new_fetcher()
except IndexError:
pass
def request_addrinfo(self):
"""Launch async request to resolve server address and select the fastest.
@@ -250,7 +256,7 @@ class Downloader(Thread):
"shutdown",
"server_restarts",
"force_disconnect",
"read_fds",
"selector",
"servers",
"timers",
"last_max_chunk_size",
@@ -290,10 +296,15 @@ class Downloader(Thread):
self.force_disconnect: bool = False
self.read_fds: Dict[int, NewsWrapper] = {}
# macOS/BSD will default to KqueueSelector, it's very efficient but produces separate events for READ and WRITE.
# Which causes problems when two receive threads are both trying to use the connection while it is resetting.
if selectors.DefaultSelector is getattr(selectors, "KqueueSelector", None):
self.selector: selectors.BaseSelector = selectors.PollSelector()
else:
self.selector: selectors.BaseSelector = selectors.DefaultSelector()
self.servers: List[Server] = []
self.timers: Dict[str, List[float]] = {}
self.servers: list[Server] = []
self.timers: dict[str, list[float]] = {}
for server in config.get_servers():
self.init_server(None, server)
@@ -319,6 +330,7 @@ class Downloader(Thread):
ssl = srv.ssl()
ssl_verify = srv.ssl_verify()
ssl_ciphers = srv.ssl_ciphers()
pipelining_requests = srv.pipelining_requests
username = srv.username()
password = srv.password()
required = srv.required()
@@ -349,6 +361,7 @@ class Downloader(Thread):
ssl,
ssl_verify,
ssl_ciphers,
pipelining_requests,
username,
password,
required,
@@ -361,15 +374,39 @@ class Downloader(Thread):
self.servers.sort(key=lambda svr: "%02d%s" % (svr.priority, svr.displayname.lower()))
@synchronized(DOWNLOADER_LOCK)
def add_socket(self, fileno: int, nw: NewsWrapper):
"""Add a socket ready to be used to the list to be watched"""
self.read_fds[fileno] = nw
def add_socket(self, nw: NewsWrapper):
"""Add a socket to be watched for read or write availability"""
if nw.nntp:
nw.server.idle_threads.discard(nw)
nw.server.busy_threads.add(nw)
try:
self.selector.register(nw.nntp.fileno, selectors.EVENT_READ | selectors.EVENT_WRITE, nw)
nw.selector_events = selectors.EVENT_READ | selectors.EVENT_WRITE
except KeyError:
pass
@synchronized(DOWNLOADER_LOCK)
def modify_socket(self, nw: NewsWrapper, events: int):
"""Modify the events socket are watched for"""
if nw.nntp and nw.selector_events != events and not nw.blocking:
try:
self.selector.modify(nw.nntp.fileno, events, nw)
nw.selector_events = events
except KeyError:
pass
@synchronized(DOWNLOADER_LOCK)
def remove_socket(self, nw: NewsWrapper):
"""Remove a socket to be watched"""
if nw.nntp:
self.read_fds.pop(nw.nntp.fileno, None)
nw.server.busy_threads.discard(nw)
nw.server.idle_threads.add(nw)
nw.timeout = None
try:
self.selector.unregister(nw.nntp.fileno)
nw.selector_events = 0
except KeyError:
pass
@NzbQueueLocker
def set_paused_state(self, state: bool):
@@ -409,8 +446,9 @@ class Downloader(Thread):
@NzbQueueLocker
def resume_from_postproc(self):
logging.info("Post-processing finished, resuming download")
self.paused_for_postproc = False
if self.paused_for_postproc:
logging.info("Post-processing finished, resuming download")
self.paused_for_postproc = False
@NzbQueueLocker
def disconnect(self):
@@ -451,6 +489,15 @@ class Downloader(Thread):
self.bandwidth_perc = 0
self.bandwidth_limit = 0
# Increase limits for faster connections
if limit > from_units("150M"):
if cfg.receive_threads() == cfg.receive_threads.default:
cfg.receive_threads.set(4)
logging.info("Receive threads set to 4")
if cfg.assembler_max_queue_size() == cfg.assembler_max_queue_size.default:
cfg.assembler_max_queue_size.set(30)
logging.info("Assembler max_queue_size set to 30")
def sleep_time_set(self):
self.sleep_time = cfg.downloader_sleep_time() * 0.0001
logging.debug("Sleep time: %f seconds", self.sleep_time)
@@ -499,26 +546,30 @@ class Downloader(Thread):
# Remove all connections to server
for nw in server.idle_threads | server.busy_threads:
self.__reset_nw(nw, "Forcing disconnect", warn=False, wait=False, retry_article=False)
self.reset_nw(nw, "Forcing disconnect", warn=False, wait=False, retry_article=False)
# Make sure server address resolution is refreshed
server.addrinfo = None
@staticmethod
def decode(article, data_view: Optional[memoryview] = None):
def decode(article: "sabnzbd.nzb.Article", response: Optional[sabctools.NNTPResponse] = None):
"""Decode article"""
# Need a better way of draining requests
if article.nzf.nzo.removed_from_queue:
return
# Article was requested and fetched, update article stats for the server
sabnzbd.BPSMeter.register_server_article_tried(article.fetcher.id)
# Handle broken articles directly
if not data_view:
if not response or not response.bytes_decoded and not article.nzf.nzo.precheck:
if not article.search_new_server():
article.nzf.nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
return
# Decode and send to article cache
sabnzbd.decoder.decode(article, data_view)
sabnzbd.decoder.decode(article, response)
def run(self):
# Warn if there are servers defined, but none are valid
@@ -538,7 +589,7 @@ class Downloader(Thread):
for _ in range(cfg.receive_threads()):
# Started as daemon, so we don't need any shutdown logic in the worker
# The Downloader code will make sure shutdown is handled gracefully
Thread(target=self.process_nw_worker, args=(self.read_fds, process_nw_queue), daemon=True).start()
Thread(target=self.process_nw_worker, args=(process_nw_queue,), daemon=True).start()
# Catch all errors, just in case
try:
@@ -560,9 +611,9 @@ class Downloader(Thread):
if (nw.nntp and nw.nntp.error_msg) or (nw.timeout and now > nw.timeout):
if nw.nntp and nw.nntp.error_msg:
# Already showed error
self.__reset_nw(nw)
self.reset_nw(nw)
else:
self.__reset_nw(nw, "Timed out", warn=True)
self.reset_nw(nw, "Timed out", warn=True)
server.bad_cons += 1
self.maybe_block_server(server)
@@ -602,16 +653,15 @@ class Downloader(Thread):
server.request_addrinfo()
break
nw.article = server.get_article()
if not nw.article:
if not server.get_article(peek=True):
break
server.idle_threads.remove(nw)
server.busy_threads.add(nw)
if nw.connected:
self.__request_article(nw)
else:
# Assign a request immediately if NewsWrapper is ready, if we wait until the socket is
# selected all idle connections will be activated when there may only be one request
nw.prepare_request()
self.add_socket(nw)
elif not nw.nntp:
try:
logging.info("%s@%s: Initiating connection", nw.thrdnum, server.host)
nw.init_connect()
@@ -622,14 +672,14 @@ class Downloader(Thread):
server.host,
sys.exc_info()[1],
)
self.__reset_nw(nw, "Failed to initialize", warn=True)
self.reset_nw(nw, "Failed to initialize", warn=True)
if self.force_disconnect or self.shutdown:
for server in self.servers:
for nw in server.idle_threads | server.busy_threads:
# Send goodbye if we have open socket
if nw.nntp:
self.__reset_nw(nw, "Forcing disconnect", wait=False, count_article_try=False)
self.reset_nw(nw, "Forcing disconnect", wait=False, count_article_try=False)
# Make sure server address resolution is refreshed
server.addrinfo = None
server.reset_article_queue()
@@ -653,10 +703,13 @@ class Downloader(Thread):
self.last_max_chunk_size = 0
# Use select to find sockets ready for reading/writing
if readkeys := self.read_fds.keys():
read, _, _ = select.select(readkeys, (), (), 1.0)
if self.selector.get_map():
if events := self.selector.select(timeout=1.0):
for key, ev in events:
nw = key.data
process_nw_queue.put((nw, ev, nw.generation))
else:
read = []
events = []
BPSMeter.reset()
time.sleep(0.1)
self.max_chunk_size = _DEFAULT_CHUNK_SIZE
@@ -675,187 +728,140 @@ class Downloader(Thread):
next_bpsmeter_update = now + _BPSMETER_UPDATE_DELAY
self.check_assembler_levels()
if not read:
if not events:
continue
# Submit all readable sockets to be processed and wait for completion
process_nw_queue.put_multiple(read)
# Wait for socket operation completion
process_nw_queue.join()
except Exception:
logging.error(T("Fatal error in Downloader"), exc_info=True)
def process_nw_worker(self, read_fds: Dict[int, NewsWrapper], nw_queue: MultiAddQueue):
def process_nw_worker(self, nw_queue: MultiAddQueue):
"""Worker for the daemon thread to process results.
Wrapped in try/except because in case of an exception, logging
might get lost and the queue.join() would block forever."""
try:
logging.debug("Starting Downloader receive thread: %s", current_thread().name)
while True:
# The read_fds is passed by reference, so we can access its items!
self.process_nw(read_fds[nw_queue.get()])
self.process_nw(*nw_queue.get())
nw_queue.task_done()
except Exception:
# We cannot break out of the Downloader from here, so just pause
logging.error(T("Fatal error in Downloader"), exc_info=True)
self.pause()
def process_nw(self, nw: NewsWrapper):
def process_nw(self, nw: NewsWrapper, event: int, generation: int):
"""Receive data from a NewsWrapper and handle the response"""
try:
bytes_received, end_of_line, article_done = nw.recv_chunk()
except ssl.SSLWantReadError:
return
except (ConnectionError, ConnectionAbortedError):
# The ConnectionAbortedError is also thrown by sabctools in case of fatal SSL-layer problems
self.__reset_nw(nw, "Server closed connection", wait=False)
return
except BufferError:
# The BufferError is thrown when exceeding maximum buffer size
# Make sure to discard the article
self.__reset_nw(nw, "Maximum data buffer size exceeded", wait=False, retry_article=False)
# Drop stale items
if nw.generation != generation:
return
# Read on EVENT_READ, or on EVENT_WRITE if TLS needs a write to complete a read
if (event & selectors.EVENT_READ) or (event & selectors.EVENT_WRITE and nw.tls_wants_write):
self.process_nw_read(nw, generation)
# If read caused a reset, don't proceed to write
if nw.generation != generation:
return
# The read may have removed the socket, so prevent calling prepare_request again
if not (nw.selector_events & selectors.EVENT_WRITE):
return
# Only attempt app-level writes if TLS is not blocked
if (event & selectors.EVENT_WRITE) and not nw.tls_wants_write:
nw.write()
def process_nw_read(self, nw: NewsWrapper, generation: int) -> None:
bytes_received: int = 0
bytes_pending: int = 0
while (
nw.connected
and nw.generation == generation
and not self.force_disconnect
and not self.shutdown
and not (nw.timeout and time.time() > nw.timeout)
):
try:
n, bytes_pending = nw.read(nbytes=bytes_pending, generation=generation)
bytes_received += n
nw.tls_wants_write = False
except ssl.SSLWantReadError:
return
except ssl.SSLWantWriteError:
# TLS needs to write handshake/key-update data before we can continue reading
nw.tls_wants_write = True
self.modify_socket(nw, selectors.EVENT_READ | selectors.EVENT_WRITE)
return
except (ConnectionError, ConnectionAbortedError):
# The ConnectionAbortedError is also thrown by sabctools in case of fatal SSL-layer problems
self.reset_nw(nw, "Server closed connection", wait=False)
return
except BufferError:
# The BufferError is thrown when exceeding maximum buffer size
# Make sure to discard the article
self.reset_nw(nw, "Maximum data buffer size exceeded", wait=False, retry_article=False)
return
if not bytes_pending:
break
# Ignore metrics for reset connections
if nw.generation != generation:
return
article = nw.article
server = nw.server
with DOWNLOADER_LOCK:
sabnzbd.BPSMeter.update(server.id, bytes_received)
if bytes_received > self.last_max_chunk_size:
self.last_max_chunk_size = bytes_received
# Update statistics only when we fetched a whole article
# The side effect is that we don't count things like article-not-available messages
if article_done:
article.nzf.nzo.update_download_stats(sabnzbd.BPSMeter.bps, server.id, nw.data_position)
# Check speedlimit
if (
self.bandwidth_limit
and sabnzbd.BPSMeter.bps + sabnzbd.BPSMeter.sum_cached_amount > self.bandwidth_limit
):
sabnzbd.BPSMeter.update()
while sabnzbd.BPSMeter.bps > self.bandwidth_limit:
while self.bandwidth_limit and sabnzbd.BPSMeter.bps > self.bandwidth_limit:
time.sleep(0.01)
sabnzbd.BPSMeter.update()
# If we are not at the end of a line, more data will follow
if not end_of_line:
return
# Response code depends on request command:
# 220 = ARTICLE, 222 = BODY
if nw.status_code not in (220, 222) and not article_done:
if not nw.connected or nw.status_code == 480:
if not self.__finish_connect_nw(nw):
return
if nw.connected:
logging.info("Connecting %s@%s finished", nw.thrdnum, nw.server.host)
self.__request_article(nw)
elif nw.status_code == 223:
article_done = True
logging.debug("Article <%s> is present", article.article)
elif nw.status_code in (411, 423, 430, 451):
article_done = True
logging.debug(
"Thread %s@%s: Article %s missing (error=%s)",
nw.thrdnum,
nw.server.host,
article.article,
nw.status_code,
)
nw.reset_data_buffer()
elif nw.status_code == 500:
if article.nzf.nzo.precheck:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug("Server %s does not support STAT", server.host)
else:
# Assume "BODY" command is not supported
server.have_body = False
logging.debug("Server %s does not support BODY", server.host)
nw.reset_data_buffer()
self.__request_article(nw)
else:
# Don't warn for (internal) server errors during downloading
if nw.status_code not in (400, 502, 503):
logging.warning(
T("%s@%s: Received unknown status code %s for article %s"),
nw.thrdnum,
nw.server.host,
nw.status_code,
article.article,
)
# Ditch this thread, we don't know what data we got now so the buffer can be bad
self.__reset_nw(nw, f"Server error or unknown status code: {nw.status_code}", wait=False)
return
if article_done:
# Successful data, clear "bad" counter
server.bad_cons = 0
server.errormsg = server.warning = ""
# Decode
self.decode(article, nw.data_view[: nw.data_position])
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s done", nw.thrdnum, server.host, article.article)
# Reset connection for new activity
nw.soft_reset()
# Request a new article immediately if possible
if (
nw.connected
and server.active
and not server.restart
and not (self.paused or self.shutdown or self.paused_for_postproc)
):
nw.article = server.get_article()
if nw.article:
self.__request_article(nw)
return
# Make socket available again
server.busy_threads.discard(nw)
server.idle_threads.add(nw)
self.remove_socket(nw)
def check_assembler_levels(self):
"""Check the Assembler queue to see if we need to delay, depending on queue size"""
if (assembler_level := sabnzbd.Assembler.queue_level()) > SOFT_QUEUE_LIMIT:
time.sleep(min((assembler_level - SOFT_QUEUE_LIMIT) / 4, 0.15))
sabnzbd.BPSMeter.delayed_assembler += 1
logged_counter = 0
if not sabnzbd.Assembler.is_busy() or (delay := sabnzbd.Assembler.delay()) <= 0:
return
time.sleep(delay)
sabnzbd.BPSMeter.delayed_assembler += 1
start_time = time.monotonic()
deadline = start_time + 5
next_log = start_time + 1.0
logged_counter = 0
while not self.shutdown and sabnzbd.Assembler.queue_level() >= 1:
# Only log/update once every second, to not waste any CPU-cycles
if not logged_counter % 10:
# Make sure the BPS-meter is updated
sabnzbd.BPSMeter.update()
# Update who is delaying us
logging.debug(
"Delayed - %d seconds - Assembler queue: %d",
logged_counter / 10,
sabnzbd.Assembler.queue.qsize(),
)
# Wait and update the queue sizes
time.sleep(0.1)
while not self.shutdown and sabnzbd.Assembler.is_busy() and time.monotonic() < deadline:
if (delay := sabnzbd.Assembler.delay()) <= 0:
break
# Sleep for the current delay (but cap to remaining time)
sleep_time = max(0.001, min(delay, deadline - time.monotonic()))
time.sleep(sleep_time)
# Make sure the BPS-meter is updated
sabnzbd.BPSMeter.update()
# Only log/update once every second
if time.monotonic() >= next_log:
logged_counter += 1
logging.debug(
"Delayed - %d seconds - Assembler queue: %s",
logged_counter,
to_units(sabnzbd.Assembler.total_ready_bytes()),
)
next_log += 1.0
@synchronized(DOWNLOADER_LOCK)
def __finish_connect_nw(self, nw: NewsWrapper) -> bool:
def finish_connect_nw(self, nw: NewsWrapper, response: sabctools.NNTPResponse) -> bool:
server = nw.server
try:
nw.finish_connect(nw.status_code)
nw.finish_connect(response.status_code, response.message)
if sabnzbd.LOG_ALL:
logging.debug("%s@%s last message -> %s", nw.thrdnum, server.host, nw.nntp_msg)
nw.reset_data_buffer()
logging.debug("%s@%s last message -> %d", nw.thrdnum, server.host, response.status_code)
except NNTPPermanentError as error:
# Handle login problems
block = False
@@ -868,7 +874,7 @@ class Downloader(Thread):
errormsg = T("Too many connections to server %s [%s]") % (server.host, error.msg)
if server.active:
# Don't count this for the tries (max_art_tries) on this server
self.__reset_nw(nw)
self.reset_nw(nw)
self.plan_server(server, _PENALTY_TOOMANY)
elif error.code in (502, 481, 482) and clues_too_many_ip(error.msg):
# Login from (too many) different IP addresses
@@ -918,7 +924,7 @@ class Downloader(Thread):
if penalty and (block or server.optional):
self.plan_server(server, penalty)
# Note that the article is discard for this server if the server is not required
self.__reset_nw(nw, retry_article=retry_article)
self.reset_nw(nw, retry_article=retry_article)
return False
except Exception as err:
logging.error(
@@ -929,11 +935,11 @@ class Downloader(Thread):
)
logging.info("Traceback: ", exc_info=True)
# No reset-warning needed, above logging is sufficient
self.__reset_nw(nw, retry_article=False)
self.reset_nw(nw, retry_article=False)
return True
@synchronized(DOWNLOADER_LOCK)
def __reset_nw(
def reset_nw(
self,
nw: NewsWrapper,
reset_msg: Optional[str] = None,
@@ -941,6 +947,7 @@ class Downloader(Thread):
wait: bool = True,
count_article_try: bool = True,
retry_article: bool = True,
article: Optional["sabnzbd.nzb.Article"] = None,
):
# Some warnings are errors, and not added as server.warning
if warn and reset_msg:
@@ -949,27 +956,8 @@ class Downloader(Thread):
elif reset_msg:
logging.debug("Thread %s@%s: %s", nw.thrdnum, nw.server.host, reset_msg)
# Make sure this NewsWrapper is in the idle threads
nw.server.busy_threads.discard(nw)
nw.server.idle_threads.add(nw)
# Make sure it is not in the readable sockets
self.remove_socket(nw)
if nw.article and not nw.article.nzf.nzo.removed_from_queue:
# Only some errors should count towards the total tries for each server
if count_article_try:
nw.article.tries += 1
# Do we discard, or try again for this server
if not retry_article or (not nw.server.required and nw.article.tries > cfg.max_art_tries()):
# Too many tries on this server, consider article missing
self.decode(nw.article)
nw.article.tries = 0
else:
# Allow all servers again for this article
# Do not use the article_queue, as the server could already have been disabled when we get here!
nw.article.allow_new_fetcher()
# Discard the article request which failed
nw.discard(article, count_article_try=count_article_try, retry_article=retry_article)
# Reset connection object
nw.hard_reset(wait)
@@ -977,21 +965,6 @@ class Downloader(Thread):
# Empty SSL info, it might change on next connect
nw.server.ssl_info = ""
def __request_article(self, nw: NewsWrapper):
try:
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: BODY %s", nw.thrdnum, nw.server.host, nw.article.article)
nw.body()
# Mark as ready to be read
self.add_socket(nw.nntp.fileno, nw)
except socket.error as err:
logging.info("Looks like server closed connection: %s", err)
self.__reset_nw(nw, "Server broke off connection", warn=True)
except Exception:
logging.error(T("Suspect error in downloader"))
logging.info("Traceback: ", exc_info=True)
self.__reset_nw(nw, "Server broke off connection", warn=True)
# ------------------------------------------------------------------------------
# Timed restart of servers admin.
# For each server all planned events are kept in a list.

View File

@@ -255,8 +255,7 @@ def diskfull_mail():
"""Send email about disk full, no templates"""
if cfg.email_full():
return send_email(
T(
"""To: %s
T("""To: %s
From: %s
Date: %s
Subject: SABnzbd reports Disk Full
@@ -266,9 +265,7 @@ Hi,
SABnzbd has stopped downloading, because the disk is almost full.
Please make room and resume SABnzbd manually.
"""
)
% (cfg.email_to.get_string(), cfg.email_from(), get_email_date()),
""") % (cfg.email_to.get_string(), cfg.email_from(), get_email_date()),
cfg.email_to(),
)
else:

View File

@@ -18,6 +18,7 @@
"""
sabnzbd.misc - filesystem operations
"""
import gzip
import os
import pickle
@@ -33,7 +34,7 @@ import fnmatch
import stat
import ctypes
import random
from typing import Union, List, Tuple, Any, Dict, Optional, BinaryIO
from typing import Union, Any, Optional, BinaryIO
try:
import win32api
@@ -42,6 +43,7 @@ try:
except ImportError:
pass
import sabctools
import sabnzbd
from sabnzbd.decorators import synchronized, conditional_cache
from sabnzbd.constants import (
@@ -56,7 +58,6 @@ from sabnzbd.constants import (
from sabnzbd.encoding import correct_unknown_encoding, utob, limit_encoded_length
import rarfile
# For Windows: determine executable extensions
if os.name == "nt":
PATHEXT = os.environ.get("PATHEXT", "").lower().split(";")
@@ -295,10 +296,10 @@ def sanitize_and_trim_path(path: str) -> str:
if sabnzbd.WINDOWS:
if path.startswith("\\\\?\\UNC\\"):
new_path = "\\\\?\\UNC\\"
path = path[8:]
path = path.removeprefix("\\\\?\\UNC\\")
elif path.startswith("\\\\?\\"):
new_path = "\\\\?\\"
path = path[4:]
path = path.removeprefix("\\\\?\\")
path = path.replace("\\", "/")
parts = path.split("/")
@@ -314,7 +315,7 @@ def sanitize_and_trim_path(path: str) -> str:
return os.path.abspath(os.path.normpath(new_path))
def sanitize_files(folder: Optional[str] = None, filelist: Optional[List[str]] = None) -> List[str]:
def sanitize_files(folder: Optional[str] = None, filelist: Optional[list[str]] = None) -> list[str]:
"""Sanitize each file in the folder or list of filepaths, return list of new names"""
logging.info("Checking if any resulting filenames need to be sanitized")
if folder:
@@ -330,7 +331,7 @@ def sanitize_files(folder: Optional[str] = None, filelist: Optional[List[str]] =
return output_filelist
def strip_extensions(name: str, ext_to_remove: Tuple[str, ...] = (".nzb", ".par", ".par2")):
def strip_extensions(name: str, ext_to_remove: tuple[str, ...] = (".nzb", ".par", ".par2")) -> str:
"""Strip extensions from a filename, without sanitizing the filename"""
name_base, ext = os.path.splitext(name)
while ext.lower() in ext_to_remove:
@@ -378,7 +379,7 @@ def real_path(loc: str, path: str) -> str:
def create_real_path(
name: str, loc: str, path: str, apply_permissions: bool = False, writable: bool = True
) -> Tuple[bool, str, Optional[str]]:
) -> tuple[bool, str, Optional[str]]:
"""When 'path' is relative, create join of 'loc' and 'path'
When 'path' is absolute, create normalized path
'name' is used for logging.
@@ -484,7 +485,7 @@ TS_RE = re.compile(r"\.(\d+)\.(ts$)", re.I)
def build_filelists(
workdir: Optional[str], workdir_complete: Optional[str] = None, check_both: bool = False, check_rar: bool = True
) -> Tuple[List[str], List[str], List[str], List[str]]:
) -> tuple[list[str], list[str], list[str], list[str]]:
"""Build filelists, if workdir_complete has files, ignore workdir.
Optionally scan both directories.
Optionally test content to establish RAR-ness
@@ -535,7 +536,7 @@ def safe_fnmatch(f: str, pattern: str) -> bool:
return False
def globber(path: str, pattern: str = "*") -> List[str]:
def globber(path: str, pattern: str = "*") -> list[str]:
"""Return matching base file/folder names in folder `path`"""
# Cannot use glob.glob() because it doesn't support Windows long name notation
if os.path.exists(path):
@@ -543,7 +544,7 @@ def globber(path: str, pattern: str = "*") -> List[str]:
return []
def globber_full(path: str, pattern: str = "*") -> List[str]:
def globber_full(path: str, pattern: str = "*") -> list[str]:
"""Return matching full file/folder names in folder `path`"""
# Cannot use glob.glob() because it doesn't support Windows long name notation
if os.path.exists(path):
@@ -572,7 +573,7 @@ def is_valid_script(basename: str) -> bool:
return basename in list_scripts(default=False, none=False)
def list_scripts(default: bool = False, none: bool = True) -> List[str]:
def list_scripts(default: bool = False, none: bool = True) -> list[str]:
"""Return a list of script names, optionally with 'Default' added"""
lst = []
path = sabnzbd.cfg.script_dir.get_path()
@@ -613,7 +614,7 @@ def make_script_path(script: str) -> Optional[str]:
return script_path
def get_admin_path(name: str, future: bool):
def get_admin_path(name: str, future: bool) -> str:
"""Return news-style full path to job-admin folder of names job
or else the old cache path
"""
@@ -660,7 +661,7 @@ def set_permissions(path: str, recursive: bool = True):
UNWANTED_FILE_PERMISSIONS = stat.S_ISUID | stat.S_ISGID | stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
def removexbits(path: str, custom_permissions: int = None):
def removexbits(path: str, custom_permissions: Optional[int] = None):
"""Remove all the x-bits from files, respecting current or custom permissions"""
if os.path.isfile(path):
# Use custom permissions as base
@@ -783,7 +784,7 @@ def get_unique_filename(path: str) -> str:
@synchronized(DIR_LOCK)
def listdir_full(input_dir: str, recursive: bool = True) -> List[str]:
def listdir_full(input_dir: str, recursive: bool = True) -> list[str]:
"""List all files in dirs and sub-dirs"""
filelist = []
for root, dirs, files in os.walk(input_dir):
@@ -797,7 +798,7 @@ def listdir_full(input_dir: str, recursive: bool = True) -> List[str]:
@synchronized(DIR_LOCK)
def move_to_path(path: str, new_path: str) -> Tuple[bool, Optional[str]]:
def move_to_path(path: str, new_path: str) -> tuple[bool, Optional[str]]:
"""Move a file to a new path, optionally give unique filename
Return (ok, new_path)
"""
@@ -990,7 +991,7 @@ def remove_all(path: str, pattern: str = "*", keep_folder: bool = False, recursi
##############################################################################
# Diskfree
##############################################################################
def diskspace_base(dir_to_check: str) -> Tuple[float, float]:
def diskspace_base(dir_to_check: str) -> tuple[float, float]:
"""Return amount of free and used diskspace in GBytes"""
# Find first folder level that exists in the path
x = "x"
@@ -1024,7 +1025,7 @@ def diskspace_base(dir_to_check: str) -> Tuple[float, float]:
@conditional_cache(cache_time=10)
def diskspace(force: bool = False) -> Dict[str, Tuple[float, float]]:
def diskspace(force: bool = False) -> dict[str, tuple[float, float]]:
"""Wrapper to keep results cached by conditional_cache
If called with force=True, the wrapper will clear the results"""
return {
@@ -1033,7 +1034,7 @@ def diskspace(force: bool = False) -> Dict[str, Tuple[float, float]]:
}
def get_new_id(prefix, folder, check_list=None):
def get_new_id(prefix: str, folder: str, check_list: Optional[list] = None) -> str:
"""Return unique prefixed admin identifier within folder
optionally making sure that id is not in the check_list.
"""
@@ -1054,7 +1055,7 @@ def get_new_id(prefix, folder, check_list=None):
raise IOError
def save_data(data, _id, path, do_pickle=True, silent=False):
def save_data(data: Any, _id: str, path: str, do_pickle: bool = True, silent: bool = False):
"""Save data to a diskfile"""
if not silent:
logging.debug("[%s] Saving data for %s in %s", sabnzbd.misc.caller_name(), _id, path)
@@ -1081,7 +1082,14 @@ def save_data(data, _id, path, do_pickle=True, silent=False):
time.sleep(0.1)
def load_data(data_id, path, remove=True, do_pickle=True, silent=False):
def load_data(
data_id: str,
path: str,
remove: bool = True,
do_pickle: bool = True,
silent: bool = False,
mutable: bool = False,
) -> Any:
"""Read data from disk file"""
path = os.path.join(path, data_id)
@@ -1100,6 +1108,9 @@ def load_data(data_id, path, remove=True, do_pickle=True, silent=False):
except UnicodeDecodeError:
# Could be Python 2 data that we can load using old encoding
data = pickle.load(data_file, encoding="latin1")
elif mutable:
data = bytearray(os.fstat(data_file.fileno()).st_size)
data_file.readinto(data)
else:
data = data_file.read()
@@ -1129,7 +1140,7 @@ def save_admin(data: Any, data_id: str):
save_data(data, data_id, sabnzbd.cfg.admin_dir.get_path())
def load_admin(data_id: str, remove=False, silent=False) -> Any:
def load_admin(data_id: str, remove: bool = False, silent: bool = False) -> Any:
"""Read data in admin folder in specified format"""
logging.debug("[%s] Loading data for %s", sabnzbd.misc.caller_name(), data_id)
return load_data(data_id, sabnzbd.cfg.admin_dir.get_path(), remove=remove, silent=silent)
@@ -1196,7 +1207,7 @@ def purge_log_files():
logging.debug("Finished puring log files")
def directory_is_writable_with_file(mydir, myfilename):
def directory_is_writable_with_file(mydir: str, myfilename: str) -> bool:
filename = os.path.join(mydir, myfilename)
if os.path.exists(filename):
try:
@@ -1222,7 +1233,7 @@ def directory_is_writable(test_dir: str) -> bool:
return True
def check_filesystem_capabilities(test_dir: str) -> bool:
def check_filesystem_capabilities(test_dir: str, is_download_dir: bool = False) -> bool:
"""Checks if we can write long and unicode filenames to the given directory.
If not on Windows, also check for special chars like slashes and :
Returns True if all OK, otherwise False"""
@@ -1250,10 +1261,25 @@ def check_filesystem_capabilities(test_dir: str) -> bool:
)
allgood = False
# sparse files allow efficient use of empty space in files
if is_download_dir and not check_sparse_and_disable(test_dir):
# Writing to correct file offsets will be disabled, and it won't be possible to flush the article cache
# directly to the destination file
sabnzbd.misc.helpful_warning(T("%s does not support sparse files. Disabling direct write mode."), test_dir)
allgood = False
return allgood
def get_win_drives() -> List[str]:
def check_sparse_and_disable(test_dir: str) -> bool:
"""Check if sparse files are supported, otherwise disable direct write mode"""
if sabnzbd.cfg.direct_write() and not is_sparse_supported(test_dir):
sabnzbd.cfg.direct_write.set(False)
return False
return True
def get_win_drives() -> list[str]:
"""Return list of detected drives, adapted from:
http://stackoverflow.com/questions/827371/is-there-a-way-to-list-all-the-available-drive-letters-in-python/827490
"""
@@ -1281,7 +1307,7 @@ PATHBROWSER_JUNKFOLDERS = (
)
def pathbrowser(path: str, show_hidden: bool = False, show_files: bool = False) -> List[Dict[str, str]]:
def pathbrowser(path: str, show_hidden: bool = False, show_files: bool = False) -> list[dict[str, str]]:
"""Returns a list of dictionaries with the folders and folders contained at the given path
Give the empty string as the path to list the contents of the root path
under Unix this means "/", on Windows this will be a list of drive letters
@@ -1367,3 +1393,44 @@ def pathbrowser(path: str, show_hidden: bool = False, show_files: bool = False)
)
return file_list
def create_work_name(name: str) -> str:
"""Remove ".nzb" and ".par(2)" and sanitize, skip URL's"""
if name.find("://") < 0:
# Invalid charters need to be removed before and after (see unit-tests)
return sanitize_foldername(strip_extensions(sanitize_foldername(name)))
else:
return name.strip()
def is_sparse(path: str) -> bool:
"""Check if a path is a sparse file"""
info = os.stat(path)
if sabnzbd.WINDOWS:
return bool(info.st_file_attributes & stat.FILE_ATTRIBUTE_SPARSE_FILE)
# Linux and macOS
if info.st_blocks * 512 < info.st_size:
return True
# Filesystem with SEEK_HOLE (ZFS)
try:
with open(path, "rb") as f:
pos = f.seek(0, os.SEEK_HOLE)
return pos < info.st_size
except (AttributeError, OSError):
pass
return False
def is_sparse_supported(check_dir: str) -> bool:
"""Check if a directory supports sparse files"""
sparse_file = tempfile.NamedTemporaryFile(dir=check_dir, delete=False)
try:
sabctools.sparse(sparse_file.fileno(), 64)
sparse_file.close()
return is_sparse(sparse_file.name)
finally:
os.remove(sparse_file.name)

View File

@@ -23,10 +23,9 @@ import socket
import threading
import time
import logging
import functools
from dataclasses import dataclass
from more_itertools import roundrobin
from typing import Tuple, Union, Optional
from typing import Union, Optional
import sabnzbd.cfg as cfg
from sabnzbd.constants import DEF_NETWORKING_TIMEOUT
@@ -61,7 +60,7 @@ class AddrInfo:
type: socket.SocketKind
proto: int
canonname: str
sockaddr: Union[Tuple[str, int], Tuple[str, int, int, int]]
sockaddr: Union[tuple[str, int], tuple[str, int, int, int]]
ipaddress: str = ""
port: int = 0
connection_time: float = 0.0

View File

@@ -34,7 +34,7 @@ import copy
from random import randint
from xml.sax.saxutils import escape
from Cheetah.Template import Template
from typing import Optional, Callable, Union, Any, Dict, List
from typing import Optional, Callable, Union, Any
from guessit.api import properties as guessit_properties
import sabnzbd
@@ -264,7 +264,7 @@ def check_hostname():
COOKIE_SECRET = str(randint(1000, 100000) * os.getpid())
def remote_ip_from_xff(xff_ips: List[str]) -> str:
def remote_ip_from_xff(xff_ips: list[str]) -> str:
# Per MDN docs, the first non-local/non-trusted IP (rtl) is our "client"
# However, it's possible that all IPs are local/trusted, so we may also
# return the first ip in the list as it "should" be the client
@@ -399,7 +399,7 @@ def check_apikey(kwargs):
return _MSG_APIKEY_INCORRECT
def template_filtered_response(file: str, search_list: Dict[str, Any]):
def template_filtered_response(file: str, search_list: dict[str, Any]):
"""Wrapper for Cheetah response"""
# We need a copy, because otherwise source-dicts might be modified
search_list_copy = copy.deepcopy(search_list)
@@ -558,7 +558,7 @@ class Wizard:
info["password"] = ""
info["connections"] = ""
info["ssl"] = 1
info["ssl_verify"] = 2
info["ssl_verify"] = 3
else:
# Sort servers to get the first enabled one
server_names = sorted(
@@ -895,6 +895,7 @@ SPECIAL_BOOL_LIST = (
"allow_old_ssl_tls",
"enable_season_sorting",
"verify_xff_header",
"direct_write",
)
SPECIAL_VALUE_LIST = (
"downloader_sleep_time",
@@ -906,6 +907,7 @@ SPECIAL_VALUE_LIST = (
"max_foldername_length",
"url_base",
"receive_threads",
"assembler_max_queue_size",
"switchinterval",
"direct_unpack_threads",
"selftest_host",
@@ -1268,7 +1270,7 @@ class ConfigRss:
active_feed,
download=self.__refresh_download,
force=self.__refresh_force,
ignoreFirst=self.__refresh_ignore,
ignore_first=self.__refresh_ignore,
readout=readout,
)
else:

View File

@@ -26,7 +26,6 @@ import socket
import ssl
import time
import threading
from typing import Dict
import sabctools
import sabnzbd
@@ -44,7 +43,7 @@ NR_CONNECTIONS = 5
TIME_LIMIT = 3
def internetspeed_worker(secure_sock: ssl.SSLSocket, socket_speed: Dict[ssl.SSLSocket, float]):
def internetspeed_worker(secure_sock: ssl.SSLSocket, socket_speed: dict[ssl.SSLSocket, float]):
"""Worker to perform the requests in parallel"""
secure_sock.sendall(TEST_REQUEST.encode())
empty_buffer = memoryview(sabctools.bytearray_malloc(BUFFER_SIZE))

View File

@@ -41,7 +41,7 @@ import math
import rarfile
from threading import Thread
from collections.abc import Iterable
from typing import Union, Tuple, Any, AnyStr, Optional, List, Dict, Collection
from typing import Union, Any, AnyStr, Optional, Collection
import sabnzbd
import sabnzbd.getipaddress
@@ -57,11 +57,12 @@ import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.decorators import conditional_cache
from sabnzbd.encoding import ubtou, platform_btou
from sabnzbd.filesystem import userxbit, make_script_path, remove_file
from sabnzbd.filesystem import userxbit, make_script_path, remove_file, strip_extensions
if sabnzbd.WINDOWS:
try:
import winreg
import win32api
import win32process
import win32con
@@ -85,6 +86,10 @@ RE_SAMPLE = re.compile(r"((^|[\W_])(sample|proof))", re.I) # something-sample o
RE_IP4 = re.compile(r"inet\s+(addr:\s*)?(\d+\.\d+\.\d+\.\d+)")
RE_IP6 = re.compile(r"inet6\s+(addr:\s*)?([0-9a-f:]+)", re.I)
# Name patterns for NZB parsing
RE_SUBJECT_FILENAME_QUOTES = re.compile(r'"([^"]*)"')
RE_SUBJECT_BASIC_FILENAME = re.compile(r"\b([\w\-+()' .,]+(?:\[[\w\-/+()' .,]*][\w\-+()' .,]*)*\.[A-Za-z0-9]{2,4})\b")
# Check if strings are defined for AM and PM
HAVE_AMPM = bool(time.strftime("%p"))
@@ -178,7 +183,7 @@ def is_none(inp: Any) -> bool:
return not inp or (isinstance(inp, str) and inp.lower() == "none")
def clean_comma_separated_list(inp: Any) -> List[str]:
def clean_comma_separated_list(inp: Any) -> list[str]:
"""Return a list of stripped values from a string or list, empty ones removed"""
result_ids = []
if isinstance(inp, str):
@@ -190,7 +195,7 @@ def clean_comma_separated_list(inp: Any) -> List[str]:
return result_ids
def cmp(x, y):
def cmp(x: Any, y: Any) -> int:
"""
Replacement for built-in function cmp that was removed in Python 3
@@ -217,7 +222,7 @@ def cat_pp_script_sanitizer(
cat: Optional[str] = None,
pp: Optional[Union[int, str]] = None,
script: Optional[str] = None,
) -> Tuple[Optional[Union[int, str]], Optional[str], Optional[str]]:
) -> tuple[Optional[Union[int, str]], Optional[str], Optional[str]]:
"""Basic sanitizer from outside input to a bit more predictable values"""
# * and Default are valid values
if safe_lower(cat) in ("", "none"):
@@ -234,7 +239,7 @@ def cat_pp_script_sanitizer(
return cat, pp, script
def name_to_cat(fname, cat=None):
def name_to_cat(fname: str, cat: Optional[str] = None) -> tuple[str, Optional[str]]:
"""Retrieve category from file name, but only if "cat" is None."""
if cat is None and fname.startswith("{{"):
n = fname.find("}}")
@@ -246,7 +251,9 @@ def name_to_cat(fname, cat=None):
return fname, cat
def cat_to_opts(cat, pp=None, script=None, priority=None) -> Tuple[str, int, str, int]:
def cat_to_opts(
cat: Optional[str], pp: Optional[int] = None, script: Optional[str] = None, priority: Optional[int] = None
) -> tuple[str, int, str, int]:
"""Derive options from category, if options not already defined.
Specified options have priority over category-options.
If no valid category is given, special category '*' will supply default values
@@ -279,7 +286,7 @@ def cat_to_opts(cat, pp=None, script=None, priority=None) -> Tuple[str, int, str
return cat, pp, script, priority
def pp_to_opts(pp: Optional[int]) -> Tuple[bool, bool, bool]:
def pp_to_opts(pp: Optional[int]) -> tuple[bool, bool, bool]:
"""Convert numeric processing options to (repair, unpack, delete)"""
# Convert the pp to an int
pp = int_conv(pp)
@@ -331,12 +338,12 @@ _wildcard_to_regex = {
}
def wildcard_to_re(text):
def wildcard_to_re(text: str) -> str:
"""Convert plain wildcard string (with '*' and '?') to regex."""
return "".join([_wildcard_to_regex.get(ch, ch) for ch in text])
def convert_filter(text):
def convert_filter(text: str) -> Optional[re.Pattern]:
"""Return compiled regex.
If string starts with re: it's a real regex
else quote all regex specials, replace '*' by '.*'
@@ -353,7 +360,7 @@ def convert_filter(text):
return None
def cat_convert(cat):
def cat_convert(cat: Optional[str]) -> Optional[str]:
"""Convert indexer's category/group-name to user categories.
If no match found, but indexer-cat equals user-cat, then return user-cat
If no match found, but the indexer-cat starts with the user-cat, return user-cat
@@ -397,7 +404,7 @@ _SERVICE_KEY = "SYSTEM\\CurrentControlSet\\services\\"
_SERVICE_PARM = "CommandLine"
def get_serv_parms(service):
def get_serv_parms(service: str) -> list[str]:
"""Get the service command line parameters from Registry"""
service_parms = []
try:
@@ -416,7 +423,7 @@ def get_serv_parms(service):
return service_parms
def set_serv_parms(service, args):
def set_serv_parms(service: str, args: list) -> bool:
"""Set the service command line parameters in Registry"""
serv = []
for arg in args:
@@ -444,7 +451,7 @@ def get_from_url(url: str) -> Optional[str]:
return None
def convert_version(text):
def convert_version(text: str) -> tuple[int, bool]:
"""Convert version string to numerical value and a testversion indicator"""
version = 0
test = True
@@ -551,7 +558,7 @@ def check_latest_version():
)
def upload_file_to_sabnzbd(url, fp):
def upload_file_to_sabnzbd(url: str, fp: str):
"""Function for uploading nzbs to a running SABnzbd instance"""
try:
fp = urllib.parse.quote_plus(fp)
@@ -644,7 +651,7 @@ def to_units(val: Union[int, float], postfix="") -> str:
return f"{sign}{val:.{decimals}f}{units}"
def caller_name(skip=2):
def caller_name(skip: int = 2) -> str:
"""Get a name of a caller in the format module.method
Originally used: https://gist.github.com/techtonik/2151727
Adapted for speed by using sys calls directly
@@ -682,7 +689,7 @@ def exit_sab(value: int):
os._exit(value)
def split_host(srv):
def split_host(srv: Optional[str]) -> tuple[Optional[str], Optional[int]]:
"""Split host:port notation, allowing for IPV6"""
if not srv:
return None, None
@@ -704,22 +711,14 @@ def split_host(srv):
return out[0], port
def get_cache_limit():
def get_cache_limit() -> str:
"""Depending on OS, calculate cache limits.
In ArticleCache it will make sure we stay
within system limits for 32/64 bit
"""
# Calculate, if possible
try:
if sabnzbd.WINDOWS:
# Windows
mem_bytes = get_windows_memory()
elif sabnzbd.MACOS:
# macOS
mem_bytes = get_macos_memory()
else:
# Linux
mem_bytes = os.sysconf("SC_PAGE_SIZE") * os.sysconf("SC_PHYS_PAGES")
mem_bytes = get_memory()
# Use 1/4th of available memory
mem_bytes = mem_bytes / 4
@@ -742,40 +741,31 @@ def get_cache_limit():
return ""
def get_windows_memory():
"""Use ctypes to extract available memory"""
class MEMORYSTATUSEX(ctypes.Structure):
_fields_ = [
("dwLength", ctypes.c_ulong),
("dwMemoryLoad", ctypes.c_ulong),
("ullTotalPhys", ctypes.c_ulonglong),
("ullAvailPhys", ctypes.c_ulonglong),
("ullTotalPageFile", ctypes.c_ulonglong),
("ullAvailPageFile", ctypes.c_ulonglong),
("ullTotalVirtual", ctypes.c_ulonglong),
("ullAvailVirtual", ctypes.c_ulonglong),
("sullAvailExtendedVirtual", ctypes.c_ulonglong),
]
def __init__(self):
# have to initialize this to the size of MEMORYSTATUSEX
self.dwLength = ctypes.sizeof(self)
super(MEMORYSTATUSEX, self).__init__()
stat = MEMORYSTATUSEX()
ctypes.windll.kernel32.GlobalMemoryStatusEx(ctypes.byref(stat))
return stat.ullTotalPhys
def get_macos_memory():
"""Use system-call to extract total memory on macOS"""
system_output = run_command(["sysctl", "hw.memsize"])
return float(system_output.split()[1])
def get_memory() -> int:
try:
if sabnzbd.WINDOWS:
# Use win32api to get total physical memory
mem_info = win32api.GlobalMemoryStatusEx()
return mem_info["TotalPhys"]
elif sabnzbd.MACOS:
# Use system-call to extract total memory on macOS
system_output = run_command(["sysctl", "-n", "hw.memsize"]).strip()
else:
try:
with open("/proc/meminfo") as f:
for line in f:
if line.startswith("MemTotal:"):
return int(line.split()[1]) * 1024
except Exception:
pass
return os.sysconf("SC_PAGE_SIZE") * os.sysconf("SC_PHYS_PAGES")
except Exception:
pass
return 0
@conditional_cache(cache_time=3600)
def get_cpu_name():
def get_cpu_name() -> Optional[str]:
"""Find the CPU name (which needs a different method per OS), and return it
If none found, return platform.platform()"""
@@ -875,7 +865,7 @@ def on_cleanup_list(filename: str, skip_nzb: bool = False) -> bool:
return False
def memory_usage():
def memory_usage() -> Optional[str]:
try:
# Probably only works on Linux because it uses /proc/<pid>/statm
with open("/proc/%d/statm" % os.getpid()) as t:
@@ -897,7 +887,7 @@ except Exception:
_HAVE_STATM = _PAGE_SIZE and memory_usage()
def loadavg():
def loadavg() -> str:
"""Return 1, 5 and 15 minute load average of host or "" if not supported"""
p = ""
if not sabnzbd.WINDOWS and not sabnzbd.MACOS:
@@ -972,7 +962,7 @@ def bool_conv(value: Any) -> bool:
return bool(int_conv(value))
def create_https_certificates(ssl_cert, ssl_key):
def create_https_certificates(ssl_cert: str, ssl_key: str) -> bool:
"""Create self-signed HTTPS certificates and store in paths 'ssl_cert' and 'ssl_key'"""
try:
from sabnzbd.utils.certgen import generate_key, generate_local_cert
@@ -988,7 +978,7 @@ def create_https_certificates(ssl_cert, ssl_key):
return True
def get_all_passwords(nzo) -> List[str]:
def get_all_passwords(nzo) -> list[str]:
"""Get all passwords, from the NZB, meta and password file. In case a working password is
already known, try it first."""
passwords = []
@@ -1051,7 +1041,7 @@ def is_sample(filename: str) -> bool:
return bool(re.search(RE_SAMPLE, filename))
def find_on_path(targets):
def find_on_path(targets: Union[str, tuple[str, ...]]) -> Optional[str]:
"""Search the PATH for a program and return full path"""
if sabnzbd.WINDOWS:
paths = os.getenv("PATH").split(";")
@@ -1170,7 +1160,7 @@ def is_local_addr(ip: str) -> bool:
return is_lan_addr(ip)
def ip_extract() -> List[str]:
def ip_extract() -> list[str]:
"""Return list of IP addresses of this system"""
ips = []
program = find_on_path("ip")
@@ -1215,7 +1205,7 @@ def get_base_url(url: str) -> str:
return ""
def match_str(text: AnyStr, matches: Tuple[AnyStr, ...]) -> Optional[AnyStr]:
def match_str(text: AnyStr, matches: tuple[AnyStr, ...]) -> Optional[AnyStr]:
"""Return first matching element of list 'matches' in 'text', otherwise None"""
text = text.lower()
for match in matches:
@@ -1224,7 +1214,7 @@ def match_str(text: AnyStr, matches: Tuple[AnyStr, ...]) -> Optional[AnyStr]:
return None
def recursive_html_escape(input_dict_or_list: Union[Dict[str, Any], List], exclude_items: Tuple[str, ...] = ()):
def recursive_html_escape(input_dict_or_list: Union[dict[str, Any], list], exclude_items: tuple[str, ...] = ()):
"""Recursively update the input_dict in-place with html-safe values"""
if isinstance(input_dict_or_list, (dict, list)):
if isinstance(input_dict_or_list, dict):
@@ -1245,7 +1235,7 @@ def recursive_html_escape(input_dict_or_list: Union[Dict[str, Any], List], exclu
raise ValueError("Expected dict or str, got %s" % type(input_dict_or_list))
def list2cmdline_unrar(lst: List[str]) -> str:
def list2cmdline_unrar(lst: list[str]) -> str:
"""convert list to a unrar.exe-compatible command string
Unrar uses "" instead of \" to escape the double quote"""
nlst = []
@@ -1259,7 +1249,9 @@ def list2cmdline_unrar(lst: List[str]) -> str:
return " ".join(nlst)
def build_and_run_command(command: List[str], windows_unrar_command: bool = False, text_mode: bool = True, **kwargs):
def build_and_run_command(
command: list[str], windows_unrar_command: bool = False, text_mode: bool = True, **kwargs
) -> subprocess.Popen:
"""Builds and then runs command with necessary flags and optional
IONice and Nice commands. Optional Popen arguments can be supplied.
On Windows we need to run our own list2cmdline for Unrar.
@@ -1326,7 +1318,7 @@ def build_and_run_command(command: List[str], windows_unrar_command: bool = Fals
return subprocess.Popen(command, **popen_kwargs)
def run_command(cmd: List[str], **kwargs):
def run_command(cmd: list[str], **kwargs) -> str:
"""Run simple external command and return output as a string."""
with build_and_run_command(cmd, **kwargs) as p:
txt = p.stdout.read()
@@ -1359,7 +1351,7 @@ def set_socks5_proxy():
socket.socket = socks.socksocket
def set_https_verification(value):
def set_https_verification(value: bool) -> bool:
"""Set HTTPS-verification state while returning current setting
False = disable verification
"""
@@ -1381,7 +1373,7 @@ def request_repair():
pass
def check_repair_request():
def check_repair_request() -> bool:
"""Return True if repair request found, remove afterwards"""
path = os.path.join(cfg.admin_dir.get_path(), REPAIR_REQUEST)
if os.path.exists(path):
@@ -1514,8 +1506,8 @@ def convert_sorter_settings():
min_size: Union[str|int] = "50M"
multipart_label: Optional[str] = ""
sort_string: str
sort_cats: List[str]
sort_type: List[int]
sort_cats: list[str]
sort_type: list[int]
is_active: bool = 1
}
@@ -1575,7 +1567,7 @@ def convert_sorter_settings():
def convert_history_retention():
"""Convert single-option to the split history retention setting"""
if "d" in cfg.history_retention():
days_to_keep = int_conv(cfg.history_retention().strip()[:-1])
days_to_keep = int_conv(cfg.history_retention().strip().removesuffix("d"))
cfg.history_retention_option.set("days-delete")
cfg.history_retention_number.set(days_to_keep)
else:
@@ -1587,6 +1579,66 @@ def convert_history_retention():
cfg.history_retention_option.set("all-delete")
def scan_password(name: str) -> tuple[str, Optional[str]]:
"""Get password (if any) from the title"""
if "http://" in name or "https://" in name:
return name, None
# Strip any unwanted usenet-related extensions
name = strip_extensions(name)
# Identify any braces
braces = name[1:].find("{{")
if braces < 0:
braces = len(name)
else:
braces += 1
slash = name.find("/")
# Look for name/password, but make sure that '/' comes before any {{
if 0 < slash < braces and "password=" not in name:
# Is it maybe in 'name / password' notation?
if slash == name.find(" / ") + 1 and name[: slash - 1].strip(". "):
# Remove the extra space after name and before password
return name[: slash - 1].strip(". "), name[slash + 2 :]
if name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# Look for "name password=password"
pw = name.find("password=")
if pw > 0 and name[:pw].strip(". "):
return name[:pw].strip(". "), name[pw + 9 :]
# Look for name{{password}}
if braces < len(name):
closing_braces = name.rfind("}}")
if closing_braces > braces and name[:braces].strip(". "):
return name[:braces].strip(". "), name[braces + 2 : closing_braces]
# Look again for name/password
if slash > 0 and name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# No password found
return name, None
def subject_name_extractor(subject: str) -> str:
"""Try to extract a file name from a subject line, return `subject` if in doubt"""
# Filename nicely wrapped in quotes
for name in re.findall(RE_SUBJECT_FILENAME_QUOTES, subject):
if name := name.strip(' "'):
return name
# Found nothing? Try a basic filename-like search
for name in re.findall(RE_SUBJECT_BASIC_FILENAME, subject):
if name := name.strip():
return name
# Return the subject
return subject
##
## SABnzbd patched rarfile classes
## Patch for https://github.com/markokr/rarfile/issues/56#issuecomment-711146569
@@ -1615,7 +1667,7 @@ class SABRarFile(rarfile.RarFile):
self._file_parser._info_list.append(rar_obj)
self._file_parser._info_map[rar_obj.filename.rstrip("/")] = rar_obj
def filelist(self):
def filelist(self) -> list[str]:
"""Return list of filenames in archive."""
return [f.filename for f in self.infolist() if not f.isdir()]

View File

@@ -29,7 +29,7 @@ import io
import shutil
import functools
import rarfile
from typing import Tuple, List, BinaryIO, Optional, Dict, Any, Union, Set
from typing import BinaryIO, Optional, Any, Union
import sabnzbd
from sabnzbd.encoding import correct_unknown_encoding, ubtou
@@ -64,12 +64,12 @@ from sabnzbd.filesystem import (
SEVENMULTI_RE,
is_size,
get_basename,
create_all_dirs,
)
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.nzb import NzbObject
import sabnzbd.cfg as cfg
from sabnzbd.constants import Status
# Regex globals
RAR_V3_RE = re.compile(r"\.(?P<ext>part\d*)$", re.I)
RAR_EXTRACTFROM_RE = re.compile(r"^Extracting\sfrom\s(.+)")
@@ -117,7 +117,14 @@ def find_programs(curdir: str):
sabnzbd.newsunpack.SEVENZIP_COMMAND = check(curdir, "macos/7zip/7zz")
if sabnzbd.WINDOWS:
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, "win/par2/par2.exe")
if sabnzbd.WINDOWSARM64:
# ARM64 version of par2
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, "win/par2/arm64/par2.exe")
else:
# Regular x64 version
sabnzbd.newsunpack.PAR2_COMMAND = check(curdir, "win/par2/par2.exe")
# UnRAR has no arm64 version, so we skip it also for 7zip
sabnzbd.newsunpack.RAR_COMMAND = check(curdir, "win/unrar/UnRAR.exe")
sabnzbd.newsunpack.SEVENZIP_COMMAND = check(curdir, "win/7zip/7za.exe")
else:
@@ -200,7 +207,7 @@ ENV_NZO_FIELDS = [
def external_processing(
extern_proc: str, nzo: NzbObject, complete_dir: str, nicename: str, status: int
) -> Tuple[str, int]:
) -> tuple[str, int]:
"""Run a user postproc script, return console output and exit value"""
failure_url = nzo.nzo_info.get("failure", "")
# Items can be bool or null, causing POpen to fail
@@ -262,12 +269,12 @@ def unpacker(
nzo: NzbObject,
workdir_complete: str,
one_folder: bool,
joinables: List[str] = [],
rars: List[str] = [],
sevens: List[str] = [],
ts: List[str] = [],
joinables: list[str] = [],
rars: list[str] = [],
sevens: list[str] = [],
ts: list[str] = [],
depth: int = 0,
) -> Tuple[Union[int, bool], List[str]]:
) -> tuple[Union[int, bool], list[str]]:
"""Do a recursive unpack from all archives in 'download_path' to 'workdir_complete'"""
if depth > 2:
# Prevent going to deep down the rabbit-hole
@@ -359,7 +366,7 @@ def unpacker(
##############################################################################
# Filejoin Functions
##############################################################################
def match_ts(file: str) -> Tuple[str, int]:
def match_ts(file: str) -> tuple[str, int]:
"""Return True if file is a joinable TS file"""
match = TS_RE.search(file)
if not match:
@@ -374,7 +381,7 @@ def match_ts(file: str) -> Tuple[str, int]:
return setname, num
def clean_up_joinables(names: List[str]):
def clean_up_joinables(names: list[str]):
"""Remove joinable files and their .1 backups"""
for name in names:
if os.path.exists(name):
@@ -403,7 +410,7 @@ def get_seq_number(name: str) -> int:
return 0
def file_join(nzo: NzbObject, workdir_complete: str, joinables: List[str]) -> Tuple[bool, List[str]]:
def file_join(nzo: NzbObject, workdir_complete: str, joinables: list[str]) -> tuple[bool, list[str]]:
"""Join and joinable files in 'workdir' to 'workdir_complete' and
when successful, delete originals
"""
@@ -494,7 +501,7 @@ def file_join(nzo: NzbObject, workdir_complete: str, joinables: List[str]) -> Tu
##############################################################################
# (Un)Rar Functions
##############################################################################
def rar_unpack(nzo: NzbObject, workdir_complete: str, one_folder: bool, rars: List[str]) -> Tuple[int, List[str]]:
def rar_unpack(nzo: NzbObject, workdir_complete: str, one_folder: bool, rars: list[str]) -> tuple[int, list[str]]:
"""Unpack multiple sets 'rars' of RAR files from 'download_path' to 'workdir_complete.
When 'delete' is set, originals will be deleted.
When 'one_folder' is set, all files will be in a single folder
@@ -616,7 +623,7 @@ def rar_unpack(nzo: NzbObject, workdir_complete: str, one_folder: bool, rars: Li
def rar_extract(
rarfile_path: str, numrars: int, one_folder: bool, nzo: NzbObject, setname: str, extraction_path: str
) -> Tuple[int, List[str], List[str]]:
) -> tuple[int, list[str], list[str]]:
"""Unpack single rar set 'rarfile' to 'extraction_path',
with password tries
Return fail==0(ok)/fail==1(error)/fail==2(wrong password)/fail==3(crc-error), new_files, rars
@@ -626,6 +633,12 @@ def rar_extract(
rars = []
passwords = get_all_passwords(nzo)
# Sanity check, does the folder exist? Could be removed by aborted Direct Unpack
if not os.path.exists(extraction_path):
# Similar to prepare_extraction_path
extraction_path = create_all_dirs(extraction_path, apply_permissions=True)
logging.info("Extraction path (re)created because it was missing: %s", extraction_path)
for password in passwords:
if password:
logging.debug('Trying unrar with password "%s"', password)
@@ -642,14 +655,14 @@ def rar_extract(
def rar_extract_core(
rarfile_path: str, numrars: int, one_folder: bool, nzo: NzbObject, setname: str, extraction_path: str, password: str
) -> Tuple[int, List[str], List[str]]:
) -> tuple[int, list[str], list[str]]:
"""Unpack single rar set 'rarfile_path' to 'extraction_path'
Return fail==0(ok)/fail==1(error)/fail==2(wrong password)/fail==3(crc-error), new_files, rars
"""
start = time.time()
logging.debug("Extraction path: %s", extraction_path)
logging.debug("Found rar version: %s", rarfile.is_rarfile(rarfile_path))
logging.debug("Found rar version: %s", rarfile.get_rar_version(rarfile_path))
if password:
password_command = "-p%s" % password
@@ -866,7 +879,7 @@ def rar_extract_core(
##############################################################################
# 7Zip Functions
##############################################################################
def unseven(nzo: NzbObject, workdir_complete: str, one_folder: bool, sevens: List[str]):
def unseven(nzo: NzbObject, workdir_complete: str, one_folder: bool, sevens: list[str]) -> tuple[bool, list[str]]:
"""Unpack multiple sets '7z' of 7Zip files from 'download_path' to 'workdir_complete.
When 'delete' is set, originals will be deleted.
"""
@@ -914,7 +927,7 @@ def unseven(nzo: NzbObject, workdir_complete: str, one_folder: bool, sevens: Lis
def seven_extract(
nzo: NzbObject, seven_path: str, seven_set: str, extraction_path: str, one_folder: bool
) -> Tuple[int, List[str]]:
) -> tuple[int, list[str]]:
"""Unpack single set 'sevenset' to 'extraction_path', with password tries
Return fail==0(ok)/fail==1(error)/fail==2(wrong password), new_files, sevens
"""
@@ -938,7 +951,7 @@ def seven_extract(
def seven_extract_core(
nzo: NzbObject, seven_path: str, extraction_path: str, seven_set: str, one_folder: bool, password: str
) -> Tuple[int, List[str]]:
) -> tuple[int, list[str]]:
"""Unpack single 7Z set 'sevenset' to 'extraction_path'
Return fail==0(ok)/fail==1(error)/fail==2(wrong password), new_files, message
"""
@@ -1004,7 +1017,7 @@ def seven_extract_core(
##############################################################################
# PAR2 Functions
##############################################################################
def par2_repair(nzo: NzbObject, setname: str) -> Tuple[bool, bool]:
def par2_repair(nzo: NzbObject, setname: str) -> tuple[bool, bool]:
"""Try to repair a set, return readd and correctness"""
# Check which of the files exists
for new_par in nzo.extrapars[setname]:
@@ -1117,8 +1130,8 @@ def par2_repair(nzo: NzbObject, setname: str) -> Tuple[bool, bool]:
def par2cmdline_verify(
parfile: str, nzo: NzbObject, setname: str, joinables: List[str]
) -> Tuple[bool, bool, List[str], List[str]]:
parfile: str, nzo: NzbObject, setname: str, joinables: list[str]
) -> tuple[bool, bool, list[str], list[str]]:
"""Run par2 on par-set"""
used_joinables = []
used_for_repair = []
@@ -1403,7 +1416,7 @@ def par2cmdline_verify(
return finished, readd, used_joinables, used_for_repair
def create_env(nzo: Optional[NzbObject] = None, extra_env_fields: Dict[str, Any] = {}) -> Optional[Dict[str, Any]]:
def create_env(nzo: Optional[NzbObject] = None, extra_env_fields: dict[str, Any] = {}) -> Optional[dict[str, Any]]:
"""Modify the environment for pp-scripts with extra information
macOS: Return copy of environment without PYTHONPATH and PYTHONHOME
other: return None
@@ -1460,7 +1473,7 @@ def create_env(nzo: Optional[NzbObject] = None, extra_env_fields: Dict[str, Any]
return env
def rar_volumelist(rarfile_path: str, password: str, known_volumes: List[str]) -> List[str]:
def rar_volumelist(rarfile_path: str, password: str, known_volumes: list[str]) -> list[str]:
"""List volumes that are part of this rarset
and merge them with parsed paths list, removing duplicates.
We assume RarFile is right and use parsed paths as backup.
@@ -1516,7 +1529,7 @@ def quick_check_set(setname: str, nzo: NzbObject) -> bool:
result = True
nzf_list = nzo.finished_files
renames = {}
found_paths: Set[str] = set()
found_paths: set[str] = set()
# Files to ignore
ignore_ext = cfg.quick_check_ext_ignore()
@@ -1590,7 +1603,7 @@ def quick_check_set(setname: str, nzo: NzbObject) -> bool:
return result
def unrar_check(rar: str) -> Tuple[int, bool]:
def unrar_check(rar: str) -> tuple[int, bool]:
"""Return version number of unrar, where "5.01" returns 501
Also return whether an original version is found
(version, original)
@@ -1678,7 +1691,7 @@ def is_sfv_file(myfile: str) -> bool:
return sfv_info_line_counter >= 1
def sfv_check(sfvs: List[str], nzo: NzbObject) -> bool:
def sfv_check(sfvs: list[str], nzo: NzbObject) -> bool:
"""Verify files using SFV files"""
# Update status
nzo.status = Status.VERIFYING
@@ -1762,7 +1775,7 @@ def sfv_check(sfvs: List[str], nzo: NzbObject) -> bool:
return result
def parse_sfv(sfv_filename):
def parse_sfv(sfv_filename: str) -> dict[str, bytes]:
"""Parse SFV file and return dictionary of crc32's and filenames"""
results = {}
with open(sfv_filename, mode="rb") as sfv_list:
@@ -1787,12 +1800,12 @@ def add_time_left(perc: float, start_time: Optional[float] = None, time_used: Op
return ""
def pre_queue(nzo: NzbObject, pp, cat):
def pre_queue(nzo: NzbObject, pp: str, cat: str) -> list[Any]:
"""Run pre-queue script (if any) and process results.
pp and cat are supplied separate since they can change.
"""
def fix(p):
def fix(p: Any) -> str:
# If added via API, some items can still be "None" (as a string)
if is_none(p):
return ""
@@ -1886,7 +1899,7 @@ class SevenZip:
if not is_sevenfile(self.path):
raise TypeError("File is not a 7zip file")
def namelist(self) -> List[str]:
def namelist(self) -> list[str]:
"""Return list of names in 7Zip"""
names = []
command = [SEVENZIP_COMMAND, "l", "-p", "-y", "-slt", "-sccUTF-8", self.path]
@@ -1909,6 +1922,6 @@ class SevenZip:
p.wait()
return data
def close(self):
def close(self) -> None:
"""Close file"""
pass

View File

@@ -21,20 +21,23 @@ sabnzbd.newswrapper
import errno
import socket
import threading
from collections import deque
from contextlib import suppress
from selectors import EVENT_READ, EVENT_WRITE
from threading import Thread
import time
import logging
import ssl
import sabctools
from typing import Optional, Tuple, Union
from typing import Optional, Tuple, Union, Callable
import sabctools
import sabnzbd
import sabnzbd.cfg
from sabnzbd.constants import DEF_NETWORKING_TIMEOUT, NNTP_BUFFER_SIZE, NTTP_MAX_BUFFER_SIZE
from sabnzbd.encoding import utob, ubtou
from sabnzbd.constants import DEF_NETWORKING_TIMEOUT, NNTP_BUFFER_SIZE, Status, FORCE_PRIORITY
from sabnzbd.encoding import utob
from sabnzbd.get_addrinfo import AddrInfo
from sabnzbd.decorators import synchronized, DOWNLOADER_LOCK
from sabnzbd.misc import int_conv
# Set pre-defined socket timeout
socket.setdefaulttimeout(DEF_NETWORKING_TIMEOUT)
@@ -57,35 +60,41 @@ class NewsWrapper:
"thrdnum",
"blocking",
"timeout",
"article",
"data",
"data_view",
"data_position",
"decoder",
"nntp",
"connected",
"ready",
"user_sent",
"pass_sent",
"group",
"user_ok",
"pass_ok",
"force_login",
"next_request",
"concurrent_requests",
"_response_queue",
"selector_events",
"lock",
"generation",
"tls_wants_write",
)
def __init__(self, server, thrdnum, block=False):
def __init__(self, server: "sabnzbd.downloader.Server", thrdnum: int, block: bool = False, generation: int = 0):
self.server: sabnzbd.downloader.Server = server
self.thrdnum: int = thrdnum
self.blocking: bool = block
self.generation: int = generation
if getattr(self, "lock", None) is None:
self.lock: threading.Lock = threading.Lock()
self.timeout: Optional[float] = None
self.article: Optional[sabnzbd.nzbstuff.Article] = None
self.data: Optional[bytearray] = None
self.data_view: Optional[memoryview] = None
self.data_position: int = 0
self.decoder: Optional[sabctools.Decoder] = None
self.nntp: Optional[NNTP] = None
self.connected: bool = False
self.connected: bool = False # TCP/TLS handshake complete
self.ready: bool = False # Auth complete, can serve requests
self.user_sent: bool = False
self.pass_sent: bool = False
self.user_ok: bool = False
@@ -93,14 +102,22 @@ class NewsWrapper:
self.force_login: bool = False
self.group: Optional[str] = None
@property
def status_code(self) -> Optional[int]:
if self.data_position >= 3:
return int_conv(self.data[:3])
# Command queue and concurrency
self.next_request: Optional[tuple[bytes, Optional["sabnzbd.nzb.Article"]]] = None
self.concurrent_requests: threading.BoundedSemaphore = threading.BoundedSemaphore(
self.server.pipelining_requests()
)
self._response_queue: deque[Optional[sabnzbd.nzb.Article]] = deque()
self.selector_events = 0
self.tls_wants_write: bool = False
@property
def nntp_msg(self) -> str:
return ubtou(self.data[: self.data_position]).strip()
def article(self) -> Optional["sabnzbd.nzb.Article"]:
"""The article currently being downloaded"""
with self.lock:
if self._response_queue:
return self._response_queue[0]
return None
def init_connect(self):
"""Setup the connection in NNTP object"""
@@ -109,16 +126,18 @@ class NewsWrapper:
raise socket.error(errno.EADDRNOTAVAIL, T("Invalid server address."))
# Construct buffer and NNTP object
self.data = sabctools.bytearray_malloc(NNTP_BUFFER_SIZE)
self.data_view = memoryview(self.data)
self.reset_data_buffer()
self.decoder = sabctools.Decoder(NNTP_BUFFER_SIZE)
self.nntp = NNTP(self, self.server.addrinfo)
self.timeout = time.time() + self.server.timeout
def finish_connect(self, code: int):
# On connect the first "response" will be 200 Welcome
self._response_queue.append(None)
self.concurrent_requests.acquire()
def finish_connect(self, code: int, message: str) -> None:
"""Perform login options"""
if not (self.server.username or self.server.password or self.force_login):
self.connected = True
self.ready = True
self.user_sent = True
self.user_ok = True
self.pass_sent = True
@@ -126,18 +145,17 @@ class NewsWrapper:
if code == 480:
self.force_login = True
self.connected = False
self.ready = False
self.user_sent = False
self.user_ok = False
self.pass_sent = False
self.pass_ok = False
if code in (400, 500, 502):
raise NNTPPermanentError(self.nntp_msg, code)
raise NNTPPermanentError(message, code)
elif not self.user_sent:
command = utob("authinfo user %s\r\n" % self.server.username)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
self.queue_command(command)
self.user_sent = True
elif not self.user_ok:
if code == 381:
@@ -147,108 +165,301 @@ class NewsWrapper:
self.user_ok = True
self.pass_sent = True
self.pass_ok = True
self.connected = True
self.ready = True
if self.user_ok and not self.pass_sent:
command = utob("authinfo pass %s\r\n" % self.server.password)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
self.queue_command(command)
self.pass_sent = True
elif self.user_ok and not self.pass_ok:
if code != 281:
# Assume that login failed (code 481 or other)
raise NNTPPermanentError(self.nntp_msg, code)
raise NNTPPermanentError(message, code)
else:
self.connected = True
self.ready = True
self.timeout = time.time() + self.server.timeout
def body(self):
def queue_command(
self,
command: bytes,
article: Optional["sabnzbd.nzb.Article"] = None,
) -> None:
"""Add a command to the command queue"""
self.next_request = command, article
def body(self, article: "sabnzbd.nzb.Article") -> tuple[bytes, "sabnzbd.nzb.Article"]:
"""Request the body of the article"""
self.timeout = time.time() + self.server.timeout
if self.article.nzf.nzo.precheck:
if article.nzf.nzo.precheck:
if self.server.have_stat:
command = utob("STAT <%s>\r\n" % self.article.article)
command = utob("STAT <%s>\r\n" % article.article)
else:
command = utob("HEAD <%s>\r\n" % self.article.article)
command = utob("HEAD <%s>\r\n" % article.article)
elif self.server.have_body:
command = utob("BODY <%s>\r\n" % self.article.article)
command = utob("BODY <%s>\r\n" % article.article)
else:
command = utob("ARTICLE <%s>\r\n" % self.article.article)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
command = utob("ARTICLE <%s>\r\n" % article.article)
return command, article
def recv_chunk(self) -> Tuple[int, bool, bool]:
"""Receive data, return #bytes, end-of-line, end-of-article"""
# Resize the buffer in the extremely unlikely case that it got full
if self.data_position == len(self.data):
self.nntp.nw.increase_data_buffer()
def on_response(self, response: sabctools.NNTPResponse, article: Optional["sabnzbd.nzb.Article"]) -> None:
"""A response to a NNTP request is received"""
self.concurrent_requests.release()
server = self.server
article_done = response.status_code in (220, 222) and article
# Receive data into the pre-allocated buffer
if self.nntp.nw.server.ssl and not self.nntp.nw.blocking and sabctools.openssl_linked:
if article_done:
with DOWNLOADER_LOCK:
# Update statistics only when we fetched a whole article
# The side effect is that we don't count things like article-not-available messages
article.nzf.nzo.update_download_stats(sabnzbd.BPSMeter.bps, server.id, response.bytes_read)
# Response code depends on request command:
# 220 = ARTICLE, 222 = BODY
if not article_done:
if not self.ready or not article or response.status_code in (281, 381, 480, 481, 482):
self.discard(article, count_article_try=False)
if not sabnzbd.Downloader.finish_connect_nw(self, response):
return
if self.ready:
logging.info("Connecting %s@%s finished", self.thrdnum, server.host)
elif response.status_code == 223:
article_done = True
logging.debug("Article <%s> is present on %s", article.article, server.host)
elif response.status_code in (411, 423, 430, 451):
article_done = True
logging.debug(
"Thread %s@%s: Article %s missing (error=%s)",
self.thrdnum,
server.host,
article.article,
response.status_code,
)
elif response.status_code == 500:
if article.nzf.nzo.precheck:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug("Server %s does not support STAT", server.host)
else:
# Assume "BODY" command is not supported
server.have_body = False
logging.debug("Server %s does not support BODY", server.host)
self.discard(article, count_article_try=False)
else:
# Don't warn for (internal) server errors during downloading
if response.status_code not in (400, 502, 503):
logging.warning(
T("%s@%s: Received unknown status code %s for article %s"),
self.thrdnum,
server.host,
response.status_code,
article.article,
)
# Ditch this thread, we don't know what data we got now so the buffer can be bad
sabnzbd.Downloader.reset_nw(
self, f"Server error or unknown status code: {response.status_code}", wait=False, article=article
)
return
if article_done:
# Successful data, clear "bad" counter
server.bad_cons = 0
server.errormsg = server.warning = ""
# Decode
sabnzbd.Downloader.decode(article, response)
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s done", self.thrdnum, server.host, article.article)
def read(
self,
nbytes: int = 0,
on_response: Optional[Callable[[int, str], None]] = None,
generation: Optional[int] = None,
) -> Tuple[int, Optional[int]]:
"""Receive data, return #bytes, #pendingbytes
:param nbytes: maximum number of bytes to read
:param on_response: callback for each complete response received
:param generation: expected reset generation
:return: #bytes, #pendingbytes
"""
if generation is None:
generation = self.generation
# NewsWrapper is being reset
if self.decoder is None:
return 0, None
# Receive data into the decoder pre-allocated buffer
if not nbytes and self.nntp.nw.server.ssl and not self.nntp.nw.blocking and sabctools.openssl_linked:
# Use patched version when downloading
bytes_recv = sabctools.unlocked_ssl_recv_into(self.nntp.sock, self.data_view[self.data_position :])
bytes_recv = sabctools.unlocked_ssl_recv_into(self.nntp.sock, self.decoder)
else:
bytes_recv = self.nntp.sock.recv_into(self.data_view[self.data_position :])
bytes_recv = self.nntp.sock.recv_into(self.decoder, nbytes=nbytes)
# No data received
if bytes_recv == 0:
raise ConnectionError("Server closed connection")
# Success, move timeout and internal data position
# Success, move timeout
self.timeout = time.time() + self.server.timeout
self.data_position += bytes_recv
self.decoder.process(bytes_recv)
if self.decoder:
for response in self.decoder:
with self.lock:
# Check generation under lock to avoid racing with hard_reset
if self.generation != generation or not self._response_queue:
break
article = self._response_queue.popleft()
if on_response:
on_response(response.status_code, response.message)
self.on_response(response, article)
# After each response this socket may need to be made available to write the next request,
# or removed from socket monitoring to prevent hot looping.
if self.prepare_request():
# There is either a next_request or an inflight request
# If there is a next_request to send, ensure the socket is registered for write events
# Checks before calling modify_socket to prevent locks on the hot path
if self.next_request and self.selector_events != EVENT_READ | EVENT_WRITE:
sabnzbd.Downloader.modify_socket(self, EVENT_READ | EVENT_WRITE)
else:
# Only remove the socket if it's not SSL or has no pending data, otherwise the recursive call may
# call prepare_request again and find a request, but the socket would have already been removed.
if not self.server.ssl or not self.nntp or not self.nntp.sock.pending():
# No further work for this socket
sabnzbd.Downloader.remove_socket(self)
# The SSL-layer might still contain data even though the socket does not. Another Downloader-loop would
# not identify this socket anymore as it is not returned by select(). So, we have to forcefully trigger
# another recv_chunk so the buffer is increased and the data from the SSL-layer is read. See #2752.
if self.nntp.nw.server.ssl and self.data_position == len(self.data) and self.nntp.sock.pending() > 0:
# We do not perform error-handling, as we know there is data available to read
additional_bytes_recv, additional_end_of_line, additional_end_of_article = self.recv_chunk()
return bytes_recv + additional_bytes_recv, additional_end_of_line, additional_end_of_article
if self.server.ssl and self.nntp and (pending := self.nntp.sock.pending()):
return bytes_recv, pending
return bytes_recv, None
# Check for end of line
# Using the data directly seems faster than the memoryview
if self.data[self.data_position - 2 : self.data_position] == b"\r\n":
# Official end-of-article is "\r\n.\r\n"
if self.data[self.data_position - 5 : self.data_position] == b"\r\n.\r\n":
return bytes_recv, True, True
return bytes_recv, True, False
def prepare_request(self) -> bool:
"""Queue an article request if appropriate."""
server = self.server
# Still in middle of data, so continue!
return bytes_recv, False, False
# Do not pipeline requests until authentication is completed (connected)
if self.ready or not self._response_queue:
server_ready = (
server.active
and not server.restart
and not (
sabnzbd.Downloader.no_active_jobs()
or sabnzbd.Downloader.shutdown
or sabnzbd.Downloader.paused_for_postproc
)
)
def soft_reset(self):
"""Reset for the next article"""
self.timeout = None
self.article = None
self.reset_data_buffer()
if server_ready:
# Queue the next article if none exists
if not self.next_request and (article := server.get_article()):
self.next_request = self.body(article)
return True
else:
# Server not ready, discard any queued next_request
if self.next_request and self.next_request[1]:
self.discard(self.next_request[1], count_article_try=False, retry_article=True)
self.next_request = None
def reset_data_buffer(self):
"""Reset the data position"""
self.data_position = 0
# Return True if there is work queued or in flight
return bool(self.next_request or self._response_queue)
def increase_data_buffer(self):
"""Resize the buffer in the extremely unlikely case that it overflows"""
# Sanity check before we go any further
if len(self.data) > NTTP_MAX_BUFFER_SIZE:
raise BufferError("Maximum data buffer size exceeded")
def write(self):
"""Send data to server"""
server = self.server
# Input needs to be integer, floats don't work
new_buffer = sabctools.bytearray_malloc(len(self.data) + NNTP_BUFFER_SIZE // 2)
new_buffer[: len(self.data)] = self.data
logging.info("Increased buffer from %d to %d for %s", len(self.data), len(new_buffer), str(self))
self.data = new_buffer
self.data_view = memoryview(self.data)
try:
# Flush any buffered data
if self.nntp.write_buffer:
sent = self.nntp.sock.send(self.nntp.write_buffer)
self.nntp.write_buffer = self.nntp.write_buffer[sent:]
# If buffer still has data, wait for next write opportunity
if self.nntp.write_buffer:
return
# If available, try to send new command
if self.prepare_request():
# Nothing to send but already requests in-flight
if not self.next_request:
sabnzbd.Downloader.modify_socket(self, EVENT_READ)
return
if self.concurrent_requests.acquire(blocking=False):
command, article = self.next_request
if article:
nzo = article.nzf.nzo
if nzo.removed_from_queue or nzo.status is Status.PAUSED and nzo.priority is not FORCE_PRIORITY:
self.discard(article, count_article_try=False, retry_article=True)
self.concurrent_requests.release()
self.next_request = None
return
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s", self.thrdnum, server.host, command)
# Non-blocking send - buffer any unsent data
sent = self.nntp.sock.send(command)
if sent < len(command):
logging.debug("%s@%s: Partial send", self.thrdnum, server.host)
self.nntp.write_buffer = command[sent:]
self._response_queue.append(article)
self.next_request = None
else:
# Concurrency limit reached; wait until a response is read to prevent hot looping on EVENT_WRITE
sabnzbd.Downloader.modify_socket(self, EVENT_READ)
else:
# No further work for this socket
sabnzbd.Downloader.remove_socket(self)
except ssl.SSLWantWriteError:
# Socket not ready for writing, keep buffer and wait for next write event
pass
except ssl.SSLWantReadError:
# SSL renegotiation needs read first
sabnzbd.Downloader.modify_socket(self, EVENT_READ)
except BlockingIOError:
# Socket not ready for writing, keep buffer and wait for next write event
pass
except socket.error as err:
logging.info("Looks like server closed connection: %s, type: %s", err, type(err))
sabnzbd.Downloader.reset_nw(self, "Server broke off connection", warn=True, wait=False)
except Exception:
logging.error(T("Suspect error in downloader"))
logging.info("Traceback: ", exc_info=True)
sabnzbd.Downloader.reset_nw(self, "Server broke off connection", warn=True)
@synchronized(DOWNLOADER_LOCK)
def hard_reset(self, wait: bool = True):
"""Destroy and restart"""
if self.nntp:
self.nntp.close(send_quit=self.connected)
self.nntp = None
with self.lock:
# Drain unsent requests
if self.next_request:
_, article = self.next_request
if article:
self.discard(article, count_article_try=False, retry_article=True)
self.next_request = None
# Drain responses
while self._response_queue:
if article := self._response_queue.popleft():
self.discard(article, count_article_try=False, retry_article=True)
# Reset all variables (including the NNTP connection)
self.__init__(self.server, self.thrdnum)
if self.nntp:
sabnzbd.Downloader.remove_socket(self)
self.nntp.close(send_quit=self.ready)
self.nntp = None
# Reset all variables (including the NNTP connection) and increment the generation counter
self.__init__(self.server, self.thrdnum, generation=self.generation + 1)
# Wait before re-using this newswrapper
if wait:
@@ -258,18 +469,40 @@ class NewsWrapper:
# Reset for internal reasons, just wait 5 sec
self.timeout = time.time() + 5
def discard(
self,
article: Optional["sabnzbd.nzb.Article"],
count_article_try: bool = True,
retry_article: bool = True,
) -> None:
"""Discard an article back to the queue"""
if article and not article.nzf.nzo.removed_from_queue:
# Only some errors should count towards the total tries for each server
if count_article_try:
article.tries += 1
# Do we discard, or try again for this server
if not retry_article or (not self.server.required and article.tries > sabnzbd.cfg.max_art_tries()):
# Too many tries on this server, consider article missing
sabnzbd.Downloader.decode(article)
article.tries = 0
else:
# Allow all servers again for this article
# Do not use the article_queue, as the server could already have been disabled when we get here!
article.allow_new_fetcher()
def __repr__(self):
return "<NewsWrapper: server=%s:%s, thread=%s, connected=%s>" % (
self.server.host,
self.server.port,
self.thrdnum,
self.connected,
self.ready,
)
class NNTP:
# Pre-define attributes to save memory
__slots__ = ("nw", "addrinfo", "error_msg", "sock", "fileno", "closed")
__slots__ = ("nw", "addrinfo", "error_msg", "sock", "fileno", "closed", "write_buffer")
def __init__(self, nw: NewsWrapper, addrinfo: AddrInfo):
self.nw: NewsWrapper = nw
@@ -280,6 +513,9 @@ class NNTP:
# Prevent closing this socket until it's done connecting
self.closed = False
# Buffer for non-blocking writes
self.write_buffer: bytes = b""
# Create SSL-context if it is needed and not created yet
if self.nw.server.ssl and not self.nw.server.ssl_context:
# Setup the SSL socket
@@ -379,7 +615,8 @@ class NNTP:
# Locked, so it can't interleave with any of the Downloader "__nw" actions
with DOWNLOADER_LOCK:
if not self.closed:
sabnzbd.Downloader.add_socket(self.fileno, self.nw)
self.nw.connected = True
sabnzbd.Downloader.add_socket(self.nw)
except OSError as e:
self.error(e)
@@ -436,6 +673,8 @@ class NNTP:
else:
logging.warning(msg)
self.nw.server.warning = msg
# No reset-warning needed, above logging is sufficient
sabnzbd.Downloader.reset_nw(self.nw)
@synchronized(DOWNLOADER_LOCK)
def close(self, send_quit: bool):
@@ -443,10 +682,12 @@ class NNTP:
Locked to match connect(), even though most likely the caller already holds the same lock."""
# Set status first, so any calls in connect/error are handled correctly
self.closed = True
self.write_buffer = b""
try:
if send_quit:
self.sock.sendall(b"QUIT\r\n")
time.sleep(0.01)
with suppress(socket.error):
self.sock.sendall(b"QUIT\r\n")
time.sleep(0.01)
self.sock.close()
except Exception as e:
logging.info("%s@%s: Failed to close socket (error=%s)", self.nw.thrdnum, self.nw.server.host, str(e))

View File

@@ -20,7 +20,6 @@
sabnzbd.notifier - Send notifications to any notification services
"""
import sys
import os.path
import logging
@@ -31,7 +30,7 @@ import http.client
import json
import apprise
from threading import Thread
from typing import Optional, Dict, Union
from typing import Optional, Union
import sabnzbd
import sabnzbd.cfg
@@ -160,7 +159,7 @@ def send_notification(
msg: str,
notification_type: str,
job_cat: Optional[str] = None,
actions: Optional[Dict[str, str]] = None,
actions: Optional[dict[str, str]] = None,
):
"""Send Notification message"""
logging.info("Sending notification: %s - %s (type=%s, job_cat=%s)", title, msg, notification_type, job_cat)
@@ -243,7 +242,7 @@ def send_notify_osd(title, message):
return error
def send_notification_center(title: str, msg: str, notification_type: str, actions: Optional[Dict[str, str]] = None):
def send_notification_center(title: str, msg: str, notification_type: str, actions: Optional[dict[str, str]] = None):
"""Send message to macOS Notification Center.
Only 1 button is possible on macOS!"""
logging.debug("Sending macOS notification")
@@ -531,7 +530,7 @@ def send_nscript(title, msg, notification_type, force=False, test=None):
return ""
def send_windows(title: str, msg: str, notification_type: str, actions: Optional[Dict[str, str]] = None):
def send_windows(title: str, msg: str, notification_type: str, actions: Optional[dict[str, str]] = None):
"""Send Windows notifications, either fancy with buttons (Windows 10+) or basic ones"""
# Skip any notifications if ran as a Windows Service, it can result in crashes
if sabnzbd.WIN_SERVICE:

56
sabnzbd/nzb/__init__.py Normal file
View File

@@ -0,0 +1,56 @@
#!/usr/bin/python3 -OO
# Copyright 2007-2025 by The SABnzbd-Team (sabnzbd.org)
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.nzb - NZB-related classes and functionality
"""
# Article-related classes
from sabnzbd.nzb.article import Article, ArticleSaver, TryList
# File-related classes
from sabnzbd.nzb.file import NzbFile, NzbFileSaver, SkippedNzbFile
# Object-related classes
from sabnzbd.nzb.object import (
NzbObject,
NzbObjectSaver,
NzoAttributeSaver,
NzbEmpty,
NzbRejected,
NzbPreQueueRejected,
NzbRejectToHistory,
)
__all__ = [
# Article
"Article",
"ArticleSaver",
"TryList",
# File
"NzbFile",
"NzbFileSaver",
"SkippedNzbFile",
# Object
"NzbObject",
"NzbObjectSaver",
"NzoAttributeSaver",
"NzbEmpty",
"NzbRejected",
"NzbPreQueueRejected",
"NzbRejectToHistory",
]

226
sabnzbd/nzb/article.py Normal file
View File

@@ -0,0 +1,226 @@
#!/usr/bin/python3 -OO
# Copyright 2007-2025 by The SABnzbd-Team (sabnzbd.org)
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.article - Article and TryList classes for NZB downloading
"""
import logging
import threading
from typing import Optional
import sabnzbd
from sabnzbd.downloader import Server
from sabnzbd.filesystem import get_new_id
from sabnzbd.decorators import synchronized
##############################################################################
# Trylist
##############################################################################
class TryList:
"""TryList keeps track of which servers have been tried for a specific article"""
# Pre-define attributes to save memory
__slots__ = ("try_list",)
def __init__(self):
# Sets are faster than lists
self.try_list: set[Server] = set()
@synchronized()
def server_in_try_list(self, server: Server) -> bool:
"""Return whether specified server has been tried"""
return server in self.try_list
@synchronized()
def all_servers_in_try_list(self, all_servers: set[Server]) -> bool:
"""Check if all servers have been tried"""
return all_servers.issubset(self.try_list)
@synchronized()
def add_to_try_list(self, server: Server):
"""Register server as having been tried already"""
# Sets cannot contain duplicate items
self.try_list.add(server)
@synchronized()
def remove_from_try_list(self, server: Server):
"""Remove server from list of tried servers"""
# Discard does not require the item to be present
self.try_list.discard(server)
@synchronized()
def reset_try_list(self):
"""Clean the list"""
self.try_list = set()
def __getstate__(self):
"""Save the servers"""
return set(server.id for server in self.try_list)
def __setstate__(self, servers_ids: list[str]):
self.try_list = set()
for server in sabnzbd.Downloader.servers:
if server.id in servers_ids:
self.add_to_try_list(server)
##############################################################################
# Article
##############################################################################
ArticleSaver = (
"article",
"art_id",
"bytes",
"lowest_partnum",
"decoded",
"file_size",
"data_begin",
"data_size",
"on_disk",
"nzf",
"crc32",
"decoded_size",
)
class Article(TryList):
"""Representation of one article"""
# Pre-define attributes to save memory
__slots__ = ArticleSaver + ("fetcher", "fetcher_priority", "tries", "lock")
def __init__(self, article, article_bytes, nzf):
super().__init__()
self.article: str = article
self.art_id: Optional[str] = None
self.bytes: int = article_bytes
self.lowest_partnum: bool = False
self.fetcher: Optional[Server] = None
self.fetcher_priority: int = 0
self.tries: int = 0 # Try count
self.decoded: bool = False
self.file_size: Optional[int] = None
self.data_begin: Optional[int] = None
self.data_size: Optional[int] = None
self.decoded_size: Optional[int] = None # Size of the decoded article
self.on_disk: bool = False
self.crc32: Optional[int] = None
self.nzf: "sabnzbd.nzb.NzbFile" = nzf # NzbFile reference
# Share NzbFile lock for file-wide atomicity of try-list ops
self.lock: threading.RLock = nzf.lock
@synchronized()
def reset_try_list(self):
"""In addition to resetting the try list, also reset fetcher so all servers
are tried again. Locked so fetcher setting changes are also protected."""
self.fetcher = None
self.fetcher_priority = 0
super().reset_try_list()
def allow_new_fetcher(self, remove_fetcher_from_try_list: bool = True):
"""Let article get new fetcher and reset try lists of file and job.
Locked so all resets are performed at once.
Must acquire nzo lock first, then nzf lock (which is self.lock) to prevent deadlock."""
with self.nzf.nzo.lock, self.lock:
if remove_fetcher_from_try_list:
self.remove_from_try_list(self.fetcher)
self.fetcher = None
self.tries = 0
self.nzf.reset_try_list()
self.nzf.nzo.reset_try_list()
def get_article(self, server: Server, servers: list[Server]):
"""Return article when appropriate for specified server"""
if self.fetcher or self.server_in_try_list(server):
return None
if server.priority > self.fetcher_priority:
# Check for higher priority server, taking advantage of servers list being sorted by priority
for server_check in servers:
if server_check.priority < server.priority:
if server_check.active and not self.server_in_try_list(server_check):
# There is a higher priority server, so set article priority and return
self.fetcher_priority = server_check.priority
return None
else:
# All servers with a higher priority have been checked
break
# If no higher priority servers, use this server
self.fetcher_priority = server.priority
self.fetcher = server
self.tries += 1
return self
def get_art_id(self):
"""Return unique article storage name, create if needed"""
if not self.art_id:
self.art_id = get_new_id("article", self.nzf.nzo.admin_path)
return self.art_id
def search_new_server(self):
"""Search for a new server for this article"""
# Since we need a new server, this one can be listed as failed
sabnzbd.BPSMeter.register_server_article_failed(self.fetcher.id)
self.add_to_try_list(self.fetcher)
# Servers-list could be modified during iteration, so we need a copy
for server in sabnzbd.Downloader.servers[:]:
if server.active and not self.server_in_try_list(server):
if server.priority >= self.fetcher.priority:
self.tries = 0
# Allow all servers for this nzo and nzf again (but not this fetcher for this article)
self.allow_new_fetcher(remove_fetcher_from_try_list=False)
return True
logging.info("Article %s unavailable on all servers, discarding", self.article)
return False
@property
def can_direct_write(self) -> bool:
return bool(
self.data_size # decoder sets data_size to 0 when offsets or file_size are outside allowed range
and self.nzf.type == "yenc"
and self.nzf.prepare_filepath()
)
def __getstate__(self):
"""Save to pickle file, selecting attributes"""
dict_ = {}
for item in ArticleSaver:
dict_[item] = getattr(self, item)
dict_["try_list"] = super().__getstate__()
return dict_
def __setstate__(self, dict_):
"""Load from pickle file, selecting attributes"""
for item in ArticleSaver:
try:
setattr(self, item, dict_[item])
except KeyError:
# Handle new attributes
setattr(self, item, None)
self.lock = threading.RLock()
super().__setstate__(dict_.get("try_list", []))
self.fetcher = None
self.fetcher_priority = 0
self.tries = 0
def __repr__(self):
return "<Article: article=%s, bytes=%s, art_id=%s>" % (self.article, self.bytes, self.art_id)

387
sabnzbd/nzb/file.py Normal file
View File

@@ -0,0 +1,387 @@
#!/usr/bin/python3 -OO
# Copyright 2007-2025 by The SABnzbd-Team (sabnzbd.org)
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.nzb.file - NzbFile class for representing files in NZB downloads
"""
import datetime
import logging
import os
import threading
from typing import Optional, Any
import sabctools
from sabnzbd.nzb.article import TryList, Article
from sabnzbd.downloader import Server
from sabnzbd.filesystem import (
sanitize_filename,
get_unique_filename,
get_filename,
remove_file,
get_new_id,
save_data,
load_data,
RAR_RE,
)
from sabnzbd.misc import int_conv, subject_name_extractor
from sabnzbd.decorators import synchronized
##############################################################################
# NzbFile
##############################################################################
class SkippedNzbFile(Exception):
pass
NzbFileSaver = (
"date",
"filename",
"filename_checked",
"filepath",
"type",
"is_par2",
"vol",
"blocks",
"setname",
"articles",
"decodetable",
"bytes",
"bytes_left",
"nzo",
"nzf_id",
"deleted",
"import_finished",
"crc32",
"assembled",
"md5of16k",
)
class NzbFile(TryList):
"""Representation of one file consisting of multiple articles"""
# Pre-define attributes to save memory
__slots__ = NzbFileSaver + ("lock", "file_lock", "assembler_next_index")
def __init__(self, date, subject, raw_article_db, file_bytes, nzo):
"""Setup object"""
super().__init__()
self.lock: threading.RLock = threading.RLock()
self.file_lock: threading.RLock = threading.RLock()
self.date: datetime.datetime = date
self.type: Optional[str] = None
self.filename: str = sanitize_filename(subject_name_extractor(subject))
self.filename_checked = False
self.filepath: Optional[str] = None
# Identifiers for par2 files
self.is_par2: bool = False
self.vol: Optional[int] = None
self.blocks: Optional[int] = None
self.setname: Optional[str] = None
# Articles are removed from "articles" after being fetched
self.articles: dict[Article, Article] = {}
self.decodetable: list[Article] = []
self.bytes: int = file_bytes
self.bytes_left: int = file_bytes
self.nzo = nzo # NzbObject reference
self.deleted = False
self.import_finished = False
self.crc32: Optional[int] = 0
self.assembled: bool = False
self.md5of16k: Optional[bytes] = None
self.assembler_next_index: int = 0
# Add first article to decodetable, this way we can check
# if this is maybe a duplicate nzf
if raw_article_db:
first_article = self.add_article(raw_article_db.pop(0))
first_article.lowest_partnum = True
if self in nzo.files:
logging.info("File %s occurred twice in NZB, skipping", self.filename)
raise SkippedNzbFile
# Create file on disk, which can fail in case of disk errors
self.nzf_id: str = get_new_id("nzf", nzo.admin_path)
if not self.nzf_id:
# Error already shown to user from get_new_id
raise SkippedNzbFile
# Any articles left?
if raw_article_db:
# Save the rest
save_data(raw_article_db, self.nzf_id, nzo.admin_path)
else:
# All imported
self.import_finished = True
@property
@synchronized()
def assembler_next_article(self) -> Optional[Article]:
if (next_index := self.assembler_next_index) < len(self.decodetable):
return self.decodetable[next_index]
return None
def finish_import(self):
"""Load the article objects from disk"""
logging.debug("Finishing import on %s", self.filename)
if raw_article_db := load_data(self.nzf_id, self.nzo.admin_path, remove=False):
for raw_article in raw_article_db:
self.add_article(raw_article)
# Make sure we have labeled the lowest part number
# Also when DirectUnpack is disabled we need to know
self.decodetable[0].lowest_partnum = True
# Mark safe to continue
self.import_finished = True
@synchronized()
def add_article(self, article_info):
"""Add article to object database and return article object"""
article = Article(article_info[0], article_info[1], self)
self.articles[article] = article
self.decodetable.append(article)
return article
@synchronized()
def remove_article(self, article: Article, success: bool) -> int:
"""Handle completed article, possibly end of file"""
if self.articles.pop(article, None) is not None:
if success:
self.bytes_left -= article.bytes
return len(self.articles)
def set_par2(self, setname, vol, blocks):
"""Designate this file as a par2 file"""
self.is_par2 = True
self.setname = setname
self.vol = vol
self.blocks = int_conv(blocks)
@synchronized()
def update_crc32(self, crc32: Optional[int], length: int) -> None:
if self.crc32 is None or crc32 is None:
self.crc32 = None
else:
self.crc32 = sabctools.crc32_combine(self.crc32, crc32, length)
@synchronized()
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int):
"""Get next articles to be downloaded"""
articles = server.article_queue
for article in self.articles:
if article := article.get_article(server, servers):
articles.append(article)
if len(articles) >= fetch_limit:
return
self.add_to_try_list(server)
@synchronized()
def reset_all_try_lists(self):
"""Reset all try lists. Locked so reset is performed
for all items at the same time without chance of another
thread changing any of the items while we are resetting"""
for art in self.articles:
art.reset_try_list()
self.reset_try_list()
def first_article_processed(self) -> bool:
"""Check if the first article has been processed.
This ensures we have attempted to extract md5of16k and filename information
before creating the filepath.
"""
# The first article of decodetable is always the lowest
first_article = self.decodetable[0]
# If it's still in nzo.first_articles, it hasn't been processed yet
return first_article not in self.nzo.first_articles
def prepare_filepath(self):
"""Do all checks before making the final path"""
if not self.filepath:
# Wait for the first article to be processed so we can get md5of16k
# and proper filename before creating the filepath
if not self.first_article_processed():
return None
self.nzo.verify_nzf_filename(self)
filename = sanitize_filename(self.filename)
self.filepath = get_unique_filename(os.path.join(self.nzo.download_path, filename))
self.filename = get_filename(self.filepath)
return self.filepath
@property
def completed(self):
"""Is this file completed?"""
if not self.import_finished:
return False
with self.lock:
return not self.articles
def remove_admin(self):
"""Remove article database from disk (sabnzbd_nzf_<id>)"""
try:
logging.debug("Removing article database for %s", self.nzf_id)
remove_file(os.path.join(self.nzo.admin_path, self.nzf_id))
except Exception:
pass
@synchronized()
def contiguous_offset(self) -> int:
"""The next file offset to write to continue sequentially.
Note: there could be non-sequential direct writes already beyond this point
"""
# If last written article has valid yenc headers
if self.assembler_next_index:
article = self.decodetable[self.assembler_next_index - 1]
if article.on_disk and article.data_size:
return article.data_begin + article.data_size
# Fallback to summing decoded size
offset = 0
for article in self.decodetable[: self.assembler_next_index]:
if not article.on_disk:
break
if article.data_size:
offset = article.data_begin + article.decoded_size
elif article.decoded_size is not None:
# queues from <= 4.5.5 do not have this attribute
offset += article.decoded_size
elif os.path.exists(self.filepath):
# fallback for <= 4.5.5 because files were always opened in append mode, so use the file size
return os.path.getsize(self.filepath)
return offset
@synchronized()
def contiguous_ready_bytes(self) -> int:
"""How many bytes from assembler_next_index onward are ready to write to file contiguously?"""
bytes_ready: int = 0
for article in self.decodetable[self.assembler_next_index :]:
if not article.decoded:
break
if article.on_disk:
continue
if article.decoded_size is None:
break
bytes_ready += article.decoded_size
return bytes_ready
def sort_key(self) -> tuple[Any, ...]:
"""Comparison function for sorting NZB files.
The comparison will sort .par2 files to the top of the queue followed by .rar files,
they will then be sorted by name.
"""
name = self.filename.lower()
base, ext = os.path.splitext(name)
is_par2 = ext == ".par2"
is_vol_par2 = is_par2 and ".vol" in base
is_mini_par2 = is_par2 and not is_vol_par2
m = RAR_RE.search(name)
is_rar = bool(m)
is_main_rar = is_rar and m.group(1) == "rar"
# Initially group by mini-par2, other files, vol-par2
if is_mini_par2:
tier = 0
elif is_vol_par2:
tier = 2
else:
tier = 1
if tier == 1:
if is_rar and m:
# strip matched RAR suffix including leading dot (.part01.rar, .rar, .r00, ...)
group_base = name[: m.start()]
local_group = 0
type_rank = 0 if is_main_rar else 1
else:
# nfo, sfv, sample.mkv, etc.
group_base = base
local_group = 1
type_rank = 0
else:
# mini/vol par2 ignore the group base
group_base = ""
local_group = 0
type_rank = 0
return tier, group_base, local_group, type_rank, name
def __getstate__(self):
"""Save to pickle file, selecting attributes"""
dict_ = {}
for item in NzbFileSaver:
dict_[item] = getattr(self, item)
dict_["try_list"] = super().__getstate__()
return dict_
def __setstate__(self, dict_):
"""Load from pickle file, selecting attributes"""
for item in NzbFileSaver:
try:
setattr(self, item, dict_[item])
except KeyError:
# Handle new attributes
setattr(self, item, None)
self.lock = threading.RLock()
self.file_lock = threading.RLock()
self.assembler_next_index = 0
if isinstance(self.articles, list):
# Converted from list to dict
self.articles = {x: x for x in self.articles}
for article in self.articles:
article.lock = self.lock
super().__setstate__(dict_.get("try_list", []))
def __lt__(self, other: "NzbFile"):
return self.sort_key() < other.sort_key()
def __eq__(self, other: "NzbFile"):
"""Assume it's the same file if the number bytes and first article
are the same or if there are no articles left, use the filenames.
Some NZB's are just a mess and report different sizes for the same article.
We used to compare (__eq__) articles based on article-ID, however, this failed
because some NZB's had the same article-ID twice within one NZF.
"""
if other and (self.bytes == other.bytes or len(self.decodetable) == len(other.decodetable)):
if self.decodetable and other.decodetable:
return self.decodetable[0].article == other.decodetable[0].article
# Fallback to filename comparison
return self.filename == other.filename
return False
def __hash__(self):
"""Required because we implement eq. The same file can be spread
over multiple NZO's so we make every NZF unique. Even though
it's considered bad practice.
"""
return id(self)
def __repr__(self):
return "<NzbFile: filename=%s, bytes=%s, nzf_id=%s>" % (self.filename, self.bytes, self.nzf_id)

View File

@@ -16,21 +16,22 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.nzbstuff - misc
sabnzbd.nzb.object - NzbObject class for representing NZB download jobs
"""
import os
import time
import re
import logging
import datetime
import threading
import functools
import difflib
from typing import List, Dict, Any, Tuple, Optional, Union, BinaryIO, Set
from typing import Any, Optional, Union, BinaryIO, Deque
# SABnzbd modules
import sabnzbd
import sabctools
from sabnzbd.nzb.article import TryList, Article
from sabnzbd.nzb.file import NzbFile
from sabnzbd.constants import (
GIGI,
ATTRIB_FILE,
@@ -60,6 +61,8 @@ from sabnzbd.misc import (
opts_to_pp,
pp_to_opts,
duplicate_warning,
scan_password,
subject_name_extractor,
)
from sabnzbd.filesystem import (
sanitize_foldername,
@@ -89,432 +92,18 @@ from sabnzbd.filesystem import (
remove_data,
strip_extensions,
get_ext,
create_work_name,
RAR_RE,
)
from sabnzbd.par2file import FilePar2Info, has_par2_in_filename, analyse_par2, parse_par2_file, is_par2_file
from sabnzbd.decorators import synchronized
import sabnzbd.config as config
import sabnzbd.cfg as cfg
import sabnzbd.nzbparser
from sabnzbd.downloader import Server
from sabnzbd.database import HistoryDB
from sabnzbd.deobfuscate_filenames import is_probably_obfuscated
# Name patterns
# In the subject, we expect the filename within double quotes
RE_SUBJECT_FILENAME_QUOTES = re.compile(r'"([^"]*)"')
# Otherwise something that looks like a filename
RE_SUBJECT_BASIC_FILENAME = re.compile(r"\b([\w\-+()' .,]+(?:\[[\w\-/+()' .,]*][\w\-+()' .,]*)*\.[A-Za-z0-9]{2,4})\b")
RE_RAR = re.compile(r"(\.rar|\.r\d\d|\.s\d\d|\.t\d\d|\.u\d\d|\.v\d\d)$", re.I)
##############################################################################
# Trylist
##############################################################################
TRYLIST_LOCK = threading.RLock()
class TryList:
"""TryList keeps track of which servers have been tried for a specific article"""
# Pre-define attributes to save memory
__slots__ = ("try_list",)
def __init__(self):
# Sets are faster than lists
self.try_list: Set[Server] = set()
def server_in_try_list(self, server: Server) -> bool:
"""Return whether specified server has been tried"""
with TRYLIST_LOCK:
return server in self.try_list
def all_servers_in_try_list(self, all_servers: Set[Server]) -> bool:
"""Check if all servers have been tried"""
with TRYLIST_LOCK:
return all_servers.issubset(self.try_list)
def add_to_try_list(self, server: Server):
"""Register server as having been tried already"""
with TRYLIST_LOCK:
# Sets cannot contain duplicate items
self.try_list.add(server)
def remove_from_try_list(self, server: Server):
"""Remove server from list of tried servers"""
with TRYLIST_LOCK:
# Discard does not require the item to be present
self.try_list.discard(server)
def reset_try_list(self):
"""Clean the list"""
with TRYLIST_LOCK:
self.try_list = set()
def __getstate__(self):
"""Save the servers"""
return set(server.id for server in self.try_list)
def __setstate__(self, servers_ids: List[str]):
self.try_list = set()
for server in sabnzbd.Downloader.servers:
if server.id in servers_ids:
self.add_to_try_list(server)
##############################################################################
# Article
##############################################################################
ArticleSaver = (
"article",
"art_id",
"bytes",
"lowest_partnum",
"decoded",
"file_size",
"data_begin",
"data_size",
"on_disk",
"nzf",
"crc32",
)
class Article(TryList):
"""Representation of one article"""
# Pre-define attributes to save memory
__slots__ = ArticleSaver + ("fetcher", "fetcher_priority", "tries")
def __init__(self, article, article_bytes, nzf):
super().__init__()
self.article: str = article
self.art_id: Optional[str] = None
self.bytes: int = article_bytes
self.lowest_partnum: bool = False
self.fetcher: Optional[Server] = None
self.fetcher_priority: int = 0
self.tries: int = 0 # Try count
self.decoded: bool = False
self.file_size: Optional[int] = None
self.data_begin: Optional[int] = None
self.data_size: Optional[int] = None
self.on_disk: bool = False
self.crc32: Optional[int] = None
self.nzf: NzbFile = nzf
@synchronized(TRYLIST_LOCK)
def reset_try_list(self):
"""In addition to resetting the try list, also reset fetcher so all servers
are tried again. Locked so fetcher setting changes are also protected."""
self.fetcher = None
self.fetcher_priority = 0
super().reset_try_list()
@synchronized(TRYLIST_LOCK)
def allow_new_fetcher(self, remove_fetcher_from_try_list: bool = True):
"""Let article get new fetcher and reset try lists of file and job.
Locked so all resets are performed at once"""
if remove_fetcher_from_try_list:
self.remove_from_try_list(self.fetcher)
self.fetcher = None
self.tries = 0
self.nzf.reset_try_list()
self.nzf.nzo.reset_try_list()
def get_article(self, server: Server, servers: List[Server]):
"""Return article when appropriate for specified server"""
if self.fetcher or self.server_in_try_list(server):
return None
if server.priority > self.fetcher_priority:
# Check for higher priority server, taking advantage of servers list being sorted by priority
for server_check in servers:
if server_check.priority < server.priority:
if server_check.active and not self.server_in_try_list(server_check):
# There is a higher priority server, so set article priority and return
self.fetcher_priority = server_check.priority
return None
else:
# All servers with a higher priority have been checked
break
# If no higher priority servers, use this server
self.fetcher_priority = server.priority
self.fetcher = server
self.tries += 1
return self
def get_art_id(self):
"""Return unique article storage name, create if needed"""
if not self.art_id:
self.art_id = get_new_id("article", self.nzf.nzo.admin_path)
return self.art_id
def search_new_server(self):
"""Search for a new server for this article"""
# Since we need a new server, this one can be listed as failed
sabnzbd.BPSMeter.register_server_article_failed(self.fetcher.id)
self.add_to_try_list(self.fetcher)
# Servers-list could be modified during iteration, so we need a copy
for server in sabnzbd.Downloader.servers[:]:
if server.active and not self.server_in_try_list(server):
if server.priority >= self.fetcher.priority:
self.tries = 0
# Allow all servers for this nzo and nzf again (but not this fetcher for this article)
self.allow_new_fetcher(remove_fetcher_from_try_list=False)
return True
logging.info("Article %s unavailable on all servers, discarding", self.article)
return False
def __getstate__(self):
"""Save to pickle file, selecting attributes"""
dict_ = {}
for item in ArticleSaver:
dict_[item] = getattr(self, item)
dict_["try_list"] = super().__getstate__()
return dict_
def __setstate__(self, dict_):
"""Load from pickle file, selecting attributes"""
for item in ArticleSaver:
try:
setattr(self, item, dict_[item])
except KeyError:
# Handle new attributes
setattr(self, item, None)
super().__setstate__(dict_.get("try_list", []))
self.fetcher = None
self.fetcher_priority = 0
self.tries = 0
def __repr__(self):
return "<Article: article=%s, bytes=%s, art_id=%s>" % (self.article, self.bytes, self.art_id)
##############################################################################
# NzbFile
##############################################################################
class SkippedNzbFile(Exception):
pass
NzbFileSaver = (
"date",
"filename",
"filename_checked",
"filepath",
"type",
"is_par2",
"vol",
"blocks",
"setname",
"articles",
"decodetable",
"bytes",
"bytes_left",
"nzo",
"nzf_id",
"deleted",
"import_finished",
"crc32",
"assembled",
"md5of16k",
)
class NzbFile(TryList):
"""Representation of one file consisting of multiple articles"""
# Pre-define attributes to save memory
__slots__ = NzbFileSaver
def __init__(self, date, subject, raw_article_db, file_bytes, nzo):
"""Setup object"""
super().__init__()
self.date: datetime.datetime = date
self.type: Optional[str] = None
self.filename: str = sanitize_filename(name_extractor(subject))
self.filename_checked = False
self.filepath: Optional[str] = None
# Identifiers for par2 files
self.is_par2: bool = False
self.vol: Optional[int] = None
self.blocks: Optional[int] = None
self.setname: Optional[str] = None
# Articles are removed from "articles" after being fetched
self.articles: List[Article] = []
self.decodetable: List[Article] = []
self.bytes: int = file_bytes
self.bytes_left: int = file_bytes
self.nzo: NzbObject = nzo
self.deleted = False
self.import_finished = False
self.crc32: Optional[int] = 0
self.assembled: bool = False
self.md5of16k: Optional[bytes] = None
# Add first article to decodetable, this way we can check
# if this is maybe a duplicate nzf
if raw_article_db:
first_article = self.add_article(raw_article_db.pop(0))
first_article.lowest_partnum = True
if self in nzo.files:
logging.info("File %s occurred twice in NZB, skipping", self.filename)
raise SkippedNzbFile
# Create file on disk, which can fail in case of disk errors
self.nzf_id: str = get_new_id("nzf", nzo.admin_path)
if not self.nzf_id:
# Error already shown to user from get_new_id
raise SkippedNzbFile
# Any articles left?
if raw_article_db:
# Save the rest
save_data(raw_article_db, self.nzf_id, nzo.admin_path)
else:
# All imported
self.import_finished = True
def finish_import(self):
"""Load the article objects from disk"""
logging.debug("Finishing import on %s", self.filename)
if raw_article_db := load_data(self.nzf_id, self.nzo.admin_path, remove=False):
for raw_article in raw_article_db:
self.add_article(raw_article)
# Make sure we have labeled the lowest part number
# Also when DirectUnpack is disabled we need to know
self.decodetable[0].lowest_partnum = True
# Mark safe to continue
self.import_finished = True
def add_article(self, article_info):
"""Add article to object database and return article object"""
article = Article(article_info[0], article_info[1], self)
self.articles.append(article)
self.decodetable.append(article)
return article
def remove_article(self, article: Article, success: bool) -> int:
"""Handle completed article, possibly end of file"""
if article in self.articles:
self.articles.remove(article)
if success:
self.bytes_left -= article.bytes
return len(self.articles)
def set_par2(self, setname, vol, blocks):
"""Designate this file as a par2 file"""
self.is_par2 = True
self.setname = setname
self.vol = vol
self.blocks = int_conv(blocks)
def update_crc32(self, crc32: Optional[int], length: int) -> None:
if self.crc32 is None or crc32 is None:
self.crc32 = None
else:
self.crc32 = sabctools.crc32_combine(self.crc32, crc32, length)
def get_articles(self, server: Server, servers: List[Server], fetch_limit: int) -> List[Article]:
"""Get next articles to be downloaded"""
articles = []
for article in self.articles:
if article := article.get_article(server, servers):
articles.append(article)
if len(articles) >= fetch_limit:
return articles
self.add_to_try_list(server)
return articles
@synchronized(TRYLIST_LOCK)
def reset_all_try_lists(self):
"""Reset all try lists. Locked so reset is performed
for all items at the same time without chance of another
thread changing any of the items while we are resetting"""
for art in self.articles:
art.reset_try_list()
self.reset_try_list()
def prepare_filepath(self):
"""Do all checks before making the final path"""
if not self.filepath:
self.nzo.verify_nzf_filename(self)
filename = sanitize_filename(self.filename)
self.filepath = get_unique_filename(os.path.join(self.nzo.download_path, filename))
self.filename = get_filename(self.filepath)
return self.filepath
@property
def completed(self):
"""Is this file completed?"""
return self.import_finished and not bool(self.articles)
def remove_admin(self):
"""Remove article database from disk (sabnzbd_nzf_<id>)"""
try:
logging.debug("Removing article database for %s", self.nzf_id)
remove_file(os.path.join(self.nzo.admin_path, self.nzf_id))
except Exception:
pass
def __getstate__(self):
"""Save to pickle file, selecting attributes"""
dict_ = {}
for item in NzbFileSaver:
dict_[item] = getattr(self, item)
dict_["try_list"] = super().__getstate__()
return dict_
def __setstate__(self, dict_):
"""Load from pickle file, selecting attributes"""
for item in NzbFileSaver:
try:
setattr(self, item, dict_[item])
except KeyError:
# Handle new attributes
setattr(self, item, None)
super().__setstate__(dict_.get("try_list", []))
def __eq__(self, other: "NzbFile"):
"""Assume it's the same file if the number bytes and first article
are the same or if there are no articles left, use the filenames.
Some NZB's are just a mess and report different sizes for the same article.
We used to compare (__eq__) articles based on article-ID, however, this failed
because some NZB's had the same article-ID twice within one NZF.
"""
if other and (self.bytes == other.bytes or len(self.decodetable) == len(other.decodetable)):
if self.decodetable and other.decodetable:
return self.decodetable[0].article == other.decodetable[0].article
# Fallback to filename comparison
return self.filename == other.filename
return False
def __hash__(self):
"""Required because we implement eq. The same file can be spread
over multiple NZO's so we make every NZF unique. Even though
it's considered bad practice.
"""
return id(self)
def __repr__(self):
return "<NzbFile: filename=%s, bytes=%s, nzf_id=%s>" % (self.filename, self.bytes, self.nzf_id)
##############################################################################
# NzbObject
##############################################################################
class NzbEmpty(Exception):
pass
@@ -596,9 +185,6 @@ NzbObjectSaver = (
NzoAttributeSaver = ("cat", "pp", "script", "priority", "final_name", "password", "url")
# Lock to prevent errors when saving the NZO data
NZO_LOCK = threading.RLock()
class NzbObject(TryList):
def __init__(
@@ -614,12 +200,13 @@ class NzbObject(TryList):
password: Optional[str] = None,
nzbname: Optional[str] = None,
status: str = Status.QUEUED,
nzo_info: Optional[Dict[str, Any]] = None,
nzo_info: Optional[dict[str, Any]] = None,
reuse: Optional[str] = None,
nzo_id: Optional[str] = None,
dup_check: bool = True,
):
super().__init__()
self.lock: threading.RLock = threading.RLock()
# Use original filename as basis
self.work_name = self.filename = filename
@@ -677,7 +264,7 @@ class NzbObject(TryList):
# Bookkeeping values
self.meta = {}
self.servercount: Dict[str, int] = {} # Dict to keep bytes per server
self.servercount: dict[str, int] = {} # Dict to keep bytes per server
self.direct_unpacker: Optional[sabnzbd.directunpacker.DirectUnpacker] = None # The DirectUnpacker instance
self.bytes: int = 0 # Original bytesize
self.bytes_par2: int = 0 # Bytes available for repair
@@ -686,15 +273,15 @@ class NzbObject(TryList):
self.bytes_missing: int = 0 # Bytes missing
self.bad_articles: int = 0 # How many bad (non-recoverable) articles
self.extrapars: Dict[str, List[NzbFile]] = {} # Holds the extra parfile names for all sets
self.par2packs: Dict[str, Dict[str, FilePar2Info]] = {} # Holds the par2info for each file in each set
self.md5of16k: Dict[bytes, str] = {} # Holds the md5s of the first-16k of all files in the NZB (hash: name)
self.extrapars: dict[str, list[NzbFile]] = {} # Holds the extra parfile names for all sets
self.par2packs: dict[str, dict[str, FilePar2Info]] = {} # Holds the par2info for each file in each set
self.md5of16k: dict[bytes, str] = {} # Holds the md5s of the first-16k of all files in the NZB (hash: name)
self.files: List[NzbFile] = [] # List of all NZFs
self.files_table: Dict[str, NzbFile] = {} # Dictionary of NZFs indexed using NZF_ID
self.renames: Dict[str, str] = {} # Dictionary of all renamed files
self.files: list[NzbFile] = [] # List of all NZFs
self.files_table: dict[str, NzbFile] = {} # Dictionary of NZFs indexed using NZF_ID
self.renames: dict[str, str] = {} # Dictionary of all renamed files
self.finished_files: List[NzbFile] = [] # List of all finished NZFs
self.finished_files: list[NzbFile] = [] # List of all finished NZFs
# The current status of the nzo eg:
# Queued, Downloading, Repairing, Unpacking, Failed, Complete
@@ -703,9 +290,9 @@ class NzbObject(TryList):
self.avg_bps_freq = 0
self.avg_bps_total = 0
self.first_articles: List[Article] = []
self.first_articles: list[Article] = []
self.first_articles_count = 0
self.saved_articles: Set[Article] = set()
self.saved_articles: set[Article] = set()
self.nzo_id: Optional[str] = None
self.duplicate: Optional[str] = None
@@ -727,11 +314,11 @@ class NzbObject(TryList):
# Store one line responses for filejoin/par2/unrar here for history display
self.action_line = ""
# Store the results from various filejoin/par2/unrar stages
self.unpack_info: Dict[str, List[str]] = {}
self.unpack_info: dict[str, list[str]] = {}
# Stores one line containing the last failure
self.fail_msg = ""
# Stores various info about the nzo to be
self.nzo_info: Dict[str, Any] = nzo_info or {}
self.nzo_info: dict[str, Any] = nzo_info or {}
self.next_save = None
self.save_timeout = None
@@ -977,7 +564,7 @@ class NzbObject(TryList):
logging.info("File %s added to queue", nzf.filename)
@synchronized(NZO_LOCK)
@synchronized()
def remove_nzf(self, nzf: NzbFile) -> bool:
if nzf in self.files:
self.files.remove(nzf)
@@ -991,7 +578,7 @@ class NzbObject(TryList):
"""Sort the files in the NZO based on name and type
and then optimize for unwanted extensions search.
"""
self.files.sort(key=functools.cmp_to_key(nzf_cmp_name))
self.files.sort()
# In the hunt for Unwanted Extensions:
# The file with the unwanted extension often is in the first or the last rar file
@@ -1001,7 +588,7 @@ class NzbObject(TryList):
logging.debug("Unwanted Extension: putting last rar after first rar")
firstrarpos = lastrarpos = 0
for nzfposcounter, nzf in enumerate(self.files):
if RE_RAR.search(nzf.filename.lower()):
if RAR_RE.search(nzf.filename.lower()):
# a NZF found with '.rar' in the name
if firstrarpos == 0:
# this is the first .rar found, so remember this position
@@ -1020,7 +607,7 @@ class NzbObject(TryList):
except Exception:
logging.debug("The lastrar swap did not go well")
@synchronized(TRYLIST_LOCK)
@synchronized()
def reset_all_try_lists(self):
"""Reset all try lists. Locked so reset is performed
for all items at the same time without chance of another
@@ -1029,7 +616,7 @@ class NzbObject(TryList):
nzf.reset_all_try_lists()
self.reset_try_list()
@synchronized(NZO_LOCK)
@synchronized()
def postpone_pars(self, parset: str):
"""Move all vol-par files matching 'parset' to the extrapars table"""
# Create new extrapars if it didn't already exist
@@ -1060,7 +647,7 @@ class NzbObject(TryList):
# Also re-parse all filenames in case par2 came after first articles
self.verify_all_filenames_and_resort()
@synchronized(NZO_LOCK)
@synchronized()
def handle_par2(self, nzf: NzbFile, filepath):
"""Check if file is a par2 and build up par2 collection"""
# Need to remove it from the other set it might be in
@@ -1112,7 +699,7 @@ class NzbObject(TryList):
self.renamed_file(get_filename(new_fname), nzf.filename)
nzf.filename = get_filename(new_fname)
@synchronized(NZO_LOCK)
@synchronized()
def promote_par2(self, nzf: NzbFile):
"""In case of a broken par2 or missing par2, move another
of the same set to the top (if we can find it)
@@ -1173,7 +760,7 @@ class NzbObject(TryList):
# Not enough
return 0
@synchronized(NZO_LOCK)
@synchronized()
def remove_article(self, article: Article, success: bool):
"""Remove article from the NzbFile and do check if it can succeed"""
job_can_succeed = True
@@ -1464,7 +1051,7 @@ class NzbObject(TryList):
if self.duplicate:
self.duplicate = DuplicateStatus.DUPLICATE_IGNORED
@synchronized(NZO_LOCK)
@synchronized()
def add_parfile(self, parfile: NzbFile) -> bool:
"""Add parfile to the files to be downloaded
Add it to the start so we try it first
@@ -1479,7 +1066,7 @@ class NzbObject(TryList):
return True
return False
@synchronized(NZO_LOCK)
@synchronized()
def remove_extrapar(self, parfile: NzbFile):
"""Remove par file from any/all sets"""
for parset in list(self.extrapars):
@@ -1490,7 +1077,7 @@ class NzbObject(TryList):
if not self.extrapars[parset]:
self.extrapars.pop(parset)
@synchronized(NZO_LOCK)
@synchronized()
def prospective_add(self, nzf: NzbFile):
"""Add par2 files to compensate for missing articles"""
# Get some blocks!
@@ -1522,7 +1109,7 @@ class NzbObject(TryList):
if hasattr(self, "direct_unpacker") and self.direct_unpacker:
self.direct_unpacker.abort()
def check_availability_ratio(self) -> Tuple[bool, float]:
def check_availability_ratio(self) -> tuple[bool, float]:
"""Determine if we are still meeting the required ratio"""
availability_ratio = req_ratio = cfg.req_completion_rate()
@@ -1561,7 +1148,7 @@ class NzbObject(TryList):
return False
return True
@synchronized(NZO_LOCK)
@synchronized()
def set_download_report(self):
"""Format the stats for the history information"""
# Pretty-format the per-server stats
@@ -1616,7 +1203,7 @@ class NzbObject(TryList):
self.set_unpack_info("RSS", rss_feed, unique=True)
self.set_unpack_info("Source", self.url or self.filename, unique=True)
@synchronized(NZO_LOCK)
@synchronized()
def increase_bad_articles_counter(self, bad_article_type: str):
"""Record information about bad articles. Should be called before
register_article, which triggers the availability check."""
@@ -1625,8 +1212,9 @@ class NzbObject(TryList):
self.nzo_info[bad_article_type] += 1
self.bad_articles += 1
def get_articles(self, server: Server, servers: List[Server], fetch_limit: int) -> List[Article]:
articles = []
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int):
"""Assign articles server up to the fetch_limit"""
articles: Deque[Article] = server.article_queue
nzf_remove_list = []
# Did we go through all first-articles?
@@ -1661,7 +1249,8 @@ class NzbObject(TryList):
else:
break
if articles := nzf.get_articles(server, servers, fetch_limit):
nzf.get_articles(server, servers, fetch_limit)
if articles:
break
# Remove all files for which admin could not be read
@@ -1676,10 +1265,9 @@ class NzbObject(TryList):
if not articles:
# No articles for this server, block for next time
self.add_to_try_list(server)
return articles
@synchronized(NZO_LOCK)
def move_top_bulk(self, nzf_ids: List[str]):
@synchronized()
def move_top_bulk(self, nzf_ids: list[str]):
self.cleanup_nzf_ids(nzf_ids)
if nzf_ids:
target = list(range(len(nzf_ids)))
@@ -1695,7 +1283,7 @@ class NzbObject(TryList):
if target == keys:
break
@synchronized(NZO_LOCK)
@synchronized()
def move_bottom_bulk(self, nzf_ids):
self.cleanup_nzf_ids(nzf_ids)
if nzf_ids:
@@ -1712,7 +1300,7 @@ class NzbObject(TryList):
if target == keys:
break
@synchronized(NZO_LOCK)
@synchronized()
def move_up_bulk(self, nzf_ids, cleanup=True):
if cleanup:
self.cleanup_nzf_ids(nzf_ids)
@@ -1729,7 +1317,7 @@ class NzbObject(TryList):
self.files[pos - 1] = nzf
self.files[pos] = tmp_nzf
@synchronized(NZO_LOCK)
@synchronized()
def move_down_bulk(self, nzf_ids, cleanup=True):
if cleanup:
self.cleanup_nzf_ids(nzf_ids)
@@ -1782,7 +1370,7 @@ class NzbObject(TryList):
self.renamed_file(yenc_filename, nzf.filename)
nzf.filename = yenc_filename
@synchronized(NZO_LOCK)
@synchronized()
def verify_all_filenames_and_resort(self):
"""Verify all filenames based on par2 info and then re-sort files.
Locked so all files are verified at once without interruptions.
@@ -1797,7 +1385,7 @@ class NzbObject(TryList):
if self.direct_unpacker:
self.direct_unpacker.set_volumes_for_nzo()
@synchronized(NZO_LOCK)
@synchronized()
def renamed_file(self, name_set, old_name=None):
"""Save renames at various stages (Download/PP)
to be used on Retry. Accepts strings and dicts.
@@ -1825,7 +1413,7 @@ class NzbObject(TryList):
"""Return remaining bytes"""
return self.bytes - self.bytes_tried
@synchronized(NZO_LOCK)
@synchronized()
def purge_data(self, delete_all_data=True):
"""Remove (all) job data"""
logging.info(
@@ -1837,6 +1425,7 @@ class NzbObject(TryList):
# Remove all cached files
sabnzbd.ArticleCache.purge_articles(self.saved_articles)
sabnzbd.Assembler.clear_ready_bytes(*self.files)
# Delete all, or just basic files
if self.futuretype:
@@ -1856,7 +1445,7 @@ class NzbObject(TryList):
if nzf_id in self.files_table:
return self.files_table[nzf_id]
@synchronized(NZO_LOCK)
@synchronized()
def set_unpack_info(self, key: str, msg: str, setname: Optional[str] = None, unique: bool = False):
"""Builds a dictionary containing the stage name (key) and a message
If unique is present, it will only have a single line message
@@ -1884,7 +1473,7 @@ class NzbObject(TryList):
# Make sure it's updated in the interface
sabnzbd.misc.history_updated()
@synchronized(NZO_LOCK)
@synchronized()
def save_to_disk(self):
"""Save job's admin to disk"""
self.save_attribs()
@@ -1899,7 +1488,7 @@ class NzbObject(TryList):
logging.debug("Saving attributes %s for %s", attribs, self.final_name)
save_data(attribs, ATTRIB_FILE, self.admin_path, silent=True)
def load_attribs(self) -> Tuple[Optional[str], Optional[int], Optional[str]]:
def load_attribs(self) -> tuple[Optional[str], Optional[int], Optional[str]]:
"""Load saved attributes and return them to be parsed"""
attribs = load_data(ATTRIB_FILE, self.admin_path, remove=False)
logging.debug("Loaded attributes %s for %s", attribs, self.final_name)
@@ -1921,8 +1510,8 @@ class NzbObject(TryList):
# Rest is to be used directly in the NZO-init flow
return attribs["cat"], attribs["pp"], attribs["script"]
@synchronized(NZO_LOCK)
def build_pos_nzf_table(self, nzf_ids: List[str]) -> Dict[int, NzbFile]:
@synchronized()
def build_pos_nzf_table(self, nzf_ids: list[str]) -> dict[int, NzbFile]:
pos_nzf_table = {}
for nzf_id in nzf_ids:
if nzf_id in self.files_table:
@@ -1932,8 +1521,8 @@ class NzbObject(TryList):
return pos_nzf_table
@synchronized(NZO_LOCK)
def cleanup_nzf_ids(self, nzf_ids: List[str]):
@synchronized()
def cleanup_nzf_ids(self, nzf_ids: list[str]):
for nzf_id in nzf_ids[:]:
if nzf_id in self.files_table:
if self.files_table[nzf_id] not in self.files:
@@ -2073,6 +1662,7 @@ class NzbObject(TryList):
except KeyError:
# Handle new attributes
setattr(self, item, None)
self.lock = threading.RLock()
super().__setstate__(dict_.get("try_list", []))
# Set non-transferable values
@@ -2102,109 +1692,3 @@ class NzbObject(TryList):
def __repr__(self):
return "<NzbObject: filename=%s, bytes=%s, nzo_id=%s>" % (self.filename, self.bytes, self.nzo_id)
def nzf_cmp_name(nzf1: NzbFile, nzf2: NzbFile):
# The comparison will sort .par2 files to the top of the queue followed by .rar files,
# they will then be sorted by name.
nzf1_name = nzf1.filename.lower()
nzf2_name = nzf2.filename.lower()
# Determine vol-pars
is_par1 = ".vol" in nzf1_name and ".par2" in nzf1_name
is_par2 = ".vol" in nzf2_name and ".par2" in nzf2_name
# mini-par2 in front
if not is_par1 and nzf1_name.endswith(".par2"):
return -1
if not is_par2 and nzf2_name.endswith(".par2"):
return 1
# vol-pars go to the back
if is_par1 and not is_par2:
return 1
if is_par2 and not is_par1:
return -1
# Prioritize .rar files above any other type of file (other than vol-par)
m1 = RE_RAR.search(nzf1_name)
m2 = RE_RAR.search(nzf2_name)
if m1 and not (is_par2 or m2):
return -1
elif m2 and not (is_par1 or m1):
return 1
# Force .rar to come before 'r00'
if m1 and m1.group(1) == ".rar":
nzf1_name = nzf1_name.replace(".rar", ".r//")
if m2 and m2.group(1) == ".rar":
nzf2_name = nzf2_name.replace(".rar", ".r//")
return cmp(nzf1_name, nzf2_name)
def create_work_name(name: str) -> str:
"""Remove ".nzb" and ".par(2)" and sanitize, skip URL's"""
if name.find("://") < 0:
# Invalid charters need to be removed before and after (see unit-tests)
return sanitize_foldername(strip_extensions(sanitize_foldername(name)))
else:
return name.strip()
def scan_password(name: str) -> Tuple[str, Optional[str]]:
"""Get password (if any) from the title"""
if "http://" in name or "https://" in name:
return name, None
# Strip any unwanted usenet-related extensions
name = strip_extensions(name)
# Identify any braces
braces = name[1:].find("{{")
if braces < 0:
braces = len(name)
else:
braces += 1
slash = name.find("/")
# Look for name/password, but make sure that '/' comes before any {{
if 0 < slash < braces and "password=" not in name:
# Is it maybe in 'name / password' notation?
if slash == name.find(" / ") + 1 and name[: slash - 1].strip(". "):
# Remove the extra space after name and before password
return name[: slash - 1].strip(". "), name[slash + 2 :]
if name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# Look for "name password=password"
pw = name.find("password=")
if pw > 0 and name[:pw].strip(". "):
return name[:pw].strip(". "), name[pw + 9 :]
# Look for name{{password}}
if braces < len(name):
closing_braces = name.rfind("}}")
if closing_braces > braces and name[:braces].strip(". "):
return name[:braces].strip(". "), name[braces + 2 : closing_braces]
# Look again for name/password
if slash > 0 and name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# No password found
return name, None
def name_extractor(subject: str) -> str:
"""Try to extract a file name from a subject line, return `subject` if in doubt"""
# Filename nicely wrapped in quotes
for name in re.findall(RE_SUBJECT_FILENAME_QUOTES, subject):
if name := name.strip(' "'):
return name
# Found nothing? Try a basic filename-like search
for name in re.findall(RE_SUBJECT_BASIC_FILENAME, subject):
if name := name.strip():
return name
# Return the subject
return subject

View File

@@ -18,6 +18,7 @@
"""
sabnzbd.nzbparser - Parse and import NZB files
"""
import os
import bz2
import gzip
@@ -30,10 +31,18 @@ import zipfile
import tempfile
import cherrypy._cpreqbody
from typing import Optional, Dict, Any, Union, List, Tuple
from typing import Optional, Any, Union
import sabnzbd
from sabnzbd import nzbstuff
from sabnzbd.nzb import (
NzbObject,
NzbEmpty,
NzbRejected,
NzbPreQueueRejected,
NzbRejectToHistory,
NzbFile,
SkippedNzbFile,
)
from sabnzbd.encoding import utob, correct_cherrypy_encoding
from sabnzbd.filesystem import (
get_filename,
@@ -152,12 +161,12 @@ def process_nzb_archive_file(
priority: Optional[Union[int, str]] = None,
nzbname: Optional[str] = None,
reuse: Optional[str] = None,
nzo_info: Optional[Dict[str, Any]] = None,
nzo_info: Optional[dict[str, Any]] = None,
url: Optional[str] = None,
password: Optional[str] = None,
nzo_id: Optional[str] = None,
dup_check: bool = True,
) -> Tuple[AddNzbFileResult, List[str]]:
) -> tuple[AddNzbFileResult, list[str]]:
"""Analyse archive and create job(s).
Accepts archive files with ONLY nzb/nfo/folder files in it.
"""
@@ -204,7 +213,7 @@ def process_nzb_archive_file(
if datap:
nzo = None
try:
nzo = nzbstuff.NzbObject(
nzo = NzbObject(
name,
pp=pp,
script=script,
@@ -220,13 +229,13 @@ def process_nzb_archive_file(
dup_check=dup_check,
)
except (
sabnzbd.nzbstuff.NzbEmpty,
sabnzbd.nzbstuff.NzbRejected,
sabnzbd.nzbstuff.NzbPreQueueRejected,
NzbEmpty,
NzbRejected,
NzbPreQueueRejected,
):
# Empty or fully rejected (including pre-queue rejections)
pass
except sabnzbd.nzbstuff.NzbRejectToHistory as err:
except NzbRejectToHistory as err:
# Duplicate or unwanted extension directed to history
sabnzbd.NzbQueue.fail_to_history(err.nzo)
nzo_ids.append(err.nzo.nzo_id)
@@ -271,12 +280,12 @@ def process_single_nzb(
priority: Optional[Union[int, str]] = None,
nzbname: Optional[str] = None,
reuse: Optional[str] = None,
nzo_info: Optional[Dict[str, Any]] = None,
nzo_info: Optional[dict[str, Any]] = None,
url: Optional[str] = None,
password: Optional[str] = None,
nzo_id: Optional[str] = None,
dup_check: bool = True,
) -> Tuple[AddNzbFileResult, List[str]]:
) -> tuple[AddNzbFileResult, list[str]]:
"""Analyze file and create a job from it
Supports NZB, NZB.BZ2, NZB.GZ and GZ.NZB-in-disguise
"""
@@ -315,7 +324,7 @@ def process_single_nzb(
nzo = None
nzo_ids = []
try:
nzo = nzbstuff.NzbObject(
nzo = NzbObject(
filename,
pp=pp,
script=script,
@@ -330,16 +339,16 @@ def process_single_nzb(
nzo_id=nzo_id,
dup_check=dup_check,
)
except sabnzbd.nzbstuff.NzbEmpty:
except NzbEmpty:
# Malformed or might not be an NZB file
result = AddNzbFileResult.NO_FILES_FOUND
except sabnzbd.nzbstuff.NzbRejected:
except NzbRejected:
# Rejected as duplicate
result = AddNzbFileResult.ERROR
except sabnzbd.nzbstuff.NzbPreQueueRejected:
except NzbPreQueueRejected:
# Rejected by pre-queue script - should be silently ignored for URL fetches
result = AddNzbFileResult.PREQUEUE_REJECTED
except sabnzbd.nzbstuff.NzbRejectToHistory as err:
except NzbRejectToHistory as err:
# Duplicate or unwanted extension directed to history
sabnzbd.NzbQueue.fail_to_history(err.nzo)
nzo_ids.append(err.nzo.nzo_id)
@@ -366,7 +375,7 @@ def process_single_nzb(
def nzbfile_parser(full_nzb_path: str, nzo):
# For type-hinting
nzo: sabnzbd.nzbstuff.NzbObject
nzo: NzbObject
# Hash for dupe-checking
md5sum = hashlib.md5()
@@ -470,8 +479,8 @@ def nzbfile_parser(full_nzb_path: str, nzo):
# Create NZF
try:
nzf = sabnzbd.nzbstuff.NzbFile(file_date, file_name, raw_article_db_sorted, file_bytes, nzo)
except sabnzbd.nzbstuff.SkippedNzbFile:
nzf = NzbFile(file_date, file_name, raw_article_db_sorted, file_bytes, nzo)
except SkippedNzbFile:
# Did not meet requirements, so continue
skipped_files += 1
continue

View File

@@ -23,10 +23,10 @@ import os
import logging
import time
import cherrypy._cpreqbody
from typing import List, Dict, Union, Tuple, Optional
from typing import Union, Optional
import sabnzbd
from sabnzbd.nzbstuff import NzbObject, Article
from sabnzbd.nzb import Article, NzbObject
from sabnzbd.misc import exit_sab, cat_to_opts, int_conv, caller_name, safe_lower, duplicate_warning
from sabnzbd.filesystem import get_admin_path, remove_all, globber_full, remove_file, is_valid_script
from sabnzbd.nzbparser import process_single_nzb
@@ -57,8 +57,8 @@ class NzbQueue:
def __init__(self):
self.__top_only: bool = cfg.top_only()
self.__nzo_list: List[NzbObject] = []
self.__nzo_table: Dict[str, NzbObject] = {}
self.__nzo_list: list[NzbObject] = []
self.__nzo_table: dict[str, NzbObject] = {}
def read_queue(self, repair: int):
"""Read queue from disk, supporting repair modes
@@ -121,7 +121,7 @@ class NzbQueue:
pass
@NzbQueueLocker
def scan_jobs(self, all_jobs: bool = False, action: bool = True) -> List[str]:
def scan_jobs(self, all_jobs: bool = False, action: bool = True) -> list[str]:
"""Scan "incomplete" for missing folders,
'all' is True: Include active folders
'action' is True, do the recovery action
@@ -247,7 +247,7 @@ class NzbQueue:
self.__top_only = value
@NzbQueueLocker
def change_opts(self, nzo_ids: List[str], pp: int) -> int:
def change_opts(self, nzo_ids: list[str], pp: int) -> int:
"""Locked so changes during URLGrabbing are correctly passed to new job"""
result = 0
for nzo_id in nzo_ids:
@@ -257,7 +257,7 @@ class NzbQueue:
return result
@NzbQueueLocker
def change_script(self, nzo_ids: List[str], script: str) -> int:
def change_script(self, nzo_ids: list[str], script: str) -> int:
"""Locked so changes during URLGrabbing are correctly passed to new job"""
result = 0
if (script is None) or is_valid_script(script):
@@ -269,7 +269,7 @@ class NzbQueue:
return result
@NzbQueueLocker
def change_cat(self, nzo_ids: List[str], cat: str) -> int:
def change_cat(self, nzo_ids: list[str], cat: str) -> int:
"""Locked so changes during URLGrabbing are correctly passed to new job"""
result = 0
for nzo_id in nzo_ids:
@@ -387,7 +387,7 @@ class NzbQueue:
return nzo
@NzbQueueLocker
def remove_multiple(self, nzo_ids: List[str], delete_all_data=True) -> List[str]:
def remove_multiple(self, nzo_ids: list[str], delete_all_data=True) -> list[str]:
"""Remove multiple jobs from the queue. Also triggers duplicate handling
and downloader-disconnect, so intended for external use only!"""
removed = []
@@ -405,7 +405,7 @@ class NzbQueue:
return removed
@NzbQueueLocker
def remove_all(self, search: Optional[str] = None) -> List[str]:
def remove_all(self, search: Optional[str] = None) -> list[str]:
"""Remove NZO's that match the search-pattern"""
nzo_ids = []
search = safe_lower(search)
@@ -414,7 +414,7 @@ class NzbQueue:
nzo_ids.append(nzo_id)
return self.remove_multiple(nzo_ids)
def remove_nzfs(self, nzo_id: str, nzf_ids: List[str]) -> List[str]:
def remove_nzfs(self, nzo_id: str, nzf_ids: list[str]) -> list[str]:
removed = []
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
@@ -441,7 +441,7 @@ class NzbQueue:
logging.info("Removed NZFs %s from job %s", removed, nzo.final_name)
return removed
def pause_multiple_nzo(self, nzo_ids: List[str]) -> List[str]:
def pause_multiple_nzo(self, nzo_ids: list[str]) -> list[str]:
handled = []
for nzo_id in nzo_ids:
self.pause_nzo(nzo_id)
@@ -449,7 +449,7 @@ class NzbQueue:
return handled
@NzbQueueLocker
def pause_nzo(self, nzo_id: str) -> List[str]:
def pause_nzo(self, nzo_id: str) -> list[str]:
"""Locked so changes during URLGrabbing are correctly passed to new job"""
handled = []
if nzo_id in self.__nzo_table:
@@ -459,7 +459,7 @@ class NzbQueue:
handled.append(nzo_id)
return handled
def resume_multiple_nzo(self, nzo_ids: List[str]) -> List[str]:
def resume_multiple_nzo(self, nzo_ids: list[str]) -> list[str]:
handled = []
for nzo_id in nzo_ids:
self.resume_nzo(nzo_id)
@@ -467,7 +467,7 @@ class NzbQueue:
return handled
@NzbQueueLocker
def resume_nzo(self, nzo_id: str) -> List[str]:
def resume_nzo(self, nzo_id: str) -> list[str]:
handled = []
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
@@ -477,7 +477,7 @@ class NzbQueue:
return handled
@NzbQueueLocker
def switch(self, item_id_1: str, item_id_2: str) -> Tuple[int, int]:
def switch(self, item_id_1: str, item_id_2: str) -> tuple[int, int]:
try:
# Allow an index as second parameter, easier for some skins
i = int(item_id_2)
@@ -532,24 +532,24 @@ class NzbQueue:
return -1, nzo1.priority
@NzbQueueLocker
def move_nzf_up_bulk(self, nzo_id: str, nzf_ids: List[str], size: int):
def move_nzf_up_bulk(self, nzo_id: str, nzf_ids: list[str], size: int):
if nzo_id in self.__nzo_table:
for _ in range(size):
self.__nzo_table[nzo_id].move_up_bulk(nzf_ids)
@NzbQueueLocker
def move_nzf_top_bulk(self, nzo_id: str, nzf_ids: List[str]):
def move_nzf_top_bulk(self, nzo_id: str, nzf_ids: list[str]):
if nzo_id in self.__nzo_table:
self.__nzo_table[nzo_id].move_top_bulk(nzf_ids)
@NzbQueueLocker
def move_nzf_down_bulk(self, nzo_id: str, nzf_ids: List[str], size: int):
def move_nzf_down_bulk(self, nzo_id: str, nzf_ids: list[str], size: int):
if nzo_id in self.__nzo_table:
for _ in range(size):
self.__nzo_table[nzo_id].move_down_bulk(nzf_ids)
@NzbQueueLocker
def move_nzf_bottom_bulk(self, nzo_id: str, nzf_ids: List[str]):
def move_nzf_bottom_bulk(self, nzo_id: str, nzf_ids: list[str]):
if nzo_id in self.__nzo_table:
self.__nzo_table[nzo_id].move_bottom_bulk(nzf_ids)
@@ -670,7 +670,7 @@ class NzbQueue:
return -1
@NzbQueueLocker
def set_priority(self, nzo_ids: List[str], priority: int) -> int:
def set_priority(self, nzo_ids: list[str], priority: int) -> int:
try:
n = -1
for nzo_id in nzo_ids:
@@ -692,7 +692,7 @@ class NzbQueue:
return False
return False
def get_articles(self, server: Server, servers: List[Server], fetch_limit: int) -> List[Article]:
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int) -> None:
"""Get next article for jobs in the queue
Not locked for performance, since it only reads the queue
"""
@@ -705,12 +705,12 @@ class NzbQueue:
and not nzo.propagation_delay_left
) or nzo.priority == FORCE_PRIORITY:
if not nzo.server_in_try_list(server):
if articles := nzo.get_articles(server, servers, fetch_limit):
return articles
nzo.get_articles(server, servers, fetch_limit)
if server.article_queue:
break
# Stop after first job that wasn't paused/propagating/etc
if self.__top_only:
return []
return []
break
def register_article(self, article: Article, success: bool = True):
"""Register the articles we tried
@@ -730,20 +730,16 @@ class NzbQueue:
articles_left, file_done, post_done = nzo.remove_article(article, success)
# Write data if file is done or at trigger time
# Skip if the file is already queued, since all available articles will then be written
if (
file_done
or (article.lowest_partnum and nzf.filename_checked and not nzf.import_finished)
or (articles_left and (articles_left % sabnzbd.ArticleCache.assembler_write_trigger) == 0)
):
if not nzo.precheck:
# The type is only set if sabctools could decode the article
if nzf.type:
sabnzbd.Assembler.process(nzo, nzf, file_done)
elif sabnzbd.par2file.has_par2_in_filename(nzf.filename):
# Broken par2 file, try to get another one
nzo.promote_par2(nzf)
if not nzo.precheck:
# Mark as on_disk so assembler knows it can skip this article
if not success:
article.on_disk = True
# The type is only set if sabctools could decode the article
if nzf.type:
sabnzbd.Assembler.process(nzo, nzf, file_done, article=article)
elif sabnzbd.par2file.has_par2_in_filename(nzf.filename):
# Broken par2 file, try to get another one
nzo.promote_par2(nzf)
# Save bookkeeping in case of crash
if file_done and (nzo.next_save is None or time.time() > nzo.next_save):
@@ -768,10 +764,9 @@ class NzbQueue:
nzo.removed_from_queue = True
if nzo.precheck:
nzo.save_to_disk()
# Check result
enough, _ = nzo.check_availability_ratio()
if enough:
# Enough data present, do real download
# If not enough data is present, fail flag will be set (also used by postproc)
if not nzo.fail_msg:
# Send back for real download
self.send_back(nzo)
return
else:
@@ -784,6 +779,7 @@ class NzbQueue:
if not nzo.nzo_id:
self.add(nzo, quiet=True)
self.remove(nzo.nzo_id, cleanup=False)
sabnzbd.Assembler.clear_ready_bytes(*nzo.files)
sabnzbd.PostProcessor.process(nzo)
def actives(self, grabs: bool = True) -> int:
@@ -802,13 +798,13 @@ class NzbQueue:
def queue_info(
self,
search: Optional[str] = None,
categories: Optional[List[str]] = None,
priorities: Optional[List[str]] = None,
statuses: Optional[List[str]] = None,
nzo_ids: Optional[List[str]] = None,
categories: Optional[list[str]] = None,
priorities: Optional[list[str]] = None,
statuses: Optional[list[str]] = None,
nzo_ids: Optional[list[str]] = None,
start: int = 0,
limit: int = 0,
) -> Tuple[int, int, int, List[NzbObject], int, int]:
) -> tuple[int, int, int, list[NzbObject], int, int]:
"""Return list of queued jobs, optionally filtered and limited by start and limit.
Not locked for performance, only reads the queue
"""
@@ -894,14 +890,40 @@ class NzbQueue:
if nzf.all_servers_in_try_list(active_servers):
# Check for articles where all active servers have already been tried
for article in nzf.articles[:]:
if article.all_servers_in_try_list(active_servers):
logging.debug("Removing article %s with bad trylist in file %s", article, nzf.filename)
nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
with nzf.lock:
for article in nzf.articles:
if article.all_servers_in_try_list(active_servers):
logging.debug(
"Removing article %s with bad trylist in file %s", article, nzf.filename
)
nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
logging.info("Resetting bad trylist for file %s in job %s", nzf.filename, nzo.final_name)
nzf.reset_try_list()
if not nzf.assembled and not nzf.articles:
logging.debug("Not assembled but no remaining articles for file %s", nzf.filename)
if not nzf.assembled and (next_article := nzf.assembler_next_article):
logging.debug(
"Next article to assemble for file %s is %s, decoded: %s, on_disk: %s, decoded_size: %s",
nzf.filename,
next_article,
next_article.decoded,
next_article.on_disk,
next_article.decoded_size,
)
for article in nzo.first_articles.copy():
logging.debug(
"First article for file %s is %s, decoded: %s, on_disk: %s, decoded_size: %s, has_fetcher: %s, tries: %s",
article.nzf.filename,
article,
article.decoded,
article.on_disk,
article.decoded_size,
article.fetcher is not None,
article.tries,
)
# Reset main try list, minimal performance impact
logging.info("Resetting bad trylist for job %s", nzo.final_name)
@@ -934,7 +956,7 @@ class NzbQueue:
# Don't use nzo.resume() to avoid resetting job warning flags
nzo.status = Status.QUEUED
def get_urls(self) -> List[Tuple[str, NzbObject]]:
def get_urls(self) -> list[tuple[str, NzbObject]]:
"""Return list of future-types needing URL"""
lst = []
for nzo_id in self.__nzo_table:

View File

@@ -66,14 +66,12 @@ def MSG_BAD_NEWS():
def MSG_BAD_PORT():
return (
T(
r"""
T(r"""
SABnzbd needs a free tcp/ip port for its internal web server.<br>
Port %s on %s was tried , but it is not available.<br>
Some other software uses the port or SABnzbd is already running.<br>
<br>
Please restart SABnzbd with a different port number."""
)
Please restart SABnzbd with a different port number.""")
+ """<br>
<br>
%s<br>
@@ -85,14 +83,12 @@ def MSG_BAD_PORT():
def MSG_BAD_HOST():
return (
T(
r"""
T(r"""
SABnzbd needs a valid host address for its internal web server.<br>
You have specified an invalid address.<br>
Safe values are <b>localhost</b> and <b>0.0.0.0</b><br>
<br>
Please restart SABnzbd with a proper host address."""
)
Please restart SABnzbd with a proper host address.""")
+ """<br>
<br>
%s<br>
@@ -104,15 +100,13 @@ def MSG_BAD_HOST():
def MSG_BAD_QUEUE():
return (
T(
r"""
T(r"""
SABnzbd detected saved data from an other SABnzbd version<br>
but cannot re-use the data of the other program.<br><br>
You may want to finish your queue first with the other program.<br><br>
After that, start this program with the "--clean" option.<br>
This will erase the current queue and history!<br>
SABnzbd read the file "%s"."""
)
SABnzbd read the file "%s".""")
+ """<br>
<br>
%s<br>
@@ -123,13 +117,11 @@ def MSG_BAD_QUEUE():
def MSG_BAD_TEMPL():
return T(
r"""
return T(r"""
SABnzbd cannot find its web interface files in %s.<br>
Please install the program again.<br>
<br>
"""
)
""")
def MSG_OTHER():
@@ -137,14 +129,12 @@ def MSG_OTHER():
def MSG_SQLITE():
return T(
r"""
return T(r"""
SABnzbd detected that the file sqlite3.dll is missing.<br><br>
Some poorly designed virus-scanners remove this file.<br>
Please check your virus-scanner, try to re-install SABnzbd and complain to your virus-scanner vendor.<br>
<br>
"""
)
""")
def panic_message(panic_code, a=None, b=None):
@@ -280,8 +270,7 @@ def error_page_401(status, message, traceback, version):
def error_page_404(status, message, traceback, version):
"""Custom handler for 404 error, redirect to main page"""
return (
r"""
return r"""
<html>
<head>
<script type="text/javascript">
@@ -292,6 +281,4 @@ def error_page_404(status, message, traceback, version):
</head>
<body><br/></body>
</html>
"""
% cfg.url_base()
)
""" % cfg.url_base()

View File

@@ -18,6 +18,7 @@
"""
sabnzbd.par2file - All par2-related functionality
"""
import hashlib
import logging
import os
@@ -25,11 +26,11 @@ import re
import struct
import sabctools
from dataclasses import dataclass
from typing import Dict, Optional, Tuple
from typing import Optional
from sabnzbd.constants import MEBI
from sabnzbd.encoding import correct_unknown_encoding
from sabnzbd.filesystem import get_basename, get_ext
from sabnzbd.filesystem import get_basename
PROBABLY_PAR2_RE = re.compile(r"(.*)\.vol(\d*)[+\-](\d*)\.par2", re.I)
SCAN_LIMIT = 10 * MEBI
@@ -71,7 +72,7 @@ def is_par2_file(filepath: str) -> bool:
return False
def analyse_par2(name: str, filepath: Optional[str] = None) -> Tuple[str, int, int]:
def analyse_par2(name: str, filepath: Optional[str] = None) -> tuple[str, int, int]:
"""Check if file is a par2-file and determine vol/block
return setname, vol, block
setname is empty when not a par2 file
@@ -103,7 +104,7 @@ def analyse_par2(name: str, filepath: Optional[str] = None) -> Tuple[str, int, i
return setname, vol, block
def parse_par2_file(fname: str, md5of16k: Dict[bytes, str]) -> Tuple[str, Dict[str, FilePar2Info]]:
def parse_par2_file(fname: str, md5of16k: dict[bytes, str]) -> tuple[str, dict[str, FilePar2Info]]:
"""Get the hash table and the first-16k hash table from a PAR2 file
Return as dictionary, indexed on names or hashes for the first-16 table
The input md5of16k is modified in place and thus not returned!

View File

@@ -18,6 +18,7 @@
"""
sabnzbd.postproc - threaded post-processing of jobs
"""
import os
import logging
import functools
@@ -27,7 +28,7 @@ import re
import gc
import queue
import rarfile
from typing import List, Optional, Tuple
from typing import Optional
import sabnzbd
from sabnzbd.newsunpack import (
@@ -39,7 +40,7 @@ from sabnzbd.newsunpack import (
rar_sort,
is_sfv_file,
)
from threading import Thread
from threading import Thread, Event
from sabnzbd.misc import (
on_cleanup_list,
is_sample,
@@ -73,7 +74,7 @@ from sabnzbd.filesystem import (
get_ext,
get_filename,
)
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.nzb import NzbObject
from sabnzbd.sorting import Sorter
from sabnzbd.constants import (
REPAIR_PRIORITY,
@@ -95,7 +96,6 @@ import sabnzbd.utils.rarvolinfo as rarvolinfo
import sabnzbd.utils.checkdir
import sabnzbd.deobfuscate_filenames as deobfuscate
MAX_FAST_JOB_COUNT = 3
@@ -107,7 +107,7 @@ class PostProcessor(Thread):
super().__init__()
# This history queue is simply used to log what active items to display in the web_ui
self.history_queue: List[NzbObject] = []
self.history_queue: list[NzbObject] = []
self.load()
# Fast-queue for jobs already finished by DirectUnpack
@@ -116,6 +116,9 @@ class PostProcessor(Thread):
# Regular queue for jobs that might need more attention
self.slow_queue: queue.Queue[Optional[NzbObject]] = queue.Queue()
# Event to signal when work is available or state changes
self.work_available = Event()
# Load all old jobs
for nzo in self.history_queue:
self.process(nzo)
@@ -180,6 +183,9 @@ class PostProcessor(Thread):
self.save()
history_updated()
# Signal that work is available
self.work_available.set()
def remove(self, nzo: NzbObject):
"""Remove given nzo from the queue"""
try:
@@ -192,10 +198,22 @@ class PostProcessor(Thread):
def stop(self):
"""Stop thread after finishing running job"""
self.__stop = True
self.slow_queue.put(None)
self.fast_queue.put(None)
# Wake up the processor thread to check stop flag
self.work_available.set()
def cancel_pp(self, nzo_ids: List[str]) -> Optional[bool]:
def pause(self):
"""Pause post-processing"""
self.paused = True
logging.info("Pausing post-processing")
def resume(self):
"""Resume post-processing"""
self.paused = False
logging.info("Resuming post-processing")
# Wake up the processor thread
self.work_available.set()
def cancel_pp(self, nzo_ids: list[str]) -> Optional[bool]:
"""Abort Direct Unpack and change the status, so that the PP is canceled"""
result = None
for nzo in self.history_queue:
@@ -220,10 +238,10 @@ class PostProcessor(Thread):
def get_queue(
self,
search: Optional[str] = None,
categories: Optional[List[str]] = None,
statuses: Optional[List[str]] = None,
nzo_ids: Optional[List[str]] = None,
) -> List[NzbObject]:
categories: Optional[list[str]] = None,
statuses: Optional[list[str]] = None,
nzo_ids: Optional[list[str]] = None,
) -> list[NzbObject]:
"""Return list of NZOs that still need to be processed.
Optionally filtered by the search terms"""
re_search = None
@@ -265,27 +283,40 @@ class PostProcessor(Thread):
while not self.__stop:
self.__busy = False
if self.paused:
time.sleep(5)
continue
# Set NzbObject object to None so references from this thread do not keep the
# object alive until the next job is added to post-processing (see #1628)
nzo = None
# Wait for work to be available (no timeout!)
self.work_available.wait()
# Check if we should stop
if self.__stop:
break
# If paused, clear event and wait for resume
if self.paused:
self.work_available.clear()
continue
# If queues are empty (spurious wake or race condition), clear and loop back
if self.slow_queue.empty() and self.fast_queue.empty():
self.work_available.clear()
continue
# Something in the fast queue?
try:
# Every few fast-jobs we should check allow a
# Every few fast-jobs we should allow a
# slow job so that they don't wait forever
if self.__fast_job_count >= MAX_FAST_JOB_COUNT and self.slow_queue.qsize():
raise queue.Empty
nzo = self.fast_queue.get(timeout=2)
nzo = self.fast_queue.get_nowait()
self.__fast_job_count += 1
except queue.Empty:
# Try the slow queue
try:
nzo = self.slow_queue.get(timeout=2)
nzo = self.slow_queue.get_nowait()
# Reset fast-counter
self.__fast_job_count = 0
except queue.Empty:
@@ -296,10 +327,6 @@ class PostProcessor(Thread):
# No fast or slow jobs, better luck next loop!
continue
# Stop job
if not nzo:
continue
# Job was already deleted.
if not nzo.work_name:
check_eoq = True
@@ -328,7 +355,7 @@ class PostProcessor(Thread):
self.external_process = None
check_eoq = True
# Allow download to proceed
# Allow download to proceed if it was paused for post-processing
sabnzbd.Downloader.resume_from_postproc()
@@ -392,14 +419,13 @@ def process_job(nzo: NzbObject) -> bool:
par_error = True
unpack_error = 1
script = nzo.script
logging.info(
"Starting Post-Processing on %s => Repair:%s, Unpack:%s, Delete:%s, Script:%s, Cat:%s",
filename,
flag_repair,
flag_unpack,
nzo.delete,
script,
nzo.script,
nzo.cat,
)
@@ -492,10 +518,10 @@ def process_job(nzo: NzbObject) -> bool:
# Check if this is an NZB-only download, if so redirect to queue
# except when PP was Download-only
nzb_list = None
if flag_repair:
nzb_list = nzb_redirect(tmp_workdir_complete, nzo.final_name, nzo.pp, script, nzo.cat, nzo.priority)
else:
nzb_list = None
nzb_list = process_nzb_only_download(tmp_workdir_complete, nzo)
if nzb_list:
nzo.set_unpack_info("Download", T("Sent %s to queue") % nzb_list)
cleanup_empty_directories(tmp_workdir_complete)
@@ -503,9 +529,10 @@ def process_job(nzo: NzbObject) -> bool:
# Full cleanup including nzb's
cleanup_list(tmp_workdir_complete, skip_nzb=False)
script_ret = 0
script_error = False
# No further processing for NZB-only downloads
if not nzb_list:
script_ret = 0
script_error = False
# Give destination its final name
if cfg.folder_rename() and tmp_workdir_complete and not one_folder:
if not all_ok:
@@ -557,11 +584,11 @@ def process_job(nzo: NzbObject) -> bool:
deobfuscate.deobfuscate_subtitles(nzo, newfiles)
# Run the user script
if script_path := make_script_path(script):
if script_path := make_script_path(nzo.script):
# Set the current nzo status to "Ext Script...". Used in History
nzo.status = Status.RUNNING
nzo.set_action_line(T("Running script"), script)
nzo.set_unpack_info("Script", T("Running user script %s") % script, unique=True)
nzo.set_action_line(T("Running script"), nzo.script)
nzo.set_unpack_info("Script", T("Running user script %s") % nzo.script, unique=True)
script_log, script_ret = external_processing(
script_path, nzo, clip_path(workdir_complete), nzo.final_name, job_result
)
@@ -574,7 +601,7 @@ def process_job(nzo: NzbObject) -> bool:
else:
script_line = T("Script exit code is %s") % script_ret
elif not script_line:
script_line = T("Ran %s") % script
script_line = T("Ran %s") % nzo.script
nzo.set_unpack_info("Script", script_line, unique=True)
# Maybe bad script result should fail job
@@ -583,29 +610,29 @@ def process_job(nzo: NzbObject) -> bool:
all_ok = False
nzo.fail_msg = script_line
# Email the results
if not nzb_list and cfg.email_endjob():
if cfg.email_endjob() == 1 or (cfg.email_endjob() == 2 and (unpack_error or par_error or script_error)):
emailer.endjob(
nzo.final_name,
nzo.cat,
all_ok,
workdir_complete,
nzo.bytes_downloaded,
nzo.fail_msg,
nzo.unpack_info,
script,
script_log,
script_ret,
)
# Email the results
if cfg.email_endjob():
if cfg.email_endjob() == 1 or (cfg.email_endjob() == 2 and (unpack_error or par_error or script_error)):
emailer.endjob(
nzo.final_name,
nzo.cat,
all_ok,
workdir_complete,
nzo.bytes_downloaded,
nzo.fail_msg,
nzo.unpack_info,
nzo.script,
script_log,
script_ret,
)
if script_log and len(script_log.rstrip().split("\n")) > 1:
# Can do this only now, otherwise it would show up in the email
nzo.set_unpack_info(
"Script",
'%s <a href="./scriptlog?name=%s">(%s)</a>' % (script_line, nzo.nzo_id, T("More")),
unique=True,
)
if script_log and len(script_log.rstrip().split("\n")) > 1:
# Can do this only now, otherwise it would show up in the email
nzo.set_unpack_info(
"Script",
'%s <a href="./scriptlog?name=%s">(%s)</a>' % (script_line, nzo.nzo_id, T("More")),
unique=True,
)
# Cleanup again, including NZB files
if all_ok and os.path.isdir(workdir_complete):
@@ -693,7 +720,7 @@ def process_job(nzo: NzbObject) -> bool:
return True
def prepare_extraction_path(nzo: NzbObject) -> Tuple[str, str, Sorter, bool, Optional[str]]:
def prepare_extraction_path(nzo: NzbObject) -> tuple[str, str, Sorter, bool, Optional[str]]:
"""Based on the information that we have, generate
the extraction path and create the directory.
Separated so it can be called from DirectUnpacker
@@ -757,7 +784,7 @@ def prepare_extraction_path(nzo: NzbObject) -> Tuple[str, str, Sorter, bool, Opt
return tmp_workdir_complete, workdir_complete, file_sorter, not create_job_dir, marker_file
def parring(nzo: NzbObject) -> Tuple[bool, bool]:
def parring(nzo: NzbObject) -> tuple[bool, bool]:
"""Perform par processing. Returns: (par_error, re_add)"""
logging.info("Starting verification and repair of %s", nzo.final_name)
par_error = False
@@ -876,7 +903,7 @@ def try_sfv_check(nzo: NzbObject) -> Optional[bool]:
return True
def try_rar_check(nzo: NzbObject, rars: List[str]) -> bool:
def try_rar_check(nzo: NzbObject, rars: list[str]) -> bool:
"""Attempt to verify set using the RARs
Return True if verified, False when failed
When setname is '', all RAR files will be used, otherwise only the matching one
@@ -1132,34 +1159,36 @@ def prefix(path: str, pre: str) -> str:
return os.path.join(p, pre + d)
def nzb_redirect(wdir, nzbname, pp, script, cat, priority):
def process_nzb_only_download(workdir: str, nzo: NzbObject) -> Optional[list[str]]:
"""Check if this job contains only NZB files,
if so send to queue and remove if on clean-up list
Returns list of processed NZB's
"""
files = listdir_full(wdir)
if files := listdir_full(workdir):
for nzb_file in files:
if get_ext(nzb_file) != ".nzb":
return None
for nzb_file in files:
if get_ext(nzb_file) != ".nzb":
return None
# Process all NZB files
new_nzbname = nzo.final_name
for nzb_file in files:
# Determine name based on number of files
nzb_filename = get_filename(nzb_file)
if len(files) > 1:
new_nzbname = f"{nzo.final_name} - {nzb_filename}"
# For multiple NZBs, cannot use the current job name
if len(files) != 1:
nzbname = None
# Process all NZB files
for nzb_file in files:
process_single_nzb(
get_filename(nzb_file),
nzb_file,
pp=pp,
script=script,
cat=cat,
priority=priority,
dup_check=False,
nzbname=nzbname,
)
return files
process_single_nzb(
nzb_filename,
nzb_file,
pp=nzo.pp,
script=nzo.script,
cat=nzo.cat,
url=nzo.url,
priority=nzo.priority,
nzbname=new_nzbname,
dup_check=False,
)
return files
def one_file_or_folder(folder: str) -> str:
@@ -1221,7 +1250,7 @@ def remove_samples(path: str):
logging.info("Skipping sample-removal, false-positive")
def rename_and_collapse_folder(oldpath: str, newpath: str, files: List[str]) -> List[str]:
def rename_and_collapse_folder(oldpath: str, newpath: str, files: list[str]) -> list[str]:
"""Rename folder, collapsing when there's just a single subfolder
oldpath --> newpath OR oldpath/subfolder --> newpath
Modify list of filenames accordingly
@@ -1273,7 +1302,7 @@ def del_marker(path: str):
logging.info("Traceback: ", exc_info=True)
def remove_from_list(name: Optional[str], lst: List[str]):
def remove_from_list(name: Optional[str], lst: list[str]):
if name:
for n in range(len(lst)):
if lst[n].endswith(name):

View File

@@ -24,7 +24,6 @@ import subprocess
import logging
import time
##############################################################################
# Power management for Windows
##############################################################################

View File

File diff suppressed because it is too large Load Diff

View File

@@ -337,7 +337,11 @@ class Scheduler:
sabnzbd.downloader.unpause_all()
sabnzbd.Downloader.set_paused_state(paused or paused_all)
sabnzbd.PostProcessor.paused = pause_post
# Handle pause_post state with proper notification
if pause_post and not sabnzbd.PostProcessor.paused:
sabnzbd.PostProcessor.pause()
elif not pause_post and sabnzbd.PostProcessor.paused:
sabnzbd.PostProcessor.resume()
if speedlimit is not None:
sabnzbd.Downloader.limit_speed(speedlimit)
@@ -506,11 +510,11 @@ def sort_schedules(all_events, now=None):
def pp_pause():
sabnzbd.PostProcessor.paused = True
sabnzbd.PostProcessor.pause()
def pp_resume():
sabnzbd.PostProcessor.paused = False
sabnzbd.PostProcessor.resume()
def enable_server(server):

View File

@@ -442,7 +442,7 @@ SKIN_TEXT = {
"Select a mode and list all (un)wanted extensions. For example: <b>exe</b> or <b>exe, com</b>"
),
"opt-sfv_check": TT("Enable SFV-based checks"),
"explain-sfv_check": TT("Do an extra verification based on SFV files."),
"explain-sfv_check": TT("If no par2 files are available, use sfv files (if present) to verify files"),
"opt-script_can_fail": TT("User script can flag job as failed"),
"explain-script_can_fail": TT(
"When the user script returns a non-zero exit code, the job will be flagged as failed."
@@ -574,6 +574,11 @@ SKIN_TEXT = {
"For unreliable servers, will be ignored longer in case of failures"
), #: Explain server optional tickbox
"srv-enable": TT("Enable"), #: Enable server tickbox
"srv-pipelining_requests": TT("Articles per request"),
"explain-pipelining_requests": TT(
"Request multiple articles per connection without waiting for each response first.<br />"
"This can improve download speeds, especially on connections with higher latency."
),
"button-addServer": TT("Add Server"), #: Button: Add server
"button-delServer": TT("Remove Server"), #: Button: Remove server
"button-testServer": TT("Test Server"), #: Button: Test server
@@ -686,10 +691,15 @@ SKIN_TEXT = {
"explain-pushbullet_device": TT("Device to which message should be sent"), #: Pushbullet settings
"opt-apprise_enable": TT("Enable Apprise notifications"), #: Apprise settings
"explain-apprise_enable": TT(
"Send notifications using Apprise to almost any notification service"
"Send notifications directly to any notification service you use.<br>"
"For example: Slack, Discord, Telegram, or any service from over 100 supported services!"
), #: Apprise settings
"opt-apprise_urls": TT("Use default Apprise URLs"), #: Apprise settings
"explain-apprise_urls": TT(
"Apprise defines service connection information using URLs.<br>"
"Read the Apprise wiki how to define the URL for each service.<br>"
"Use a comma and/or space to identify more than one URL."
), #: Apprise settings
"opt-apprise_urls": TT("Default Apprise URLs"), #: Apprise settings
"explain-apprise_urls": TT("Use a comma and/or space to identify more than one URL."), #: Apprise settings
"explain-apprise_extra_urls": TT(
"Override the default URLs for specific notification types below, if desired."
), #: Apprise settings
@@ -894,6 +904,7 @@ SKIN_TEXT = {
"Glitter-notification-removing1": TT("Removing job"), # Notification window
"Glitter-notification-removing": TT("Removing jobs"), # Notification window
"Glitter-notification-shutdown": TT("Shutting down"), # Notification window
"Glitter-notification-upload-failed": TT("Failed to upload file: %s"), # Notification window
# Wizard
"wizard-quickstart": TT("SABnzbd Quick-Start Wizard"),
"wizard-version": TT("SABnzbd Version"),
@@ -922,11 +933,9 @@ SKIN_TEXT = {
"wizard-test-server-required": TT("Click on Test Server before continuing"), #: Tooltip for disabled Next button
"restore-backup": TT("Restore backup"),
# Special
"yourRights": TT(
"""
"yourRights": TT("""
SABnzbd comes with ABSOLUTELY NO WARRANTY.
This is free software, and you are welcome to redistribute it under certain conditions.
It is licensed under the GNU GENERAL PUBLIC LICENSE Version 2 or (at your option) any later version.
"""
),
"""),
}

Some files were not shown because too many files have changed in this diff Show More