Compare commits

...

20 Commits

Author SHA1 Message Date
Safihre
e182707d3a Update text files for 4.6.0Alpha2 2025-12-08 22:36:51 +01:00
Safihre
05cbd9d7c4 Correct process_nzb_only_download and add tests 2025-12-08 11:42:22 +01:00
Safihre
6e8683349f Keep NZB name prefix when processing multiple NZBs
Closes #3217
2025-12-08 10:27:16 +01:00
Safihre
adb4816552 Update to Python 3.14.2 2025-12-08 10:27:16 +01:00
renovate[bot]
3914290c11 Update all dependencies (#3219)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-08 01:00:54 +00:00
Safihre
f76bf55b4a Handle aborted Direct Unpack better
Closes #3212
2025-12-05 16:18:17 +01:00
SABnzbd Automation
1cde764336 Update translatable texts
[skip ci]
2025-12-05 12:34:16 +00:00
mnightingale
44d94226ec Pipelining and performance optimisations (#3199)
* Pipelining and performance optimisations

* Refactor to remove handle_remainder and add on_response callback to allow inspecting of nntp messages

* Logic fix if there are sockets but nothing to read/write

* Fix logic errors for failed article requests

* Fix logic for reconfiguring servers

* Add guard_restart callback to pipelining_requests

* Fix article download stats

* Fix current article request shown via api

* Removal of DecodingStatus

* Fix circular reference

* Cleanup imports

* Handle reset_nw and hard_reset for inflight requests

* Improve __request_article behaviour using discard helper

* Article should be None here (before auth) but just in case

* Remove command_queue_condition unnecessary with the pull rather than push queue system

* During reset discard any data received prior to sending quit request

* Circular references again

* Revert to using bytearray

* Revert "During reset discard any data received prior to sending quit request"

This reverts commit ed522e3e80.

* Simpler interaction with sabctools

* Temporarily use the sabctools streaming decoder branch

* Fix most uu tests

* Reduce maximum pipelining requests

* Fix the squiggly line

* Remove some LOG_ALL debug code

* Make get_articles return consistent (None) - it now populates the server deque

* Reduce NNTP_BUFFER_SIZE

* Rename PIPELINING_REQUESTS to DEF_PIPELINING_REQUESTS

* A little refactoring

* Reduce default pipelining until it is dynamic

* Use BoundedSemaphore and fix the unacquired release

* Use crc from sabctools for uu and make filename logic consistent wit yenc

* Use sabctools 9.0.0

* Fix Check Before Download

* Move lock to NzbFile

* Use sabctools 9.1.0

* Minor change

* Fix 430 on check before download

* Update sabnews to work reliably with pipelining

* Minor tidy up

* Why does only Linux complain about this

* Leave this as it was

* Remove unused import

* Compare enum by identity

* Remove command_queue and just prepare a single request
Check if it should be sent and discard when paused

* Kick-start idle connections

* Modify events sockets are monitored for
2025-12-05 13:33:35 +01:00
Safihre
e8e8fff5bf Prevent filepath creation before first article is processed (#3215) 2025-12-05 13:18:27 +01:00
SABnzbd Automation
1b04e07d40 Update translatable texts
[skip ci]
2025-12-04 14:14:01 +00:00
Safihre
54db889f05 Update sfv help text
Closes #3214 and #3213
2025-12-04 15:13:07 +01:00
Safihre
777d279267 Only clear work-flag for post processing when needed 2025-12-01 16:40:59 +01:00
Safihre
75be6b5850 Use Event's to handle Post Processing queue
See #3209
2025-12-01 15:28:05 +01:00
Safihre
a4657e2bd3 Correct rar-version logging line 2025-12-01 11:36:10 +01:00
renovate[bot]
095b48ca47 Update all dependencies (#3210)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-12-01 01:32:06 +00:00
renovate[bot]
d459f69113 Update all dependencies (#3204)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-11-24 01:49:05 +00:00
mnightingale
2ecdd0b940 Optimise bpsmeter to avoid repeat lookups and try/except (#3203) 2025-11-22 14:46:21 +01:00
Safihre
73a4ad50e5 Allow longer build timeout for Snap amr64 build 2025-11-21 12:02:41 +01:00
Safihre
9b59e24961 Lower macOS build version to support older clients 2025-11-21 11:50:45 +01:00
Safihre
27e164763e Remove unused imports and shorten build timeouts 2025-11-21 11:29:53 +01:00
52 changed files with 1159 additions and 892 deletions

View File

@@ -10,9 +10,9 @@ jobs:
build_windows:
name: Build Windows binary
runs-on: windows-2022
timeout-minutes: 30
timeout-minutes: 15
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:
@@ -84,18 +84,18 @@ jobs:
build_macos:
name: Build macOS binary
runs-on: macos-14
timeout-minutes: 30
timeout-minutes: 15
env:
# We need the official Python, because the GA ones only support newer macOS versions
# The deployment target is picked up by the Python build tools automatically
# If updated, make sure to also set LSMinimumSystemVersion in SABnzbd.spec
PYTHON_VERSION: "3.14.0"
PYTHON_VERSION: "3.14.2"
MACOSX_DEPLOYMENT_TARGET: "10.15"
# We need to force compile for universal2 support
CFLAGS: -arch x86_64 -arch arm64
ARCHFLAGS: -arch x86_64 -arch arm64
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
# Only use this for the caching of pip packages!
uses: actions/setup-python@v6
@@ -172,7 +172,7 @@ jobs:
linux_arch: arm64
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Cache par2cmdline-turbo tarball
uses: actions/cache@v4
id: cache-par2cmdline
@@ -215,7 +215,7 @@ jobs:
runs-on: ubuntu-latest
needs: [build_windows, build_macos]
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python
uses: actions/setup-python@v6
with:

View File

@@ -7,7 +7,7 @@ jobs:
name: Black Code Formatter
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Black Code Formatter
uses: lgeiger/black-action@master
with:
@@ -43,7 +43,7 @@ jobs:
python-version: "3.14"
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v6
with:

View File

@@ -12,7 +12,7 @@ jobs:
env:
TX_TOKEN: ${{ secrets.TX_TOKEN }}
steps:
- uses: actions/checkout@v5
- uses: actions/checkout@v6
with:
token: ${{ secrets.AUTOMATION_GITHUB_TOKEN }}
- name: Generate translatable texts

View File

@@ -1,13 +1,19 @@
Release Notes - SABnzbd 4.6.0 Alpha 1
Release Notes - SABnzbd 4.6.0 Alpha 2
=========================================================
This is the first test release of version 4.6.
This is the second test release of version 4.6.
## New features in 4.6.0
* Added default support for NNTP Pipelining which eliminates idle waiting
between requests, significantly improving speeds on high-latency connections.
Read more here: https://sabnzbd.org/wiki/advanced/nntp-pipelining
* Dynamically increase Assembler limits on faster connections.
* Improved disk speed measurement in Status window.
* Enable `verify_xff_header` by default.
* Reduce delays between jobs during post-processing.
* If a download only has `.nzb` files inside, the new downloads
will include the name of the original download.
* Dropped support for Python 3.8.
## Bug fixes since 4.5.0
@@ -15,6 +21,7 @@ This is the first test release of version 4.6.
* `Check before download` could get stuck or fail to reject.
* Windows: Tray icon disappears after Explorer restart.
* Correct mobile layout if `Full Width` is enabled.
* Aborted Direct Unpack could result in no files being unpacked.
* macOS: Slow to start on some network setups.

View File

@@ -1,9 +1,9 @@
# Basic build requirements
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
pyinstaller==6.16.0
pyinstaller==6.17.0
packaging==25.0
pyinstaller-hooks-contrib==2025.9
altgraph==0.17.4
pyinstaller-hooks-contrib==2025.10
altgraph==0.17.5
wrapt==2.0.1
setuptools==80.9.0
@@ -12,8 +12,8 @@ pefile==2024.8.26; sys_platform == 'win32'
pywin32-ctypes==0.2.3; sys_platform == 'win32'
# For the macOS build
dmgbuild==1.6.5; sys_platform == 'darwin'
mac-alias==2.2.2; sys_platform == 'darwin'
macholib==1.16.3; sys_platform == 'darwin'
ds-store==1.3.1; sys_platform == 'darwin'
dmgbuild==1.6.6; sys_platform == 'darwin'
mac-alias==2.2.3; sys_platform == 'darwin'
macholib==1.16.4; sys_platform == 'darwin'
ds-store==1.3.2; sys_platform == 'darwin'
PyNaCl==1.6.1; sys_platform == 'darwin'

View File

@@ -508,11 +508,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr ""
@@ -530,11 +525,6 @@ msgstr ""
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr ""
@@ -1127,6 +1117,16 @@ msgstr ""
msgid "left"
msgstr ""
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr ""
@@ -3127,7 +3127,7 @@ msgid "Enable SFV-based checks"
msgstr ""
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgid "If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py

View File

@@ -560,11 +560,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Příliš mnoho spojení k serveru %s [%s]"
@@ -584,11 +579,6 @@ msgstr "Přihlášení k serveru %s se nezdařilo [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Nejspíše chyba downloaderu"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Vypínání"
@@ -1207,6 +1197,16 @@ msgstr "Zkouším SFV ověření"
msgid "left"
msgstr ""
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Nejspíše chyba downloaderu"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Tento server nepovoluje SSL na tomto portu"
@@ -3287,7 +3287,8 @@ msgid "Enable SFV-based checks"
msgstr ""
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py

View File

@@ -585,11 +585,6 @@ msgstr "Det lykkedes ikke at initialisere %s@%s med begrundelse %s"
msgid "Fatal error in Downloader"
msgstr "Alvorlig fejl i Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Modtog ukendt statuskode %s for artikel %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Alt for mange forbindelser til serveren %s [%s]"
@@ -611,11 +606,6 @@ msgstr "Det lykkedes ikke at logge på serveren %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Forbindelse %s@%s mislykkedes, besked %s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Suspect fejl i downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Påbegynder lukning af SABnzbd"
@@ -1248,6 +1238,16 @@ msgstr "Forsøger SFV verifikation"
msgid "left"
msgstr "tilbage"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Modtog ukendt statuskode %s for artikel %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Suspect fejl i downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Denne server tillader ikke SSL på denne port"
@@ -3428,8 +3428,9 @@ msgid "Enable SFV-based checks"
msgstr "Aktiver SFV-baseret kontrol"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Udfør en ekstra kontrol baseret på SFV-filer."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -617,11 +617,6 @@ msgstr "Fehler %s@%s zu initialisieren, aus folgendem Grund: %s"
msgid "Fatal error in Downloader"
msgstr "Schwerer Fehler im Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s:Unbekannter Statuscode%s für Artikel erhalten %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Zu viele Verbindungen zu Server %s [%s]"
@@ -643,11 +638,6 @@ msgstr "Anmelden beim Server fehlgeschlagen. %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Fehler beim Verbinden mit %s@%s, Meldung = %s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Vermute Fehler im Downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Wird beendet …"
@@ -1293,6 +1283,16 @@ msgstr "Versuche SFV-Überprüfung"
msgid "left"
msgstr "rest"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s:Unbekannter Statuscode%s für Artikel erhalten %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Vermute Fehler im Downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Dieser Server erlaubt kein SSL auf diesem Port"
@@ -3519,8 +3519,9 @@ msgid "Enable SFV-based checks"
msgstr "SFV-basierte Überprüfung aktivieren"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Zusätzliche Überprüfung mittels SFV-Dateien durchführen"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -602,12 +602,6 @@ msgstr "Error al inicializar %s@%s con la razón: %s"
msgid "Fatal error in Downloader"
msgstr "Error grave en el descargador"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
"%s@%s: Se recibió un código de estado desconocido %s para el artículo %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Demasiadas conexiones con el servidor %s [%s]"
@@ -629,11 +623,6 @@ msgstr "Registraccion fallo para servidor %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Ha fallado la conexión a %s@%s, el mensaje=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Error sospechoso en downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Apagando"
@@ -1282,6 +1271,17 @@ msgstr "Intentando verificación por SFV"
msgid "left"
msgstr "Restante"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
"%s@%s: Se recibió un código de estado desconocido %s para el artículo %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Error sospechoso en downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Este servidor no permite SSL en este puerto"
@@ -3495,8 +3495,9 @@ msgid "Enable SFV-based checks"
msgstr "Habilitar verificacion basada en SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Realiza una verificación extra basada en ficheros SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -556,11 +556,6 @@ msgstr "Alustaminen epäonnistui kohteessa %s@%s syy: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Liikaa yhteyksiä palvelimelle %s [%s]"
@@ -580,11 +575,6 @@ msgstr "Kirjautuminen palvelimelle %s epäonnistui [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Yhdistäminen %s@%s epäonnistui, viesti=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Mahdollinen virhe lataajassa"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Sammutetaan"
@@ -1208,6 +1198,16 @@ msgstr "Yritetään SFV varmennusta"
msgid "left"
msgstr "jäljellä"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Mahdollinen virhe lataajassa"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Tämä palvelin ei salli SSL yhteyksiä tähän porttiin"
@@ -3363,8 +3363,9 @@ msgid "Enable SFV-based checks"
msgstr "SFV-pohjaiset tarkistukset käytössä"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Suorittaa ylimääräisen varmennuksen SFV tiedostojen avulla."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -606,11 +606,6 @@ msgstr "Échec d'initialisation de %s@%s pour la raison suivante : %s"
msgid "Fatal error in Downloader"
msgstr "Erreur fatale dans le Téléchargeur"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s a reçu le code d'état inconnu %s pour l'article %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Trop de connexions au serveur %s [%s]"
@@ -632,11 +627,6 @@ msgstr "Échec de la connexion au serveur %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "La connexion à %s@%s a échoué, message=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Erreur suspecte dans le téléchargeur"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Arrêt en cours..."
@@ -1282,6 +1272,16 @@ msgstr "Essai vérification SFV"
msgid "left"
msgstr "restant"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s a reçu le code d'état inconnu %s pour l'article %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Erreur suspecte dans le téléchargeur"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Ce serveur n'authorise pas de connexion SSL sur ce port"
@@ -3502,8 +3502,9 @@ msgid "Enable SFV-based checks"
msgstr "Activer les contrôles SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fait une vérification supplémentaire basée sur les fichiers SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -567,11 +567,6 @@ msgstr "כישלון באתחול %s@%s עם סיבה: %s"
msgid "Fatal error in Downloader"
msgstr "שגיאה גורלית במורידן"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: קוד בלתי ידוע של מעמד התקבל %s עבור מאמר %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "יותר מדי חיבורים לשרת %s [%s]"
@@ -593,11 +588,6 @@ msgstr "כניסה נכשלה עבור שרת %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "התחברות אל %s@%s נכשלה, הודעה=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "הורדה חשודה במורידן"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "מכבה"
@@ -1221,6 +1211,16 @@ msgstr "מנסה וידוא SFV"
msgid "left"
msgstr "נותר"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: קוד בלתי ידוע של מעמד התקבל %s עבור מאמר %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "הורדה חשודה במורידן"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "שרת זה אינו מתיר SSL על פתחה זו"
@@ -3377,8 +3377,9 @@ msgid "Enable SFV-based checks"
msgstr "אפשר בדיקות מבוססות SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "בצע וידוא נוסף שמבוסס על קבצי SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -599,11 +599,6 @@ msgstr "Inizializzazione di %s@%s fallita con motivo: %s"
msgid "Fatal error in Downloader"
msgstr "Errore fatale nel Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Ricevuto codice di stato sconosciuto %s per l'articolo %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Troppe connessioni al server %s [%s]"
@@ -625,11 +620,6 @@ msgstr "Accesso fallito per il server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Connessione a %s@%s fallita, messaggio=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Sospetto errore nel downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Spegnimento in corso"
@@ -1269,6 +1259,16 @@ msgstr "Tentativo di verifica SFV"
msgid "left"
msgstr "rimanente"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Ricevuto codice di stato sconosciuto %s per l'articolo %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Sospetto errore nel downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Questo server non permette SSL su questa porta"
@@ -3471,8 +3471,9 @@ msgid "Enable SFV-based checks"
msgstr "Abilita controlli basati su SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Esegui una verifica extra basata sui file SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -553,11 +553,6 @@ msgstr "Feilet å starte %s@%s grunnet: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "For mange tilkoblinger til server %s [%s]"
@@ -577,11 +572,6 @@ msgstr "Kunne ikke logge inn på server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Kontaker %s@%s feilet, feilmelding=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Mistenker feil i nedlaster"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Starter avslutning av SABnzbd.."
@@ -1206,6 +1196,16 @@ msgstr "Prøver SFV-verifisering"
msgid "left"
msgstr "gjenstår"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Mistenker feil i nedlaster"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Denne serveren tillater ikke SSL på denne porten"
@@ -3346,8 +3346,9 @@ msgid "Enable SFV-based checks"
msgstr "Aktiver SFV-baserte sjekker"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Utfør ekstra verifisering basert på SFV filer"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -600,11 +600,6 @@ msgstr "Initialisatie van %s@%s mislukt, vanwege: %s"
msgid "Fatal error in Downloader"
msgstr "Onherstelbare fout in de Downloader"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Onbekende statuscode %s ontvangen voor artikel %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Te veel verbindingen met server %s [%s]"
@@ -626,11 +621,6 @@ msgstr "Aanmelden bij server %s mislukt [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Verbinding %s@%s mislukt, bericht=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Vedachte fout in downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Afsluiten"
@@ -1272,6 +1262,16 @@ msgstr "Probeer SFV-verificatie"
msgid "left"
msgstr "over"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: Onbekende statuscode %s ontvangen voor artikel %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Vedachte fout in downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "De server staat geen SSL toe op deze poort"
@@ -3469,8 +3469,9 @@ msgid "Enable SFV-based checks"
msgstr "Voer SFV-gebaseerde controles uit"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Doe een extra verificatie m.b.v. SFV-bestanden"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -554,11 +554,6 @@ msgstr "Błąd podczas inicjalizacji %s@%s: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Zbyt wiele połączeń do serwera %s [%s]"
@@ -578,11 +573,6 @@ msgstr "Błąd logowania do serwera %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Błąd połączenia %s@%s, komunikat=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Nieobsługiwany błąd w module pobierania"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Wyłączanie"
@@ -1211,6 +1201,16 @@ msgstr "Próba weryfikacji SFV"
msgid "left"
msgstr "pozostało"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Nieobsługiwany błąd w module pobierania"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Serwer nie obsługuje SSL na tym porcie"
@@ -3357,8 +3357,9 @@ msgid "Enable SFV-based checks"
msgstr "Włącz sprawdzanie przy użyciu SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Wykonuj dodatkową weryfikację na podstawie plików SFV"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -568,11 +568,6 @@ msgstr "Falha ao iniciar %s@%s devido as seguintes razões: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Excesso de conexões ao servidor %s [%s]"
@@ -592,11 +587,6 @@ msgstr "Falha de logon ao servidor %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "A conexão a %s@%s falhou. Mensagem=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Erro suspeito no downloader"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Encerrando"
@@ -1220,6 +1210,16 @@ msgstr "Tentando verificação SFV"
msgid "left"
msgstr "restantes"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Erro suspeito no downloader"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Este servidor não permite SSL nesta porta"
@@ -3367,8 +3367,9 @@ msgid "Enable SFV-based checks"
msgstr "Habilitar verificações baseadas em SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fazer uma verificação extra baseada em arquivos SFV."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -576,11 +576,6 @@ msgstr "Nu am putu inițializa %s@%s din cauza următorului motiv: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Prea multe conexiuni la serverul %s [%s]"
@@ -600,11 +595,6 @@ msgstr "Autentificare nereuşită la serverul %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Conectare %s@%s eșuată, mesaj=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Eroare suspectă în sistemul de descprcare"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Închidere"
@@ -1236,6 +1226,16 @@ msgstr "Încerc verificare SFV"
msgid "left"
msgstr "rămas"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Eroare suspectă în sistemul de descprcare"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Acest server nu permite SSL pe acest port"
@@ -3385,8 +3385,9 @@ msgid "Enable SFV-based checks"
msgstr "Activează verficări SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Fă o verificare extra bazată pe fişiere SFV"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -552,11 +552,6 @@ msgstr ""
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr ""
@@ -576,11 +571,6 @@ msgstr "Ошибка входа на сервер %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr ""
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Завершение работы"
@@ -1206,6 +1196,16 @@ msgstr "Проверка SFV-суммы"
msgid "left"
msgstr "осталось"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr ""
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr ""
@@ -3349,8 +3349,9 @@ msgid "Enable SFV-based checks"
msgstr "Использовать проверку по SFV"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Выполнять дополнительную проверку по SFV-файлам."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -550,11 +550,6 @@ msgstr "Neuspešna inicijalizacija %s@%s iz razloga: %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "Previše konekcija ka serveru %s [%s]"
@@ -574,11 +569,6 @@ msgstr "Неуспешно пријављивање на сервер %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Povezivanje na %s@%s neuspešno, poruka=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Sumnja u grešku u programu za download"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Гашење"
@@ -1201,6 +1191,16 @@ msgstr "Pokušaj SFV provere"
msgid "left"
msgstr "остало"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Sumnja u grešku u programu za download"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Ovaj server ne dozvoljava SSL na ovom portu"
@@ -3335,8 +3335,9 @@ msgid "Enable SFV-based checks"
msgstr "Упали SFV провере"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Уради још једну проверу базирану на SFV датотеке."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -550,11 +550,6 @@ msgstr "Misslyckades att initiera %s@%s med orsak %s"
msgid "Fatal error in Downloader"
msgstr ""
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "För många anslutningar till servern %s [%s]"
@@ -574,11 +569,6 @@ msgstr "Det gick inte att logga in på server %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "Anslutning %s@%s misslyckades, meddelande=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "Misstänker fel i nedladdare"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Påbörjar nedstängning av SABnzbd.."
@@ -1205,6 +1195,16 @@ msgstr "Försöker verifiera SFV"
msgid "left"
msgstr "kvar"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr ""
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "Misstänker fel i nedladdare"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Den här servern tillåter in SSL på denna port"
@@ -3345,8 +3345,9 @@ msgid "Enable SFV-based checks"
msgstr "Använd SFV-baserade kontroller"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "Gör en extra kontroll med SFV filer"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -597,11 +597,6 @@ msgstr "%s@%s başlatması şu sebepten dolayı başarısız oldu: %s"
msgid "Fatal error in Downloader"
msgstr "İndirici'de ölümcül hata"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: bilinmeyen durum kodu %s, şu makale için alındı: %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "%s [%s] sunucusuna çok fazla bağlantı"
@@ -623,11 +618,6 @@ msgstr "%s [%s] sunucusunda oturum açılışı başarısız oldu"
msgid "Connecting %s@%s failed, message=%s"
msgstr "%s@%s bağlantısı başarısız oldu, mesaj=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "İndiricide şüpheli hata"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "Kapatılıyor"
@@ -1260,6 +1250,16 @@ msgstr "SFV doğrulaması deneniyor"
msgid "left"
msgstr "kaldı"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s: bilinmeyen durum kodu %s, şu makale için alındı: %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "İndiricide şüpheli hata"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "Bu sunucu, bu bağlantı noktasında SSL kullanımına izin vermiyor"
@@ -3458,8 +3458,9 @@ msgid "Enable SFV-based checks"
msgstr "SFV temelli kontrolleri etkinleştir"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "SFV dosyalarına dayalı ilave bir doğrulama yap."
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -550,11 +550,6 @@ msgstr "无法初始化 %s@%s原因为: %s"
msgid "Fatal error in Downloader"
msgstr "下载器出现致命错误"
#. Warning message
#: sabnzbd/downloader.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s收到文章 %s 的未知状态码 %s"
#: sabnzbd/downloader.py
msgid "Too many connections to server %s [%s]"
msgstr "服务器 %s 连接数过多 [%s]"
@@ -574,11 +569,6 @@ msgstr "无法登录服务器 %s [%s]"
msgid "Connecting %s@%s failed, message=%s"
msgstr "连接 %s@%s 失败,消息=%s"
#. Error message
#: sabnzbd/downloader.py
msgid "Suspect error in downloader"
msgstr "下载器疑似错误"
#: sabnzbd/downloader.py, sabnzbd/skintext.py
msgid "Shutting down"
msgstr "正在关闭"
@@ -1197,6 +1187,16 @@ msgstr "正在尝试 SFV 验证"
msgid "left"
msgstr "剩余"
#. Warning message
#: sabnzbd/newswrapper.py
msgid "%s@%s: Received unknown status code %s for article %s"
msgstr "%s@%s收到文章 %s 的未知状态码 %s"
#. Error message
#: sabnzbd/newswrapper.py
msgid "Suspect error in downloader"
msgstr "下载器疑似错误"
#: sabnzbd/newswrapper.py
msgid "This server does not allow SSL on this port"
msgstr "该服务器不允许在该端口使用 SSL"
@@ -3302,8 +3302,9 @@ msgid "Enable SFV-based checks"
msgstr "启用基于 SFV 的检查"
#: sabnzbd/skintext.py
msgid "Do an extra verification based on SFV files."
msgstr "根据 SFV 文件进行额外验证。"
msgid ""
"If no par2 files are available, use sfv files (if present) to verify files"
msgstr ""
#: sabnzbd/skintext.py
msgid "User script can flag job as failed"

View File

@@ -1,8 +1,8 @@
# Main requirements
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
apprise==1.9.5
sabctools==8.2.6
CT3==3.4.0
apprise==1.9.6
sabctools==9.1.0
CT3==3.4.0.post5
cffi==2.0.0
pycparser==2.23
feedparser==6.0.12
@@ -37,7 +37,7 @@ cryptography==46.0.3
# We recommend using "orjson" as it is 2x as fast as "ujson". However, it requires
# Rust so SABnzbd works just as well with "ujson" or the Python built in "json" module
ujson==5.11.0
orjson==3.11.4
orjson==3.11.5
# Windows system integration
pywin32==311; sys_platform == 'win32'
@@ -67,7 +67,7 @@ paho-mqtt==1.6.1 # Pinned, newer versions don't work with AppRise yet
# Requests Requirements
charset_normalizer==3.4.4
idna==3.11
urllib3==2.5.0
urllib3==2.6.0
certifi==2025.11.12
oauthlib==3.3.1
PyJWT==2.10.1

View File

@@ -269,6 +269,7 @@ def initialize(pause_downloader=False, clean_up=False, repair=0):
cfg.language.callback(cfg.guard_language)
cfg.enable_https_verification.callback(cfg.guard_https_ver)
cfg.guard_https_ver()
cfg.pipelining_requests.callback(cfg.guard_restart)
# Set language files
lang.set_locale_info("SABnzbd", DIR_LANGUAGE)

View File

@@ -30,6 +30,8 @@ import cherrypy
from threading import Thread
from typing import Optional, Any, Union
import sabctools
# For json.dumps, orjson is magnitudes faster than ujson, but it is harder to
# compile due to Rust dependency. Since the output is the same, we support all modules.
try:
@@ -781,12 +783,12 @@ def _api_watched_now(name: str, kwargs: dict[str, Union[str, list[str]]]) -> byt
def _api_resume_pp(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.PostProcessor.paused = False
sabnzbd.PostProcessor.resume()
return report()
def _api_pause_pp(name: str, kwargs: dict[str, Union[str, list[str]]]) -> bytes:
sabnzbd.PostProcessor.paused = True
sabnzbd.PostProcessor.pause()
return report()
@@ -1387,12 +1389,20 @@ def test_nntp_server_dict(kwargs: dict[str, Union[str, list[str]]]) -> tuple[boo
# Sorry, no clever analysis:
return False, T('Server address "%s:%s" is not valid.') % (host, port)
nw = NewsWrapper(server=test_server, thrdnum=-1, block=True)
nntp_code: int = 0
nntp_message: str = ""
def on_response(code: int, message: str):
nonlocal nntp_code, nntp_message
nntp_code = code
nntp_message = message
try:
nw = NewsWrapper(server=test_server, thrdnum=-1, block=True)
nw.init_connect()
while not nw.connected:
nw.recv_chunk()
nw.finish_connect(nw.status_code)
nw.write()
nw.read(on_response=on_response)
except socket.timeout:
if port != 119 and not ssl:
@@ -1414,30 +1424,30 @@ def test_nntp_server_dict(kwargs: dict[str, Union[str, list[str]]]) -> tuple[boo
return False, str(err)
if not username or not password:
nw.nntp.sock.sendall(b"ARTICLE <test@home>\r\n")
nw.queue_command(b"ARTICLE <test@home>\r\n")
try:
nw.reset_data_buffer()
nw.recv_chunk()
nw.write()
nw.read(on_response=on_response)
except Exception as err:
# Some internal error, not always safe to close connection
return False, str(err)
# Parse result
return_status = ()
if nw.status_code:
if nw.status_code == 480:
if nntp_code:
if nntp_code == 480:
return_status = (False, T("Server requires username and password."))
elif nw.status_code < 300 or nw.status_code in (411, 423, 430):
elif nntp_code < 300 or nntp_code in (411, 423, 430):
# If no username/password set and we requested fake-article, it will return 430 Not Found
return_status = (True, T("Connection Successful!"))
elif nw.status_code == 502 or sabnzbd.downloader.clues_login(nw.nntp_msg):
elif nntp_code == 502 or sabnzbd.downloader.clues_login(nntp_message):
return_status = (False, T("Authentication failed, check username/password."))
elif sabnzbd.downloader.clues_too_many(nw.nntp_msg):
elif sabnzbd.downloader.clues_too_many(nntp_message):
return_status = (False, T("Too many connections, please pause downloading or try again later"))
# Fallback in case no data was received or unknown status
if not return_status:
return_status = (False, T("Could not determine connection result (%s)") % nw.nntp_msg)
return_status = (False, T("Could not determine connection result (%s)") % nntp_message)
# Close the connection and return result
nw.hard_reset()
@@ -1502,13 +1512,13 @@ def build_status(calculate_performance: bool = False, skip_dashboard: bool = Fal
for nw in server.busy_threads.copy():
if nw.connected:
activeconn += 1
if nw.article:
if article := nw.article:
serverconnections.append(
{
"thrdnum": nw.thrdnum,
"art_name": nw.article.article,
"nzf_name": nw.article.nzf.filename,
"nzo_name": nw.article.nzf.nzo.final_name,
"art_name": article.article,
"nzf_name": article.nzf.filename,
"nzo_name": article.nzf.nzo.final_name,
}
)

View File

@@ -39,7 +39,7 @@ from sabnzbd.filesystem import (
has_unwanted_extension,
get_basename,
)
from sabnzbd.constants import Status, GIGI, DEF_MAX_ASSEMBLER_QUEUE
from sabnzbd.constants import Status, GIGI
import sabnzbd.cfg as cfg
from sabnzbd.nzbstuff import NzbObject, NzbFile
import sabnzbd.par2file as par2file

View File

@@ -254,8 +254,6 @@ class BPSMeter:
self.week_total[server] = 0
if server not in self.month_total:
self.month_total[server] = 0
if server not in self.month_total:
self.month_total[server] = 0
if server not in self.grand_total:
self.grand_total[server] = 0
if server not in self.timeline_total:
@@ -302,45 +300,51 @@ class BPSMeter:
for server in sabnzbd.Downloader.servers[:]:
self.init_server_stats(server.id)
# Cache dict references for faster access
day_total = self.day_total
week_total = self.week_total
month_total = self.month_total
grand_total = self.grand_total
timeline_total = self.timeline_total
cached_amount = self.cached_amount
server_bps = self.server_bps
start_time = self.start_time
last_update = self.last_update
# Minimum epsilon to avoid division by zero
dt_total = max(t - start_time, 1e-6)
dt_last = max(last_update - start_time, 1e-6)
# Add amounts that have been stored temporarily to statistics
for srv in self.cached_amount:
if self.cached_amount[srv]:
self.day_total[srv] += self.cached_amount[srv]
self.week_total[srv] += self.cached_amount[srv]
self.month_total[srv] += self.cached_amount[srv]
self.grand_total[srv] += self.cached_amount[srv]
self.timeline_total[srv][self.day_label] += self.cached_amount[srv]
if cached := self.cached_amount[srv]:
day_total[srv] += cached
week_total[srv] += cached
month_total[srv] += cached
grand_total[srv] += cached
timeline_total[srv][self.day_label] += cached
# Reset for next time
cached_amount[srv] = 0
# Update server bps
try:
self.server_bps[srv] = (
self.server_bps[srv] * (self.last_update - self.start_time) + self.cached_amount[srv]
) / (t - self.start_time)
except ZeroDivisionError:
self.server_bps[srv] = 0.0
# Reset for next time
self.cached_amount[srv] = 0
server_bps[srv] = (server_bps[srv] * dt_last + cached) / dt_total
# Quota check
total_cached = self.sum_cached_amount
if self.have_quota and self.quota_enabled:
self.left -= self.sum_cached_amount
self.left -= total_cached
self.check_quota()
# Speedometer
try:
self.bps = (self.bps * (self.last_update - self.start_time) + self.sum_cached_amount) / (
t - self.start_time
)
except ZeroDivisionError:
self.bps = 0.0
self.bps = (self.bps * dt_last + total_cached) / dt_total
self.sum_cached_amount = 0
self.last_update = t
check_time = t - 5.0
if self.start_time < check_time:
if start_time < check_time:
self.start_time = check_time
if self.bps < 0.01:

View File

@@ -53,6 +53,7 @@ from sabnzbd.constants import (
DEF_HTTPS_CERT_FILE,
DEF_HTTPS_KEY_FILE,
DEF_MAX_ASSEMBLER_QUEUE,
DEF_PIPELINING_REQUESTS,
)
from sabnzbd.filesystem import same_directory, real_path, is_valid_script, is_network_path
@@ -534,6 +535,7 @@ ssdp_broadcast_interval = OptionNumber("misc", "ssdp_broadcast_interval", 15, mi
ext_rename_ignore = OptionList("misc", "ext_rename_ignore", validation=lower_case_ext)
unrar_parameters = OptionStr("misc", "unrar_parameters", validation=supported_unrar_parameters)
outgoing_nntp_ip = OptionStr("misc", "outgoing_nntp_ip")
pipelining_requests = OptionNumber("misc", "pipelining_requests", DEF_PIPELINING_REQUESTS, minval=1, maxval=10)
##############################################################################

View File

@@ -50,7 +50,7 @@ RENAMES_FILE = "__renames__"
ATTRIB_FILE = "SABnzbd_attrib"
REPAIR_REQUEST = "repair-all.sab"
SABCTOOLS_VERSION_REQUIRED = "8.2.6"
SABCTOOLS_VERSION_REQUIRED = "9.1.0"
DB_HISTORY_VERSION = 1
DB_HISTORY_NAME = "history%s.db" % DB_HISTORY_VERSION
@@ -101,8 +101,9 @@ DEF_MAX_ASSEMBLER_QUEUE = 12
SOFT_ASSEMBLER_QUEUE_LIMIT = 0.5
# Percentage of cache to use before adding file to assembler
ASSEMBLER_WRITE_THRESHOLD = 5
NNTP_BUFFER_SIZE = int(800 * KIBI)
NNTP_BUFFER_SIZE = int(256 * KIBI)
NTTP_MAX_BUFFER_SIZE = int(10 * MEBI)
DEF_PIPELINING_REQUESTS = 2
REPAIR_PRIORITY = 3
FORCE_PRIORITY = 2

View File

@@ -21,13 +21,10 @@ sabnzbd.decoder - article decoder
import logging
import hashlib
import binascii
from io import BytesIO
from zlib import crc32
from typing import Optional
import sabnzbd
from sabnzbd.constants import SABCTOOLS_VERSION_REQUIRED
from sabnzbd.encoding import ubtou
from sabnzbd.nzbstuff import Article
from sabnzbd.misc import match_str
@@ -50,7 +47,7 @@ except Exception:
class BadData(Exception):
def __init__(self, data: bytes):
def __init__(self, data: bytearray):
super().__init__()
self.data = data
@@ -63,8 +60,8 @@ class BadUu(Exception):
pass
def decode(article: Article, data_view: memoryview):
decoded_data = None
def decode(article: Article, decoder: sabctools.NNTPResponse):
decoded_data: Optional[bytearray] = None
nzo = article.nzf.nzo
art_id = article.article
@@ -78,10 +75,10 @@ def decode(article: Article, data_view: memoryview):
if sabnzbd.LOG_ALL:
logging.debug("Decoding %s", art_id)
if article.nzf.type == "uu":
decoded_data = decode_uu(article, bytes(data_view))
if decoder.format is sabctools.EncodingFormat.UU:
decoded_data = decode_uu(article, decoder)
else:
decoded_data = decode_yenc(article, data_view)
decoded_data = decode_yenc(article, decoder)
article_success = True
@@ -112,28 +109,18 @@ def decode(article: Article, data_view: memoryview):
except (BadYenc, ValueError):
# Handles precheck and badly formed articles
if nzo.precheck and data_view and data_view[:4] == b"223 ":
if nzo.precheck and decoder.status_code == 223:
# STAT was used, so we only get a status code
article_success = True
else:
# Try uu-decoding
if not nzo.precheck and article.nzf.type != "yenc":
try:
decoded_data = decode_uu(article, bytes(data_view))
logging.debug("Found uu-encoded article %s in job %s", art_id, nzo.final_name)
article_success = True
except Exception:
pass
# Only bother with further checks if uu-decoding didn't work out
if not article_success:
# Convert the first 2000 bytes of raw socket data to article lines,
# and examine the headers (for precheck) or body (for download).
for line in bytes(data_view[:2000]).split(b"\r\n"):
# Examine the headers (for precheck) or body (for download).
if lines := decoder.lines:
for line in lines:
lline = line.lower()
if lline.startswith(b"message-id:"):
if lline.startswith("message-id:"):
article_success = True
# Look for DMCA clues (while skipping "X-" headers)
if not lline.startswith(b"x-") and match_str(lline, (b"dmca", b"removed", b"cancel", b"blocked")):
if not lline.startswith("x-") and match_str(lline, ("dmca", "removed", "cancel", "blocked")):
article_success = False
logging.info("Article removed from server (%s)", art_id)
break
@@ -170,164 +157,63 @@ def decode(article: Article, data_view: memoryview):
sabnzbd.NzbQueue.register_article(article, article_success)
def decode_yenc(article: Article, data_view: memoryview) -> bytearray:
def decode_yenc(article: Article, response: sabctools.NNTPResponse) -> bytearray:
# Let SABCTools do all the heavy lifting
(
decoded_data,
yenc_filename,
article.file_size,
article.data_begin,
article.data_size,
crc_correct,
) = sabctools.yenc_decode(data_view)
decoded_data = response.data
article.file_size = response.file_size
article.data_begin = response.part_begin
article.data_size = response.part_size
nzf = article.nzf
# Assume it is yenc
nzf.type = "yenc"
# Only set the name if it was found and not obfuscated
if not nzf.filename_checked and yenc_filename:
if not nzf.filename_checked and (file_name := response.file_name):
# Set the md5-of-16k if this is the first article
if article.lowest_partnum:
nzf.md5of16k = hashlib.md5(decoded_data[:16384]).digest()
nzf.md5of16k = hashlib.md5(memoryview(decoded_data)[:16384]).digest()
# Try the rename, even if it's not the first article
# For example when the first article was missing
nzf.nzo.verify_nzf_filename(nzf, yenc_filename)
nzf.nzo.verify_nzf_filename(nzf, file_name)
# CRC check
if crc_correct is None:
if (crc := response.crc) is None:
logging.info("CRC Error in %s", article.article)
raise BadData(decoded_data)
article.crc32 = crc_correct
article.crc32 = crc
return decoded_data
def decode_uu(article: Article, raw_data: bytes) -> bytes:
"""Try to uu-decode an article. The raw_data may or may not contain headers.
If there are headers, they will be separated from the body by at least one
empty line. In case of no headers, the first line seems to always be the nntp
response code (220/222) directly followed by the msg body."""
if not raw_data:
def decode_uu(article: Article, response: sabctools.NNTPResponse) -> bytearray:
"""Process a uu-decoded response"""
if not response.bytes_decoded:
logging.debug("No data to decode")
raise BadUu
# Line up the raw_data
raw_data = raw_data.split(b"\r\n")
if response.baddata:
raise BadData(response.data)
# Index of the uu payload start in raw_data
uu_start = 0
# Limit the number of lines to check for the onset of uu data
limit = min(len(raw_data), 32) - 1
if limit < 3:
logging.debug("Article too short to contain valid uu-encoded data")
raise BadUu
# Try to find an empty line separating the body from headers or response
# code and set the expected payload start to the next line.
try:
uu_start = raw_data[:limit].index(b"") + 1
except ValueError:
# No empty line, look for a response code instead
if raw_data[0].startswith(b"220 ") or raw_data[0].startswith(b"222 "):
uu_start = 1
else:
# Invalid data?
logging.debug("Failed to locate start of uu payload")
raise BadUu
def is_uu_junk(line: bytes) -> bool:
"""Determine if the line is empty or contains known junk data"""
return (not line) or line == b"-- " or line.startswith(b"Posted via ")
# Check the uu 'begin' line
if article.lowest_partnum:
try:
# Make sure the line after the uu_start one isn't empty as well or
# detection of the 'begin' line won't work. For articles other than
# lowest_partnum, filtering out empty lines (and other junk) can
# wait until the actual decoding step.
for index in range(uu_start, limit):
if is_uu_junk(raw_data[index]):
uu_start = index + 1
else:
# Bingo
break
else:
# Search reached the limit
raise IndexError
uu_begin_data = raw_data[uu_start].split(b" ")
# Filename may contain spaces
uu_filename = ubtou(b" ".join(uu_begin_data[2:]).strip())
# Sanity check the 'begin' line
if (
len(uu_begin_data) < 3
or uu_begin_data[0].lower() != b"begin"
or (not int(uu_begin_data[1], 8))
or (not uu_filename)
):
raise ValueError
# Consider this enough proof to set the type, avoiding further
# futile attempts at decoding articles in this nzf as yenc.
article.nzf.type = "uu"
# Bump the pointer for the payload to the next line
uu_start += 1
except Exception:
logging.debug("Missing or invalid uu 'begin' line: %s", raw_data[uu_start] if uu_start < limit else None)
raise BadUu
# Do the actual decoding
with BytesIO() as decoded_data:
for line in raw_data[uu_start:]:
# Ignore junk
if is_uu_junk(line):
continue
# End of the article
if line in (b"`", b"end", b"."):
break
# Remove dot stuffing
if line.startswith(b".."):
line = line[1:]
try:
decoded_line = binascii.a2b_uu(line)
except binascii.Error as msg:
try:
# Workaround for broken uuencoders by Fredrik Lundh
nbytes = (((line[0] - 32) & 63) * 4 + 5) // 3
decoded_line = binascii.a2b_uu(line[:nbytes])
except Exception as msg2:
logging.info(
"Error while uu-decoding %s: %s (line: %s; workaround: %s)", article.article, msg, line, msg2
)
raise BadData(decoded_data.getvalue())
# Store the decoded data
decoded_data.write(decoded_line)
# Set the type to uu; the latter is still needed in
# case the lowest_partnum article was damaged or slow to download.
article.nzf.type = "uu"
decoded_data = response.data
nzf = article.nzf
nzf.type = "uu"
# Only set the name if it was found and not obfuscated
if not nzf.filename_checked and (file_name := response.file_name):
# Set the md5-of-16k if this is the first article
if article.lowest_partnum:
decoded_data.seek(0)
article.nzf.md5of16k = hashlib.md5(decoded_data.read(16384)).digest()
# Handle the filename
if not article.nzf.filename_checked and uu_filename:
article.nzf.nzo.verify_nzf_filename(article.nzf, uu_filename)
nzf.md5of16k = hashlib.md5(memoryview(decoded_data)[:16384]).digest()
data = decoded_data.getvalue()
article.crc32 = crc32(data)
return data
# Try the rename, even if it's not the first article
# For example when the first article was missing
nzf.nzo.verify_nzf_filename(nzf, file_name)
article.crc32 = response.crc
return decoded_data
def search_new_server(article: Article) -> bool:

View File

@@ -37,7 +37,6 @@ from sabnzbd.decorators import synchronized
from sabnzbd.newsunpack import RAR_EXTRACTFROM_RE, RAR_EXTRACTED_RE, rar_volumelist, add_time_left
from sabnzbd.postproc import prepare_extraction_path
from sabnzbd.misc import SABRarFile
import rarfile
from sabnzbd.utils.diskspeed import diskspeedmeasure
# Need a lock to make sure start and stop is handled correctly

View File

@@ -23,7 +23,7 @@ import asyncio
import os
import logging
import threading
from typing import Generator, Set, Optional, Tuple
from typing import Generator, Optional
import sabnzbd
from sabnzbd.constants import SCAN_FILE_NAME, VALID_ARCHIVES, VALID_NZB_FILES, AddNzbFileResult

View File

@@ -19,15 +19,18 @@
sabnzbd.downloader - download engine
"""
import select
import logging
import selectors
from collections import deque
from threading import Thread, RLock, current_thread
import socket
import sys
import ssl
import time
from datetime import date
from typing import Optional, Union
from typing import Optional, Union, Deque
import sabctools
import sabnzbd
from sabnzbd.decorators import synchronized, NzbQueueLocker, DOWNLOADER_CV, DOWNLOADER_LOCK
@@ -148,7 +151,7 @@ class Server:
self.request: bool = False # True if a getaddrinfo() request is pending
self.have_body: bool = True # Assume server has "BODY", until proven otherwise
self.have_stat: bool = True # Assume server has "STAT", until proven otherwise
self.article_queue: list[sabnzbd.nzbstuff.Article] = []
self.article_queue: Deque[sabnzbd.nzbstuff.Article] = deque()
# Skip during server testing
if threads:
@@ -173,19 +176,19 @@ class Server:
self.reset_article_queue()
@synchronized(DOWNLOADER_LOCK)
def get_article(self):
def get_article(self, peek: bool = False):
"""Get article from pre-fetched and pre-fetch new ones if necessary.
Articles that are too old for this server are immediately marked as tried"""
if self.article_queue:
return self.article_queue.pop(0)
return self.article_queue[0] if peek else self.article_queue.popleft()
if self.next_article_search < time.time():
# Pre-fetch new articles
self.article_queue = sabnzbd.NzbQueue.get_articles(self, sabnzbd.Downloader.servers, _ARTICLE_PREFETCH)
sabnzbd.NzbQueue.get_articles(self, sabnzbd.Downloader.servers, _ARTICLE_PREFETCH)
if self.article_queue:
article = self.article_queue.pop(0)
article = self.article_queue[0] if peek else self.article_queue.popleft()
# Mark expired articles as tried on this server
if self.retention and article.nzf.nzo.avg_stamp < time.time() - self.retention:
if not peek and self.retention and article.nzf.nzo.avg_stamp < time.time() - self.retention:
sabnzbd.Downloader.decode(article)
while self.article_queue:
sabnzbd.Downloader.decode(self.article_queue.pop())
@@ -201,9 +204,12 @@ class Server:
"""Reset articles queued for the Server. Locked to prevent
articles getting stuck in the Server when enabled/disabled"""
logging.debug("Resetting article queue for %s (%s)", self, self.article_queue)
for article in self.article_queue:
article.allow_new_fetcher()
self.article_queue = []
while self.article_queue:
try:
article = self.article_queue.popleft()
article.allow_new_fetcher()
except IndexError:
pass
def request_addrinfo(self):
"""Launch async request to resolve server address and select the fastest.
@@ -250,7 +256,7 @@ class Downloader(Thread):
"shutdown",
"server_restarts",
"force_disconnect",
"read_fds",
"selector",
"servers",
"timers",
"last_max_chunk_size",
@@ -290,7 +296,7 @@ class Downloader(Thread):
self.force_disconnect: bool = False
self.read_fds: dict[int, NewsWrapper] = {}
self.selector: selectors.DefaultSelector = selectors.DefaultSelector()
self.servers: list[Server] = []
self.timers: dict[str, list[float]] = {}
@@ -361,15 +367,34 @@ class Downloader(Thread):
self.servers.sort(key=lambda svr: "%02d%s" % (svr.priority, svr.displayname.lower()))
@synchronized(DOWNLOADER_LOCK)
def add_socket(self, fileno: int, nw: NewsWrapper):
"""Add a socket ready to be used to the list to be watched"""
self.read_fds[fileno] = nw
def add_socket(self, nw: NewsWrapper):
"""Add a socket to be watched for read or write availability"""
if nw.nntp:
try:
self.selector.register(nw.nntp.fileno, selectors.EVENT_READ | selectors.EVENT_WRITE, nw)
nw.selector_events = selectors.EVENT_READ | selectors.EVENT_WRITE
except KeyError:
pass
@synchronized(DOWNLOADER_LOCK)
def modify_socket(self, nw: NewsWrapper, events: int):
"""Modify the events socket are watched for"""
if nw.nntp and nw.selector_events != events:
try:
self.selector.modify(nw.nntp.fileno, events, nw)
nw.selector_events = events
except KeyError:
pass
@synchronized(DOWNLOADER_LOCK)
def remove_socket(self, nw: NewsWrapper):
"""Remove a socket to be watched"""
if nw.nntp:
self.read_fds.pop(nw.nntp.fileno, None)
try:
self.selector.unregister(nw.nntp.fileno)
nw.selector_events = 0
except KeyError:
pass
@NzbQueueLocker
def set_paused_state(self, state: bool):
@@ -409,8 +434,9 @@ class Downloader(Thread):
@NzbQueueLocker
def resume_from_postproc(self):
logging.info("Post-processing finished, resuming download")
self.paused_for_postproc = False
if self.paused_for_postproc:
logging.info("Post-processing finished, resuming download")
self.paused_for_postproc = False
@NzbQueueLocker
def disconnect(self):
@@ -508,26 +534,30 @@ class Downloader(Thread):
# Remove all connections to server
for nw in server.idle_threads | server.busy_threads:
self.__reset_nw(nw, "Forcing disconnect", warn=False, wait=False, retry_article=False)
self.reset_nw(nw, "Forcing disconnect", warn=False, wait=False, retry_article=False)
# Make sure server address resolution is refreshed
server.addrinfo = None
@staticmethod
def decode(article, data_view: Optional[memoryview] = None):
def decode(article: "sabnzbd.nzbstuff.Article", response: Optional[sabctools.NNTPResponse] = None):
"""Decode article"""
# Need a better way of draining requests
if article.nzf.nzo.removed_from_queue:
return
# Article was requested and fetched, update article stats for the server
sabnzbd.BPSMeter.register_server_article_tried(article.fetcher.id)
# Handle broken articles directly
if not data_view:
if not response or not response.bytes_decoded and not article.nzf.nzo.precheck:
if not article.search_new_server():
article.nzf.nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
return
# Decode and send to article cache
sabnzbd.decoder.decode(article, data_view)
sabnzbd.decoder.decode(article, response)
def run(self):
# Warn if there are servers defined, but none are valid
@@ -547,7 +577,7 @@ class Downloader(Thread):
for _ in range(cfg.receive_threads()):
# Started as daemon, so we don't need any shutdown logic in the worker
# The Downloader code will make sure shutdown is handled gracefully
Thread(target=self.process_nw_worker, args=(self.read_fds, process_nw_queue), daemon=True).start()
Thread(target=self.process_nw_worker, args=(process_nw_queue,), daemon=True).start()
# Catch all errors, just in case
try:
@@ -569,9 +599,9 @@ class Downloader(Thread):
if (nw.nntp and nw.nntp.error_msg) or (nw.timeout and now > nw.timeout):
if nw.nntp and nw.nntp.error_msg:
# Already showed error
self.__reset_nw(nw)
self.reset_nw(nw)
else:
self.__reset_nw(nw, "Timed out", warn=True)
self.reset_nw(nw, "Timed out", warn=True)
server.bad_cons += 1
self.maybe_block_server(server)
@@ -611,15 +641,14 @@ class Downloader(Thread):
server.request_addrinfo()
break
nw.article = server.get_article()
if not nw.article:
if not server.get_article(peek=True):
break
server.idle_threads.remove(nw)
server.busy_threads.add(nw)
if nw.connected:
self.__request_article(nw)
self.add_socket(nw)
else:
try:
logging.info("%s@%s: Initiating connection", nw.thrdnum, server.host)
@@ -631,14 +660,14 @@ class Downloader(Thread):
server.host,
sys.exc_info()[1],
)
self.__reset_nw(nw, "Failed to initialize", warn=True)
self.reset_nw(nw, "Failed to initialize", warn=True)
if self.force_disconnect or self.shutdown:
for server in self.servers:
for nw in server.idle_threads | server.busy_threads:
# Send goodbye if we have open socket
if nw.nntp:
self.__reset_nw(nw, "Forcing disconnect", wait=False, count_article_try=False)
self.reset_nw(nw, "Forcing disconnect", wait=False, count_article_try=False)
# Make sure server address resolution is refreshed
server.addrinfo = None
server.reset_article_queue()
@@ -662,10 +691,12 @@ class Downloader(Thread):
self.last_max_chunk_size = 0
# Use select to find sockets ready for reading/writing
if readkeys := self.read_fds.keys():
read, _, _ = select.select(readkeys, (), (), 1.0)
if self.selector.get_map():
if events := self.selector.select(timeout=1.0):
for key, ev in events:
process_nw_queue.put((key.data, ev))
else:
read = []
events = []
BPSMeter.reset()
time.sleep(0.1)
self.max_chunk_size = _DEFAULT_CHUNK_SIZE
@@ -684,58 +715,65 @@ class Downloader(Thread):
next_bpsmeter_update = now + _BPSMETER_UPDATE_DELAY
self.check_assembler_levels()
if not read:
if not events:
continue
# Submit all readable sockets to be processed and wait for completion
process_nw_queue.put_multiple(read)
# Wait for socket operation completion
process_nw_queue.join()
except Exception:
logging.error(T("Fatal error in Downloader"), exc_info=True)
def process_nw_worker(self, read_fds: dict[int, NewsWrapper], nw_queue: MultiAddQueue):
def process_nw_worker(self, nw_queue: MultiAddQueue):
"""Worker for the daemon thread to process results.
Wrapped in try/except because in case of an exception, logging
might get lost and the queue.join() would block forever."""
try:
logging.debug("Starting Downloader receive thread: %s", current_thread().name)
while True:
# The read_fds is passed by reference, so we can access its items!
self.process_nw(read_fds[nw_queue.get()])
self.process_nw(*nw_queue.get())
nw_queue.task_done()
except Exception:
# We cannot break out of the Downloader from here, so just pause
logging.error(T("Fatal error in Downloader"), exc_info=True)
self.pause()
def process_nw(self, nw: NewsWrapper):
def process_nw(self, nw: NewsWrapper, event: int):
"""Receive data from a NewsWrapper and handle the response"""
try:
bytes_received, end_of_line, article_done = nw.recv_chunk()
except ssl.SSLWantReadError:
return
except (ConnectionError, ConnectionAbortedError):
# The ConnectionAbortedError is also thrown by sabctools in case of fatal SSL-layer problems
self.__reset_nw(nw, "Server closed connection", wait=False)
return
except BufferError:
# The BufferError is thrown when exceeding maximum buffer size
# Make sure to discard the article
self.__reset_nw(nw, "Maximum data buffer size exceeded", wait=False, retry_article=False)
return
if event & selectors.EVENT_READ:
self.process_nw_read(nw)
if event & selectors.EVENT_WRITE:
nw.write()
def process_nw_read(self, nw: NewsWrapper) -> None:
bytes_received: int = 0
bytes_pending: int = 0
while True:
try:
n, bytes_pending = nw.read(nbytes=bytes_pending)
bytes_received += n
except ssl.SSLWantReadError:
return
except (ConnectionError, ConnectionAbortedError):
# The ConnectionAbortedError is also thrown by sabctools in case of fatal SSL-layer problems
self.reset_nw(nw, "Server closed connection", wait=False)
return
except BufferError:
# The BufferError is thrown when exceeding maximum buffer size
# Make sure to discard the article
self.reset_nw(nw, "Maximum data buffer size exceeded", wait=False, retry_article=False)
return
if not bytes_pending:
break
article = nw.article
server = nw.server
with DOWNLOADER_LOCK:
sabnzbd.BPSMeter.update(server.id, bytes_received)
if bytes_received > self.last_max_chunk_size:
self.last_max_chunk_size = bytes_received
# Update statistics only when we fetched a whole article
# The side effect is that we don't count things like article-not-available messages
if article_done:
article.nzf.nzo.update_download_stats(sabnzbd.BPSMeter.bps, server.id, nw.data_position)
# Check speedlimit
if (
self.bandwidth_limit
@@ -746,99 +784,6 @@ class Downloader(Thread):
time.sleep(0.01)
sabnzbd.BPSMeter.update()
# If we are not at the end of a line, more data will follow
if not end_of_line:
return
# Response code depends on request command:
# 220 = ARTICLE, 222 = BODY
if nw.status_code not in (220, 222) and not article_done:
if not nw.connected or nw.status_code == 480:
if not self.__finish_connect_nw(nw):
return
if nw.connected:
logging.info("Connecting %s@%s finished", nw.thrdnum, nw.server.host)
self.__request_article(nw)
elif nw.status_code == 223:
article_done = True
logging.debug("Article <%s> is present on %s", article.article, nw.server.host)
elif nw.status_code in (411, 423, 430, 451):
article_done = True
logging.debug(
"Thread %s@%s: Article %s missing (error=%s)",
nw.thrdnum,
nw.server.host,
article.article,
nw.status_code,
)
nw.reset_data_buffer()
elif nw.status_code == 500:
if article.nzf.nzo.precheck:
# Did we try "STAT" already?
if not server.have_stat:
# Hopless server, just discard
logging.info("Server %s does not support STAT or HEAD, precheck not possible", server.host)
article_done = True
else:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug("Server %s does not support STAT, trying HEAD", server.host)
else:
# Assume "BODY" command is not supported
server.have_body = False
logging.debug("Server %s does not support BODY", server.host)
nw.reset_data_buffer()
self.__request_article(nw)
else:
# Don't warn for (internal) server errors during downloading
if nw.status_code not in (400, 502, 503):
logging.warning(
T("%s@%s: Received unknown status code %s for article %s"),
nw.thrdnum,
nw.server.host,
nw.status_code,
article.article,
)
# Ditch this thread, we don't know what data we got now so the buffer can be bad
self.__reset_nw(nw, f"Server error or unknown status code: {nw.status_code}", wait=False)
return
if article_done:
# Successful data, clear "bad" counter
server.bad_cons = 0
server.errormsg = server.warning = ""
# Decode
self.decode(article, nw.data_view[: nw.data_position])
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s done", nw.thrdnum, server.host, article.article)
# Reset connection for new activity
nw.soft_reset()
# Request a new article immediately if possible
if (
nw.connected
and server.active
and not server.restart
and not (self.paused or self.shutdown or self.paused_for_postproc)
):
nw.article = server.get_article()
if nw.article:
self.__request_article(nw)
return
# Make socket available again
server.busy_threads.discard(nw)
server.idle_threads.add(nw)
self.remove_socket(nw)
def check_assembler_levels(self):
"""Check the Assembler queue to see if we need to delay, depending on queue size"""
if (assembler_level := sabnzbd.Assembler.queue_level()) > SOFT_ASSEMBLER_QUEUE_LIMIT:
@@ -864,13 +809,12 @@ class Downloader(Thread):
logged_counter += 1
@synchronized(DOWNLOADER_LOCK)
def __finish_connect_nw(self, nw: NewsWrapper) -> bool:
def finish_connect_nw(self, nw: NewsWrapper, response: sabctools.NNTPResponse) -> bool:
server = nw.server
try:
nw.finish_connect(nw.status_code)
nw.finish_connect(response.status_code, response.message)
if sabnzbd.LOG_ALL:
logging.debug("%s@%s last message -> %s", nw.thrdnum, server.host, nw.nntp_msg)
nw.reset_data_buffer()
logging.debug("%s@%s last message -> %d", nw.thrdnum, server.host, response.status_code)
except NNTPPermanentError as error:
# Handle login problems
block = False
@@ -883,7 +827,7 @@ class Downloader(Thread):
errormsg = T("Too many connections to server %s [%s]") % (server.host, error.msg)
if server.active:
# Don't count this for the tries (max_art_tries) on this server
self.__reset_nw(nw)
self.reset_nw(nw)
self.plan_server(server, _PENALTY_TOOMANY)
elif error.code in (502, 481, 482) and clues_too_many_ip(error.msg):
# Login from (too many) different IP addresses
@@ -933,7 +877,7 @@ class Downloader(Thread):
if penalty and (block or server.optional):
self.plan_server(server, penalty)
# Note that the article is discard for this server if the server is not required
self.__reset_nw(nw, retry_article=retry_article)
self.reset_nw(nw, retry_article=retry_article)
return False
except Exception as err:
logging.error(
@@ -944,11 +888,11 @@ class Downloader(Thread):
)
logging.info("Traceback: ", exc_info=True)
# No reset-warning needed, above logging is sufficient
self.__reset_nw(nw, retry_article=False)
self.reset_nw(nw, retry_article=False)
return True
@synchronized(DOWNLOADER_LOCK)
def __reset_nw(
def reset_nw(
self,
nw: NewsWrapper,
reset_msg: Optional[str] = None,
@@ -956,6 +900,7 @@ class Downloader(Thread):
wait: bool = True,
count_article_try: bool = True,
retry_article: bool = True,
article: Optional["sabnzbd.nzbstuff.Article"] = None,
):
# Some warnings are errors, and not added as server.warning
if warn and reset_msg:
@@ -971,20 +916,8 @@ class Downloader(Thread):
# Make sure it is not in the readable sockets
self.remove_socket(nw)
if nw.article and not nw.article.nzf.nzo.removed_from_queue:
# Only some errors should count towards the total tries for each server
if count_article_try:
nw.article.tries += 1
# Do we discard, or try again for this server
if not retry_article or (not nw.server.required and nw.article.tries > cfg.max_art_tries()):
# Too many tries on this server, consider article missing
self.decode(nw.article)
nw.article.tries = 0
else:
# Allow all servers again for this article
# Do not use the article_queue, as the server could already have been disabled when we get here!
nw.article.allow_new_fetcher()
# Discard the article request which failed
nw.discard(article, count_article_try=count_article_try, retry_article=retry_article)
# Reset connection object
nw.hard_reset(wait)
@@ -992,21 +925,6 @@ class Downloader(Thread):
# Empty SSL info, it might change on next connect
nw.server.ssl_info = ""
def __request_article(self, nw: NewsWrapper):
try:
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: BODY %s", nw.thrdnum, nw.server.host, nw.article.article)
nw.body()
# Mark as ready to be read
self.add_socket(nw.nntp.fileno, nw)
except socket.error as err:
logging.info("Looks like server closed connection: %s", err)
self.__reset_nw(nw, "Server broke off connection", warn=True)
except Exception:
logging.error(T("Suspect error in downloader"))
logging.info("Traceback: ", exc_info=True)
self.__reset_nw(nw, "Server broke off connection", warn=True)
# ------------------------------------------------------------------------------
# Timed restart of servers admin.
# For each server all planned events are kept in a list.

View File

@@ -23,7 +23,6 @@ import socket
import threading
import time
import logging
import functools
from dataclasses import dataclass
from more_itertools import roundrobin
from typing import Union, Optional

View File

@@ -913,6 +913,7 @@ SPECIAL_VALUE_LIST = (
"ssdp_broadcast_interval",
"unrar_parameters",
"outgoing_nntp_ip",
"pipelining_requests",
)
SPECIAL_LIST_LIST = (
"rss_odd_titles",

View File

@@ -41,7 +41,7 @@ import math
import rarfile
from threading import Thread
from collections.abc import Iterable
from typing import Union, Tuple, Any, AnyStr, Optional, Collection
from typing import Union, Any, AnyStr, Optional, Collection
import sabnzbd
import sabnzbd.getipaddress

View File

@@ -29,7 +29,7 @@ import io
import shutil
import functools
import rarfile
from typing import BinaryIO, Optional, Any, Union, Callable
from typing import BinaryIO, Optional, Any, Union
import sabnzbd
from sabnzbd.encoding import correct_unknown_encoding, ubtou
@@ -64,6 +64,7 @@ from sabnzbd.filesystem import (
SEVENMULTI_RE,
is_size,
get_basename,
create_all_dirs,
)
from sabnzbd.nzbstuff import NzbObject
import sabnzbd.cfg as cfg
@@ -626,6 +627,12 @@ def rar_extract(
rars = []
passwords = get_all_passwords(nzo)
# Sanity check, does the folder exist? Could be removed by aborted Direct Unpack
if not os.path.exists(extraction_path):
# Similar to prepare_extraction_path
extraction_path = create_all_dirs(extraction_path, apply_permissions=True)
logging.info("Extraction path (re)created because it was missing: %s", extraction_path)
for password in passwords:
if password:
logging.debug('Trying unrar with password "%s"', password)
@@ -649,7 +656,7 @@ def rar_extract_core(
start = time.time()
logging.debug("Extraction path: %s", extraction_path)
logging.debug("Found rar version: %s", rarfile.is_rarfile(rarfile_path))
logging.debug("Found rar version: %s", rarfile.get_rar_version(rarfile_path))
if password:
password_command = "-p%s" % password

View File

@@ -21,20 +21,22 @@ sabnzbd.newswrapper
import errno
import socket
import threading
from collections import deque
from selectors import EVENT_READ, EVENT_WRITE
from threading import Thread
import time
import logging
import ssl
import sabctools
from typing import Optional, Tuple, Union
from typing import Optional, Tuple, Union, Callable
import sabctools
import sabnzbd
import sabnzbd.cfg
from sabnzbd.constants import DEF_NETWORKING_TIMEOUT, NNTP_BUFFER_SIZE, NTTP_MAX_BUFFER_SIZE
from sabnzbd.encoding import utob, ubtou
from sabnzbd.constants import DEF_NETWORKING_TIMEOUT, NNTP_BUFFER_SIZE, Status, FORCE_PRIORITY
from sabnzbd.encoding import utob
from sabnzbd.get_addrinfo import AddrInfo
from sabnzbd.decorators import synchronized, DOWNLOADER_LOCK
from sabnzbd.misc import int_conv
# Set pre-defined socket timeout
socket.setdefaulttimeout(DEF_NETWORKING_TIMEOUT)
@@ -57,10 +59,8 @@ class NewsWrapper:
"thrdnum",
"blocking",
"timeout",
"article",
"data",
"data_view",
"data_position",
"decoder",
"send_buffer",
"nntp",
"connected",
"user_sent",
@@ -69,6 +69,11 @@ class NewsWrapper:
"user_ok",
"pass_ok",
"force_login",
"next_request",
"concurrent_requests",
"_response_queue",
"selector_events",
"lock",
)
def __init__(self, server, thrdnum, block=False):
@@ -77,11 +82,9 @@ class NewsWrapper:
self.blocking: bool = block
self.timeout: Optional[float] = None
self.article: Optional[sabnzbd.nzbstuff.Article] = None
self.data: Optional[bytearray] = None
self.data_view: Optional[memoryview] = None
self.data_position: int = 0
self.decoder: Optional[sabctools.Decoder] = None
self.send_buffer = b""
self.nntp: Optional[NNTP] = None
@@ -93,14 +96,22 @@ class NewsWrapper:
self.force_login: bool = False
self.group: Optional[str] = None
@property
def status_code(self) -> Optional[int]:
if self.data_position >= 3:
return int_conv(self.data[:3])
# Command queue and concurrency
self.next_request: Optional[tuple[bytes, Optional["sabnzbd.nzbstuff.Article"]]] = None
self.concurrent_requests: threading.BoundedSemaphore = threading.BoundedSemaphore(
sabnzbd.cfg.pipelining_requests()
)
self._response_queue: deque[Optional[sabnzbd.nzbstuff.Article]] = deque()
self.selector_events = 0
self.lock: threading.Lock = threading.Lock()
@property
def nntp_msg(self) -> str:
return ubtou(self.data[: self.data_position]).strip()
def article(self) -> Optional["sabnzbd.nzbstuff.Article"]:
"""The article currently being downloaded"""
with self.lock:
if self._response_queue:
return self._response_queue[0]
return None
def init_connect(self):
"""Setup the connection in NNTP object"""
@@ -109,13 +120,15 @@ class NewsWrapper:
raise socket.error(errno.EADDRNOTAVAIL, T("Invalid server address."))
# Construct buffer and NNTP object
self.data = sabctools.bytearray_malloc(NNTP_BUFFER_SIZE)
self.data_view = memoryview(self.data)
self.reset_data_buffer()
self.decoder = sabctools.Decoder(NNTP_BUFFER_SIZE)
self.nntp = NNTP(self, self.server.addrinfo)
self.timeout = time.time() + self.server.timeout
def finish_connect(self, code: int):
# On connect the first "response" will be 200 Welcome
self._response_queue.append(None)
self.concurrent_requests.acquire()
def finish_connect(self, code: int, message: str) -> None:
"""Perform login options"""
if not (self.server.username or self.server.password or self.force_login):
self.connected = True
@@ -133,11 +146,10 @@ class NewsWrapper:
self.pass_ok = False
if code in (400, 500, 502):
raise NNTPPermanentError(self.nntp_msg, code)
raise NNTPPermanentError(message, code)
elif not self.user_sent:
command = utob("authinfo user %s\r\n" % self.server.username)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
self.queue_command(command)
self.user_sent = True
elif not self.user_ok:
if code == 381:
@@ -151,98 +163,254 @@ class NewsWrapper:
if self.user_ok and not self.pass_sent:
command = utob("authinfo pass %s\r\n" % self.server.password)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
self.queue_command(command)
self.pass_sent = True
elif self.user_ok and not self.pass_ok:
if code != 281:
# Assume that login failed (code 481 or other)
raise NNTPPermanentError(self.nntp_msg, code)
raise NNTPPermanentError(message, code)
else:
self.connected = True
self.timeout = time.time() + self.server.timeout
def body(self):
def queue_command(
self,
command: bytes,
article: Optional["sabnzbd.nzbstuff.Article"] = None,
) -> None:
"""Add a command to the command queue"""
self.next_request = command, article
def body(self, article: "sabnzbd.nzbstuff.Article") -> tuple[bytes, "sabnzbd.nzbstuff.Article"]:
"""Request the body of the article"""
self.timeout = time.time() + self.server.timeout
if self.article.nzf.nzo.precheck:
if article.nzf.nzo.precheck:
if self.server.have_stat:
command = utob("STAT <%s>\r\n" % self.article.article)
command = utob("STAT <%s>\r\n" % article.article)
else:
command = utob("HEAD <%s>\r\n" % self.article.article)
command = utob("HEAD <%s>\r\n" % article.article)
elif self.server.have_body:
command = utob("BODY <%s>\r\n" % self.article.article)
command = utob("BODY <%s>\r\n" % article.article)
else:
command = utob("ARTICLE <%s>\r\n" % self.article.article)
self.nntp.sock.sendall(command)
self.reset_data_buffer()
command = utob("ARTICLE <%s>\r\n" % article.article)
return command, article
def recv_chunk(self) -> tuple[int, bool, bool]:
"""Receive data, return #bytes, end-of-line, end-of-article"""
# Resize the buffer in the extremely unlikely case that it got full
if self.data_position == len(self.data):
self.nntp.nw.increase_data_buffer()
def on_response(self, response: sabctools.NNTPResponse, article: Optional["sabnzbd.nzbstuff.Article"]) -> None:
"""A response to a NNTP request is received"""
self.concurrent_requests.release()
sabnzbd.Downloader.modify_socket(self, EVENT_READ | EVENT_WRITE)
server = self.server
article_done = response.status_code in (220, 222) and article
# Receive data into the pre-allocated buffer
if self.nntp.nw.server.ssl and not self.nntp.nw.blocking and sabctools.openssl_linked:
if article_done:
with DOWNLOADER_LOCK:
# Update statistics only when we fetched a whole article
# The side effect is that we don't count things like article-not-available messages
article.nzf.nzo.update_download_stats(sabnzbd.BPSMeter.bps, server.id, response.bytes_read)
# Response code depends on request command:
# 220 = ARTICLE, 222 = BODY
if not article_done:
if not self.connected or not article or response.status_code in (281, 381, 480, 481, 482):
self.discard(article, count_article_try=False)
if not sabnzbd.Downloader.finish_connect_nw(self, response):
return
if self.connected:
logging.info("Connecting %s@%s finished", self.thrdnum, server.host)
elif response.status_code == 223:
article_done = True
logging.debug("Article <%s> is present on %s", article.article, server.host)
elif response.status_code in (411, 423, 430, 451):
article_done = True
logging.debug(
"Thread %s@%s: Article %s missing (error=%s)",
self.thrdnum,
server.host,
article.article,
response.status_code,
)
elif response.status_code == 500:
if article.nzf.nzo.precheck:
# Did we try "STAT" already?
if not server.have_stat:
# Hopless server, just discard
logging.info("Server %s does not support STAT or HEAD, precheck not possible", server.host)
article_done = True
else:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug("Server %s does not support STAT, trying HEAD", server.host)
else:
# Assume "BODY" command is not supported
server.have_body = False
logging.debug("Server %s does not support BODY", server.host)
self.discard(article, count_article_try=False)
else:
# Don't warn for (internal) server errors during downloading
if response.status_code not in (400, 502, 503):
logging.warning(
T("%s@%s: Received unknown status code %s for article %s"),
self.thrdnum,
server.host,
response.status_code,
article.article,
)
# Ditch this thread, we don't know what data we got now so the buffer can be bad
sabnzbd.Downloader.reset_nw(
self, f"Server error or unknown status code: {response.status_code}", wait=False, article=article
)
return
if article_done:
# Successful data, clear "bad" counter
server.bad_cons = 0
server.errormsg = server.warning = ""
# Decode
sabnzbd.Downloader.decode(article, response)
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s done", self.thrdnum, server.host, article.article)
def read(
self,
nbytes: int = 0,
on_response: Optional[Callable[[int, str], None]] = None,
) -> Tuple[int, Optional[int]]:
"""Receive data, return #bytes, #pendingbytes
:param nbytes: maximum number of bytes to read
:param on_response: callback for each complete response received
:return: #bytes, #pendingbytes
"""
# Receive data into the decoder pre-allocated buffer
if not nbytes and self.nntp.nw.server.ssl and not self.nntp.nw.blocking and sabctools.openssl_linked:
# Use patched version when downloading
bytes_recv = sabctools.unlocked_ssl_recv_into(self.nntp.sock, self.data_view[self.data_position :])
bytes_recv = sabctools.unlocked_ssl_recv_into(self.nntp.sock, self.decoder)
else:
bytes_recv = self.nntp.sock.recv_into(self.data_view[self.data_position :])
bytes_recv = self.nntp.sock.recv_into(self.decoder, nbytes=nbytes)
# No data received
if bytes_recv == 0:
raise ConnectionError("Server closed connection")
# Success, move timeout and internal data position
# Success, move timeout
self.timeout = time.time() + self.server.timeout
self.data_position += bytes_recv
self.decoder.process(bytes_recv)
for response in self.decoder:
with self.lock:
article = self._response_queue.popleft()
if on_response:
on_response(response.status_code, response.message)
self.on_response(response, article)
# The SSL-layer might still contain data even though the socket does not. Another Downloader-loop would
# not identify this socket anymore as it is not returned by select(). So, we have to forcefully trigger
# another recv_chunk so the buffer is increased and the data from the SSL-layer is read. See #2752.
if self.nntp.nw.server.ssl and self.data_position == len(self.data) and self.nntp.sock.pending() > 0:
# We do not perform error-handling, as we know there is data available to read
additional_bytes_recv, additional_end_of_line, additional_end_of_article = self.recv_chunk()
return bytes_recv + additional_bytes_recv, additional_end_of_line, additional_end_of_article
if self.server.ssl and self.nntp and (pending := self.nntp.sock.pending()):
return bytes_recv, pending
return bytes_recv, None
# Check for end of line
# Using the data directly seems faster than the memoryview
if self.data[self.data_position - 2 : self.data_position] == b"\r\n":
# Official end-of-article is "\r\n.\r\n"
if self.data[self.data_position - 5 : self.data_position] == b"\r\n.\r\n":
return bytes_recv, True, True
return bytes_recv, True, False
def write(self):
"""Send data to server"""
server = self.server
# Still in middle of data, so continue!
return bytes_recv, False, False
try:
# First, try to flush any remaining data
if self.send_buffer:
sent = self.nntp.sock.send(self.send_buffer)
self.send_buffer = self.send_buffer[sent:]
if self.send_buffer:
# Still unsent data, wait for next EVENT_WRITE
return
def soft_reset(self):
"""Reset for the next article"""
self.timeout = None
self.article = None
self.reset_data_buffer()
if self.connected:
if (
server.active
and not server.restart
and not (
sabnzbd.Downloader.paused
or sabnzbd.Downloader.shutdown
or sabnzbd.Downloader.paused_for_postproc
)
):
# Prepare the next request
if not self.next_request and (article := server.get_article()):
self.next_request = self.body(article)
elif self.next_request and self.next_request[1]:
# Discard the next request
self.discard(self.next_request[1], count_article_try=False, retry_article=True)
self.next_request = None
def reset_data_buffer(self):
"""Reset the data position"""
self.data_position = 0
# If no pending buffer, try to send new command
if not self.send_buffer and self.next_request:
if self.concurrent_requests.acquire(blocking=False):
command, article = self.next_request
self.next_request = None
if article:
nzo = article.nzf.nzo
if nzo.removed_from_queue or nzo.status is Status.PAUSED and nzo.priority is not FORCE_PRIORITY:
self.discard(article, count_article_try=False, retry_article=True)
self.concurrent_requests.release()
return
self._response_queue.append(article)
if sabnzbd.LOG_ALL:
logging.debug("Thread %s@%s: %s", self.thrdnum, server.host, command)
try:
sent = self.nntp.sock.send(command)
if sent < len(command):
# Partial send, store remainder
self.send_buffer = command[sent:]
except (BlockingIOError, ssl.SSLWantWriteError):
# Can't send now, store full command
self.send_buffer = command
else:
# Concurrency limit reached
sabnzbd.Downloader.modify_socket(self, EVENT_READ)
else:
# Is it safe to shut down this socket?
if (
not self.send_buffer
and not self.next_request
and not self._response_queue
and (not server.active or server.restart or time.time() > self.timeout)
):
# Make socket available again
server.busy_threads.discard(self)
server.idle_threads.add(self)
sabnzbd.Downloader.remove_socket(self)
def increase_data_buffer(self):
"""Resize the buffer in the extremely unlikely case that it overflows"""
# Sanity check before we go any further
if len(self.data) > NTTP_MAX_BUFFER_SIZE:
raise BufferError("Maximum data buffer size exceeded")
# Input needs to be integer, floats don't work
new_buffer = sabctools.bytearray_malloc(len(self.data) + NNTP_BUFFER_SIZE // 2)
new_buffer[: len(self.data)] = self.data
logging.info("Increased buffer from %d to %d for %s", len(self.data), len(new_buffer), str(self))
self.data = new_buffer
self.data_view = memoryview(self.data)
except (BlockingIOError, ssl.SSLWantWriteError):
# Socket not currently writable — just try again later
return
except socket.error as err:
logging.info("Looks like server closed connection: %s", err)
sabnzbd.Downloader.reset_nw(self, "Server broke off connection", warn=True)
except Exception:
logging.error(T("Suspect error in downloader"))
logging.info("Traceback: ", exc_info=True)
sabnzbd.Downloader.reset_nw(self, "Server broke off connection", warn=True)
def hard_reset(self, wait: bool = True):
"""Destroy and restart"""
with self.lock:
# Drain unsent requests
if self.next_request:
_, article = self.next_request
if article:
self.discard(article, count_article_try=False, retry_article=True)
self.next_request = None
# Drain responses
while self._response_queue:
if article := self._response_queue.popleft():
self.discard(article, count_article_try=False, retry_article=True)
if self.nntp:
self.nntp.close(send_quit=self.connected)
self.nntp = None
@@ -258,6 +426,28 @@ class NewsWrapper:
# Reset for internal reasons, just wait 5 sec
self.timeout = time.time() + 5
def discard(
self,
article: Optional["sabnzbd.nzbstuff.Article"],
count_article_try: bool = True,
retry_article: bool = True,
) -> None:
"""Discard an article back to the queue"""
if article and not article.nzf.nzo.removed_from_queue:
# Only some errors should count towards the total tries for each server
if count_article_try:
article.tries += 1
# Do we discard, or try again for this server
if not retry_article or (not self.server.required and article.tries > sabnzbd.cfg.max_art_tries()):
# Too many tries on this server, consider article missing
sabnzbd.Downloader.decode(article)
article.tries = 0
else:
# Allow all servers again for this article
# Do not use the article_queue, as the server could already have been disabled when we get here!
article.allow_new_fetcher()
def __repr__(self):
return "<NewsWrapper: server=%s:%s, thread=%s, connected=%s>" % (
self.server.host,
@@ -379,7 +569,7 @@ class NNTP:
# Locked, so it can't interleave with any of the Downloader "__nw" actions
with DOWNLOADER_LOCK:
if not self.closed:
sabnzbd.Downloader.add_socket(self.fileno, self.nw)
sabnzbd.Downloader.add_socket(self.nw)
except OSError as e:
self.error(e)

View File

@@ -692,7 +692,7 @@ class NzbQueue:
return False
return False
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int) -> list[Article]:
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int) -> None:
"""Get next article for jobs in the queue
Not locked for performance, since it only reads the queue
"""
@@ -705,12 +705,12 @@ class NzbQueue:
and not nzo.propagation_delay_left
) or nzo.priority == FORCE_PRIORITY:
if not nzo.server_in_try_list(server):
if articles := nzo.get_articles(server, servers, fetch_limit):
return articles
nzo.get_articles(server, servers, fetch_limit)
if server.article_queue:
break
# Stop after first job that wasn't paused/propagating/etc
if self.__top_only:
return []
return []
break
def register_article(self, article: Article, success: bool = True):
"""Register the articles we tried
@@ -893,11 +893,14 @@ class NzbQueue:
if nzf.all_servers_in_try_list(active_servers):
# Check for articles where all active servers have already been tried
for article in nzf.articles[:]:
if article.all_servers_in_try_list(active_servers):
logging.debug("Removing article %s with bad trylist in file %s", article, nzf.filename)
nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
with nzf:
for article in nzf.articles:
if article.all_servers_in_try_list(active_servers):
logging.debug(
"Removing article %s with bad trylist in file %s", article, nzf.filename
)
nzo.increase_bad_articles_counter("missing_articles")
sabnzbd.NzbQueue.register_article(article, success=False)
logging.info("Resetting bad trylist for file %s in job %s", nzf.filename, nzo.final_name)
nzf.reset_try_list()

View File

@@ -26,7 +26,7 @@ import datetime
import threading
import functools
import difflib
from typing import Any, Optional, Union, BinaryIO
from typing import Any, Optional, Union, BinaryIO, Deque
# SABnzbd modules
import sabnzbd
@@ -328,11 +328,12 @@ class NzbFile(TryList):
"""Representation of one file consisting of multiple articles"""
# Pre-define attributes to save memory
__slots__ = NzbFileSaver
__slots__ = NzbFileSaver + ("lock",)
def __init__(self, date, subject, raw_article_db, file_bytes, nzo):
"""Setup object"""
super().__init__()
self.lock = threading.RLock()
self.date: datetime.datetime = date
self.type: Optional[str] = None
@@ -347,7 +348,7 @@ class NzbFile(TryList):
self.setname: Optional[str] = None
# Articles are removed from "articles" after being fetched
self.articles: list[Article] = []
self.articles: dict[Article, Article] = {}
self.decodetable: list[Article] = []
self.bytes: int = file_bytes
@@ -402,17 +403,18 @@ class NzbFile(TryList):
def add_article(self, article_info):
"""Add article to object database and return article object"""
article = Article(article_info[0], article_info[1], self)
self.articles.append(article)
self.decodetable.append(article)
with self.lock:
self.articles[article] = article
self.decodetable.append(article)
return article
def remove_article(self, article: Article, success: bool) -> int:
"""Handle completed article, possibly end of file"""
if article in self.articles:
self.articles.remove(article)
if success:
self.bytes_left -= article.bytes
return len(self.articles)
with self.lock:
if self.articles.pop(article, None) is not None:
if success:
self.bytes_left -= article.bytes
return len(self.articles)
def set_par2(self, setname, vol, blocks):
"""Designate this file as a par2 file"""
@@ -427,29 +429,45 @@ class NzbFile(TryList):
else:
self.crc32 = sabctools.crc32_combine(self.crc32, crc32, length)
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int) -> list[Article]:
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int):
"""Get next articles to be downloaded"""
articles = []
for article in self.articles:
if article := article.get_article(server, servers):
articles.append(article)
if len(articles) >= fetch_limit:
return articles
articles = server.article_queue
with self.lock:
for article in self.articles:
if article := article.get_article(server, servers):
articles.append(article)
if len(articles) >= fetch_limit:
return
self.add_to_try_list(server)
return articles
@synchronized(TRYLIST_LOCK)
def reset_all_try_lists(self):
"""Reset all try lists. Locked so reset is performed
for all items at the same time without chance of another
thread changing any of the items while we are resetting"""
for art in self.articles:
art.reset_try_list()
with self.lock:
for art in self.articles:
art.reset_try_list()
self.reset_try_list()
def first_article_processed(self) -> bool:
"""Check if the first article has been processed.
This ensures we have attempted to extract md5of16k and filename information
before creating the filepath.
"""
# The first article of decodetable is always the lowest
first_article = self.decodetable[0]
# If it's still in nzo.first_articles, it hasn't been processed yet
return first_article not in self.nzo.first_articles
def prepare_filepath(self):
"""Do all checks before making the final path"""
if not self.filepath:
# Wait for the first article to be processed so we can get md5of16k
# and proper filename before creating the filepath
if not self.first_article_processed():
return None
self.nzo.verify_nzf_filename(self)
filename = sanitize_filename(self.filename)
self.filepath = get_unique_filename(os.path.join(self.nzo.download_path, filename))
@@ -459,7 +477,10 @@ class NzbFile(TryList):
@property
def completed(self):
"""Is this file completed?"""
return self.import_finished and not bool(self.articles)
if not self.import_finished:
return False
with self.lock:
return not self.articles
def remove_admin(self):
"""Remove article database from disk (sabnzbd_nzf_<id>)"""
@@ -469,6 +490,12 @@ class NzbFile(TryList):
except Exception:
pass
def __enter__(self):
self.lock.acquire()
def __exit__(self, exc_type, exc_val, exc_tb):
self.lock.release()
def __getstate__(self):
"""Save to pickle file, selecting attributes"""
dict_ = {}
@@ -486,6 +513,10 @@ class NzbFile(TryList):
# Handle new attributes
setattr(self, item, None)
super().__setstate__(dict_.get("try_list", []))
self.lock = threading.RLock()
if isinstance(self.articles, list):
# Converted from list to dict
self.articles = {x: x for x in self.articles}
def __eq__(self, other: "NzbFile"):
"""Assume it's the same file if the number bytes and first article
@@ -1625,8 +1656,9 @@ class NzbObject(TryList):
self.nzo_info[bad_article_type] += 1
self.bad_articles += 1
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int) -> list[Article]:
articles = []
def get_articles(self, server: Server, servers: list[Server], fetch_limit: int):
"""Assign articles server up to the fetch_limit"""
articles: Deque[Article] = server.article_queue
nzf_remove_list = []
# Did we go through all first-articles?
@@ -1661,7 +1693,8 @@ class NzbObject(TryList):
else:
break
if articles := nzf.get_articles(server, servers, fetch_limit):
nzf.get_articles(server, servers, fetch_limit)
if articles:
break
# Remove all files for which admin could not be read
@@ -1676,7 +1709,6 @@ class NzbObject(TryList):
if not articles:
# No articles for this server, block for next time
self.add_to_try_list(server)
return articles
@synchronized(NZO_LOCK)
def move_top_bulk(self, nzf_ids: list[str]):

View File

@@ -29,7 +29,7 @@ from typing import Optional
from sabnzbd.constants import MEBI
from sabnzbd.encoding import correct_unknown_encoding
from sabnzbd.filesystem import get_basename, get_ext
from sabnzbd.filesystem import get_basename
PROBABLY_PAR2_RE = re.compile(r"(.*)\.vol(\d*)[+\-](\d*)\.par2", re.I)
SCAN_LIMIT = 10 * MEBI

View File

@@ -39,7 +39,7 @@ from sabnzbd.newsunpack import (
rar_sort,
is_sfv_file,
)
from threading import Thread
from threading import Thread, Event
from sabnzbd.misc import (
on_cleanup_list,
is_sample,
@@ -116,6 +116,9 @@ class PostProcessor(Thread):
# Regular queue for jobs that might need more attention
self.slow_queue: queue.Queue[Optional[NzbObject]] = queue.Queue()
# Event to signal when work is available or state changes
self.work_available = Event()
# Load all old jobs
for nzo in self.history_queue:
self.process(nzo)
@@ -180,6 +183,9 @@ class PostProcessor(Thread):
self.save()
history_updated()
# Signal that work is available
self.work_available.set()
def remove(self, nzo: NzbObject):
"""Remove given nzo from the queue"""
try:
@@ -192,8 +198,20 @@ class PostProcessor(Thread):
def stop(self):
"""Stop thread after finishing running job"""
self.__stop = True
self.slow_queue.put(None)
self.fast_queue.put(None)
# Wake up the processor thread to check stop flag
self.work_available.set()
def pause(self):
"""Pause post-processing"""
self.paused = True
logging.info("Pausing post-processing")
def resume(self):
"""Resume post-processing"""
self.paused = False
logging.info("Resuming post-processing")
# Wake up the processor thread
self.work_available.set()
def cancel_pp(self, nzo_ids: list[str]) -> Optional[bool]:
"""Abort Direct Unpack and change the status, so that the PP is canceled"""
@@ -265,27 +283,40 @@ class PostProcessor(Thread):
while not self.__stop:
self.__busy = False
if self.paused:
time.sleep(5)
continue
# Set NzbObject object to None so references from this thread do not keep the
# object alive until the next job is added to post-processing (see #1628)
nzo = None
# Wait for work to be available (no timeout!)
self.work_available.wait()
# Check if we should stop
if self.__stop:
break
# If paused, clear event and wait for resume
if self.paused:
self.work_available.clear()
continue
# If queues are empty (spurious wake or race condition), clear and loop back
if self.slow_queue.empty() and self.fast_queue.empty():
self.work_available.clear()
continue
# Something in the fast queue?
try:
# Every few fast-jobs we should check allow a
# Every few fast-jobs we should allow a
# slow job so that they don't wait forever
if self.__fast_job_count >= MAX_FAST_JOB_COUNT and self.slow_queue.qsize():
raise queue.Empty
nzo = self.fast_queue.get(timeout=2)
nzo = self.fast_queue.get_nowait()
self.__fast_job_count += 1
except queue.Empty:
# Try the slow queue
try:
nzo = self.slow_queue.get(timeout=2)
nzo = self.slow_queue.get_nowait()
# Reset fast-counter
self.__fast_job_count = 0
except queue.Empty:
@@ -296,10 +327,6 @@ class PostProcessor(Thread):
# No fast or slow jobs, better luck next loop!
continue
# Stop job
if not nzo:
continue
# Job was already deleted.
if not nzo.work_name:
check_eoq = True
@@ -328,7 +355,7 @@ class PostProcessor(Thread):
self.external_process = None
check_eoq = True
# Allow download to proceed
# Allow download to proceed if it was paused for post-processing
sabnzbd.Downloader.resume_from_postproc()
@@ -392,14 +419,13 @@ def process_job(nzo: NzbObject) -> bool:
par_error = True
unpack_error = 1
script = nzo.script
logging.info(
"Starting Post-Processing on %s => Repair:%s, Unpack:%s, Delete:%s, Script:%s, Cat:%s",
filename,
flag_repair,
flag_unpack,
nzo.delete,
script,
nzo.script,
nzo.cat,
)
@@ -492,10 +518,10 @@ def process_job(nzo: NzbObject) -> bool:
# Check if this is an NZB-only download, if so redirect to queue
# except when PP was Download-only
nzb_list = None
if flag_repair:
nzb_list = nzb_redirect(tmp_workdir_complete, nzo.final_name, nzo.pp, script, nzo.cat, nzo.priority)
else:
nzb_list = None
nzb_list = process_nzb_only_download(tmp_workdir_complete, nzo)
if nzb_list:
nzo.set_unpack_info("Download", T("Sent %s to queue") % nzb_list)
cleanup_empty_directories(tmp_workdir_complete)
@@ -503,9 +529,10 @@ def process_job(nzo: NzbObject) -> bool:
# Full cleanup including nzb's
cleanup_list(tmp_workdir_complete, skip_nzb=False)
script_ret = 0
script_error = False
# No further processing for NZB-only downloads
if not nzb_list:
script_ret = 0
script_error = False
# Give destination its final name
if cfg.folder_rename() and tmp_workdir_complete and not one_folder:
if not all_ok:
@@ -557,11 +584,11 @@ def process_job(nzo: NzbObject) -> bool:
deobfuscate.deobfuscate_subtitles(nzo, newfiles)
# Run the user script
if script_path := make_script_path(script):
if script_path := make_script_path(nzo.script):
# Set the current nzo status to "Ext Script...". Used in History
nzo.status = Status.RUNNING
nzo.set_action_line(T("Running script"), script)
nzo.set_unpack_info("Script", T("Running user script %s") % script, unique=True)
nzo.set_action_line(T("Running script"), nzo.script)
nzo.set_unpack_info("Script", T("Running user script %s") % nzo.script, unique=True)
script_log, script_ret = external_processing(
script_path, nzo, clip_path(workdir_complete), nzo.final_name, job_result
)
@@ -574,7 +601,7 @@ def process_job(nzo: NzbObject) -> bool:
else:
script_line = T("Script exit code is %s") % script_ret
elif not script_line:
script_line = T("Ran %s") % script
script_line = T("Ran %s") % nzo.script
nzo.set_unpack_info("Script", script_line, unique=True)
# Maybe bad script result should fail job
@@ -583,29 +610,29 @@ def process_job(nzo: NzbObject) -> bool:
all_ok = False
nzo.fail_msg = script_line
# Email the results
if not nzb_list and cfg.email_endjob():
if cfg.email_endjob() == 1 or (cfg.email_endjob() == 2 and (unpack_error or par_error or script_error)):
emailer.endjob(
nzo.final_name,
nzo.cat,
all_ok,
workdir_complete,
nzo.bytes_downloaded,
nzo.fail_msg,
nzo.unpack_info,
script,
script_log,
script_ret,
)
# Email the results
if cfg.email_endjob():
if cfg.email_endjob() == 1 or (cfg.email_endjob() == 2 and (unpack_error or par_error or script_error)):
emailer.endjob(
nzo.final_name,
nzo.cat,
all_ok,
workdir_complete,
nzo.bytes_downloaded,
nzo.fail_msg,
nzo.unpack_info,
nzo.script,
script_log,
script_ret,
)
if script_log and len(script_log.rstrip().split("\n")) > 1:
# Can do this only now, otherwise it would show up in the email
nzo.set_unpack_info(
"Script",
'%s <a href="./scriptlog?name=%s">(%s)</a>' % (script_line, nzo.nzo_id, T("More")),
unique=True,
)
if script_log and len(script_log.rstrip().split("\n")) > 1:
# Can do this only now, otherwise it would show up in the email
nzo.set_unpack_info(
"Script",
'%s <a href="./scriptlog?name=%s">(%s)</a>' % (script_line, nzo.nzo_id, T("More")),
unique=True,
)
# Cleanup again, including NZB files
if all_ok and os.path.isdir(workdir_complete):
@@ -1132,34 +1159,36 @@ def prefix(path: str, pre: str) -> str:
return os.path.join(p, pre + d)
def nzb_redirect(wdir, nzbname, pp, script, cat, priority):
def process_nzb_only_download(workdir: str, nzo: NzbObject) -> Optional[list[str]]:
"""Check if this job contains only NZB files,
if so send to queue and remove if on clean-up list
Returns list of processed NZB's
"""
files = listdir_full(wdir)
if files := listdir_full(workdir):
for nzb_file in files:
if get_ext(nzb_file) != ".nzb":
return None
for nzb_file in files:
if get_ext(nzb_file) != ".nzb":
return None
# Process all NZB files
new_nzbname = nzo.final_name
for nzb_file in files:
# Determine name based on number of files
nzb_filename = get_filename(nzb_file)
if len(files) > 1:
new_nzbname = f"{nzo.final_name} - {nzb_filename}"
# For multiple NZBs, cannot use the current job name
if len(files) != 1:
nzbname = None
# Process all NZB files
for nzb_file in files:
process_single_nzb(
get_filename(nzb_file),
nzb_file,
pp=pp,
script=script,
cat=cat,
priority=priority,
dup_check=False,
nzbname=nzbname,
)
return files
process_single_nzb(
nzb_filename,
nzb_file,
pp=nzo.pp,
script=nzo.script,
cat=nzo.cat,
url=nzo.url,
priority=nzo.priority,
nzbname=new_nzbname,
dup_check=False,
)
return files
def one_file_or_folder(folder: str) -> str:

View File

@@ -337,7 +337,11 @@ class Scheduler:
sabnzbd.downloader.unpause_all()
sabnzbd.Downloader.set_paused_state(paused or paused_all)
sabnzbd.PostProcessor.paused = pause_post
# Handle pause_post state with proper notification
if pause_post and not sabnzbd.PostProcessor.paused:
sabnzbd.PostProcessor.pause()
elif not pause_post and sabnzbd.PostProcessor.paused:
sabnzbd.PostProcessor.resume()
if speedlimit is not None:
sabnzbd.Downloader.limit_speed(speedlimit)
@@ -506,11 +510,11 @@ def sort_schedules(all_events, now=None):
def pp_pause():
sabnzbd.PostProcessor.paused = True
sabnzbd.PostProcessor.pause()
def pp_resume():
sabnzbd.PostProcessor.paused = False
sabnzbd.PostProcessor.resume()
def enable_server(server):

View File

@@ -442,7 +442,7 @@ SKIN_TEXT = {
"Select a mode and list all (un)wanted extensions. For example: <b>exe</b> or <b>exe, com</b>"
),
"opt-sfv_check": TT("Enable SFV-based checks"),
"explain-sfv_check": TT("Do an extra verification based on SFV files."),
"explain-sfv_check": TT("If no par2 files are available, use sfv files (if present) to verify files"),
"opt-script_can_fail": TT("User script can flag job as failed"),
"explain-script_can_fail": TT(
"When the user script returns a non-zero exit code, the job will be flagged as failed."

View File

@@ -19,9 +19,7 @@
"""
sabnzbd.utils.rarvolinfo - Find out volume number and/or original extension of a rar file. Useful with obfuscated files
"""
import logging
import os
import rarfile

View File

@@ -6,5 +6,5 @@
# You MUST use double quotes (so " and not ')
# Do not forget to update the appdata file for every major release!
__version__ = "4.6.0Alpha1"
__version__ = "4.6.0Alpha2"
__baseline__ = "unknown"

View File

@@ -27,7 +27,6 @@ from selenium import webdriver
from selenium.webdriver.chrome.options import Options as ChromeOptions
from warnings import warn
from sabnzbd.constants import DEF_INI_FILE
from tests.testhelper import *

View File

@@ -45,32 +45,38 @@ ARTICLE_INFO = re.compile(
YENC_ESCAPE = [0x00, 0x0A, 0x0D, ord("="), ord(".")]
class NewsServerProtocol(asyncio.Protocol):
def __init__(self):
self.transport = None
self.connected = False
self.in_article = False
super().__init__()
class NewsServerSession:
def __init__(self, reader: asyncio.StreamReader, writer: asyncio.StreamWriter):
self.reader = reader
self.writer = writer
def connection_made(self, transport):
logging.info("Connection from %s", transport.get_extra_info("peername"))
self.transport = transport
self.connected = True
self.transport.write(b"200 Welcome (SABNews)\r\n")
async def run(self):
self.writer.write(b"200 Welcome (SABNews)\r\n")
await self.writer.drain()
def data_received(self, message):
logging.debug("Data received: %s", message.strip())
try:
while not self.reader.at_eof():
message = await self.reader.readuntil(b"\r\n")
logging.debug("Data received: %s", message.strip())
await self.handle_command(message)
except (ConnectionResetError, asyncio.IncompleteReadError):
logging.debug("Client closed connection")
# Handle basic commands
async def handle_command(self, message: bytes):
"""Handle basic NNTP commands, \r\n is already stripped."""
if message.startswith(b"QUIT"):
self.close_connection()
elif message.startswith((b"ARTICLE", b"BODY")):
await self.close_connection()
return
if message.startswith((b"ARTICLE", b"BODY")):
parsed_message = ARTICLE_INFO.search(message)
self.serve_article(parsed_message)
await self.serve_article(parsed_message)
return
# self.transport.write(data)
self.writer.write(b"500 Unknown command\r\n")
await self.writer.drain()
def serve_article(self, parsed_message):
async def serve_article(self, parsed_message):
# Check if we parsed everything
try:
message_id = parsed_message.group("message_id")
@@ -81,34 +87,37 @@ class NewsServerProtocol(asyncio.Protocol):
size = int(parsed_message.group("size"))
except (AttributeError, ValueError):
logging.warning("Can't parse article information")
self.transport.write(b"430 No Such Article Found (bad message-id)\r\n")
self.writer.write(b"430 No Such Article Found (bad message-id)\r\n")
await self.writer.drain()
return
# Check if file exists
if not os.path.exists(file):
logging.warning("File not found: %s", file)
self.transport.write(b"430 No Such Article Found (no file on disk)\r\n")
self.writer.write(b"430 No Such Article Found (no file on disk)\r\n")
await self.writer.drain()
return
# Check if sizes are valid
file_size = os.path.getsize(file)
if start + size > file_size:
logging.warning("Invalid start/size attributes")
self.transport.write(b"430 No Such Article Found (invalid start/size attributes)\r\n")
self.writer.write(b"430 No Such Article Found (invalid start/size attributes)\r\n")
await self.writer.drain()
return
logging.debug("Serving %s" % message_id)
# File is found, send headers
self.transport.write(b"222 0 %s\r\n" % message_id)
self.transport.write(b"Message-ID: %s\r\n" % message_id)
self.transport.write(b'Subject: "%s"\r\n\r\n' % file_base.encode("utf-8"))
self.writer.write(b"222 0 %s\r\n" % message_id)
self.writer.write(b"Message-ID: %s\r\n" % message_id)
self.writer.write(b'Subject: "%s"\r\n\r\n' % file_base.encode("utf-8"))
# Write yEnc headers
self.transport.write(
self.writer.write(
b"=ybegin part=%d line=128 size=%d name=%s\r\n" % (part, file_size, file_base.encode("utf-8"))
)
self.transport.write(b"=ypart begin=%d end=%d\r\n" % (start + 1, start + size))
self.writer.write(b"=ypart begin=%d end=%d\r\n" % (start + 1, start + size))
with open(file, "rb") as inp_file:
inp_file.seek(start)
@@ -116,24 +125,31 @@ class NewsServerProtocol(asyncio.Protocol):
# Encode data
output_string, crc = sabctools.yenc_encode(inp_buffer)
self.transport.write(output_string)
self.writer.write(output_string)
# Write footer
self.transport.write(b"\r\n=yend size=%d part=%d pcrc32=%08x\r\n" % (size, part, crc))
self.transport.write(b".\r\n")
self.writer.write(b"\r\n=yend size=%d part=%d pcrc32=%08x\r\n" % (size, part, crc))
self.writer.write(b".\r\n")
await self.writer.drain()
def close_connection(self):
async def close_connection(self):
logging.debug("Closing connection")
self.transport.write(b"205 Connection closing\r\n")
self.transport.close()
self.writer.write(b"205 Connection closing\r\n")
await self.writer.drain()
self.writer.close()
await self.writer.wait_closed()
async def connection_handler(reader: asyncio.StreamReader, writer: asyncio.StreamWriter):
session = NewsServerSession(reader, writer)
await session.run()
async def serve_sabnews(hostname, port):
# Start server
logging.info("Starting SABNews on %s:%d", hostname, port)
loop = asyncio.get_running_loop()
server = await loop.create_server(lambda: NewsServerProtocol(), hostname, port)
server = await asyncio.start_server(connection_handler, hostname, port)
async with server:
await server.serve_forever()

View File

@@ -21,10 +21,12 @@ tests.test_decoder- Testing functions in decoder.py
import binascii
import os
import pytest
from io import BytesIO
from random import randint
from unittest import mock
import sabctools
import sabnzbd.decoder as decoder
from sabnzbd.nzbstuff import Article
@@ -111,7 +113,7 @@ class TestUuDecoder:
result.append(END_DATA)
# Signal the end of the message with a dot on a line of its own
data.append(b".")
data.append(b".\r\n")
# Join the data with \r\n line endings, just like we get from socket reads
data = b"\r\n".join(data)
@@ -120,22 +122,26 @@ class TestUuDecoder:
return article, bytearray(data), result
def test_no_data(self):
with pytest.raises(decoder.BadUu):
assert decoder.decode_uu(None, None)
@staticmethod
def _response(raw_data: bytes) -> sabctools.NNTPResponse:
dec = sabctools.Decoder(len(raw_data))
reader = BytesIO(raw_data)
reader.readinto(dec)
dec.process(len(raw_data))
return next(dec)
@pytest.mark.parametrize(
"raw_data",
[
b"",
b"\r\n\r\n",
b"foobar\r\n", # Plenty of list items, but (too) few actual lines
b"222 0 <artid@woteva>\r\nX-Too-Short: yup\r\n",
b"222 0 <foo@bar>\r\n.\r\n",
b"222 0 <foo@bar>\r\n\r\n.\r\n",
b"222 0 <foo@bar>\r\nfoobar\r\n.\r\n", # Plenty of list items, but (too) few actual lines
b"222 0 <foo@bar>\r\nX-Too-Short: yup\r\n.\r\n",
],
)
def test_short_data(self, raw_data):
with pytest.raises(decoder.BadUu):
assert decoder.decode_uu(None, bytearray(raw_data))
assert decoder.decode_uu(Article("foo@bar", 4321, None), self._response(raw_data))
@pytest.mark.parametrize(
"raw_data",
@@ -158,7 +164,8 @@ class TestUuDecoder:
with pytest.raises(decoder.BadUu):
raw_data = bytearray(raw_data)
raw_data.extend(filler)
assert decoder.decode_uu(article, raw_data)
raw_data.extend(b".\r\n")
assert decoder.decode_uu(article, self._response(raw_data))
@pytest.mark.parametrize("insert_empty_line", [True, False])
@pytest.mark.parametrize("insert_excess_empty_lines", [True, False])
@@ -194,7 +201,7 @@ class TestUuDecoder:
insert_dot_stuffing_line,
begin_line,
)
assert decoder.decode_uu(article, raw_data) == expected_result
assert decoder.decode_uu(article, self._response(raw_data)) == expected_result
assert article.nzf.filename_checked
@pytest.mark.parametrize("insert_empty_line", [True, False])
@@ -205,7 +212,7 @@ class TestUuDecoder:
decoded_data = expected_data = b""
for part in ("begin", "middle", "middle", "end"):
article, data, result = self._generate_msg_part(part, insert_empty_line, False, False, True)
decoded_data += decoder.decode_uu(article, data)
decoded_data += decoder.decode_uu(article, self._response(data))
expected_data += result
# Verify results
@@ -223,4 +230,6 @@ class TestUuDecoder:
article.lowest_partnum = False
filler = b"\r\n".join(VALID_UU_LINES[:4]) + b"\r\n"
with pytest.raises(decoder.BadData):
assert decoder.decode_uu(article, bytearray(b"222 0 <foo@bar>\r\n" + filler + bad_data + b"\r\n"))
assert decoder.decode_uu(
article, self._response(bytearray(b"222 0 <foo@bar>\r\n" + filler + bad_data + b"\r\n.\r\n"))
)

View File

@@ -257,3 +257,137 @@ class TestPostProc:
assert tmp_workdir_complete == workdir_complete
_func()
class TestNzbOnlyDownload:
@mock.patch("sabnzbd.postproc.process_single_nzb")
@mock.patch("sabnzbd.postproc.listdir_full")
def test_process_nzb_only_download_single_nzb(self, mock_listdir, mock_process_single_nzb):
"""Test process_nzb_only_download with a single NZB file"""
# Setup mock NZO
fake_nzo = mock.Mock()
fake_nzo.final_name = "TestDownload"
fake_nzo.pp = 3
fake_nzo.script = "test_script.py"
fake_nzo.cat = "movies"
fake_nzo.url = "http://example.com/test.nzb"
fake_nzo.priority = 0
# Mock single NZB file
workdir = os.path.join(SAB_CACHE_DIR, "test_workdir")
nzb_file = os.path.join(workdir, "test.nzb")
mock_listdir.return_value = [nzb_file]
# Call the function
result = postproc.process_nzb_only_download(workdir, fake_nzo)
# Verify result
assert result == [nzb_file]
# Verify process_single_nzb was called with correct arguments
mock_process_single_nzb.assert_called_once_with(
"test.nzb",
nzb_file,
pp=3,
script="test_script.py",
cat="movies",
url="http://example.com/test.nzb",
priority=0,
nzbname="TestDownload",
dup_check=False,
)
@mock.patch("sabnzbd.postproc.process_single_nzb")
@mock.patch("sabnzbd.postproc.listdir_full")
def test_process_nzb_only_download_multiple_nzbs(self, mock_listdir, mock_process_single_nzb):
"""Test process_nzb_only_download with multiple NZB files"""
# Setup mock NZO
fake_nzo = mock.Mock()
fake_nzo.final_name = "TestDownload"
fake_nzo.pp = 2
fake_nzo.script = None
fake_nzo.cat = "tv"
fake_nzo.url = "http://example.com/test.nzb"
fake_nzo.priority = 1
# Mock multiple NZB files
workdir = os.path.join(SAB_CACHE_DIR, "test_workdir")
first_nzb = os.path.join(workdir, "first.nzb")
second_nzb = os.path.join(workdir, "second.nzb")
mock_listdir.return_value = [first_nzb, second_nzb]
# Call the function
result = postproc.process_nzb_only_download(workdir, fake_nzo)
# Verify result
assert result == [first_nzb, second_nzb]
# Verify process_single_nzb was called twice with correct arguments
assert mock_process_single_nzb.call_count == 2
mock_process_single_nzb.assert_any_call(
"first.nzb",
first_nzb,
pp=2,
script=None,
cat="tv",
url="http://example.com/test.nzb",
priority=1,
nzbname="TestDownload - first.nzb",
dup_check=False,
)
mock_process_single_nzb.assert_any_call(
"second.nzb",
second_nzb,
pp=2,
script=None,
cat="tv",
url="http://example.com/test.nzb",
priority=1,
nzbname="TestDownload - second.nzb",
dup_check=False,
)
@mock.patch("sabnzbd.postproc.process_single_nzb")
@mock.patch("sabnzbd.postproc.listdir_full")
def test_process_nzb_only_download_mixed_files(self, mock_listdir, mock_process_single_nzb):
"""Test process_nzb_only_download with mixed file types returns None"""
# Setup mock NZO
fake_nzo = mock.Mock()
fake_nzo.final_name = "TestDownload"
# Mock mixed files (NZB and non-NZB)
workdir = os.path.join(SAB_CACHE_DIR, "test_workdir")
mock_listdir.return_value = [
os.path.join(workdir, "test.nzb"),
os.path.join(workdir, "readme.txt"),
]
# Call the function
result = postproc.process_nzb_only_download(workdir, fake_nzo)
# Verify result is None (not NZB-only)
assert result is None
# Verify process_single_nzb was NOT called
mock_process_single_nzb.assert_not_called()
@mock.patch("sabnzbd.postproc.process_single_nzb")
@mock.patch("sabnzbd.postproc.listdir_full")
def test_process_nzb_only_download_empty_directory(self, mock_listdir, mock_process_single_nzb):
"""Test process_nzb_only_download with empty directory returns None"""
# Setup mock NZO
fake_nzo = mock.Mock()
fake_nzo.final_name = "TestDownload"
# Mock empty directory
workdir = os.path.join(SAB_CACHE_DIR, "test_workdir")
mock_listdir.return_value = []
# Call the function
result = postproc.process_nzb_only_download(workdir, fake_nzo)
# Verify result is None (no files)
assert result is None
# Verify process_single_nzb was NOT called
mock_process_single_nzb.assert_not_called()