Compare commits

..

118 Commits

Author SHA1 Message Date
Safihre
9ca80e481c Handle Direct Unpack sets before proceeding to unpack the rest 2020-12-12 19:46:03 +01:00
Safihre
f5f8aa985e Sort script drop-down list alphabetically
Closes #1699
2020-12-12 19:13:27 +01:00
Safihre
1a848cf5fe Smarter extraction of filenames from NZB-subject 2020-12-12 17:18:58 +01:00
puzzledsab
b748b05fbd Only check idle servers for new articles twice per second (#1696)
* Only check idle servers for new articles twice per second

* Fix black complaint

* Store time.time() in variable in DL loop

* No need to check server for last_busy if it was just set
2020-12-12 17:05:29 +01:00
Safihre
9f2a9c32c0 Switch to GitHub Actions for CI
Removed the par2 files for the unicode job, they caused too much problems. It's a bad "fix" for #1509.
2020-12-12 16:52:43 +01:00
jcfp
92d0b0163a prevent repetition of unwanted extension warnings (#1695) 2020-12-11 21:09:16 +01:00
Safihre
c50e2a4026 Small tweak of where set_download_report is called 2020-12-10 16:06:28 +01:00
Safihre
69ffa159c7 Correctly use dict.keys()
Solves https://forums.sabnzbd.org/viewtopic.php?f=2&t=25087
2020-12-08 10:11:35 +01:00
Sander
81089fc20a obfuscated rar sets: better handling missing rars (#1688)
* obfuscated rar sets: better handlin missing rars

* obfuscated rar sets: make black happy

* rarset: cleanup unused code

* rarset: cleanup unused code

* rarset: wrong is_obfuscated_filename
2020-12-06 16:39:43 +01:00
Sander
3d09f72c90 Fixed pattern obfuscation detection (#1691)
* obfuscated: recognize fixed pattern abc.xyz as obfuscated

* obfuscated: recognize fixed pattern abc.xyz as obfuscated

* obfuscated: recognize fixed pattern abc.xyz as obfuscated - extra test

* obfuscated: recognize fixed pattern abc.xyz as obfuscated - black happy

* obfuscated: recognize fixed pattern abc.xyz as obfuscated - r"bla"
2020-12-03 07:53:16 +01:00
SABnzbd Automation
ef7d84b24d Update translatable texts 2020-11-28 20:37:02 +00:00
Safihre
9b71f8ca4b Use fully customizable date ranges for server graphs
Closes #1645
2020-11-28 21:35:03 +01:00
Safihre
04c3fc77cb On Travis use Python 3.9 now it is stable
Closes #1677
2020-11-27 15:21:35 +01:00
Safihre
c6cc6f4537 Correct Git-commit detection when running in different folder
Closes #1676, #1675
2020-11-27 15:19:29 +01:00
Sander
f31a4440f1 diskspeed: follow pylint's advice, and more pytesting (#1678)
* diskspeed: follow pylint's advice, and more pytesting

* diskspeed: improved hint, catch relevant exceptions

* diskspeed: lower tun time to 0.5 s (as we run it twice)

* diskspeed: make black and pylint happier

* Delete somefile.txt
2020-11-27 14:34:42 +01:00
jcfp
84b1e60803 fix sabnews regex deprecation warning (#1685) 2020-11-26 21:11:33 +01:00
jcfp
a434a5f25d Explicitly set mode for gzip.GzipFile() (#1684) 2020-11-26 21:10:46 +01:00
Safihre
09e844a63f Do not crash in Queue Repair if there was no resulting NZO
Closes #1649
2020-11-22 12:49:04 +01:00
jcfp
c55e114131 normalize shebang for utils, example script (#1679) 2020-11-17 08:55:37 +01:00
Sander
575fbc06aa IPv4 IPv6 library based testing (#1673)
* IPv4 IPv6 library based testing

* IPv4 IPv6 library based testing ... make black happy again
2020-11-13 17:19:52 +01:00
Sander
19376805de Ssdp for real ... more improvements (#1656)
* Add base implementation of SSDP util

* SSDP+XML: working setup #1

* SSDP+XML: with socket ... as sock

* SSDP+XML: unique UUIDs

* SSDP+XML: simpler constructions of XML URL

* SSDP+XML: cleaner SSDP and XML, steady UUID in XML, better logging

* SSDP+XML: UUIDs into __init__(). Better, innit?

* SSDP+XML: Make black happy again

* SSDP+XML: Make black happy again ... now for interface.py

* SSDP+XML: creation of SSDP message and XML to __init__()

* SSDP+XML: changes based on feedback

* SSDP+XML: no more SABnzbd references in ssdp.py. No network is OK now.

* SSDP+XML: references to specs for SSDP and the XML

Co-authored-by: Safihre <safihre@sabnzbd.org>
2020-11-13 15:17:15 +01:00
jcfp
5ea6a31bc2 Api tests (#1668)
* fix deprecation warning in sabnews regex

* enable text, xml returns from get_api_result

* add api tests

* add functional api tests

* add tavern.yaml files to test data

* explicitly add lxml to work around pip dependency issues

* prevent pytest from picking up the tavern files

* Revert "fix deprecation warning in sabnews regex"

This reverts commit 4f0b7131e7.

* address minor issues

* integrate fixtures into conftest

* black :/

* harden queue repair test

* try a workaround for extremely slow test runs on windoze

* Correct server detection in functional tests

* move scripts dir inside SAB_CACHE_DIR

* also relocate the generated script

Co-authored-by: Safihre <safihre@sabnzbd.org>
2020-11-08 18:37:48 +01:00
Safihre
2714ffe04d Do not crash if we cannot format the error message 2020-11-08 15:09:52 +01:00
exizak42
c38eac0e46 Separate email message lines are with CRLF (#1671)
SMTP protocol dictates that all lines are supposed to be separated
with CRLF and not LF (even on LF-based systems). This change ensures
that even if the original byte string message is using `\n` for line
separators, the SMTP protocol will still work properly.

This resolves sabnzbd#1669

Fix code formatting
2020-11-06 16:19:38 +01:00
Safihre
fccc57fd52 It was not possible to set directory-settings to empty values 2020-11-06 16:15:08 +01:00
jcfp
fea309da11 fix order for sorting queue by avg_age (#1666) 2020-11-01 19:37:30 +01:00
Safihre
d867881162 Deobfuscate-during-download did not work
https://forums.sabnzbd.org/viewtopic.php?f=3&t=25037
2020-11-01 15:39:41 +01:00
SABnzbd Automation
af9a7d2fb3 Update translatable texts 2020-11-01 13:22:36 +00:00
Safihre
259584b24f Less strict validation in test_functional_downloads due to #1509 2020-11-01 14:21:29 +01:00
SABnzbd Automation
38f61f64c7 Update translatable texts 2020-10-30 16:40:42 +00:00
Safihre
3e9bfba4d6 Improve handling of binary restarts (macOS / Windows) 2020-10-30 17:39:48 +01:00
Safihre
be26c7f080 mode=reset_quota api call returned nothing
Closes #1661
2020-10-28 16:16:49 +01:00
jcfp
6b8befdc67 Fix nzbstuff scan_password, expand tests (#1659)
* fixes for scan_password, expand tests

* correct typ0

* correct check for index of {{
2020-10-27 07:31:07 +01:00
Safihre
423e4e429b Add functional test for Queue Repair
Relates to #1649
2020-10-24 12:03:24 +02:00
SABnzbd Automation
53aba47915 Update translatable texts 2020-10-23 16:24:37 +00:00
jcfp
87f90b004f randomize age for generated nzb files in sabnews (#1655)
* randomize age for generated nzb files

Useful for testing queue sorting function of the api. Timestamp values are randomly chosen between september '93 and now.

* Sigh.
2020-10-23 18:23:56 +02:00
SABnzbd Automation
0b96afb055 Update translatable texts 2020-10-22 16:35:48 +00:00
Safihre
8e99ebe5ef Remove path length limitation on admin_dir and download_dir 2020-10-22 18:34:59 +02:00
Safihre
6e06d954fe Refactor of config.py and added typing hints to config.py and others 2020-10-22 16:10:24 +02:00
jcfp
497abb83da only replace the first occurence of "script_" (#1651)
* only replace the first occurence of "script_"

Use of str.replace() without a count replaces all occurences. As a result, scripts with filenames such as "my_script_for_sab.py" would be mangled when trying to set them as action on queue completion.

* also modify the check of the action var
2020-10-22 16:03:31 +02:00
Safihre
7ffebd97b9 Use constant for all localhost-definitions 2020-10-22 12:04:07 +02:00
SABnzbd Automation
55a5855720 Update translatable texts 2020-10-21 09:02:34 +00:00
Safihre
adc828dc8a Pin GitHub-actions versions 2020-10-21 11:01:28 +02:00
Safihre
6c5c9e0147 After pre-check the job was not restored to the original spot 2020-10-16 16:15:42 +02:00
Safihre
baa9ffb948 Applying Filters to a feed would result in crash
Closes #1634
2020-10-15 18:07:18 +02:00
Safihre
92541fec23 Allow failure of download_unicode_made_on_windows test due to bug #1633 2020-10-13 12:35:49 +02:00
Safihre
b1f6448ae0 Update import of sabnzbd.getipaddress 2020-10-12 23:52:34 +02:00
Safihre
fc72cf0451 Use same AppVeyor image as used for the releases 2020-10-12 23:18:30 +02:00
Sander
c76d931b01 bonjour/zeroconf improved (#1638)
* bonjour/zeroconf improved

* bonjour/zeroconf improved black formatting

* bonjour/zeroconf improved import
2020-10-12 23:17:56 +02:00
jcfp
02ef37d381 localhost is all of 127.0.0.0/8 not just /16 2020-10-11 11:42:11 +02:00
Safihre
329b420c0d Use same AppVeyor image as used for the releases 2020-10-09 22:43:32 +02:00
SABnzbd Automation
10049d0c1f Update translatable texts 2020-10-09 20:27:13 +00:00
Safihre
1e602d86bd Only start Direct Unpack after all first-articles are recieved 2020-10-09 22:26:23 +02:00
Safihre
f22ab0068e Notify Plush users that the skin is no longer maintained 2020-10-09 09:42:37 +02:00
SABnzbd Automation
3700e45e7f Update translatable texts 2020-10-09 07:37:00 +00:00
Safihre
36196a176e Update text for "Post-Process Only Verified Jobs"
Closes #1632
2020-10-09 09:36:18 +02:00
Safihre
72907de5ef Use newer version of black pipeline 2020-10-08 10:53:11 +02:00
Safihre
9a7385789e Show commit hash when running from GitHub sources 2020-10-07 20:50:25 +02:00
Safihre
d13893d1c7 Direct Unpack parsing was broken
Closes #1630
2020-10-07 20:31:34 +02:00
Safihre
1a8031c75d Use browser URL on last page of Wizard
Closes #1617
2020-10-07 12:35:24 +02:00
Safihre
9d10261a9f Reset decoded_data variable in Decoder and some style changes 2020-10-04 22:27:18 +02:00
Safihre
d0a7ff00fc Reference the right GitHub-issue 2020-10-04 22:27:18 +02:00
Safihre
b80d0ee458 URLGrabber would leave reference to NzbObject 2020-10-04 22:27:18 +02:00
Safihre
53069492b1 Add tests to verify no objects are left in memory after downloading 2020-10-04 22:27:18 +02:00
Safihre
3e2dad4a7e Properly manage all references to Nzo/Nzf/Article objects 2020-10-04 22:27:18 +02:00
Safihre
fca1e5355e Remove unused code 2020-10-02 11:42:49 +02:00
SABnzbd Automation
47c0fd706f Update translatable texts 2020-10-02 09:35:35 +00:00
Safihre
4c4ffb2f54 For reliability use internal webserver to test RSS feed parsing
We already have all the dependencies due to pytest-httpbin
2020-10-02 11:34:43 +02:00
SABnzbd Automation
ade477c6e5 Update translatable texts 2020-10-02 08:24:35 +00:00
Safihre
719b966709 Reset updated .pot files after pytest 2020-10-02 10:23:37 +02:00
SABnzbd Automation
2085c04717 Update translatable texts 2020-09-30 20:30:08 +00:00
Safihre
12a4e34075 Remove unused global DIR_APPDATA variable 2020-09-30 22:29:27 +02:00
SABnzbd Automation
13dd81ebbd Update translatable texts 2020-09-30 11:56:39 +00:00
Safihre
a9492eb25f Small refactor of the GUI-logger 2020-09-30 13:55:52 +02:00
SABnzbd Automation
4dabbb7590 Update translatable texts 2020-09-29 20:38:18 +00:00
Safihre
64b78bddd6 CI pipeline optimizations
Remove PPA (not needed)
Remove LGTM (not used)
Stop logging all API-requests
2020-09-29 22:37:15 +02:00
Safihre
5a02554380 Allow aborting at any point during external post-processing
Closes #1271
2020-09-29 22:37:15 +02:00
Safihre
c312f3917f Resolve unresolved references
dd
2020-09-29 22:37:15 +02:00
Safihre
30654af261 Scheduler refactor and add additional typing 2020-09-29 22:37:15 +02:00
Safihre
29aa329038 Notify users of Deobfuscate.py that it is now part of SABnzbd 2020-09-29 14:09:04 +02:00
Safihre
cfbb0d3bf6 Only set the "Waiting" status when the job hits post-processing
https://forums.sabnzbd.org/viewtopic.php?f=11&t=24969
2020-09-29 13:28:31 +02:00
Safihre
388f77ea52 Only run Windows Service code when executed from the executables
Could be made to work with the from-sources code.. But seems like very small usecase.
Closes #1623
2020-09-29 10:42:06 +02:00
SABnzbd Automation
139c2f3c14 Update translatable texts 2020-09-28 20:46:14 +00:00
Safihre
dab544bc93 Use HistoryDB as a contextmanager 2020-09-28 22:44:57 +02:00
Safihre
0070fce88d sqlite Row object does not support get-operation 2020-09-28 16:05:04 +02:00
Safihre
c27ecfe339 Revert "Fixes after the RSS and Rating-refactor"
This reverts commit 746de90700.
2020-09-28 15:09:22 +02:00
Safihre
746de90700 Fixes after the RSS and Rating-refactor 2020-09-27 17:57:29 +02:00
Safihre
c580f1aff7 Skip DirectUnpack parsing when there is nothing new yet 2020-09-27 17:57:10 +02:00
Safihre
93b429af8b We do not need to trim incomplete paths on Windows anymore 2020-09-27 17:57:10 +02:00
Safihre
f0e2e783a8 Force UnRar and Multipar to output UTF8 2020-09-27 17:57:10 +02:00
Safihre
9c2af4281a Set execute bit on Deobfuscate.py 2020-09-27 17:18:47 +02:00
SABnzbd Automation
c12e25217b Update translatable texts 2020-09-27 11:32:35 +00:00
Safihre
d5d0903591 Handle failing RSS-feeds for feedparser 6.0.0+
Closes #1621
Now throws warnings (that can be disabled, helpfull_warnings) if readout failed.
2020-09-27 13:31:51 +02:00
Safihre
72bde214a3 Missed one RSSReader replacement
Closes #1625
2020-09-27 12:46:44 +02:00
Safihre
3ae2cbcd2c Prevent unnecessary trackbacks from Rating.py 2020-09-27 09:29:24 +02:00
Safihre
82b3f210f6 Refactor RSS to fit the rest of the threads 2020-09-27 09:22:51 +02:00
Safihre
b8e67c558d Add NzbRatingV2 to rating.py for backwards compatibility
Closes #1624
2020-09-27 09:02:33 +02:00
Safihre
371bcfbf5b Correct function-calls in scheduler.py
Leftover from previous refactor.
2020-09-27 08:58:34 +02:00
Safihre
d75f1ed966 Small refactor of unpack_history_info 2020-09-26 11:34:49 +02:00
Safihre
5e4c3e0fa4 Small refactor of diskspace function 2020-09-26 10:13:32 +02:00
Safihre
2c2642a92a Small changes to rating.py and additional typing 2020-09-25 15:30:07 +02:00
SABnzbd Automation
afa0a206bc Update translatable texts 2020-09-25 11:47:00 +00:00
Safihre
57a8661988 Existing files were not parsed when re-adding a job 2020-09-25 10:49:20 +02:00
Safihre
a57b58b675 Do not crash if attributes file is not present 2020-09-25 10:43:21 +02:00
Safihre
8b051462a8 Do not crash if we can't save attributes, the job might be gone 2020-09-25 10:02:28 +02:00
Safihre
3bde8373a3 Correctly parse failed_only for Plush 2020-09-23 16:56:45 +02:00
Safihre
73df161cd0 Remove redundant "do" attribute 2020-09-23 15:40:36 +02:00
Safihre
9c83fd14bc Improve typing hints after rework of main threads 2020-09-23 13:13:36 +02:00
Safihre
ab020a0654 Rework the naming of the main SABnzbd threads 2020-09-23 13:13:36 +02:00
Safihre
14e77f3f9b Add typing hints to some SABnzbd-specific objects and general functions
Bye, Python 3.5.
Also includes fixes that I found because I added these type hints!
2020-09-23 13:13:36 +02:00
SABnzbd Automation
730d717936 Update translatable texts 2020-09-21 20:12:52 +00:00
Safihre
91a7a83cd5 Assume RarFile parses the correct filepaths for the RAR-volumes
Parsing UTF8 from command-line still fails.
https://forums.sabnzbd.org/viewtopic.php?p=122267#p122267
2020-09-21 21:31:25 +02:00
Safihre
6fb586e30f work_name would not be sanatized when adding NZB's
Closes #1615
Now with tests, yeah.
2020-09-20 11:57:29 +02:00
SABnzbd Automation
05b069ab8e Update translatable texts 2020-09-19 09:12:20 +00:00
Safihre
33a9eca696 More text-file updates for 3.2.0-develop 2020-09-19 11:11:38 +02:00
SABnzbd Automation
2b969c987c Update translatable texts 2020-09-19 08:59:41 +00:00
Safihre
f6c15490cc Set version to 3.2.0-develop and drop Python 3.5 support 2020-09-19 10:58:49 +02:00
SABnzbd Automation
da5e95595d Update translatable texts 2020-09-19 08:49:15 +00:00
145 changed files with 6324 additions and 3785 deletions

View File

@@ -7,7 +7,7 @@ jobs:
steps:
- uses: actions/checkout@v2
- name: Black Code Formatter
uses: lgeiger/black-action@v1.0.1
uses: lgeiger/black-action@master
with:
args: >
SABnzbd.py
@@ -16,5 +16,5 @@ jobs:
tools
tests
--line-length=120
--target-version=py35
--target-version=py36
--check

View File

@@ -0,0 +1,37 @@
name: CI Tests
on: [push, pull_request]
jobs:
test:
name: Test ${{ matrix.os }} - Python ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.6, 3.7, 3.8, 3.9]
os: [ubuntu-20.04]
include:
- os: macos-latest
python-version: 3.9
- os: windows-latest
python-version: 3.9
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install system dependencies
if: runner.os == 'Linux'
run: sudo apt-get install unrar p7zip-full par2 chromium-chromedriver
- name: Install Python dependencies
run: |
python --version
pip install --upgrade pip
pip install --upgrade -r requirements.txt
pip install --upgrade -r tests/requirements.txt
- name: Test SABnzbd
run: pytest -s

View File

@@ -9,7 +9,7 @@ jobs:
translations:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@master
- uses: actions/checkout@v2
- name: Generate translatable texts
run: |
python3 tools/extract_pot.py
@@ -25,7 +25,7 @@ jobs:
env:
TX_TOKEN: ${{ secrets.TX_TOKEN }}
- name: Push translatable and translated texts back to repo
uses: stefanzweifel/git-auto-commit-action@master
uses: stefanzweifel/git-auto-commit-action@v4.5.1
with:
commit_message: Update translatable texts
commit_user_name: SABnzbd Automation

View File

@@ -1,7 +0,0 @@
path_classifiers:
oldinterfaces:
- interfaces/smpl
- interfaces/Plush
library:
- "*knockout*"
- "**/*min*"

View File

@@ -1,48 +0,0 @@
matrix:
include:
# On Linux we test all supported Python versions
# On macOS we only test the semi-recent version that is included
- os: linux
language: python
python: "3.5"
- os: linux
language: python
python: "3.6"
- os: linux
language: python
python: "3.7"
- os: linux
language: python
python: "3.8"
- os: linux
language: python
python: "3.9-dev"
- os: osx
addons:
chrome: stable
env:
- HOMEBREW_NO_AUTO_UPDATE=1
install:
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then
LATEST_CHROMEDRIVER=$(curl -s https://chromedriver.storage.googleapis.com/LATEST_RELEASE) &&
wget --no-verbose -O /tmp/chromedriver.zip https://chromedriver.storage.googleapis.com/$LATEST_CHROMEDRIVER/chromedriver_mac64.zip &&
sudo unzip /tmp/chromedriver.zip chromedriver -d /usr/local/bin/;
else
sudo add-apt-repository ppa:jcfp -y;
sudo apt-get update -q;
sudo apt-get install unrar p7zip-full par2 chromium-chromedriver -y;
ln -s /usr/lib/chromium-browser/chromedriver ~/bin/chromedriver;
fi;
- python3 --version
- python3 -m pip install --upgrade pip wheel
- python3 -m pip install --upgrade -r requirements.txt
- python3 -m pip install --upgrade -r tests/requirements.txt
script:
- python3 -m pytest -s
notifications:
email:
on_success: never
on_failure: change

View File

@@ -1,4 +1,4 @@
SABnzbd 3.1.0
SABnzbd 3.2.0
-------------------------------------------------------------------------------
0) LICENSE
@@ -52,7 +52,7 @@ Specific guides to install from source are available for Windows and macOS:
https://sabnzbd.org/wiki/installation/install-macos
https://sabnzbd.org/wiki/installation/install-from-source-windows
Only Python 3.5 and above is supported.
Only Python 3.6 and above is supported.
On Linux systems you need to install:
par2 unrar unzip python3-setuptools python3-pip

View File

@@ -22,7 +22,7 @@
setting the option "api_warnings" to 0.
See: https://sabnzbd.org/wiki/configuration/3.1/special
- On OSX you may encounter downloaded files with foreign characters.
- On macOS you may encounter downloaded files with foreign characters.
The par2 repair may fail when the files were created on a Windows system.
The problem is caused by the PAR2 utility and we cannot fix this now.
This does not apply to files inside RAR files.
@@ -33,25 +33,14 @@
We cannot solve this problem, because the Operating System (read Windows)
prevents the removal.
- Memory usage can sometimes have high peaks. This makes using SABnzbd on very low
memory systems (e.g. a NAS device or a router) a challenge.
In particular on Synology (SynoCommunity) the device may report that SABnzbd is using
a lot of memory even when idle. In this case the memory is usually not actually used by
SABnzbd and will be available if required by other apps or the system. More information
can be found in the discussion here: https://github.com/SynoCommunity/spksrc/issues/2856
- SABnzbd is not compatible with some software firewall versions.
The Microsoft Windows Firewall works fine, but remember to tell this
firewall that SABnzbd is allowed to talk to other computers.
- When SABnzbd cannot send notification emails, check your virus scanner,
firewall or security suite. It may be blocking outgoing email.
- When you are using external drives or network shares on OSX or Linux
- When you are using external drives or network shares on macOS or Linux
make sure that the drives are mounted.
The operating system will simply redirect your files to alternative locations.
You may have trouble finding the files when mounting the drive later.
On OSX, SABnzbd will not create new folders in /Volumes.
On macOS, SABnzbd will not create new folders in /Volumes.
The result will be a failed job that can be retried once the volume has been mounted.
- If you use a mounted drive as "temporary download folder", it must be present when SABnzbd

View File

@@ -1,7 +1,7 @@
Metadata-Version: 1.0
Name: SABnzbd
Version: 3.1.0RC3
Summary: SABnzbd-3.1.0RC3
Version: 3.2.0-develop
Summary: SABnzbd-3.2.0-develop
Home-page: https://sabnzbd.org
Author: The SABnzbd Team
Author-email: team@sabnzbd.org

View File

@@ -18,7 +18,7 @@ If you want to know more you can head over to our website: https://sabnzbd.org.
SABnzbd has a few dependencies you'll need before you can get running. If you've previously run SABnzbd from one of the various Linux packages, then you likely already have all the needed dependencies. If not, here's what you're looking for:
- `python` (Python 3.5 and higher, often called `python3`)
- `python` (Python 3.6 and higher, often called `python3`)
- Python modules listed in `requirements.txt`
- `par2` (Multi-threaded par2 installation guide can be found [here](https://sabnzbd.org/wiki/installation/multicore-par2))
- `unrar` (make sure you get the "official" non-free version of unrar)

View File

@@ -1,18 +1,6 @@
Release Notes - SABnzbd 3.1.0 Release Candidate 3
Release Notes - SABnzbd 3.1.0 Release Candidate 1
=========================================================
## Changes and bugfixes since 3.1.0 Release Candidate 2
- Jobs in post-processing could be left in the "Waiting"-status.
- Notify users of `Deobfuscate.py` that it is now part of SABnzbd.
## Changes and bugfixes since 3.1.0 Release Candidate 1
- Failing RSS-feeds would result in tracebacks, they now show a warning.
- Existing files were not parsed when retrying a job.
- Reading attributes when retrying a job could result in crash.
- Temporary Folder with unicode characters could result in duplicate unpacking.
- Plush skin would only show failed jobs.
- Windows: Folders could end in a period, breaking Windows Explorer.
## Changes and bugfixes since 3.1.0 Beta 2
- Deobfuscate final filenames can now be used when job folders are disabled.
- Deobfuscate final filenames will ignore blu-ray disc files.
@@ -46,15 +34,6 @@ Release Notes - SABnzbd 3.1.0 Release Candidate 3
- Windows: non-Latin languages were displayed incorrectly in the installer.
- Windows: could fail to create folders on some network shares.
## Upgrade notices
- Jobs that failed on versions before 3.1.x, will throw an error about the
attribute file failing to load when they are retried on 3.1.0+. This error
can be ignored.
- When upgrading from 2.x.x or older the queue will be converted. Job order,
settings and data will be preserved, but if you decide to go back to 2.x.x
your queue cannot be downgraded again. But you can restore the jobs by going
to the Status page and running Queue Repair.
## Known problems and solutions
- Read the file "ISSUES.txt"

View File

@@ -17,8 +17,8 @@
import sys
if sys.hexversion < 0x03050000:
print("Sorry, requires Python 3.5 or above")
if sys.hexversion < 0x03060000:
print("Sorry, requires Python 3.6 or above")
print("You can read more at: https://sabnzbd.org/python3")
sys.exit(1)
@@ -34,6 +34,7 @@ import subprocess
import ssl
import time
import re
from typing import List, Dict, Any
try:
import Cheetah
@@ -66,15 +67,17 @@ from sabnzbd.misc import (
get_serv_parms,
get_from_url,
upload_file_to_sabnzbd,
probablyipv4,
)
from sabnzbd.filesystem import get_ext, real_path, long_path, globber_full, remove_file
from sabnzbd.panic import panic_tmpl, panic_port, panic_host, panic, launch_a_browser
import sabnzbd.scheduler as scheduler
import sabnzbd.config as config
import sabnzbd.cfg
import sabnzbd.downloader
import sabnzbd.notifier as notifier
import sabnzbd.zconfig
from sabnzbd.getipaddress import localipv4, publicipv4, ipv6
import sabnzbd.utils.ssdp as ssdp
try:
import win32api
@@ -120,28 +123,31 @@ class GUIHandler(logging.Handler):
def __init__(self, size):
""" Initializes the handler """
logging.Handler.__init__(self)
self.size = size
self.store = []
self._size: int = size
self.store: List[Dict[str, Any]] = []
def emit(self, record):
def emit(self, record: logging.LogRecord):
""" Emit a record by adding it to our private queue """
if record.levelname == "WARNING":
sabnzbd.LAST_WARNING = record.msg % record.args
else:
sabnzbd.LAST_ERROR = record.msg % record.args
if len(self.store) >= self.size:
# Loose the oldest record
self.store.pop(0)
# If % is part of the msg, this could fail
try:
# Append traceback, if available
warning = {"type": record.levelname, "text": record.msg % record.args, "time": int(time.time())}
if record.exc_info:
warning["text"] = "%s\n%s" % (warning["text"], traceback.format_exc())
self.store.append(warning)
except UnicodeDecodeError:
# Catch elusive Unicode conversion problems
pass
parsed_msg = record.msg % record.args
except TypeError:
parsed_msg = record.msg + str(record.args)
if record.levelno == logging.WARNING:
sabnzbd.notifier.send_notification(T("Warning"), parsed_msg, "warning")
else:
sabnzbd.notifier.send_notification(T("Error"), parsed_msg, "error")
# Append traceback, if available
warning = {"type": record.levelname, "text": parsed_msg, "time": int(time.time())}
if record.exc_info:
warning["text"] = "%s\n%s" % (warning["text"], traceback.format_exc())
# Loose the oldest record
if len(self.store) >= self._size:
self.store.pop(0)
self.store.append(warning)
def clear(self):
self.store = []
@@ -243,7 +249,7 @@ def daemonize():
# Get log file path and remove the log file if it got too large
log_path = os.path.join(sabnzbd.cfg.log_dir.get_path(), DEF_LOG_ERRFILE)
if os.path.exists(log_path) and os.path.getsize(log_path) > sabnzbd.cfg.log_size.get_int():
if os.path.exists(log_path) and os.path.getsize(log_path) > sabnzbd.cfg.log_size():
remove_file(log_path)
# Replace file descriptors for stdin, stdout, and stderr
@@ -330,7 +336,6 @@ def get_user_profile_paths(vista_plus):
if sabnzbd.DAEMON:
# In daemon mode, do not try to access the user profile
# just assume that everything defaults to the program dir
sabnzbd.DIR_APPDATA = sabnzbd.DIR_PROG
sabnzbd.DIR_LCLDATA = sabnzbd.DIR_PROG
sabnzbd.DIR_HOME = sabnzbd.DIR_PROG
if sabnzbd.WIN32:
@@ -344,8 +349,6 @@ def get_user_profile_paths(vista_plus):
try:
from win32com.shell import shell, shellcon
path = shell.SHGetFolderPath(0, shellcon.CSIDL_APPDATA, None, 0)
sabnzbd.DIR_APPDATA = os.path.join(path, DEF_WORKDIR)
path = shell.SHGetFolderPath(0, shellcon.CSIDL_LOCAL_APPDATA, None, 0)
sabnzbd.DIR_LCLDATA = os.path.join(path, DEF_WORKDIR)
sabnzbd.DIR_HOME = os.environ["USERPROFILE"]
@@ -354,18 +357,16 @@ def get_user_profile_paths(vista_plus):
if vista_plus:
root = os.environ["AppData"]
user = os.environ["USERPROFILE"]
sabnzbd.DIR_APPDATA = "%s\\%s" % (root.replace("\\Roaming", "\\Local"), DEF_WORKDIR)
sabnzbd.DIR_LCLDATA = "%s\\%s" % (root.replace("\\Roaming", "\\Local"), DEF_WORKDIR)
sabnzbd.DIR_HOME = user
else:
root = os.environ["USERPROFILE"]
sabnzbd.DIR_APPDATA = "%s\\%s" % (root, DEF_WORKDIR)
sabnzbd.DIR_LCLDATA = "%s\\%s" % (root, DEF_WORKDIR)
sabnzbd.DIR_HOME = root
sabnzbd.DIR_LCLDATA = sabnzbd.DIR_APPDATA
except:
pass
# Long-path everything
sabnzbd.DIR_APPDATA = long_path(sabnzbd.DIR_APPDATA)
sabnzbd.DIR_LCLDATA = long_path(sabnzbd.DIR_LCLDATA)
sabnzbd.DIR_HOME = long_path(sabnzbd.DIR_HOME)
return
@@ -373,16 +374,14 @@ def get_user_profile_paths(vista_plus):
elif sabnzbd.DARWIN:
home = os.environ.get("HOME")
if home:
sabnzbd.DIR_APPDATA = "%s/Library/Application Support/SABnzbd" % home
sabnzbd.DIR_LCLDATA = sabnzbd.DIR_APPDATA
sabnzbd.DIR_LCLDATA = "%s/Library/Application Support/SABnzbd" % home
sabnzbd.DIR_HOME = home
return
else:
# Unix/Linux
home = os.environ.get("HOME")
if home:
sabnzbd.DIR_APPDATA = "%s/.%s" % (home, DEF_WORKDIR)
sabnzbd.DIR_LCLDATA = sabnzbd.DIR_APPDATA
sabnzbd.DIR_LCLDATA = "%s/.%s" % (home, DEF_WORKDIR)
sabnzbd.DIR_HOME = home
return
@@ -532,7 +531,7 @@ def get_webhost(cherryhost, cherryport, https_port):
# Valid user defined name?
info = socket.getaddrinfo(cherryhost, None)
except socket.error:
if cherryhost not in ("localhost", "127.0.0.1", "::1"):
if cherryhost not in LOCALHOSTS:
cherryhost = "0.0.0.0"
try:
info = socket.getaddrinfo(localhost, None)
@@ -599,7 +598,7 @@ def get_webhost(cherryhost, cherryport, https_port):
except socket.error:
cherryhost = cherryhost.strip("[]")
if ipv6 and ipv4 and (browserhost not in ("localhost", "127.0.0.1", "[::1]", "::1")):
if ipv6 and ipv4 and browserhost not in LOCALHOSTS:
sabnzbd.AMBI_LOCALHOST = True
logging.info("IPV6 has priority on this system, potential Firefox issue")
@@ -614,7 +613,7 @@ def get_webhost(cherryhost, cherryport, https_port):
if ips[0] != "127.0.0.1":
browserhost = "127.0.0.1"
# This is to please Chrome on OSX
# This is to please Chrome on macOS
if cherryhost == "localhost" and sabnzbd.DARWIN:
cherryhost = "127.0.0.1"
browserhost = "localhost"
@@ -731,7 +730,7 @@ def commandline_handler():
serv_opts = [os.path.normpath(os.path.abspath(sys.argv[0]))]
upload_nzbs = []
# OSX binary: get rid of the weird -psn_0_123456 parameter
# macOS binary: get rid of the weird -psn_0_123456 parameter
for arg in sys.argv:
if arg.startswith("-psn_"):
sys.argv.remove(arg)
@@ -1117,7 +1116,7 @@ def main():
try:
if not no_file_log:
rollover_log = logging.handlers.RotatingFileHandler(
sabnzbd.LOGFILE, "a+", sabnzbd.cfg.log_size.get_int(), sabnzbd.cfg.log_backups()
sabnzbd.LOGFILE, "a+", sabnzbd.cfg.log_size(), sabnzbd.cfg.log_backups()
)
rollover_log.setFormatter(logging.Formatter(logformat))
logger.addHandler(rollover_log)
@@ -1139,8 +1138,19 @@ def main():
if no_file_log:
logging.info("Console logging only")
# Start SABnzbd
logging.info("--------------------------------")
logging.info("%s-%s (rev=%s)", sabnzbd.MY_NAME, sabnzbd.__version__, sabnzbd.__baseline__)
logging.info("%s-%s", sabnzbd.MY_NAME, sabnzbd.__version__)
# See if we can get version from git when running an unknown revision
if sabnzbd.__baseline__ == "unknown":
try:
sabnzbd.__baseline__ = sabnzbd.misc.run_command(
["git", "rev-parse", "--short", "HEAD"], cwd=sabnzbd.DIR_PROG
).strip()
except:
pass
logging.info("Commit: %s", sabnzbd.__baseline__)
logging.info("Full executable path = %s", sabnzbd.MY_FULLNAME)
if sabnzbd.WIN32:
suffix = ""
@@ -1171,10 +1181,6 @@ def main():
sabnzbd.encoding.CODEPAGE,
)
# TODO: Remove after 3.1.0
if sys.hexversion < 0x03060000:
logging.warning_helpful("Python 3.5 is end-of-life. SABnzbd 3.2.0 will only run on Python 3.6 and above.")
# SSL Information
logging.info("SSL version = %s", ssl.OPENSSL_VERSION)
@@ -1198,9 +1204,6 @@ def main():
ctx = ssl.create_default_context()
logging.debug("Available certificates: %s", repr(ctx.cert_store_stats()))
# Show IPv4/IPv6 address
from sabnzbd.getipaddress import localipv4, publicipv4, ipv6
mylocalipv4 = localipv4()
if mylocalipv4:
logging.debug("My local IPv4 address = %s", mylocalipv4)
@@ -1236,7 +1239,7 @@ def main():
if autobrowser is not None:
sabnzbd.cfg.autobrowser.set(autobrowser)
sabnzbd.initialize(pause, clean_up, evalSched=True, repair=repair)
sabnzbd.initialize(pause, clean_up, repair=repair)
os.chdir(sabnzbd.DIR_PROG)
@@ -1481,25 +1484,37 @@ def main():
check_latest_version()
autorestarted = False
# ZeroConfig/Bonjour needs a ip. Lets try to find it.
try:
z_host = socket.gethostbyname(socket.gethostname())
except socket.gaierror:
z_host = cherryhost
sabnzbd.zconfig.set_bonjour(z_host, cherryport)
# bonjour/zeroconf needs an ip. Lets try to find it.
external_host = localipv4() # IPv4 address of the LAN interface. This is the normal use case
if not external_host:
# None, so no network / default route, so let's set to ...
external_host = "127.0.0.1"
elif probablyipv4(cherryhost) and cherryhost not in LOCALHOSTS + ("0.0.0.0", "::"):
# a hard-configured cherryhost other than the usual, so let's take that (good or wrong)
external_host = cherryhost
logging.debug("bonjour/zeroconf/SSDP using host: %s", external_host)
sabnzbd.zconfig.set_bonjour(external_host, cherryport)
# Start SSDP if SABnzbd is running exposed
if cherryhost not in LOCALHOSTS:
# Set URL for browser for external hosts
if enable_https:
ssdp_url = "https://%s:%s%s" % (external_host, cherryport, sabnzbd.cfg.url_base())
else:
ssdp_url = "http://%s:%s%s" % (external_host, cherryport, sabnzbd.cfg.url_base())
ssdp.start_ssdp(
external_host,
"SABnzbd",
ssdp_url,
"SABnzbd %s" % sabnzbd.__version__,
"SABnzbd Team",
"https://sabnzbd.org/",
"SABnzbd %s" % sabnzbd.__version__,
)
# Have to keep this running, otherwise logging will terminate
timer = 0
while not sabnzbd.SABSTOP:
if sabnzbd.LAST_WARNING:
msg = sabnzbd.LAST_WARNING
sabnzbd.LAST_WARNING = None
sabnzbd.notifier.send_notification(T("Warning"), msg, "warning")
if sabnzbd.LAST_ERROR:
msg = sabnzbd.LAST_ERROR
sabnzbd.LAST_ERROR = None
sabnzbd.notifier.send_notification(T("Error"), msg, "error")
time.sleep(3)
# Check for loglevel changes
@@ -1516,7 +1531,7 @@ def main():
# Keep OS awake (if needed)
sabnzbd.keep_awake()
# Restart scheduler (if needed)
scheduler.restart()
sabnzbd.Scheduler.restart(plan_restart=False)
# Save config (if needed)
config.save_config()
# Check the threads
@@ -1530,17 +1545,18 @@ def main():
# Check for auto-restart request
# Or special restart cases like Mac and WindowsService
if sabnzbd.TRIGGER_RESTART:
logging.info("Performing triggered restart")
# Shutdown
sabnzbd.shutdown_program()
if sabnzbd.downloader.Downloader.do.paused:
if sabnzbd.Downloader.paused:
sabnzbd.RESTART_ARGS.append("-p")
if autorestarted:
sabnzbd.RESTART_ARGS.append("--autorestarted")
sys.argv = sabnzbd.RESTART_ARGS
os.chdir(org_dir)
# If OSX frozen restart of app instead of embedded python
# If macOS frozen restart of app instead of embedded python
if hasattr(sys, "frozen") and sabnzbd.DARWIN:
# [[NSProcessInfo processInfo] processIdentifier]]
# logging.info("%s" % (NSProcessInfo.processInfo().processIdentifier()))
@@ -1548,7 +1564,7 @@ def main():
my_name = sabnzbd.MY_FULLNAME.replace("/Contents/MacOS/SABnzbd", "")
my_args = " ".join(sys.argv[1:])
cmd = 'kill -9 %s && open "%s" --args %s' % (my_pid, my_name, my_args)
logging.info("Launching: ", cmd)
logging.info("Launching: %s", cmd)
os.system(cmd)
elif sabnzbd.WIN_SERVICE:
# Use external service handler to do the restart
@@ -1709,7 +1725,7 @@ if __name__ == "__main__":
elif sabnzbd.DARWIN and sabnzbd.FOUNDATION:
# OSX binary runner
# macOS binary runner
from threading import Thread
from PyObjCTools import AppHelper
from AppKit import NSApplication

View File

@@ -1,14 +0,0 @@
environment:
# We only test the latest Python version
matrix:
- PYTHON: "C:\\Python38-x64"
install:
- "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%"
- python --version
- python -m pip install --upgrade pip wheel
- python -m pip install --upgrade -r requirements.txt
- python -m pip install --upgrade -r tests/requirements.txt
build_script:
- python -m pytest -s

View File

@@ -9,7 +9,7 @@
<tbody>
<tr>
<th scope="row">$T('version'): </th>
<td>$version [$build]</td>
<td>$version [<a href="https://github.com/sabnzbd/sabnzbd/commit/$build" target="_blank">$build</a>]</td>
</tr>
<tr>
<th scope="row">$T('uptime'): </th>

View File

@@ -50,7 +50,7 @@ else:
<select name="action" id="action">
<optgroup label="$T('sch-action')">
<!--#for $action in $actions#-->
<option value="$action" data-action="" data-noarg="<!--#if $action is 'speedlimit' then 0 else 1#-->">$actions_lng[$action]</option>
<option value="$action" data-action="" data-noarg="<!--#if $action == 'speedlimit' then 0 else 1#-->">$actions_lng[$action]</option>
<!--#end for#-->
</optgroup>
<optgroup label="$T('cmenu-servers')">

View File

@@ -2,44 +2,8 @@
<!--#set global $help_uri="configuration/3.1/servers"#-->
<!--#include $webdir + "/_inc_header_uc.tmpl"#-->
<!--
We need to find how many months we have recorded so far, so we
loop over all the dates to find the lowest value and then use
this to calculate the date-selector and maximum value per month.
-->
<!--#import json#-->
<!--#import datetime#-->
<!--#import sabnzbd.misc#-->
<!--#set month_names = [$T('January'), $T('February'), $T('March'), $T('April'), $T('May'), $T('June'), $T('July'), $T('August'), $T('September'), $T('October'), $T('November'), $T('December')] #-->
<!--#set min_date = datetime.date.today()#-->
<!--#set max_data_all = {}#-->
<!--#for $server in $servers #-->
<!--#if 'amounts' in $server#-->
<!--#set max_data_server = {}#-->
<!--#for date in $server['amounts'][4]#-->
<!--#set split_date = $date.split('-')#-->
<!--#set min_date = min(min_date, datetime.date(int(split_date[0]), int(split_date[1]), 1))#-->
<!--#set month_date = $date[:7]#-->
<!--#if $month_date not in $max_data_server#-->
<!--#set max_data_server[$month_date] = 0#-->
<!--#end if#-->
<!--#set max_data_server[$month_date] = max(max_data_server[$month_date], $server['amounts'][4][$date])#-->
<!--#end for#-->
<!--#for month_date in max_data_server#-->
<!--#if $month_date not in $max_data_all#-->
<!--#set max_data_all[$month_date] = 0#-->
<!--#end if#-->
<!--#set max_data_all[$month_date] = max(max_data_all[$month_date], max_data_server[$month_date])#-->
<!--#end for#-->
<!--#end if#-->
<!--#end for#-->
<!--#set months_recorded = list(sabnzbd.misc.monthrange(min_date, datetime.date.today()))#-->
<!--#$months_recorded.reverse()#-->
<script type="text/javascript">
// Define variable needed for the server-plots
@@ -53,21 +17,13 @@
<input type="checkbox" id="advanced-settings-button" name="advanced-settings-button"> $T('button-advanced')
</label>
<!--#if $months_recorded#-->
<div class="advanced-buttonSeperator"></div>
<div class="chart-selector-container" title="$T('srv-bandwidth')">
<span class="glyphicon glyphicon-signal"></span>
<select name="chart-selector" id="chart-selector">
<!--#for $cur_date in months_recorded#-->
<!--#set month_date = '%d-%02d' % ($cur_date.year, $cur_date.month)#-->
<!--#if $month_date not in $max_data_all#-->
<!--#set max_data_all[$month_date] = 0#-->
<!--#end if#-->
<option value="$month_date" data-max="$max_data_all[$month_date]">$month_names[$cur_date.month-1] $cur_date.year</option>
<!--#end for#-->
</select>
<!--#set today = datetime.date.today()#-->
<input type="date" name="chart-start" id="chart-start" value="<!--#echo (today-datetime.timedelta(days=30)).strftime('%Y-%m-%d')#-->"> -
<input type="date" name="chart-end" id="chart-end" value="<!--#echo today.strftime('%Y-%m-%d')#-->">
</div>
<!--#end if#-->
</div>
<div class="section" id="addServerContent" style="display: none;">
<div class="col2">
@@ -287,7 +243,7 @@
$T('today'): $(server['amounts'][3])B<br/>
$T('thisWeek'): $(server['amounts'][2])B<br/>
$T('thisMonth'): $(server['amounts'][1])B<br/>
<span id="server-data-label-${cur}"></span>: <span id="server-data-value-${cur}"></span>
$T('custom'): <span id="server-data-value-${cur}"></span>
</div>
<div class="server-chart" data-serverid="${cur}"s>
<div id="server-chart-${cur}" class="ct-chart"></div>
@@ -331,58 +287,72 @@
}
function showCharts() {
// This month
var theMonth = \$('#chart-selector').val()
var thisDay = new Date()
// Get the constants
const startDate = new Date(\$('#chart-start').val())
const endDate = new Date(\$('#chart-end').val())
const oneDay = 24 * 60 * 60 * 1000
const nrDays = Math.round((endDate-startDate)/oneDay)
// What month are we doing?
var inputDate = new Date(theMonth+'-01')
var baseDate = new Date(inputDate.getUTCFullYear(), inputDate.getUTCMonth(), 1)
var maxDaysInMonth = new Date(baseDate.getFullYear(), baseDate.getMonth()+1, 0).getDate()
// Show only maximum 10 labels to avoid cluttering
const labelStep = Math.round(nrDays/10)
// Set the new maximum
chartOptions.axisY.high = \$('#chart-selector :selected').data('max');
chartOptions.axisY.low = 0
// Save largest value
var maxVal = 0
// For each chart
\$('.server-chart').each(function(i, elemn) {
var server_id = \$(elemn).data('serverid')
\$('.server-chart').each(function(j, elemn) {
const server_id = \$(elemn).data('serverid')
var totalThisRange = 0
// Fill the data array
var data = {
labels: [],
series: [[]]
};
var totalThisMonth = 0
for(var i = 1; i < maxDaysInMonth+1; i++) {
for(var i = 0; i < nrDays+1; i++) {
// Update the date
const checkDate = new Date(startDate)
checkDate.setDate(checkDate.getDate() + i);
// Add X-label
if(i % 3 == 1) {
data['labels'].push(i)
if(i % labelStep === 0) {
data['labels'].push(checkDate.getDate())
} else {
data['labels'].push(NaN)
}
// Get formatted date
baseDate.setDate(i)
var dateCheck = toFormattedDate(baseDate)
// Date we can check in the array
const dateCheck = toFormattedDate(checkDate)
// Add data if we have it
if(dateCheck in serverData[server_id]) {
data['series'][0].push(serverData[server_id][dateCheck])
totalThisMonth += serverData[server_id][dateCheck]
} else if(thisDay.getYear() == baseDate.getYear() && thisDay.getMonth() == baseDate.getMonth() && thisDay.getDate() < i) {
data['series'][0].push(NaN)
} else {
totalThisRange += serverData[server_id][dateCheck]
maxVal = Math.max(maxVal, serverData[server_id][dateCheck])
} else {
data['series'][0].push(0)
}
}
// Update the text value
\$('#server-data-label-' + server_id).text(\$('#chart-selector :selected').text())
\$('#server-data-value-' + server_id).text(filesize(totalThisMonth, {round: 1}))
\$('#server-data-value-' + server_id).text(filesize(totalThisRange, {round: 1}))
// Save data in a very ugly way, but we need to do this
// so we can calculate the maximum Y-axis for all graphs
\$(elemn).data("chart-data", data)
})
// Set the maximum
chartOptions.axisY.high = maxVal;
chartOptions.axisY.low = 0
// Update all the axis with the largest value and draw the graph
\$('.server-chart').each(function(j, elemn) {
const server_id = \$(elemn).data('serverid')
// Show the chart
chart = new Chartist.Line('#server-chart-'+server_id, data, chartOptions);
chart = new Chartist.Line('#server-chart-'+server_id, \$(elemn).data("chart-data"), chartOptions)
chart.on('created', function(context) {
// Make sure to add this as the first child so it's at the bottom
context.svg.elem('rect', {
@@ -391,7 +361,7 @@
width: context.chartRect.width(),
height: context.chartRect.height()+2,
fill: 'none',
stroke: '#B9B9B9',
stroke: '#b9b9b9',
'stroke-width': '1px'
}, '', context.svg, true)
\$('#server-chart-'+server_id+' .ct-label.ct-vertical').each(function(index, elmn) {
@@ -399,6 +369,10 @@
})
});
})
// Limit input to sensible values
\$('#chart-start').attr("max", \$('#chart-end').val())
\$('#chart-end').attr("min", \$('#chart-start').val())
}
// Need to mitigate timezone effects!
@@ -425,7 +399,7 @@
/**
Update charts when changed
**/
\$('#chart-selector').on('change', function(elemn) {
\$('#chart-start, #chart-end').on('change', function(elemn) {
showCharts()
// Lets us leave (needs to be called after the change event)

View File

@@ -795,6 +795,7 @@ input[type="submit"]:hover {
input[type="text"],
input[type="email"],
input[type="url"],
input[type="date"],
input[type="number"],
input[type="password"],
textarea,

View File

@@ -228,11 +228,11 @@ function QueueListModel(parent) {
switch($(event.currentTarget).data('action')) {
case 'sortAgeAsc':
sort = 'avg_age';
dir = 'asc';
dir = 'desc';
break;
case 'sortAgeDesc':
sort = 'avg_age';
dir = 'desc';
dir = 'asc';
break;
case 'sortNameAsc':
sort = 'name';
@@ -751,4 +751,4 @@ function QueueModel(parent, data) {
});
}
};
}
}

View File

@@ -103,6 +103,7 @@
<span id="warning_box"><b><a href="${path}status/#tabs-warnings" id="last_warning"><span id="have_warnings">$have_warnings</span> $T('warnings')</a></b></span>
#if $pane=="Main"#
#if $new_release#&sdot; <a href="$new_rel_url" id="new_release" target="_blank">$T('Plush-updateAvailable').replace(' ','&nbsp;')</a>#end if#
This skin is no longer actively maintained! <a href="${path}config/general/#web_dir"><strong>We recommend using the Glitter skin.</strong></a>
#end if#
</div>
</div>

View File

@@ -306,8 +306,8 @@ jQuery(function($){
$('#queue_sort_list .queue_sort').click(function(event) {
var sort, dir;
switch ($(this).attr('id')) {
case 'sortAgeAsc': sort='avg_age'; dir='asc'; break;
case 'sortAgeDesc': sort='avg_age'; dir='desc'; break;
case 'sortAgeAsc': sort='avg_age'; dir='desc'; break;
case 'sortAgeDesc': sort='avg_age'; dir='asc'; break;
case 'sortNameAsc': sort='name'; dir='asc'; break;
case 'sortNameDesc': sort='name'; dir='desc'; break;
case 'sortSizeAsc': sort='size'; dir='asc'; break;

View File

@@ -5,7 +5,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: team@sabnzbd.org\n"
"Language-Team: SABnzbd <team@sabnzbd.org>\n"

View File

@@ -4,7 +4,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Language-Team: Czech (https://www.transifex.com/sabnzbd/teams/111101/cs/)\n"
"MIME-Version: 1.0\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Danish (https://www.transifex.com/sabnzbd/teams/111101/da/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: German (https://www.transifex.com/sabnzbd/teams/111101/de/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Spanish (https://www.transifex.com/sabnzbd/teams/111101/es/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Finnish (https://www.transifex.com/sabnzbd/teams/111101/fi/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: French (https://www.transifex.com/sabnzbd/teams/111101/fr/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: ION, 2020\n"
"Language-Team: Hebrew (https://www.transifex.com/sabnzbd/teams/111101/he/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Norwegian Bokmål (https://www.transifex.com/sabnzbd/teams/111101/nb/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Dutch (https://www.transifex.com/sabnzbd/teams/111101/nl/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Polish (https://www.transifex.com/sabnzbd/teams/111101/pl/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Portuguese (Brazil) (https://www.transifex.com/sabnzbd/teams/111101/pt_BR/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Romanian (https://www.transifex.com/sabnzbd/teams/111101/ro/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Russian (https://www.transifex.com/sabnzbd/teams/111101/ru/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Serbian (https://www.transifex.com/sabnzbd/teams/111101/sr/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Swedish (https://www.transifex.com/sabnzbd/teams/111101/sv/)\n"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:56+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Chinese (China) (https://www.transifex.com/sabnzbd/teams/111101/zh_CN/)\n"

View File

@@ -5,7 +5,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: team@sabnzbd.org\n"
"Language-Team: SABnzbd <team@sabnzbd.org>\n"
@@ -13,6 +13,16 @@ msgstr ""
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr ""
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr ""
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -100,16 +110,6 @@ msgstr ""
msgid "SABnzbd %s started"
msgstr ""
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr ""
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr ""
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr ""
@@ -329,10 +329,6 @@ msgstr ""
msgid "UNC path \"%s\" not allowed here"
msgstr ""
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr ""
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr ""
@@ -718,15 +714,6 @@ msgstr ""
msgid "Running script"
msgstr ""
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr ""
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr ""
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -788,11 +775,6 @@ msgstr ""
msgid "Unpacking"
msgstr ""
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr ""
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr ""
@@ -1476,6 +1458,10 @@ msgstr ""
msgid "see logfile"
msgstr ""
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr ""
@@ -1578,10 +1564,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr ""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr ""
@@ -1590,10 +1572,15 @@ msgstr ""
msgid "Server side error (server code %s); could not get %s on %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr ""
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr ""
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr ""
@@ -1682,6 +1669,16 @@ msgstr ""
msgid "Join files"
msgstr ""
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr ""
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr ""
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2014,6 +2011,10 @@ msgstr ""
msgid "Total"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr ""
#: sabnzbd/skintext.py
msgid "on"
msgstr ""
@@ -3162,7 +3163,7 @@ msgid "Post-Process Only Verified Jobs"
msgstr ""
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid "Only unpack and run scripts on jobs that passed the verification stage. If turned off, all jobs will be marked as Completed even if they are incomplete."
msgstr ""
#: sabnzbd/skintext.py
@@ -3865,7 +3866,7 @@ msgstr ""
msgid "Enable NotifyOSD"
msgstr ""
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr ""
@@ -4496,10 +4497,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr ""
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

File diff suppressed because it is too large Load Diff

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Danish (https://www.transifex.com/sabnzbd/teams/111101/da/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: da\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advarsel"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fejl"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -116,16 +126,6 @@ msgstr "Kunne ikke starte web-grænseflade: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s startet"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advarsel"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fejl"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd lukning udført"
@@ -353,10 +353,6 @@ msgstr "%s er ikke et korrekt ciffer værdi"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC søgning \"%s\" er ikke tilladt her"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Fejl: Sti længde bør være under %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Fejl: Køen er ikke tom, kan ikke skifte mappe."
@@ -781,15 +777,6 @@ msgstr "Python script \"%s\" har ikke udfør (+x) tilladelsessæt"
msgid "Running script"
msgstr "Køre script"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Efterbehandling blev afbrudt (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -851,11 +838,6 @@ msgstr "Udpakning mislykkedes, arkivet kræver adgangskode"
msgid "Unpacking"
msgstr "Udpakker"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Udpak"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Udpakning mislykkedes, kunne ikke finde %s"
@@ -1584,6 +1566,10 @@ msgstr "Efterbehandling mislykkedes for %s (%s)"
msgid "see logfile"
msgstr "se logfil"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Download mislykkedes"
@@ -1690,10 +1676,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Forkert RSS-feed beskrivelse \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Mislykkedes at hente RSS fra %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Har ikke gyldig godkendelse til feed %s"
@@ -1702,10 +1684,15 @@ msgstr "Har ikke gyldig godkendelse til feed %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Server fejl (server kode %s); kunne ikke få %s på %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Mislykkedes at hente RSS fra %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Server %s bruger et upålideligt HTTPS-certifikat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS Feed %s er tom"
@@ -1795,6 +1782,16 @@ msgstr "Downloader"
msgid "Join files"
msgstr "Sammenlægger filer"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Udpak"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2127,6 +2124,10 @@ msgstr "I dag"
msgid "Total"
msgstr "Totalt"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Tilpasse"
#: sabnzbd/skintext.py
msgid "on"
msgstr "på"
@@ -3376,8 +3377,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Efterbehandling kun verificerede jobs"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr "Kun udføre efterbehandling af jobs som har bestået PAR2 kontrollen."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4137,7 +4141,7 @@ msgstr "Notifikation sendt!"
msgid "Enable NotifyOSD"
msgstr "Aktiver NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Notification Center"
@@ -4792,10 +4796,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr "Glitter har nogle (nye) egenskaber, du kan lide!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Tilpasse"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Kompakt layout"

View File

@@ -5,12 +5,15 @@
# Translators:
# N S <reloxx@interia.pl>, 2020
# Safihre <safihre@sabnzbd.org>, 2020
# C E <githubce@eiselt.ch>, 2020
# Nikolai Bohl <n.kay01@gmail.com>, 2020
# hotio, 2020
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Last-Translator: hotio, 2020\n"
"Language-Team: German (https://www.transifex.com/sabnzbd/teams/111101/de/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -18,6 +21,16 @@ msgstr ""
"Language: de\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Achtung"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fehler"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -100,7 +113,7 @@ msgstr ""
#. Warning message
#: SABnzbd.py
msgid "Could not load additional certificates from certifi package"
msgstr ""
msgstr "Konnte weitere Zertifikate vom Paket certifi nicht laden."
#. Warning message
#: SABnzbd.py
@@ -122,16 +135,6 @@ msgstr "Fehler beim Starten der Web-Oberfläche: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s gestartet"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Achtung"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fehler"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd wurde beendet"
@@ -178,7 +181,7 @@ msgstr "Fehler beim Laden von %s"
#. Warning message
#: sabnzbd/__init__.py
msgid "Cannot access PID file %s"
msgstr ""
msgstr "Zugriff auf PID Datei %s nicht möglich"
#: sabnzbd/api.py, sabnzbd/emailer.py
msgid "Email succeeded"
@@ -363,10 +366,6 @@ msgstr "%s ist kein gültiger Oktal-Wert"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC-Pfad \"%s\" ist hier nicht erlaubt"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Fehler: Dateipfadlänge sollte kürzer als %s sein."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr ""
@@ -682,6 +681,8 @@ msgid ""
"The Completed Download Folder cannot be the same or a subfolder of the "
"Temporary Download Folder"
msgstr ""
"Der \"Abgeschlossene Downloads\"-Ordner darf kein Unterordner des "
"\"Temporäre Downloads\"-Ordners sein."
#: sabnzbd/interface.py
msgid "Warning: LOCALHOST is ambiguous, use numerical IP-address."
@@ -771,7 +772,7 @@ msgstr "m"
#. Error message
#: sabnzbd/misc.py
msgid "Failed to upload file: %s"
msgstr ""
msgstr "Hochladen der Datei %s fehlgeschlagen"
#. Error message
#: sabnzbd/misc.py
@@ -791,7 +792,7 @@ msgstr ""
#. Warning message
#: sabnzbd/misc.py
msgid "Failed to read the password file %s"
msgstr ""
msgstr "Konnte die Passwortdatei %s nicht lesen"
#. Error message
#: sabnzbd/misc.py
@@ -807,15 +808,6 @@ msgstr "Dem Pythonskript \"%s\" fehlen die Ausführungsrechte (+x)"
msgid "Running script"
msgstr "Ausführen des Skripts"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Nachbearbeitung wurde abgebrochen (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -877,11 +869,6 @@ msgstr "Entpacken fehlgeschlagen. Archiv benötigt ein Passwort."
msgid "Unpacking"
msgstr "Entpacken"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Entpacken"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Entpacken fehlgeschlagen. Konnte %s nicht finden."
@@ -1632,6 +1619,10 @@ msgstr "Nachbearbeitung von %s fehlgeschlagen (%s)"
msgid "see logfile"
msgstr "Beachten Sie die Protokolldatei"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr "Nachbearbeitung wurde abgebrochen"
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Download Fehlgeschlagen"
@@ -1685,7 +1676,7 @@ msgstr "RAR-Datei konnten nicht überprüft werden"
#. Warning message
#: sabnzbd/postproc.py
msgid "No matching earlier rar file for %s"
msgstr ""
msgstr "Keine zugehörige frühere RAR-Datei für %s"
#. Error message
#: sabnzbd/postproc.py
@@ -1710,7 +1701,7 @@ msgstr "Fehler beim Herunterfahren des Systems"
#. Error message
#: sabnzbd/powersup.py
msgid "Received a DBus exception %s"
msgstr ""
msgstr "DBus-Ausnahmefehler empfangen %s "
#. Warning message
#: sabnzbd/rating.py
@@ -1738,10 +1729,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Ungültige RSS-Feed-Beschreibung \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Abrufen des RSS-Feeds von %s fehlgeschlagen: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Keine gültige Berechtigung für Feed %s"
@@ -1750,10 +1737,15 @@ msgstr "Keine gültige Berechtigung für Feed %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Server-Fehler (Code %s); konnte %s von %s nicht laden"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Abrufen des RSS-Feeds von %s fehlgeschlagen: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Der Server %s nutzt ein nicht vertrauenswürdiges HTTPS-Zertifikat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS-Feed %s war leer"
@@ -1843,6 +1835,16 @@ msgstr "Herunterladen"
msgid "Join files"
msgstr "Dateien zusammenfügen"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Entpacken"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2175,6 +2177,10 @@ msgstr "Heute"
msgid "Total"
msgstr "Gesamt"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Benutzerdefiniert"
#: sabnzbd/skintext.py
msgid "on"
msgstr "An"
@@ -3455,10 +3461,14 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Nur überprüfte Aufträge nachbearbeiten"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Die Nachbearbeitung nur für Aufträge durchführen,<br />die alle "
"PAR2-Überprüfungen bestanden haben."
"Entpacken und starten von Skripten nur bei verifizierten Jobs. Wenn "
"ausgeschaltet werden alle Jobs als vollständig markiert, selbst wenn sie "
"unvollständig sind."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -3724,13 +3734,15 @@ msgstr "Nach dem Download löschen"
#: sabnzbd/skintext.py
msgid "Deobfuscate final filenames"
msgstr ""
msgstr "Entschleiere finale Dateinamen"
#: sabnzbd/skintext.py
msgid ""
"If filenames of (large) files in the final folder look obfuscated or "
"meaningless they will be renamed to the job name."
msgstr ""
"Dateinamen von (großen) Dateien im Zielordner werden in den Auftragsnamen "
"umbenannt, wenn sie verschleiert oder bedeutungslos aussehen."
#: sabnzbd/skintext.py
msgid "HTTPS certificate verification"
@@ -4244,7 +4256,7 @@ msgstr "Benachrichtigung gesendet!"
msgid "Enable NotifyOSD"
msgstr "NotifyOSD aktivieren"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Benachrichtigungscenter"
@@ -4901,10 +4913,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr "Glitter hat ein paar (neue) Feature die du bestimmt magst!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Benutzerdefiniert"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Kompaktes Layout"

View File

@@ -8,7 +8,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Spanish (https://www.transifex.com/sabnzbd/teams/111101/es/)\n"
@@ -18,6 +18,16 @@ msgstr ""
"Language: es\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advertencia"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Se ha producido un error"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -122,16 +132,6 @@ msgstr "Error al iniciar la interfaz web: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s comenzó"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advertencia"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Se ha producido un error"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Cierre de SABnzbd terminado"
@@ -365,10 +365,6 @@ msgstr "%s no es un valor octal correcto"
msgid "UNC path \"%s\" not allowed here"
msgstr "Ruta de acceso UNC \"%s\" no permitido aqui"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Error: La longitud de ruta debería ser menor que %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Error: Cola no esta vacía, no se puede cambiar el directorio"
@@ -807,15 +803,6 @@ msgstr ""
msgid "Running script"
msgstr "Ejecutando script"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Se ha abortado el PostProcesamiento (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -878,11 +865,6 @@ msgstr "Error al descomprimir; El archivo está protegido por contraseña"
msgid "Unpacking"
msgstr "Descomprimiendo"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Descomprimir"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Error al descomprimir; Imposible encontrar %s"
@@ -1636,6 +1618,10 @@ msgstr "Error al post-procesar %s (%s)"
msgid "see logfile"
msgstr "ver fichero de log"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "La descarga falló"
@@ -1744,10 +1730,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "iaDescripción de canal RSS incorrecta \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Error al recuperar RSS desde %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "No se encontró autenticación válida para el feed %s"
@@ -1758,10 +1740,15 @@ msgstr ""
"Error del lado servidor (código enviado por el servidor: %s); no se ha "
"podido conseguir %s en %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Error al recuperar RSS desde %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "El servidor %s utiliza un certificado HTTPS no fiable"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "El canal RSS %s estaba vacío"
@@ -1851,6 +1838,16 @@ msgstr "Descargar"
msgid "Join files"
msgstr "Unir ficheros"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Descomprimir"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2183,6 +2180,10 @@ msgstr "Hoy"
msgid "Total"
msgstr "Total"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizar"
#: sabnzbd/skintext.py
msgid "on"
msgstr "activado"
@@ -3459,10 +3460,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Post-procesar sólo trabajos verificados"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Sólo realiza el post-procesado en trabajos que han pasado todos los chequeos"
" PAR2."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4252,7 +4254,7 @@ msgstr "¡Notificación enviada!"
msgid "Enable NotifyOSD"
msgstr "Habilitar NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Centro de Notificación"
@@ -4910,10 +4912,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr "¡Glitter tiene alguna nueva funcionalidad que puede gustarte!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizar"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Diseño compacto"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Finnish (https://www.transifex.com/sabnzbd/teams/111101/fi/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: fi\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Varoitus"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Virhe"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -117,16 +127,6 @@ msgstr "Web-käyttöliittymän käynnistys epäonnistui : "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s käynnistetty"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Varoitus"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Virhe"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd sammutus valmis"
@@ -350,10 +350,6 @@ msgstr "%s ei ole oikea oktaalinen arvo"
msgid "UNC path \"%s\" not allowed here"
msgstr "TUNT polku \"%s\" ei ole sallittu"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Virhe: Polun pituus täytyy olla alle %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Virhe: Jono ei ole tyhjä, kansiota ei voida vaihtaa."
@@ -777,15 +773,6 @@ msgstr ""
msgid "Running script"
msgstr "Ajetaan skripti"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Jälkikäsittely peruutettiin (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Skripti"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -847,11 +834,6 @@ msgstr "Purkaminen epäonnistui, arkisto vaatii salasanan"
msgid "Unpacking"
msgstr "Puretaan"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Pura"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Purkaminen epäonnistui, %s ei löydy"
@@ -1577,6 +1559,10 @@ msgstr "Jälkikäsittely epäonnistui kohteelle %s (%s)"
msgid "see logfile"
msgstr "katso lokitiedosto"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Lataus epäonnistui"
@@ -1681,10 +1667,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Virheellinen RSS syötteen kuvaus \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "RSS noutaminen epäonnistui kohteesta %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Ei ole käyttöoikeutta syötteeseen %s"
@@ -1693,10 +1675,15 @@ msgstr "Ei ole käyttöoikeutta syötteeseen %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Palvelinpään virhe (virhekoodi %s); ei voitu noutaa %s kohteesta %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "RSS noutaminen epäonnistui kohteesta %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Palvelin %s käyttää epäluotettavaa HTTPS sertifikaattia"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS syöte %s oli tyhjä"
@@ -1786,6 +1773,16 @@ msgstr "Lataa"
msgid "Join files"
msgstr "Yhdistä tiedostot"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Pura"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Skripti"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2118,6 +2115,10 @@ msgstr "Tänään"
msgid "Total"
msgstr "Yhteensä"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Mukautettu"
#: sabnzbd/skintext.py
msgid "on"
msgstr "käytössä"
@@ -3376,10 +3377,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Jälkikäsittele vain onnistuneet lataukset"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Suorittaa jälkikäsittelyn vain niille latauksille jotka läpäisevät kaikki "
"PAR2 tarkistukset."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4138,7 +4140,7 @@ msgstr "Ilmoitus lähetetty!"
msgid "Enable NotifyOSD"
msgstr "NotifyOSD käytössä"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Ilmoituskeskus"
@@ -4796,10 +4798,6 @@ msgid "Glitter has some (new) features you might like!"
msgstr ""
"Glitter-teemassa on muutamia (uusia) ominaisuuksia joista saatat pitää!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Mukautettu"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Tiivis käyttöliittymä"

View File

@@ -3,14 +3,14 @@
# team@sabnzbd.org
#
# Translators:
# Fred L <88com88@gmail.com>, 2020
# Safihre <safihre@sabnzbd.org>, 2020
# Fred L <88com88@gmail.com>, 2020
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Last-Translator: Fred L <88com88@gmail.com>, 2020\n"
"Language-Team: French (https://www.transifex.com/sabnzbd/teams/111101/fr/)\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
@@ -18,6 +18,16 @@ msgstr ""
"Language: fr\n"
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Avertissement"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Erreur"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -125,16 +135,6 @@ msgstr "Impossible de démarrer l'interface web : "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s démarré"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Avertissement"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Erreur"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Arrêt de SABnzbd terminé"
@@ -368,10 +368,6 @@ msgstr "%s n'est pas une valeur octale correcte"
msgid "UNC path \"%s\" not allowed here"
msgstr "Le chemin UNC \"%s\" n'est pas autorisé ici"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Erreur : la longueur du chemin doit être inférieure à %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr ""
@@ -685,6 +681,8 @@ msgid ""
"The Completed Download Folder cannot be the same or a subfolder of the "
"Temporary Download Folder"
msgstr ""
"Le dossier des téléchargements terminés ne peut pas être le même dossier que"
" les téléchargements temporaires, ni être l'un de ses sous-dossiers"
#: sabnzbd/interface.py
msgid "Warning: LOCALHOST is ambiguous, use numerical IP-address."
@@ -812,15 +810,6 @@ msgstr ""
msgid "Running script"
msgstr "Exécution du script"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Post-traitement interrompu (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -882,11 +871,6 @@ msgstr "Échec de l'extraction, l'archive nécessite un mot de passe"
msgid "Unpacking"
msgstr "Extraction"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Décompresser"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Échec de l'extraction, %s n'a pas été trouvé"
@@ -1634,6 +1618,10 @@ msgstr "Échec du post-traitement pour %s (%s)"
msgid "see logfile"
msgstr "voir le journal"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr "Le post-traitement a été interrompu"
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Échec du téléchargement"
@@ -1740,10 +1728,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Description du flux RSS incorrecte \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Échec de la récupération RSS de %s : %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Vous n'avez pas d'authentification valide pour ce flux %s"
@@ -1753,10 +1737,15 @@ msgid "Server side error (server code %s); could not get %s on %s"
msgstr ""
"Erreur du côté serveur (code serveur %s) ; n'a pas pu obtenir %s sur %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Échec de la récupération RSS de %s : %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Le serveur %s utilise un certificat de sécurité HTTPS non authentifié"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "Le flux RSS %s était vide"
@@ -1846,6 +1835,16 @@ msgstr "Télécharger"
msgid "Join files"
msgstr "Concaténer"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Décompresser"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2178,6 +2177,10 @@ msgstr "Aujourd'hui"
msgid "Total"
msgstr "Total"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personnalisé"
#: sabnzbd/skintext.py
msgid "on"
msgstr "oui"
@@ -3459,10 +3462,14 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Ne post-traiter que les tâches vérifiées"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Limite le post-traitement aux tâches qui ont passé avec succès toutes les "
"vérifications PAR2."
"Décompresser et lancer les scripts uniquement sur les tâches qui ont passé "
"l'étape de vérification. Si désactivé, toutes les tâches seront marquées "
"comme Terminées même si elles sont incomplètes."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4254,7 +4261,7 @@ msgstr "Notification envoyée !"
msgid "Enable NotifyOSD"
msgstr "Activer NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Centre de notification"
@@ -4916,10 +4923,6 @@ msgid "Glitter has some (new) features you might like!"
msgstr ""
"Glitter a des (nouvelles) fonctionnalités que vous devriez apprécier !"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personnalisé"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Affichage compact"

View File

@@ -8,7 +8,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: ION, 2020\n"
"Language-Team: Hebrew (https://www.transifex.com/sabnzbd/teams/111101/he/)\n"
@@ -18,6 +18,16 @@ msgstr ""
"Language: he\n"
"Plural-Forms: nplurals=4; plural=(n == 1 && n % 1 == 0) ? 0 : (n == 2 && n % 1 == 0) ? 1: (n % 10 == 0 && n % 1 == 0 && n > 10) ? 2 : 3;\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "אזהרה"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "שגיאה"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -91,7 +101,7 @@ msgstr ""
#. Warning message
#: SABnzbd.py
msgid "Could not load additional certificates from certifi package"
msgstr ""
msgstr "לא היה ניתן לטעון תעודות נוספות מחבילת תעודות"
#. Warning message
#: SABnzbd.py
@@ -112,16 +122,6 @@ msgstr "נכשל בהתחלת ממשק רשת: "
msgid "SABnzbd %s started"
msgstr "התחיל SABnzbd %s"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "אזהרה"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "שגיאה"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "הסתיים SABnzbd כיבוי"
@@ -227,13 +227,14 @@ msgid ""
"Paused job \"%s\" because of encrypted RAR file (if supplied, all passwords "
"were tried)"
msgstr ""
"העבודה \"%s\" הושהתה בגלל קובץ RAR מוצפן (במקרה שסיסמאות סופקו, כולן נוסו)"
#. Warning message
#: sabnzbd/assembler.py
msgid ""
"Aborted job \"%s\" because of encrypted RAR file (if supplied, all passwords"
" were tried)"
msgstr ""
msgstr "העבודה \"%s\" בוטלה בגלל קובץ RAR מוצפן (במקרה שסיסמאות סופקו, כולן נוסו)"
#: sabnzbd/assembler.py
msgid "Aborted, encryption detected"
@@ -242,7 +243,7 @@ msgstr "בוטל, הצפנה התגלתה"
#. Warning message
#: sabnzbd/assembler.py
msgid "In \"%s\" unwanted extension in RAR file. Unwanted file is %s "
msgstr ""
msgstr "בעבודה \"%s\" יש סיומת בלתי רצויה בתוך קובץ RAR. הקובץ הבלתי רצוי הוא %s"
#: sabnzbd/assembler.py
msgid "Unwanted extension is in rar file %s"
@@ -255,12 +256,12 @@ msgstr "בוטל, סיומת בלתי רצויה התגלתה"
#. Warning message
#: sabnzbd/assembler.py
msgid "Paused job \"%s\" because of rating (%s)"
msgstr ""
msgstr "העבודה \"%s\" הושהתה בגלל דירוג (%s)"
#. Warning message
#: sabnzbd/assembler.py
msgid "Aborted job \"%s\" because of rating (%s)"
msgstr ""
msgstr "העבודה \"%s\" בוטלה בגלל דירוג (%s)"
#: sabnzbd/assembler.py
msgid "Aborted, rating filter matched (%s)"
@@ -345,10 +346,6 @@ msgstr "%s אינו ערך אוקטלי נכון"
msgid "UNC path \"%s\" not allowed here"
msgstr "אינו מותר כאן \"%s\" UNC נתיב"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr ".שגיאה: אורך הנתיב צריך להיות מתחת אל %s"
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr ".שגיאה: התור אינו ריק, לא יכול לשנות תיקייה"
@@ -652,6 +649,8 @@ msgid ""
"The Completed Download Folder cannot be the same or a subfolder of the "
"Temporary Download Folder"
msgstr ""
"תיקיית ההורדות השלמות אינה יכולה להיות אותה תיקייה או תת־תיקייה של תיקיית "
"ההורדות הזמניות"
#: sabnzbd/interface.py
msgid "Warning: LOCALHOST is ambiguous, use numerical IP-address."
@@ -773,15 +772,6 @@ msgstr "(+x) אין ערכת הרשאות ביצוע \"%s\" לתסריט פיי
msgid "Running script"
msgstr "מריץ תסריט"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "(%s) בתר־עיבוד בוטל"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "תסריט"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -843,11 +833,6 @@ msgstr "פריקה נכשלה, ארכיון דורש סיסמה"
msgid "Unpacking"
msgstr "פורק"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "פרוק"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "%s פריקה נכשלה, לא היה ניתן למצוא את"
@@ -1578,6 +1563,10 @@ msgstr "%s (%s) בתר־עיבוד נכשל עבור"
msgid "see logfile"
msgstr "ראה קובץ יומן"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr "בתר־עיבוד בוטל"
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "הורדה נכשלה"
@@ -1682,10 +1671,6 @@ msgstr ".מפתח זה מספק זהות למדדן. בדוק את המתאר ש
msgid "Incorrect RSS feed description \"%s\""
msgstr "\"%s\" לא נכון RSS תיאור הזנת"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "%s: %s מן RSS נכשל באחזור"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "%s אין אימות תקף עבור ההזנה"
@@ -1694,10 +1679,15 @@ msgstr "%s אין אימות תקף עבור ההזנה"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "%s על %s שגיאה צדדית של שרת (קוד שרת %s); לא היה ניתן להשיג את"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "%s: %s מן RSS נכשל באחזור"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "בלתי מהימן HTTPS משתמש באישור %s השרת"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "הייתה ריקה %s RSS הזנת"
@@ -1787,6 +1777,16 @@ msgstr "הורדה"
msgid "Join files"
msgstr "אחד קבצים"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "פרוק"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "תסריט"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2119,6 +2119,10 @@ msgstr "היום"
msgid "Total"
msgstr "סה״כ"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "מותאם אישית"
#: sabnzbd/skintext.py
msgid "on"
msgstr "פועל"
@@ -3362,8 +3366,13 @@ msgid "Post-Process Only Verified Jobs"
msgstr "בצע בתר־עיבוד רק על עבודות שוודאו"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr ".PAR2-בצע בתר־עיבוד רק בעבודות שעברו את כל בדיקות ה"
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"פרוק והרץ רק על עבודות שעברו את שלב הוידוא. אם מכובה, כל העבודות יסומנו "
"כשלמות אפילו אם הן בלתי שלמות."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -3611,13 +3620,15 @@ msgstr "מחק לאחר הורדה"
#: sabnzbd/skintext.py
msgid "Deobfuscate final filenames"
msgstr ""
msgstr "בטל ערפול של שמות קובץ סופיים"
#: sabnzbd/skintext.py
msgid ""
"If filenames of (large) files in the final folder look obfuscated or "
"meaningless they will be renamed to the job name."
msgstr ""
"אם שמות קבצים של קבצים (גדולים) בתיקייה הסופית נראים מעורפלים או חסרי "
"משמעות, שמותיהם ישונו אל שם העבודה."
#: sabnzbd/skintext.py
msgid "HTTPS certificate verification"
@@ -4116,7 +4127,7 @@ msgstr "!התראה נשלחה"
msgid "Enable NotifyOSD"
msgstr "NotifyOSD אפשר"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "מרכז ההתראות"
@@ -4766,10 +4777,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr "!יש מספר מאפיינים (חדשים) שאתה עשוי לאהוב Glitter אל"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "מותאם אישית"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "פריסה צפופה"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Norwegian Bokmål (https://www.transifex.com/sabnzbd/teams/111101/nb/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: nb\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advarsel"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Feil"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -114,16 +124,6 @@ msgstr "Kunne ikke starte webgrensesnittet: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s startet"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Advarsel"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Feil"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd er nå avsluttet"
@@ -347,10 +347,6 @@ msgstr "%s er ikke en korrekt oktal verdi"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC-sti \"%s\" er ikke tillatt her"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Feil: Fillengde bør være kortere enn %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Feil: Køen er ikke tom, kan ikke bytte mappe."
@@ -773,15 +769,6 @@ msgstr ""
msgid "Running script"
msgstr "Kjører skript"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Etterbehandling ble avbrutt (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -843,11 +830,6 @@ msgstr "Utpakking mislyktes, arkivet krever passord"
msgid "Unpacking"
msgstr "Utpakker"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Utpakking"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Utpakking mislyktes, kunne ikke finne %s"
@@ -1574,6 +1556,10 @@ msgstr "Etterbehandling mislyktes for %s (%s)"
msgid "see logfile"
msgstr "se loggfil"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Nedlasting mislyktes"
@@ -1678,10 +1664,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Feilaktig RSS-kilde beskrivelse \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Kunne ikke hente RSS-kilde fra %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Ugyldig autentisering for nyhetsstrøm %s"
@@ -1690,10 +1672,15 @@ msgstr "Ugyldig autentisering for nyhetsstrøm %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Serverside-feil (serverkode %s); kunne ikke hente %s på %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Kunne ikke hente RSS-kilde fra %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Server %s bruker et usikkert HTTP sertifikat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS-kilde %s var tom"
@@ -1783,6 +1770,16 @@ msgstr "Nedlastning"
msgid "Join files"
msgstr "Slå sammen filer"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Utpakking"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2115,6 +2112,10 @@ msgstr "I dag"
msgid "Total"
msgstr "Totalt"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Tilpasse"
#: sabnzbd/skintext.py
msgid "on"
msgstr "på"
@@ -3359,8 +3360,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Etterbehandle kun verifiserte nedlastinger"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr "Etterbehandle kun nedlastinger som har passert PAR2 kontrollen."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4112,7 +4116,7 @@ msgstr "Varsel sendt!"
msgid "Enable NotifyOSD"
msgstr "Aktiver NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Varselsenter"
@@ -4762,10 +4766,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Tilpasse"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Dutch (https://www.transifex.com/sabnzbd/teams/111101/nl/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: nl\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Waarschuwing"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fout"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -118,16 +128,6 @@ msgstr "Webinterface kon niet gestart worden: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s is gestart"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Waarschuwing"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fout"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd is afgesloten"
@@ -359,10 +359,6 @@ msgstr "%s is geen correct octaal getal"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC-pad '%s' hier niet toegestaan."
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Fout: het opgegeven pad mag niet langer zijn dan %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Fout: Wachtrij is niet leeg, andere map kiezen niet mogelijk."
@@ -674,6 +670,8 @@ msgid ""
"The Completed Download Folder cannot be the same or a subfolder of the "
"Temporary Download Folder"
msgstr ""
"De Map voor verwerkte downloads mag niet een map in de Tijdelijke download "
"map zijn."
#: sabnzbd/interface.py
msgid "Warning: LOCALHOST is ambiguous, use numerical IP-address."
@@ -798,15 +796,6 @@ msgstr "Python-script '%s' heeft geen uitvoerpermissie (+x)"
msgid "Running script"
msgstr "Script uitvoeren"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Nabewerking is afgebroken (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -868,11 +857,6 @@ msgstr "Uitpakken mislukt, archief vereist wachtwoord"
msgid "Unpacking"
msgstr "Uitpakken"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Uitpakken"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Uitpakken mislukt, kan %s niet vinden"
@@ -1609,6 +1593,10 @@ msgstr "Nabewerking van %s mislukt (%s)"
msgid "see logfile"
msgstr "zie logbestand"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr "Nabewerking is afgebroken"
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Download mislukt"
@@ -1715,10 +1703,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Foutieve RSS-feed definitie \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Kan RSS-feed \"%s\" niet lezen vanwege: \"%s\""
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Geen geldige inlog gegevens beschikbaar voor RSS-feed %s"
@@ -1727,10 +1711,15 @@ msgstr "Geen geldige inlog gegevens beschikbaar voor RSS-feed %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Server fout (code is %s); kon geen %s van %s krijgen"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Kan RSS-feed \"%s\" niet lezen vanwege: \"%s\""
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Server %s gebruikt een onbetrouwbaar HTTPS-certificaat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS-feed %s is leeg"
@@ -1820,6 +1809,16 @@ msgstr "Download"
msgid "Join files"
msgstr "Samenvoegen"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Uitpakken"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2152,6 +2151,10 @@ msgstr "Vandaag"
msgid "Total"
msgstr "Totaal"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Aangepast"
#: sabnzbd/skintext.py
msgid "on"
msgstr "aan"
@@ -3418,10 +3421,14 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Verwerk alleen correct geverifieerde downloads"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Voer de nabewerking alleen uit op downloads die de PAR2 controles hebben "
"doorlopen."
"Uitpakken en scripts worden alleen uitgevoerd op opdrachten die succesvol "
"geverifieerd zijn. Als deze optie uitgeschakeld is zullen alle opdrachten "
"gemarkeerd worden als succesvol, zelfs als dat niet zo is."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4208,7 +4215,7 @@ msgstr "Melding verzonden"
msgid "Enable NotifyOSD"
msgstr "NotifyOSD activeren"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Berichtencentrum"
@@ -4864,10 +4871,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr "Glitter heeft enkele (nieuwe) functies die je mogelijk aanspreken!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Aangepast"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Compacte weergave"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Polish (https://www.transifex.com/sabnzbd/teams/111101/pl/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: pl\n"
"Plural-Forms: nplurals=4; plural=(n==1 ? 0 : (n%10>=2 && n%10<=4) && (n%100<12 || n%100>14) ? 1 : n!=1 && (n%10>=0 && n%10<=1) || (n%10>=5 && n%10<=9) || (n%100>=12 && n%100<=14) ? 2 : 3);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Ostrzeżenie"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Błąd"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -109,16 +119,6 @@ msgstr "Nie udało się uruchomić interfejsu WWW: "
msgid "SABnzbd %s started"
msgstr "Uruchomiono SABnzbd %s"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Ostrzeżenie"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Błąd"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd został wyłączony"
@@ -342,10 +342,6 @@ msgstr "%s nie jest prawidłową wartością w systemie ósemkowym"
msgid "UNC path \"%s\" not allowed here"
msgstr "Ścieżka UNC \"%s\" niedozwolona"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Błąd: Długość ścieżki powinna być mniejsza niż %s"
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Błąd: Kolejka nie jest pusta, nie można zmienić katalogu."
@@ -772,15 +768,6 @@ msgstr ""
msgid "Running script"
msgstr "Uruchamianie skryptu"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Przetwarzanie końcowe zostało przerwane (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Skrypt"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -842,11 +829,6 @@ msgstr "Rozpakowywanie nie powiodło się, archiwum wymaga podania hasła"
msgid "Unpacking"
msgstr "Rozpakowywanie"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Rozpakuj"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Rozpakowywanie nie powiodło się, nie można znaleźć %s"
@@ -1579,6 +1561,10 @@ msgstr "Przetwarzanie końcowe nie powiodło się dla %s (%s)"
msgid "see logfile"
msgstr "sprawdź logi"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Pobieranie nie powiodło się"
@@ -1683,10 +1669,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Nieprawidłowy opis kanału RSS \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Nie udało się pobrać RSS z %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Brak poprawnego uwierzytelnienia dla kanału %s"
@@ -1695,10 +1677,15 @@ msgstr "Brak poprawnego uwierzytelnienia dla kanału %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Błąd po stronie serwera (kod: %s); nie udało się pobrać %s z %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Nie udało się pobrać RSS z %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Serwer %s używa niezaufanego certyfikatu HTTPS"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "Kanał RSS %s był pusty"
@@ -1788,6 +1775,16 @@ msgstr "Pobierz"
msgid "Join files"
msgstr "Połącz pliki"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Rozpakuj"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Skrypt"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2120,6 +2117,10 @@ msgstr "Dzisiaj"
msgid "Total"
msgstr "Razem"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Własny"
#: sabnzbd/skintext.py
msgid "on"
msgstr "włączone"
@@ -3365,10 +3366,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Przetwarzanie końcowe tylko dla zweryfikowanych zadań"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Uruchom przetwarzanie końcowe tylko dla zadań, które zostały sprawdzone przy"
" użyciu PAR2"
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4124,7 +4126,7 @@ msgstr "Wysłano powiadomienie!"
msgid "Enable NotifyOSD"
msgstr "Włącz NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Centrum powiadomień"
@@ -4774,10 +4776,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Własny"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Portuguese (Brazil) (https://www.transifex.com/sabnzbd/teams/111101/pt_BR/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: pt_BR\n"
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Alerta"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Erro"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -113,16 +123,6 @@ msgstr "Falha ao iniciar a interface web "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s iniciado"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Alerta"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Erro"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Encerramento do SABnzbd concluído"
@@ -346,10 +346,6 @@ msgstr "%s não é um valor octal correto"
msgid "UNC path \"%s\" not allowed here"
msgstr "O caminho UNC \"%s\" não é permitido aqui"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Erro: Tamanho do caminho deve ser menor que %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Erro: A fila não está vazia. Não será possível mudar de pasta."
@@ -776,15 +772,6 @@ msgstr ""
msgid "Running script"
msgstr "Executando script"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "O pós-processamento foi cancelado (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -846,11 +833,6 @@ msgstr "A descompactação falhou. O arquivo exige uma senha"
msgid "Unpacking"
msgstr "Descompactando"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Descompactar"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "A descompactação falhou. Não foi possível encontrar %s"
@@ -1581,6 +1563,10 @@ msgstr "O pós-processamento falhou para %s (%s)"
msgid "see logfile"
msgstr "veja o arquivo de log"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "O download falhou"
@@ -1685,10 +1671,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Descrição de feed RSS incorreta \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Falha ao obter RSS de %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Não há autenticação válida para o feed %s"
@@ -1698,10 +1680,15 @@ msgid "Server side error (server code %s); could not get %s on %s"
msgstr ""
"Erro do servidor (código do servidor %s); não foi possível obter %s de %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Falha ao obter RSS de %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Servidor %s usa um certificado HTTPS não confiável"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "O feed RSS %s estava vazio"
@@ -1791,6 +1778,16 @@ msgstr "Download"
msgid "Join files"
msgstr "Unir arquivos"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Descompactar"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2123,6 +2120,10 @@ msgstr "Hoje"
msgid "Total"
msgstr "Total"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizado"
#: sabnzbd/skintext.py
msgid "on"
msgstr "ligado"
@@ -3370,10 +3371,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Pós-processar apenas os trabalhos verificados"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Realizar pós-processamento apenas em trabalhos que passaram todas as "
"verificações PAR2."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4127,7 +4129,7 @@ msgstr "Notificação Enviada!"
msgid "Enable NotifyOSD"
msgstr "Habilitar NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Centro de Notificações"
@@ -4777,10 +4779,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizado"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Romanian (https://www.transifex.com/sabnzbd/teams/111101/ro/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: ro\n"
"Plural-Forms: nplurals=3; plural=(n==1?0:(((n%100>19)||((n%100==0)&&(n!=0)))?2:1));\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Avertisment"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Eroare"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -113,16 +123,6 @@ msgstr "Nu am putu porni interfața web: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s pornit"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Avertisment"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Eroare"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Închidere SABnzbd terminată"
@@ -346,10 +346,6 @@ msgstr "%s nu este o valoare octală corectă"
msgid "UNC path \"%s\" not allowed here"
msgstr "cale UNC \"%s\" nu este premisă aici"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Eroare: Lungimea cale ar trebuie să fie sub %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Eroare: Coada nu este goală, nu pot schimba dosar."
@@ -775,15 +771,6 @@ msgstr ""
msgid "Running script"
msgstr "Rulare script"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Post-Procesarea a fost abandonată (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -845,11 +832,6 @@ msgstr "Dezarhivare nereuşită, arhiva necesită o parolă"
msgid "Unpacking"
msgstr "Dezarhivare"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Dezarhivează"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Dezarhivare nereuşită, nu pot găsi %s"
@@ -1580,6 +1562,10 @@ msgstr "Post Procesare Nereuşită pentru %s (%s)"
msgid "see logfile"
msgstr "vezi fişier jurnal"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Descărcarea a eșuat"
@@ -1684,10 +1670,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Descriere flux RSS incorectă \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Descărcare %s: %s din RSS nereuşită"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Autentificare invalida pentru flux %s"
@@ -1696,10 +1678,15 @@ msgstr "Autentificare invalida pentru flux %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Eroare la server (codul server %s); nu am putu lua %s în data de %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Descărcare %s: %s din RSS nereuşită"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Serverul %s utilizează un certificat HTTPS nesigur"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "Fluxul RSS %s a fost gol"
@@ -1789,6 +1776,16 @@ msgstr "Descarcă"
msgid "Join files"
msgstr "Uneşte fişierele"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Dezarhivează"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Script"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2121,6 +2118,10 @@ msgstr "Azi"
msgid "Total"
msgstr "Total"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizat"
#: sabnzbd/skintext.py
msgid "on"
msgstr "activat"
@@ -3362,9 +3363,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Post-Procesează Doar Sarcinile Verificate"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Execută post-procesarea doar dacă sarcina a trecut toate verificările PAR2."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4122,7 +4125,7 @@ msgstr "Notificare Trimisă!"
msgid "Enable NotifyOSD"
msgstr "Activează NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Centru Notificări"
@@ -4772,10 +4775,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Personalizat"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "Aspect compact"

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Russian (https://www.transifex.com/sabnzbd/teams/111101/ru/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: ru\n"
"Plural-Forms: nplurals=4; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && n%10<=4 && (n%100<12 || n%100>14) ? 1 : n%10==0 || (n%10>=5 && n%10<=9) || (n%100>=11 && n%100<=14)? 2 : 3);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Предупреждение"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr ""
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -113,16 +123,6 @@ msgstr ""
msgid "SABnzbd %s started"
msgstr ""
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Предупреждение"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr ""
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Завершение работы SABnzbd закончено"
@@ -346,10 +346,6 @@ msgstr "%s не является правильным восьмеричным
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC-путь «%s» здесь не допускается"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr ""
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Ошибка: очередь не пустая, папку нельзя изменить."
@@ -774,15 +770,6 @@ msgstr ""
msgid "Running script"
msgstr "Запуск сценария"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Пост-обработка была прервана (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Сценарий"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -844,11 +831,6 @@ msgstr "Ошибка распаковки: архив защищён парол
msgid "Unpacking"
msgstr "Распаковка"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Распаковать"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Ошибка распаковки: не удаётся найти %s"
@@ -1578,6 +1560,10 @@ msgstr "Ошибка пост-обработки для %s (%s)"
msgid "see logfile"
msgstr "см. журнал"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Не удалось загрузить"
@@ -1682,10 +1668,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Неправильное описание RSS-ленты «%s»"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Не удалось получить RSS-ленту из %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Неправильные учётные данные для ленты %s"
@@ -1694,10 +1676,15 @@ msgstr "Неправильные учётные данные для ленты %
msgid "Server side error (server code %s); could not get %s on %s"
msgstr ""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Не удалось получить RSS-ленту из %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr ""
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS-лента %s была пустой"
@@ -1787,6 +1774,16 @@ msgstr "Загрузить"
msgid "Join files"
msgstr "Объединить файлы"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Распаковать"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Сценарий"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2119,6 +2116,10 @@ msgstr "за сегодня"
msgid "Total"
msgstr "всего"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Другой"
#: sabnzbd/skintext.py
msgid "on"
msgstr "на"
@@ -3362,8 +3363,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Обрабатывать только проверенные задания"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr "Обрабатывать только задания, успешно прошедшие все проверки PAR2."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4117,7 +4121,7 @@ msgstr "Уведомление отправлено"
msgid "Enable NotifyOSD"
msgstr "Использовать NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr ""
@@ -4765,10 +4769,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Другой"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Serbian (https://www.transifex.com/sabnzbd/teams/111101/sr/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: sr\n"
"Plural-Forms: nplurals=3; plural=(n%10==1 && n%100!=11 ? 0 : n%10>=2 && n%10<=4 && (n%100<10 || n%100>=20) ? 1 : 2);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Упозорење"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Грeшкa"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -111,16 +121,6 @@ msgstr "Neuspešno pokretanje web interfejsa: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s покренут"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Упозорење"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Грeшкa"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "Гашење SABnzbd је завршено"
@@ -344,10 +344,6 @@ msgstr "%s nije ispravna oktalna vrednost"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC путања \"%s\" није дозвољена"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Greška: Dužina putanje bi trebala biti ispod %s"
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Грешка: ред није празан, фасцикла се не може променити."
@@ -769,15 +765,6 @@ msgstr ""
msgid "Running script"
msgstr "Покретање скрипта"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Пост-процесирање је заустављено (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Скрипт"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -839,11 +826,6 @@ msgstr "Neuspešno raspakivanje, arhiva zahteva lozinku"
msgid "Unpacking"
msgstr "Распакивање"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Распакуј"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Погрешно распакивање, не може да се нађе %s"
@@ -1570,6 +1552,10 @@ msgstr "Грешка пост-процесирања за %s (%s)"
msgid "see logfile"
msgstr "видети извештај"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Неуспешно преузимање"
@@ -1674,10 +1660,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Погрешан опис RSS фида \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Неуспешно преузимање RSS од %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Немам важећу аутентификацију за фид %s"
@@ -1686,10 +1668,15 @@ msgstr "Немам важећу аутентификацију за фид %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Greška na strani servera (kod greške %s); nemoguće dobiti %s na %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Неуспешно преузимање RSS од %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Server %s koristi nepouzdan HTTPS sertifikat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS фид %s је празан"
@@ -1779,6 +1766,16 @@ msgstr "Преузми"
msgid "Join files"
msgstr "Прилепити датотеке"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Распакуј"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Скрипт"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2111,6 +2108,10 @@ msgstr "Данас"
msgid "Total"
msgstr "Укупно"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Прилагођено"
#: sabnzbd/skintext.py
msgid "on"
msgstr "укљ."
@@ -3350,9 +3351,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Пост-процесирај само проверени послови"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
"Огранићи пост-процесирање само за радове који су прешли све PAR2 провере."
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4099,7 +4102,7 @@ msgstr "Обавештење послато!"
msgid "Enable NotifyOSD"
msgstr "Упали „NotifyOSD“"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Центар за обавештења"
@@ -4748,10 +4751,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Прилагођено"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Swedish (https://www.transifex.com/sabnzbd/teams/111101/sv/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: sv\n"
"Plural-Forms: nplurals=2; plural=(n != 1);\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Varning"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fel"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -111,16 +121,6 @@ msgstr "Misslyckades att starta webbgränsnitt: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s startad"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "Varning"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "Fel"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd nedstängning utförd."
@@ -344,10 +344,6 @@ msgstr "%s är inte rätt siffervärde"
msgid "UNC path \"%s\" not allowed here"
msgstr "UNC sökväg \"%s\" är inte tillåten här"
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "Fel: Sökvägen skall vara under %s."
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "Fel: Kön är inte tom, kan inte byta mapp."
@@ -771,15 +767,6 @@ msgstr ""
msgid "Running script"
msgstr "Kör skript"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "Efterbehandling avbröts (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -841,11 +828,6 @@ msgstr "Uppackning misslyckades, arkivet kräver lösenord"
msgid "Unpacking"
msgstr "Packar upp"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "Packa upp"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "Uppackning misslyckades, gick inte att hitta %s"
@@ -1576,6 +1558,10 @@ msgstr "Efterbehandling misslyckades för %s (%s)"
msgid "see logfile"
msgstr "se loggfil"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "Hämtning misslyckades"
@@ -1680,10 +1666,6 @@ msgstr ""
msgid "Incorrect RSS feed description \"%s\""
msgstr "Felaktigt RSS-flödesbeskrivning \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Det gick inte att hämta RSS flödet från %s: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "Har inte giltig autentisering för flöde %s"
@@ -1692,10 +1674,15 @@ msgstr "Har inte giltig autentisering för flöde %s"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "Server fel (serverkod %s); kunde inte få %s på %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "Det gick inte att hämta RSS flödet från %s: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "Server %s använder ett otillförlitlig HTTPS-certifikat"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS-flödet %s var tomt"
@@ -1785,6 +1772,16 @@ msgstr "Nedladdning"
msgid "Join files"
msgstr "Slår ihop filer"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "Packa upp"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "Skript"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2117,6 +2114,10 @@ msgstr "I dag"
msgid "Total"
msgstr "Totalt"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Anpassa"
#: sabnzbd/skintext.py
msgid "on"
msgstr "den"
@@ -3359,8 +3360,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "Efterbehandla endast verifierade jobb"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr "Efterbehandla enbart jobb som passerat PAR2 kontrollen."
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4111,7 +4115,7 @@ msgstr "Notis skickad!"
msgid "Enable NotifyOSD"
msgstr "Aktivera NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "Meddelandecenter"
@@ -4759,10 +4763,6 @@ msgstr ""
msgid "Glitter has some (new) features you might like!"
msgstr ""
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "Anpassa"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr ""

View File

@@ -7,7 +7,7 @@
#
msgid ""
msgstr ""
"Project-Id-Version: SABnzbd-3.1.0-develop\n"
"Project-Id-Version: SABnzbd-3.2.0-develop\n"
"PO-Revision-Date: 2020-06-27 15:49+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>, 2020\n"
"Language-Team: Chinese (China) (https://www.transifex.com/sabnzbd/teams/111101/zh_CN/)\n"
@@ -17,6 +17,16 @@ msgstr ""
"Language: zh_CN\n"
"Plural-Forms: nplurals=1; plural=0;\n"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "警告"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "错误"
#. Error message
#: SABnzbd.py
msgid "Failed to start web-interface"
@@ -109,16 +119,6 @@ msgstr "无法启动 web 界面: "
msgid "SABnzbd %s started"
msgstr "SABnzbd %s 已启动"
#. Notification - Status page, table column header, actual message
#: SABnzbd.py, sabnzbd/notifier.py, sabnzbd/skintext.py
msgid "Warning"
msgstr "警告"
#. Notification
#: SABnzbd.py, sabnzbd/notifier.py
msgid "Error"
msgstr "错误"
#: SABnzbd.py, sabnzbd/interface.py
msgid "SABnzbd shutdown finished"
msgstr "SABnzbd 关闭完成"
@@ -342,10 +342,6 @@ msgstr "%s 不是有效的八进制值"
msgid "UNC path \"%s\" not allowed here"
msgstr "此处不允许使用 UNC 路径 \"%s\""
#: sabnzbd/config.py
msgid "Error: Path length should be below %s."
msgstr "错误: 路径长度应不超过 %s。"
#: sabnzbd/config.py
msgid "Error: Queue not empty, cannot change folder."
msgstr "错误: 队列非空,无法变更文件夹。"
@@ -761,15 +757,6 @@ msgstr "Python 脚本 \"%s\" 不具有执行 (+x) 权限"
msgid "Running script"
msgstr "正在执行脚本"
#: sabnzbd/newsunpack.py, sabnzbd/postproc.py
msgid "PostProcessing was aborted (%s)"
msgstr "后期处理已中止 (%s)"
#. PP phase "script" - Notification Script settings
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Script"
msgstr "脚本"
#. Warning message
#: sabnzbd/newsunpack.py
msgid "Unpack nesting too deep [%s]"
@@ -831,11 +818,6 @@ msgstr "解压失败,压缩文件需要密码"
msgid "Unpacking"
msgstr "正在解压"
#. PP phase "unpack"
#: sabnzbd/newsunpack.py, sabnzbd/skintext.py
msgid "Unpack"
msgstr "解压"
#: sabnzbd/newsunpack.py
msgid "Unpacking failed, unable to find %s"
msgstr "解压失败,找不到 %s"
@@ -1560,6 +1542,10 @@ msgstr "后期处理失败:%s (%s)"
msgid "see logfile"
msgstr "查看日志文件"
#: sabnzbd/postproc.py
msgid "Post-processing was aborted"
msgstr ""
#: sabnzbd/postproc.py
msgid "Download Failed"
msgstr "下载失败"
@@ -1664,10 +1650,6 @@ msgstr "这个密钥用来向服务器表明身份。查看您在索引网站上
msgid "Incorrect RSS feed description \"%s\""
msgstr "RSS feed 描述不正确 \"%s\""
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "无法检索 %s 的 RSS: %s"
#: sabnzbd/rss.py
msgid "Do not have valid authentication for feed %s"
msgstr "feed %s 无有效的身份认证凭据"
@@ -1676,10 +1658,15 @@ msgstr "feed %s 无有效的身份认证凭据"
msgid "Server side error (server code %s); could not get %s on %s"
msgstr "服务器端错误 (服务器代码 %s);无法获取 %s (服务器 %s"
#: sabnzbd/rss.py
msgid "Failed to retrieve RSS from %s: %s"
msgstr "无法检索 %s 的 RSS: %s"
#: sabnzbd/rss.py, sabnzbd/urlgrabber.py
msgid "Server %s uses an untrusted HTTPS certificate"
msgstr "服务器 %s 使用的 HTTPS 证书不受信任"
#. Warning message
#: sabnzbd/rss.py
msgid "RSS Feed %s was empty"
msgstr "RSS Feed %s 为空"
@@ -1769,6 +1756,16 @@ msgstr "下载"
msgid "Join files"
msgstr "合并文件"
#. PP phase "unpack"
#: sabnzbd/skintext.py
msgid "Unpack"
msgstr "解压"
#. PP phase "script" - Notification Script settings
#: sabnzbd/skintext.py
msgid "Script"
msgstr "脚本"
#. PP Source of the NZB (path or URL) - Where to find the SABnzbd sourcecode
#: sabnzbd/skintext.py
msgid "Source"
@@ -2101,6 +2098,10 @@ msgstr "今天"
msgid "Total"
msgstr "总计"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "自定义"
#: sabnzbd/skintext.py
msgid "on"
msgstr "开"
@@ -3306,8 +3307,11 @@ msgid "Post-Process Only Verified Jobs"
msgstr "仅对经验证的任务进行后期处理"
#: sabnzbd/skintext.py
msgid "Only perform post-processing on jobs that passed all PAR2 checks."
msgstr "仅对通过全部 PAR2 检查的任务执行后期处理。"
msgid ""
"Only unpack and run scripts on jobs that passed the verification stage. If "
"turned off, all jobs will be marked as Completed even if they are "
"incomplete."
msgstr ""
#: sabnzbd/skintext.py
msgid "Action when encrypted RAR is downloaded"
@@ -4043,7 +4047,7 @@ msgstr "通知已发送!"
msgid "Enable NotifyOSD"
msgstr "启用NotifyOSD"
#. Header for OSX Notfication Center section
#. Header for macOS Notfication Center section
#: sabnzbd/skintext.py
msgid "Notification Center"
msgstr "通知中心"
@@ -4690,10 +4694,6 @@ msgstr "您的浏览器已禁用 LocalStorage (cookies)。界面设置将在您
msgid "Glitter has some (new) features you might like!"
msgstr "你可能会喜欢一些 Glitter 的(新)功能!"
#: sabnzbd/skintext.py
msgid "Custom"
msgstr "自定义"
#: sabnzbd/skintext.py
msgid "Compact layout"
msgstr "精简外观"

View File

@@ -1,8 +1,7 @@
sabyenc3>=4.0.0
cheetah3>=3.0.0
cryptography
feedparser<6.0.0; python_version == '3.5'
feedparser>=6.0.0; python_version > '3.5'
feedparser>=6.0.0
configobj
cheroot<8.4.3
cherrypy

View File

@@ -15,9 +15,6 @@
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
# Imported to be referenced from other files directly
from sabnzbd.version import __version__, __baseline__
import os
import logging
import datetime
@@ -28,9 +25,9 @@ import time
import socket
import cherrypy
import sys
import re
import ssl
from threading import Lock, Thread
from typing import Any, AnyStr
##############################################################################
# Determine platform flags
@@ -73,35 +70,40 @@ elif os.name == "posix":
except:
pass
# Imported to be referenced from other files directly
from sabnzbd.version import __version__, __baseline__
# Now we can import safely
from sabnzbd.nzbqueue import NzbQueue
from sabnzbd.postproc import PostProcessor
from sabnzbd.downloader import Downloader
from sabnzbd.decoder import Decoder
from sabnzbd.assembler import Assembler
from sabnzbd.rating import Rating
import sabnzbd.misc as misc
import sabnzbd.filesystem as filesystem
import sabnzbd.powersup as powersup
from sabnzbd.dirscanner import DirScanner
from sabnzbd.urlgrabber import URLGrabber
import sabnzbd.scheduler as scheduler
import sabnzbd.rss as rss
import sabnzbd.emailer as emailer
from sabnzbd.articlecache import ArticleCache
import sabnzbd.newsunpack
import sabnzbd.encoding as encoding
import sabnzbd.config as config
from sabnzbd.bpsmeter import BPSMeter
import sabnzbd.cfg as cfg
import sabnzbd.database
import sabnzbd.lang as lang
import sabnzbd.par2file as par2file
import sabnzbd.nzbparser as nzbparser
import sabnzbd.nzbstuff
import sabnzbd.getipaddress
import sabnzbd.newsunpack
import sabnzbd.par2file
import sabnzbd.api
import sabnzbd.interface
import sabnzbd.nzbstuff as nzbstuff
import sabnzbd.zconfig
import sabnzbd.directunpacker as directunpacker
import sabnzbd.dirscanner
import sabnzbd.urlgrabber
import sabnzbd.nzbqueue
import sabnzbd.postproc
import sabnzbd.downloader
import sabnzbd.decoder
import sabnzbd.assembler
import sabnzbd.rating
import sabnzbd.articlecache
import sabnzbd.bpsmeter
import sabnzbd.scheduler as scheduler
from sabnzbd.decorators import synchronized
from sabnzbd.constants import (
DEFAULT_PRIORITY,
@@ -111,18 +113,29 @@ from sabnzbd.constants import (
QUEUE_VERSION,
QUEUE_FILE_TMPL,
)
import sabnzbd.getipaddress as getipaddress
import sabnzbd.utils.ssdp
LINUX_POWER = powersup.HAVE_DBUS
# Storage for the threads, variables are filled during initialization
ArticleCache: sabnzbd.articlecache.ArticleCache
Rating: sabnzbd.rating.Rating
Assembler: sabnzbd.assembler.Assembler
Decoder: sabnzbd.decoder.Decoder
Downloader: sabnzbd.downloader.Downloader
PostProcessor: sabnzbd.postproc.PostProcessor
NzbQueue: sabnzbd.nzbqueue.NzbQueue
URLGrabber: sabnzbd.urlgrabber.URLGrabber
DirScanner: sabnzbd.dirscanner.DirScanner
BPSMeter: sabnzbd.bpsmeter.BPSMeter
RSSReader: sabnzbd.rss.RSSReader
Scheduler: sabnzbd.scheduler.Scheduler
# Regular constants
START = datetime.datetime.now()
MY_NAME = None
MY_FULLNAME = None
RESTART_ARGS = []
NEW_VERSION = (None, None)
DIR_HOME = None
DIR_APPDATA = None
DIR_LCLDATA = None
DIR_PROG = None
DIR_INTERFACES = None
@@ -134,6 +147,7 @@ QUEUECOMPLETEACTION = None # stores the name of the function to be called
QUEUECOMPLETEARG = None # stores an extra arguments that need to be passed
DAEMON = None
LINUX_POWER = powersup.HAVE_DBUS
LOGFILE = None
WEBLOGFILE = None
@@ -157,8 +171,6 @@ PAUSED_ALL = False
TRIGGER_RESTART = False # To trigger restart for Scheduler, WinService and Mac
WINTRAY = None # Thread for the Windows SysTray icon
WEBUI_READY = False
LAST_WARNING = None
LAST_ERROR = None
EXTERNAL_IPV6 = False
LAST_HISTORY_UPDATE = 1
@@ -168,6 +180,7 @@ DOWNLOAD_DIR_SPEED = 0
COMPLETE_DIR_SPEED = 0
INTERNET_BANDWIDTH = 0
# Rendering of original command line arguments in Config
CMDLINE = " ".join(['"%s"' % arg for arg in sys.argv])
@@ -179,7 +192,6 @@ __SHUTTING_DOWN__ = False
# Signal Handler
##############################################################################
def sig_handler(signum=None, frame=None):
global SABSTOP, WINTRAY
if sabnzbd.WIN32 and signum is not None and DAEMON and signum == 5:
# Ignore the "logoff" event when running as a Win32 daemon
return True
@@ -196,7 +208,7 @@ def sig_handler(signum=None, frame=None):
time.sleep(0.5)
else:
pid_file()
SABSTOP = True
sabnzbd.SABSTOP = True
os._exit(0)
@@ -214,13 +226,11 @@ def get_db_connection(thread_index=0):
@synchronized(INIT_LOCK)
def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0):
global __INITIALIZED__, __SHUTTING_DOWN__, LOGFILE, WEBLOGFILE, LOGHANDLER, GUIHANDLER, AMBI_LOCALHOST, WAITEXIT, DAEMON, MY_NAME, MY_FULLNAME, NEW_VERSION, DIR_HOME, DIR_APPDATA, DIR_LCLDATA, DIR_PROG, DIR_INTERFACES, DARWIN, RESTART_REQ
if __INITIALIZED__:
def initialize(pause_downloader=False, clean_up=False, repair=0):
if sabnzbd.__INITIALIZED__:
return False
__SHUTTING_DOWN__ = False
sabnzbd.__SHUTTING_DOWN__ = False
# Set global database connection for Web-UI threads
cherrypy.engine.subscribe("start_thread", get_db_connection)
@@ -271,11 +281,6 @@ def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0
cfg.enable_https_verification.callback(guard_https_ver)
guard_https_ver()
# Set cache limit
if not cfg.cache_limit() or (cfg.cache_limit() in ("200M", "450M") and (sabnzbd.WIN32 or sabnzbd.DARWIN)):
cfg.cache_limit.set(misc.get_cache_limit())
ArticleCache.do.new_limit(cfg.cache_limit.get_int())
check_incomplete_vs_complete()
# Set language files
@@ -283,26 +288,9 @@ def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0
lang.set_language(cfg.language())
sabnzbd.api.clear_trans_cache()
# Set end-of-queue action
sabnzbd.change_queue_complete_action(cfg.queue_complete(), new=False)
# One time conversion "speedlimit" in schedules.
if not cfg.sched_converted():
schedules = cfg.schedules()
newsched = []
for sched in schedules:
if "speedlimit" in sched:
newsched.append(re.sub(r"(speedlimit \d+)$", r"\1K", sched))
else:
newsched.append(sched)
cfg.schedules.set(newsched)
cfg.sched_converted.set(1)
# Second time schedule conversion
if cfg.sched_converted() != 2:
cfg.schedules.set(["%s %s" % (1, schedule) for schedule in cfg.schedules()])
cfg.sched_converted.set(2)
config.save_config()
# Convert auto-sort
if cfg.auto_sort() == "0":
cfg.auto_sort.set("")
@@ -319,103 +307,97 @@ def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0
pause_downloader = True
# Initialize threads
rss.init()
sabnzbd.ArticleCache = sabnzbd.articlecache.ArticleCache()
sabnzbd.BPSMeter = sabnzbd.bpsmeter.BPSMeter()
sabnzbd.NzbQueue = sabnzbd.nzbqueue.NzbQueue()
sabnzbd.Downloader = sabnzbd.downloader.Downloader(pause_downloader or sabnzbd.BPSMeter.read())
sabnzbd.Decoder = sabnzbd.decoder.Decoder()
sabnzbd.Assembler = sabnzbd.assembler.Assembler()
sabnzbd.PostProcessor = sabnzbd.postproc.PostProcessor()
sabnzbd.DirScanner = sabnzbd.dirscanner.DirScanner()
sabnzbd.Rating = sabnzbd.rating.Rating()
sabnzbd.URLGrabber = sabnzbd.urlgrabber.URLGrabber()
sabnzbd.RSSReader = sabnzbd.rss.RSSReader()
sabnzbd.Scheduler = sabnzbd.scheduler.Scheduler()
paused = BPSMeter.do.read()
# Run startup tasks
sabnzbd.NzbQueue.read_queue(repair)
sabnzbd.Scheduler.analyse(pause_downloader)
NzbQueue()
Downloader(pause_downloader or paused)
Decoder()
Assembler()
PostProcessor()
NzbQueue.do.read_queue(repair)
DirScanner()
Rating()
URLGrabber()
scheduler.init()
if evalSched:
scheduler.analyse(pause_downloader)
# Set cache limit for new users
if not cfg.cache_limit():
cfg.cache_limit.set(misc.get_cache_limit())
sabnzbd.ArticleCache.new_limit(cfg.cache_limit.get_int())
logging.info("All processes started")
RESTART_REQ = False
__INITIALIZED__ = True
return True
sabnzbd.RESTART_REQ = False
sabnzbd.__INITIALIZED__ = True
@synchronized(INIT_LOCK)
def start():
global __INITIALIZED__
if __INITIALIZED__:
if sabnzbd.__INITIALIZED__:
logging.debug("Starting postprocessor")
PostProcessor.do.start()
sabnzbd.PostProcessor.start()
logging.debug("Starting assembler")
Assembler.do.start()
sabnzbd.Assembler.start()
logging.debug("Starting downloader")
Downloader.do.start()
sabnzbd.Downloader.start()
logging.debug("Starting decoders")
Decoder.do.start()
sabnzbd.Decoder.start()
scheduler.start()
logging.debug("Starting scheduler")
sabnzbd.Scheduler.start()
logging.debug("Starting dirscanner")
DirScanner.do.start()
sabnzbd.DirScanner.start()
Rating.do.start()
logging.debug("Starting rating")
sabnzbd.Rating.start()
logging.debug("Starting urlgrabber")
URLGrabber.do.start()
sabnzbd.URLGrabber.start()
@synchronized(INIT_LOCK)
def halt():
global __INITIALIZED__, __SHUTTING_DOWN__
if __INITIALIZED__:
if sabnzbd.__INITIALIZED__:
logging.info("SABnzbd shutting down...")
__SHUTTING_DOWN__ = True
sabnzbd.__SHUTTING_DOWN__ = True
# Stop the windows tray icon
if sabnzbd.WINTRAY:
sabnzbd.WINTRAY.terminate = True
sabnzbd.zconfig.remove_server()
sabnzbd.utils.ssdp.stop_ssdp()
sabnzbd.directunpacker.abort_all()
rss.stop()
logging.debug("Stopping RSSReader")
sabnzbd.RSSReader.stop()
logging.debug("Stopping URLGrabber")
URLGrabber.do.stop()
sabnzbd.URLGrabber.stop()
try:
URLGrabber.do.join()
sabnzbd.URLGrabber.join()
except:
pass
logging.debug("Stopping rating")
Rating.do.stop()
sabnzbd.Rating.stop()
try:
Rating.do.join()
sabnzbd.Rating.join()
except:
pass
logging.debug("Stopping dirscanner")
DirScanner.do.stop()
sabnzbd.DirScanner.stop()
try:
DirScanner.do.join()
sabnzbd.DirScanner.join()
except:
pass
@@ -425,20 +407,20 @@ def halt():
# Decoder handles join gracefully
logging.debug("Stopping decoders")
Decoder.do.stop()
Decoder.do.join()
sabnzbd.Decoder.stop()
sabnzbd.Decoder.join()
logging.debug("Stopping assembler")
Assembler.do.stop()
sabnzbd.Assembler.stop()
try:
Assembler.do.join()
sabnzbd.Assembler.join()
except:
pass
logging.debug("Stopping postprocessor")
PostProcessor.do.stop()
sabnzbd.PostProcessor.stop()
try:
PostProcessor.do.join()
sabnzbd.PostProcessor.join()
except:
pass
@@ -452,11 +434,12 @@ def halt():
# Since all warm-restarts have been removed, it's not longer
# needed to stop the scheduler.
# We must tell the scheduler to deactivate.
scheduler.abort()
logging.debug("Terminating scheduler")
sabnzbd.Scheduler.abort()
logging.info("All processes stopped")
__INITIALIZED__ = False
sabnzbd.__INITIALIZED__ = False
def trigger_restart(timeout=None):
@@ -465,15 +448,6 @@ def trigger_restart(timeout=None):
if timeout:
time.sleep(timeout)
# Add extra arguments
if sabnzbd.downloader.Downloader.do.paused:
sabnzbd.RESTART_ARGS.append("-p")
sys.argv = sabnzbd.RESTART_ARGS
# Stop all services
sabnzbd.halt()
cherrypy.engine.exit()
if sabnzbd.WIN32:
# Remove connection info for faster restart
del_connection_info()
@@ -482,6 +456,15 @@ def trigger_restart(timeout=None):
if hasattr(sys, "frozen"):
sabnzbd.TRIGGER_RESTART = True
else:
# Add extra arguments
if sabnzbd.Downloader.paused:
sabnzbd.RESTART_ARGS.append("-p")
sys.argv = sabnzbd.RESTART_ARGS
# Stop all services
sabnzbd.halt()
cherrypy.engine.exit()
# Do the restart right now
cherrypy.engine._do_execv()
@@ -491,18 +474,17 @@ def trigger_restart(timeout=None):
##############################################################################
def new_limit():
""" Callback for article cache changes """
ArticleCache.do.new_limit(cfg.cache_limit.get_int())
sabnzbd.ArticleCache.new_limit(cfg.cache_limit.get_int())
def guard_restart():
""" Callback for config options requiring a restart """
global RESTART_REQ
sabnzbd.RESTART_REQ = True
def guard_top_only():
""" Callback for change of top_only option """
NzbQueue.do.set_top_only(cfg.top_only())
sabnzbd.NzbQueue.set_top_only(cfg.top_only())
def guard_pause_on_pp():
@@ -511,17 +493,17 @@ def guard_pause_on_pp():
pass # Not safe to idle downloader, because we don't know
# if post-processing is active now
else:
Downloader.do.resume_from_postproc()
sabnzbd.Downloader.resume_from_postproc()
def guard_quota_size():
""" Callback for change of quota_size """
BPSMeter.do.change_quota()
sabnzbd.BPSMeter.change_quota()
def guard_quota_dp():
""" Callback for change of quota_day or quota_period """
scheduler.restart(force=True)
sabnzbd.Scheduler.restart()
def guard_language():
@@ -565,41 +547,39 @@ def add_url(url, pp=None, script=None, cat=None, priority=None, nzbname=None, pa
msg = "%s - %s" % (nzbname, msg)
# Generate the placeholder
future_nzo = NzbQueue.do.generate_future(msg, pp, script, cat, url=url, priority=priority, nzbname=nzbname)
future_nzo = sabnzbd.NzbQueue.generate_future(msg, pp, script, cat, url=url, priority=priority, nzbname=nzbname)
# Set password
if not future_nzo.password:
future_nzo.password = password
# Get it!
URLGrabber.do.add(url, future_nzo)
sabnzbd.URLGrabber.add(url, future_nzo)
return future_nzo.nzo_id
def save_state():
""" Save all internal bookkeeping to disk """
ArticleCache.do.flush_articles()
NzbQueue.do.save()
BPSMeter.do.save()
rss.save()
Rating.do.save()
DirScanner.do.save()
PostProcessor.do.save()
sabnzbd.ArticleCache.flush_articles()
sabnzbd.NzbQueue.save()
sabnzbd.BPSMeter.save()
sabnzbd.Rating.save()
sabnzbd.DirScanner.save()
sabnzbd.PostProcessor.save()
sabnzbd.RSSReader.save()
def pause_all():
""" Pause all activities than cause disk access """
global PAUSED_ALL
PAUSED_ALL = True
Downloader.do.pause()
sabnzbd.PAUSED_ALL = True
sabnzbd.Downloader.pause()
logging.debug("PAUSED_ALL active")
def unpause_all():
""" Resume all activities """
global PAUSED_ALL
PAUSED_ALL = False
Downloader.do.resume()
sabnzbd.PAUSED_ALL = False
sabnzbd.Downloader.resume()
logging.debug("PAUSED_ALL inactive")
@@ -608,20 +588,20 @@ def unpause_all():
##############################################################################
def backup_exists(filename):
def backup_exists(filename: str) -> bool:
""" Return True if backup exists and no_dupes is set """
path = cfg.nzb_backup_dir.get_path()
return path and os.path.exists(os.path.join(path, filename + ".gz"))
def backup_nzb(filename, data):
def backup_nzb(filename: str, data: AnyStr):
""" Backup NZB file """
path = cfg.nzb_backup_dir.get_path()
if path:
save_compressed(path, filename, data)
def save_compressed(folder, filename, data):
def save_compressed(folder: str, filename: str, data: AnyStr):
""" Save compressed NZB file in folder """
if filename.endswith(".nzb"):
filename += ".gz"
@@ -631,7 +611,7 @@ def save_compressed(folder, filename, data):
try:
# Have to get around the path being put inside the tgz
with open(os.path.join(folder, filename), "wb") as tgz_file:
f = gzip.GzipFile(filename, fileobj=tgz_file)
f = gzip.GzipFile(filename, fileobj=tgz_file, mode="wb")
f.write(encoding.utob(data))
f.flush()
f.close()
@@ -745,7 +725,7 @@ def enable_server(server):
logging.warning(T("Trying to set status of non-existing server %s"), server)
return
config.save_config()
Downloader.do.update_server(server, server)
sabnzbd.Downloader.update_server(server, server)
def disable_server(server):
@@ -756,7 +736,7 @@ def disable_server(server):
logging.warning(T("Trying to set status of non-existing server %s"), server)
return
config.save_config()
Downloader.do.update_server(server, server)
sabnzbd.Downloader.update_server(server, server)
def system_shutdown():
@@ -819,15 +799,13 @@ def change_queue_complete_action(action, new=True):
Scripts are prefixed with 'script_'
When "new" is False, check whether non-script actions are acceptable
"""
global QUEUECOMPLETE, QUEUECOMPLETEACTION, QUEUECOMPLETEARG
_action = None
_argument = None
if "script_" in action:
if action.startswith("script_"):
# all scripts are labeled script_xxx
_action = run_script
_argument = action.replace("script_", "")
elif new or cfg.queue_complete_pers.get():
_argument = action.replace("script_", "", 1)
elif new or cfg.queue_complete_pers():
if action == "shutdown_pc":
_action = system_shutdown
elif action == "hibernate_pc":
@@ -846,9 +824,9 @@ def change_queue_complete_action(action, new=True):
config.save_config()
# keep the name of the action for matching the current select in queue.tmpl
QUEUECOMPLETE = action
QUEUECOMPLETEACTION = _action
QUEUECOMPLETEARG = _argument
sabnzbd.QUEUECOMPLETE = action
sabnzbd.QUEUECOMPLETEACTION = _action
sabnzbd.QUEUECOMPLETEARG = _argument
def run_script(script):
@@ -862,20 +840,14 @@ def run_script(script):
logging.info("Failed queue-complete script %s, Traceback: ", script, exc_info=True)
def empty_queues():
""" Return True if queues empty or non-existent """
global __INITIALIZED__
return (not __INITIALIZED__) or (PostProcessor.do.empty() and NzbQueue.do.is_empty())
def keep_awake():
""" If we still have work to do, keep Windows/OSX system awake """
""" If we still have work to do, keep Windows/macOS system awake """
if KERNEL32 or FOUNDATION:
if sabnzbd.cfg.keep_awake():
ES_CONTINUOUS = 0x80000000
ES_SYSTEM_REQUIRED = 0x00000001
if (not Downloader.do.is_paused() and not NzbQueue.do.is_empty()) or (
not PostProcessor.do.paused and not PostProcessor.do.empty()
if (not sabnzbd.Downloader.is_paused() and not sabnzbd.NzbQueue.is_empty()) or (
not sabnzbd.PostProcessor.paused and not sabnzbd.PostProcessor.empty()
):
if KERNEL32:
# Set ES_SYSTEM_REQUIRED until the next call
@@ -975,7 +947,7 @@ def load_data(data_id, path, remove=True, do_pickle=True, silent=False):
return data
def remove_data(_id, path):
def remove_data(_id: str, path: str):
""" Remove admin file """
path = os.path.join(path, _id)
try:
@@ -985,13 +957,13 @@ def remove_data(_id, path):
logging.debug("Failed to remove %s", path)
def save_admin(data, data_id):
def save_admin(data: Any, data_id: str):
""" Save data in admin folder in specified format """
logging.debug("[%s] Saving data for %s", misc.caller_name(), data_id)
save_data(data, data_id, cfg.admin_dir.get_path())
def load_admin(data_id, remove=False, silent=False):
def load_admin(data_id: str, remove=False, silent=False) -> Any:
""" Read data in admin folder in specified format """
logging.debug("[%s] Loading data for %s", misc.caller_name(), data_id)
return load_data(data_id, cfg.admin_dir.get_path(), remove=remove, silent=silent)
@@ -1027,67 +999,66 @@ def check_all_tasks():
return True
# Non-restartable threads, require program restart
if not sabnzbd.PostProcessor.do.is_alive():
if not sabnzbd.PostProcessor.is_alive():
logging.info("Restarting because of crashed postprocessor")
return False
if not Downloader.do.is_alive():
if not sabnzbd.Downloader.is_alive():
logging.info("Restarting because of crashed downloader")
return False
if not Decoder.do.is_alive():
if not sabnzbd.Decoder.is_alive():
logging.info("Restarting because of crashed decoder")
return False
if not Assembler.do.is_alive():
if not sabnzbd.Assembler.is_alive():
logging.info("Restarting because of crashed assembler")
return False
# Kick the downloader, in case it missed the semaphore
Downloader.do.wakeup()
sabnzbd.Downloader.wakeup()
# Make sure the right servers are active
Downloader.do.check_timers()
sabnzbd.Downloader.check_timers()
# Restartable threads
if not DirScanner.do.is_alive():
if not sabnzbd.DirScanner.is_alive():
logging.info("Restarting crashed dirscanner")
DirScanner.do.__init__()
if not URLGrabber.do.is_alive():
sabnzbd.DirScanner.__init__()
if not sabnzbd.URLGrabber.is_alive():
logging.info("Restarting crashed urlgrabber")
URLGrabber.do.__init__()
if not Rating.do.is_alive():
sabnzbd.URLGrabber.__init__()
if not sabnzbd.Rating.is_alive():
logging.info("Restarting crashed rating")
Rating.do.__init__()
if not sabnzbd.scheduler.sched_check():
sabnzbd.Rating.__init__()
if not sabnzbd.Scheduler.is_alive():
logging.info("Restarting crashed scheduler")
sabnzbd.scheduler.init()
sabnzbd.downloader.Downloader.do.unblock_all()
sabnzbd.Scheduler.restart()
sabnzbd.Downloader.unblock_all()
# Check one-shot pause
sabnzbd.scheduler.pause_check()
sabnzbd.Scheduler.pause_check()
# Check (and terminate) idle jobs
sabnzbd.nzbqueue.NzbQueue.do.stop_idle_jobs()
sabnzbd.NzbQueue.stop_idle_jobs()
return True
def pid_file(pid_path=None, pid_file=None, port=0):
""" Create or remove pid file """
global DIR_PID
if not sabnzbd.WIN32:
if pid_path and pid_path.startswith("/"):
DIR_PID = os.path.join(pid_path, "sabnzbd-%d.pid" % port)
sabnzbd.DIR_PID = os.path.join(pid_path, "sabnzbd-%d.pid" % port)
elif pid_file and pid_file.startswith("/"):
DIR_PID = pid_file
sabnzbd.DIR_PID = pid_file
if DIR_PID:
if sabnzbd.DIR_PID:
try:
if port:
with open(DIR_PID, "w") as f:
with open(sabnzbd.DIR_PID, "w") as f:
f.write("%d\n" % os.getpid())
else:
filesystem.remove_file(DIR_PID)
filesystem.remove_file(sabnzbd.DIR_PID)
except:
logging.warning(T("Cannot access PID file %s"), DIR_PID)
logging.warning(T("Cannot access PID file %s"), sabnzbd.DIR_PID)
def check_incomplete_vs_complete():
@@ -1111,18 +1082,13 @@ def wait_for_download_folder():
time.sleep(2.0)
# Required wrapper because nzbstuff.py cannot import downloader.py
def highest_server(me):
return sabnzbd.downloader.Downloader.do.highest_server(me)
def test_ipv6():
""" Check if external IPv6 addresses are reachable """
if not cfg.selftest_host():
# User disabled the test, assume active IPv6
return True
try:
info = getipaddress.addresslookup6(cfg.selftest_host())
info = sabnzbd.getipaddress.addresslookup6(cfg.selftest_host())
except:
logging.debug(
"Test IPv6: Disabling IPv6, because it looks like it's not available. Reason: %s", sys.exc_info()[0]

View File

@@ -22,13 +22,14 @@ sabnzbd.api - api
import os
import logging
import re
import gc
import datetime
import time
import json
import cherrypy
import locale
from threading import Thread
from typing import List
try:
import win32api
@@ -50,9 +51,6 @@ from sabnzbd.constants import (
)
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.downloader import Downloader
from sabnzbd.nzbqueue import NzbQueue
import sabnzbd.scheduler as scheduler
from sabnzbd.skintext import SKIN_TEXT
from sabnzbd.utils.pathbrowser import folders_at_path
from sabnzbd.utils.getperformance import getcpu
@@ -68,16 +66,13 @@ from sabnzbd.misc import (
)
from sabnzbd.filesystem import diskspace, get_ext, globber_full, clip_path, remove_all, userxbit
from sabnzbd.encoding import xml_name
from sabnzbd.postproc import PostProcessor
from sabnzbd.articlecache import ArticleCache
from sabnzbd.utils.servertests import test_nntp_server_dict
from sabnzbd.bpsmeter import BPSMeter
from sabnzbd.rating import Rating
from sabnzbd.getipaddress import localipv4, publicipv4, ipv6, addresslookup
from sabnzbd.database import build_history_info, unpack_history_info, HistoryDB
import sabnzbd.notifier
import sabnzbd.rss
import sabnzbd.emailer
import sabnzbd.sorting
##############################################################################
# API error messages
@@ -199,12 +194,12 @@ def _api_queue(name, output, kwargs):
def _api_queue_delete(output, value, kwargs):
""" API: accepts output, value """
if value.lower() == "all":
removed = NzbQueue.do.remove_all(kwargs.get("search"))
removed = sabnzbd.NzbQueue.remove_all(kwargs.get("search"))
return report(output, keyword="", data={"status": bool(removed), "nzo_ids": removed})
elif value:
items = value.split(",")
delete_all_data = int_conv(kwargs.get("del_files"))
removed = NzbQueue.do.remove_multiple(items, delete_all_data=delete_all_data)
removed = sabnzbd.NzbQueue.remove_multiple(items, delete_all_data=delete_all_data)
return report(output, keyword="", data={"status": bool(removed), "nzo_ids": removed})
else:
return report(output, _MSG_NO_VALUE)
@@ -214,7 +209,7 @@ def _api_queue_delete_nzf(output, value, kwargs):
""" API: accepts value(=nzo_id), value2(=nzf_id) """
value2 = kwargs.get("value2")
if value and value2:
removed = NzbQueue.do.remove_nzf(value, value2, force_delete=True)
removed = sabnzbd.NzbQueue.remove_nzf(value, value2, force_delete=True)
return report(output, keyword="", data={"status": bool(removed), "nzf_ids": removed})
else:
return report(output, _MSG_NO_VALUE2)
@@ -225,7 +220,7 @@ def _api_queue_rename(output, value, kwargs):
value2 = kwargs.get("value2")
value3 = kwargs.get("value3")
if value and value2:
ret = NzbQueue.do.change_name(value, value2, value3)
ret = sabnzbd.NzbQueue.change_name(value, value2, value3)
return report(output, keyword="", data={"status": ret})
else:
return report(output, _MSG_NO_VALUE2)
@@ -239,7 +234,7 @@ def _api_queue_change_complete_action(output, value, kwargs):
def _api_queue_purge(output, value, kwargs):
""" API: accepts output """
removed = NzbQueue.do.remove_all(kwargs.get("search"))
removed = sabnzbd.NzbQueue.remove_all(kwargs.get("search"))
return report(output, keyword="", data={"status": bool(removed), "nzo_ids": removed})
@@ -247,7 +242,7 @@ def _api_queue_pause(output, value, kwargs):
""" API: accepts output, value(=list of nzo_id) """
if value:
items = value.split(",")
handled = NzbQueue.do.pause_multiple_nzo(items)
handled = sabnzbd.NzbQueue.pause_multiple_nzo(items)
else:
handled = False
return report(output, keyword="", data={"status": bool(handled), "nzo_ids": handled})
@@ -257,7 +252,7 @@ def _api_queue_resume(output, value, kwargs):
""" API: accepts output, value(=list of nzo_id) """
if value:
items = value.split(",")
handled = NzbQueue.do.resume_multiple_nzo(items)
handled = sabnzbd.NzbQueue.resume_multiple_nzo(items)
else:
handled = False
return report(output, keyword="", data={"status": bool(handled), "nzo_ids": handled})
@@ -272,7 +267,7 @@ def _api_queue_priority(output, value, kwargs):
priority = int(value2)
except:
return report(output, _MSG_INT_VALUE)
pos = NzbQueue.do.set_priority(value, priority)
pos = sabnzbd.NzbQueue.set_priority(value, priority)
# Returns the position in the queue, -1 is incorrect job-id
return report(output, keyword="position", data=pos)
except:
@@ -286,7 +281,7 @@ def _api_queue_sort(output, value, kwargs):
sort = kwargs.get("sort")
direction = kwargs.get("dir", "")
if sort:
NzbQueue.do.sort_queue(sort, direction)
sabnzbd.NzbQueue.sort_queue(sort, direction)
return report(output)
else:
return report(output, _MSG_NO_VALUE2)
@@ -304,13 +299,13 @@ def _api_queue_default(output, value, kwargs):
def _api_queue_rating(output, value, kwargs):
""" API: accepts output, value(=nzo_id), type, setting, detail """
vote_map = {"up": Rating.VOTE_UP, "down": Rating.VOTE_DOWN}
vote_map = {"up": sabnzbd.Rating.VOTE_UP, "down": sabnzbd.Rating.VOTE_DOWN}
flag_map = {
"spam": Rating.FLAG_SPAM,
"encrypted": Rating.FLAG_ENCRYPTED,
"expired": Rating.FLAG_EXPIRED,
"other": Rating.FLAG_OTHER,
"comment": Rating.FLAG_COMMENT,
"spam": sabnzbd.Rating.FLAG_SPAM,
"encrypted": sabnzbd.Rating.FLAG_ENCRYPTED,
"expired": sabnzbd.Rating.FLAG_EXPIRED,
"other": sabnzbd.Rating.FLAG_OTHER,
"comment": sabnzbd.Rating.FLAG_COMMENT,
}
content_type = kwargs.get("type")
setting = kwargs.get("setting")
@@ -326,7 +321,7 @@ def _api_queue_rating(output, value, kwargs):
if content_type == "flag":
flag = flag_map[setting]
if cfg.rating_enable():
Rating.do.update_user_rating(value, video, audio, vote, flag, kwargs.get("detail"))
sabnzbd.Rating.update_user_rating(value, video, audio, vote, flag, kwargs.get("detail"))
return report(output)
except:
return report(output, _MSG_BAD_SERVER_PARMS)
@@ -389,7 +384,7 @@ def _api_retry(name, output, kwargs):
def _api_cancel_pp(name, output, kwargs):
""" API: accepts name, output, value(=nzo_id) """
nzo_id = kwargs.get("value")
if PostProcessor.do.cancel_pp(nzo_id):
if sabnzbd.PostProcessor.cancel_pp(nzo_id):
return report(output, keyword="", data={"status": True, "nzo_id": nzo_id})
else:
return report(output, _MSG_NO_ITEM)
@@ -438,7 +433,7 @@ def _api_switch(name, output, kwargs):
value = kwargs.get("value")
value2 = kwargs.get("value2")
if value and value2:
pos, prio = NzbQueue.do.switch(value, value2)
pos, prio = sabnzbd.NzbQueue.switch(value, value2)
# Returns the new position and new priority (if different)
return report(output, keyword="result", data={"position": pos, "priority": prio})
else:
@@ -454,7 +449,7 @@ def _api_change_cat(name, output, kwargs):
cat = value2
if cat == "None":
cat = None
result = NzbQueue.do.change_cat(nzo_id, cat)
result = sabnzbd.NzbQueue.change_cat(nzo_id, cat)
return report(output, keyword="status", data=bool(result > 0))
else:
return report(output, _MSG_NO_VALUE)
@@ -469,7 +464,7 @@ def _api_change_script(name, output, kwargs):
script = value2
if script.lower() == "none":
script = None
result = NzbQueue.do.change_script(nzo_id, script)
result = sabnzbd.NzbQueue.change_script(nzo_id, script)
return report(output, keyword="status", data=bool(result > 0))
else:
return report(output, _MSG_NO_VALUE)
@@ -481,7 +476,7 @@ def _api_change_opts(name, output, kwargs):
value2 = kwargs.get("value2")
result = 0
if value and value2 and value2.isdigit():
result = NzbQueue.do.change_opts(value, int(value2))
result = sabnzbd.NzbQueue.change_opts(value, int(value2))
return report(output, keyword="status", data=bool(result > 0))
@@ -534,7 +529,7 @@ def _api_history(name, output, kwargs):
return report(output, _MSG_NO_VALUE)
elif not name:
history = {}
grand, month, week, day = BPSMeter.do.get_sums()
grand, month, week, day = sabnzbd.BPSMeter.get_sums()
history["total_size"], history["month_size"], history["week_size"], history["day_size"] = (
to_units(grand),
to_units(month),
@@ -580,14 +575,14 @@ def _api_addurl(name, output, kwargs):
def _api_pause(name, output, kwargs):
""" API: accepts output """
scheduler.plan_resume(0)
Downloader.do.pause()
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
return report(output)
def _api_resume(name, output, kwargs):
""" API: accepts output """
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.unpause_all()
return report(output)
@@ -654,13 +649,14 @@ def _api_restart_repair(name, output, kwargs):
""" API: accepts output """
logging.info("Queue repair requested by API")
sabnzbd.request_repair()
sabnzbd.trigger_restart()
# Do the shutdown async to still send goodbye to browser
Thread(target=sabnzbd.trigger_restart, kwargs={"timeout": 1}).start()
return report(output)
def _api_disconnect(name, output, kwargs):
""" API: accepts output """
Downloader.do.disconnect()
sabnzbd.Downloader.disconnect()
return report(output)
@@ -673,7 +669,7 @@ def _api_osx_icon(name, output, kwargs):
def _api_rescan(name, output, kwargs):
""" API: accepts output """
NzbQueue.do.scan_jobs(all_jobs=False, action=True)
sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=True)
return report(output)
@@ -692,26 +688,26 @@ def _api_eval_sort(name, output, kwargs):
def _api_watched_now(name, output, kwargs):
""" API: accepts output """
sabnzbd.dirscanner.dirscan()
sabnzbd.DirScanner.scan()
return report(output)
def _api_resume_pp(name, output, kwargs):
""" API: accepts output """
PostProcessor.do.paused = False
sabnzbd.PostProcessor.paused = False
return report(output)
def _api_pause_pp(name, output, kwargs):
""" API: accepts output """
PostProcessor.do.paused = True
sabnzbd.PostProcessor.paused = True
return report(output)
def _api_rss_now(name, output, kwargs):
""" API: accepts output """
# Run RSS scan async, because it can take a long time
scheduler.force_rss()
sabnzbd.Scheduler.force_rss()
return report(output)
@@ -722,7 +718,8 @@ def _api_retry_all(name, output, kwargs):
def _api_reset_quota(name, output, kwargs):
""" Reset quota left """
BPSMeter.do.reset_quota(force=True)
sabnzbd.BPSMeter.reset_quota(force=True)
return report(output)
def _api_test_email(name, output, kwargs):
@@ -827,13 +824,13 @@ def _api_config_speedlimit(output, kwargs):
value = kwargs.get("value")
if not value:
value = "0"
Downloader.do.limit_speed(value)
sabnzbd.Downloader.limit_speed(value)
return report(output)
def _api_config_get_speedlimit(output, kwargs):
""" API: accepts output """
return report(output, keyword="speedlimit", data=Downloader.do.get_limit())
return report(output, keyword="speedlimit", data=sabnzbd.Downloader.get_limit())
def _api_config_set_colorscheme(output, kwargs):
@@ -849,7 +846,7 @@ def _api_config_set_colorscheme(output, kwargs):
def _api_config_set_pause(output, kwargs):
""" API: accepts output, value(=pause interval) """
value = kwargs.get("value")
scheduler.plan_resume(int_conv(value))
sabnzbd.Scheduler.plan_resume(int_conv(value))
return report(output)
@@ -898,16 +895,24 @@ def _api_config_undefined(output, kwargs):
def _api_server_stats(name, output, kwargs):
""" API: accepts output """
sum_t, sum_m, sum_w, sum_d = BPSMeter.do.get_sums()
sum_t, sum_m, sum_w, sum_d = sabnzbd.BPSMeter.get_sums()
stats = {"total": sum_t, "month": sum_m, "week": sum_w, "day": sum_d, "servers": {}}
for svr in config.get_servers():
t, m, w, d, daily = BPSMeter.do.amounts(svr)
t, m, w, d, daily = sabnzbd.BPSMeter.amounts(svr)
stats["servers"][svr] = {"total": t or 0, "month": m or 0, "week": w or 0, "day": d or 0, "daily": daily or {}}
return report(output, keyword="", data=stats)
def _api_gc_stats(name, output, kwargs):
"""Function only intended for internal testing of the memory handling"""
# Collect before we check
gc.collect()
# We cannot create any lists/dicts, as they would create a reference
return report(output, data=[str(obj) for obj in gc.get_objects() if isinstance(obj, sabnzbd.nzbstuff.TryList)])
##############################################################################
_api_table = {
"server_stats": (_api_server_stats, 2),
@@ -944,6 +949,7 @@ _api_table = {
"restart_repair": (_api_restart_repair, 2),
"disconnect": (_api_disconnect, 2),
"osx_icon": (_api_osx_icon, 3),
"gc_stats": (_api_gc_stats, 3),
"rescan": (_api_rescan, 2),
"eval_sort": (_api_eval_sort, 2),
"watched_now": (_api_watched_now, 2),
@@ -1119,7 +1125,7 @@ def handle_server_api(output, kwargs):
else:
config.ConfigServer(name, kwargs)
old_name = None
Downloader.do.update_server(old_name, name)
sabnzbd.Downloader.update_server(old_name, name)
return name
@@ -1180,7 +1186,7 @@ def build_status(skip_dashboard=False, output=None):
info["logfile"] = sabnzbd.LOGFILE
info["weblogfile"] = sabnzbd.WEBLOGFILE
info["loglevel"] = str(cfg.log_level())
info["folders"] = NzbQueue.do.scan_jobs(all_jobs=False, action=False)
info["folders"] = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
info["configfn"] = config.get_filename()
# Dashboard: Speed of System
@@ -1211,7 +1217,7 @@ def build_status(skip_dashboard=False, output=None):
info["dnslookup"] = None
info["servers"] = []
servers = sorted(Downloader.do.servers[:], key=lambda svr: "%02d%s" % (svr.priority, svr.displayname.lower()))
servers = sorted(sabnzbd.Downloader.servers[:], key=lambda svr: "%02d%s" % (svr.priority, svr.displayname.lower()))
for server in servers:
serverconnections = []
connected = 0
@@ -1343,7 +1349,7 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
slot["mb_fmt"] = locale.format_string("%d", int(mb), True)
slot["mbdone_fmt"] = locale.format_string("%d", int(mb - mbleft), True)
if not Downloader.do.paused and status not in (Status.PAUSED, Status.FETCHING, Status.GRABBING):
if not sabnzbd.Downloader.paused and status not in (Status.PAUSED, Status.FETCHING, Status.GRABBING):
if is_propagating:
slot["status"] = Status.PROP
elif status == Status.CHECKING:
@@ -1357,8 +1363,8 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
slot["status"] = "%s" % status
if (
Downloader.do.paused
or Downloader.do.postproc
sabnzbd.Downloader.paused
or sabnzbd.Downloader.postproc
or is_propagating
or status not in (Status.DOWNLOADING, Status.FETCHING, Status.QUEUED)
) and priority != FORCE_PRIORITY:
@@ -1381,7 +1387,7 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
else:
slot["avg_age"] = calc_age(average_date, bool(trans))
rating = Rating.do.get_rating_by_nzo(nzo_id)
rating = sabnzbd.Rating.get_rating_by_nzo(nzo_id)
slot["has_rating"] = rating is not None
if rating:
slot["rating_avg_video"] = rating.avg_video
@@ -1400,17 +1406,17 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
def fast_queue():
""" Return paused, bytes_left, bpsnow, time_left """
bytes_left = NzbQueue.do.remaining()
paused = Downloader.do.paused
bpsnow = BPSMeter.do.bps
bytes_left = sabnzbd.sabnzbd.NzbQueue.remaining()
paused = sabnzbd.Downloader.paused
bpsnow = sabnzbd.BPSMeter.bps
time_left = calc_timeleft(bytes_left, bpsnow)
return paused, bytes_left, bpsnow, time_left
def build_file_list(nzo_id):
def build_file_list(nzo_id: str):
"""Build file lists for specified job"""
jobs = []
nzo = NzbQueue.do.get_nzo(nzo_id)
nzo = sabnzbd.sabnzbd.NzbQueue.get_nzo(nzo_id)
if nzo:
pnfo = nzo.gather_info(full=True)
@@ -1487,7 +1493,7 @@ def retry_job(job, new_nzb=None, password=None):
nzo_id = sabnzbd.add_url(url, pp, script, cat)
else:
path = history_db.get_path(job)
nzo_id = NzbQueue.do.repair_job(path, new_nzb, password)
nzo_id = sabnzbd.NzbQueue.repair_job(path, new_nzb, password)
if nzo_id:
# Only remove from history if we repaired something
history_db.remove_history(job)
@@ -1516,9 +1522,9 @@ def del_job_files(job_paths):
def del_hist_job(job, del_files):
""" Remove history element """
if job:
path = PostProcessor.do.get_path(job)
path = sabnzbd.PostProcessor.get_path(job)
if path:
PostProcessor.do.delete(job, del_files=del_files)
sabnzbd.PostProcessor.delete(job, del_files=del_files)
else:
history_db = sabnzbd.get_db_connection()
remove_all(history_db.get_path(job), recursive=True)
@@ -1568,10 +1574,10 @@ def build_header(webdir="", output=None, trans_functions=True):
except:
uptime = "-"
speed_limit = Downloader.do.get_limit()
speed_limit = sabnzbd.Downloader.get_limit()
if speed_limit <= 0:
speed_limit = 100
speed_limit_abs = Downloader.do.get_limit_abs()
speed_limit_abs = sabnzbd.Downloader.get_limit_abs()
if speed_limit_abs <= 0:
speed_limit_abs = ""
@@ -1603,14 +1609,14 @@ def build_header(webdir="", output=None, trans_functions=True):
header["darwin"] = sabnzbd.DARWIN
header["power_options"] = sabnzbd.WIN32 or sabnzbd.DARWIN or sabnzbd.LINUX_POWER
header["pp_pause_event"] = sabnzbd.scheduler.pp_pause_event()
header["pp_pause_event"] = sabnzbd.Scheduler.pp_pause_event
header["apikey"] = cfg.api_key()
header["new_release"], header["new_rel_url"] = sabnzbd.NEW_VERSION
header["version"] = sabnzbd.__version__
header["paused"] = bool(Downloader.do.paused or Downloader.do.postproc)
header["pause_int"] = scheduler.pause_int()
header["paused"] = bool(sabnzbd.Downloader.paused or sabnzbd.Downloader.postproc)
header["pause_int"] = sabnzbd.Scheduler.pause_int()
header["paused_all"] = sabnzbd.PAUSED_ALL
header["diskspace1"] = "%.2f" % diskspace_info["download_dir"][1]
@@ -1626,11 +1632,11 @@ def build_header(webdir="", output=None, trans_functions=True):
header["have_warnings"] = str(sabnzbd.GUIHANDLER.count())
header["finishaction"] = sabnzbd.QUEUECOMPLETE
header["quota"] = to_units(BPSMeter.do.quota)
header["have_quota"] = bool(BPSMeter.do.quota > 0.0)
header["left_quota"] = to_units(BPSMeter.do.left)
header["quota"] = to_units(sabnzbd.BPSMeter.quota)
header["have_quota"] = bool(sabnzbd.BPSMeter.quota > 0.0)
header["left_quota"] = to_units(sabnzbd.BPSMeter.left)
anfo = ArticleCache.do.cache_info()
anfo = sabnzbd.ArticleCache.cache_info()
header["cache_art"] = str(anfo.article_sum)
header["cache_size"] = to_units(anfo.cache_size, "B")
header["cache_max"] = str(anfo.cache_limit)
@@ -1643,8 +1649,8 @@ def build_queue_header(search=None, start=0, limit=0, output=None):
header = build_header(output=output)
bytespersec = BPSMeter.do.bps
qnfo = NzbQueue.do.queue_info(search=search, start=start, limit=limit)
bytespersec = sabnzbd.BPSMeter.bps
qnfo = sabnzbd.NzbQueue.queue_info(search=search, start=start, limit=limit)
bytesleft = qnfo.bytes_left
bytes_total = qnfo.bytes
@@ -1657,7 +1663,7 @@ def build_queue_header(search=None, start=0, limit=0, output=None):
header["size"] = to_units(bytes_total, "B")
header["noofslots_total"] = qnfo.q_fullsize
if Downloader.do.paused or Downloader.do.postproc:
if sabnzbd.Downloader.paused or sabnzbd.Downloader.postproc:
status = Status.PAUSED
elif bytespersec > 0:
status = Status.DOWNLOADING
@@ -1682,7 +1688,7 @@ def build_history(start=0, limit=0, search=None, failed_only=0, categories=None)
limit = 1000000
# Grab any items that are active or queued in postproc
postproc_queue = PostProcessor.do.get_queue()
postproc_queue = sabnzbd.PostProcessor.get_queue()
# Filter out any items that don't match the search term or category
if postproc_queue:
@@ -1762,7 +1768,7 @@ def build_history(start=0, limit=0, search=None, failed_only=0, categories=None)
item["retry"] = True
if rating_enabled:
rating = Rating.do.get_rating_by_nzo(item["nzo_id"])
rating = sabnzbd.Rating.get_rating_by_nzo(item["nzo_id"])
item["has_rating"] = rating is not None
if rating:
item["rating_avg_video"] = rating.avg_video
@@ -1848,7 +1854,7 @@ def calc_timeleft(bytesleft, bps):
return "0:00:00"
def list_scripts(default=False, none=True):
def list_scripts(default: bool = False, none: bool = True) -> List[str]:
""" Return a list of script names, optionally with 'Default' added """
lst = []
path = cfg.script_dir.get_path()
@@ -1865,6 +1871,8 @@ def list_scripts(default=False, none=True):
or (not sabnzbd.WIN32 and userxbit(script) and not os.path.basename(script).startswith("."))
):
lst.append(os.path.basename(script))
# Make sure capitalization is ignored to avoid strange results
lst = sorted(lst, key=str.casefold)
if none:
lst.insert(0, "None")
if default:
@@ -1913,7 +1921,7 @@ def del_from_section(kwargs):
del item
config.save_config()
if section == "servers":
Downloader.do.update_server(keyword, None)
sabnzbd.Downloader.update_server(keyword, None)
return True
else:
return False
@@ -1922,15 +1930,13 @@ def del_from_section(kwargs):
def history_remove_failed():
""" Remove all failed jobs from history, including files """
logging.info("Scheduled removal of all failed jobs")
history_db = HistoryDB()
del_job_files(history_db.get_failed_paths())
history_db.remove_failed()
history_db.close()
with HistoryDB() as history_db:
del_job_files(history_db.get_failed_paths())
history_db.remove_failed()
def history_remove_completed():
""" Remove all completed jobs from history """
logging.info("Scheduled removal of all completed jobs")
history_db = HistoryDB()
history_db.remove_completed()
history_db.close()
with HistoryDB() as history_db:
history_db.remove_completed()

View File

@@ -22,10 +22,12 @@ sabnzbd.articlecache - Article cache handling
import logging
import threading
import struct
from typing import Dict, List
import sabnzbd
from sabnzbd.decorators import synchronized
from sabnzbd.constants import GIGI, ANFO, MEBI, LIMIT_DECODE_QUEUE, MIN_DECODE_QUEUE
from sabnzbd.nzbstuff import Article
# Operations on the article table are handled via try/except.
# The counters need to be made atomic to ensure consistency.
@@ -33,13 +35,11 @@ ARTICLE_COUNTER_LOCK = threading.RLock()
class ArticleCache:
do = None
def __init__(self):
self.__cache_limit_org = 0
self.__cache_limit = 0
self.__cache_size = 0
self.__article_table = {} # Dict of buffered articles
self.__article_table: Dict[Article, bytes] = {} # Dict of buffered articles
# Limit for the decoder is based on the total available cache
# so it can be larger on memory-rich systems
@@ -51,12 +51,10 @@ class ArticleCache:
if sabnzbd.DARWIN or sabnzbd.WIN64 or (struct.calcsize("P") * 8) == 64:
self.__cache_upper_limit = 4 * GIGI
ArticleCache.do = self
def cache_info(self):
return ANFO(len(self.__article_table), abs(self.__cache_size), self.__cache_limit_org)
def new_limit(self, limit):
def new_limit(self, limit: int):
""" Called when cache limit changes """
self.__cache_limit_org = limit
if limit < 0:
@@ -71,20 +69,20 @@ class ArticleCache:
self.decoder_cache_article_limit = max(decoder_cache_limit, MIN_DECODE_QUEUE)
@synchronized(ARTICLE_COUNTER_LOCK)
def reserve_space(self, data_size):
def reserve_space(self, data_size: int):
""" Reserve space in the cache """
self.__cache_size += data_size
@synchronized(ARTICLE_COUNTER_LOCK)
def free_reserved_space(self, data_size):
def free_reserved_space(self, data_size: int):
""" Remove previously reserved space """
self.__cache_size -= data_size
def space_left(self):
def space_left(self) -> bool:
""" Is there space left in the set limit? """
return self.__cache_size < self.__cache_limit
def save_article(self, article, data):
def save_article(self, article: Article, data: bytes):
""" Save article in cache, either memory or disk """
nzo = article.nzf.nzo
if nzo.is_gone():
@@ -116,7 +114,7 @@ class ArticleCache:
# No data saved in memory, direct to disk
self.__flush_article_to_disk(article, data)
def load_article(self, article):
def load_article(self, article: Article):
""" Load the data of the article """
data = None
nzo = article.nzf.nzo
@@ -131,7 +129,7 @@ class ArticleCache:
logging.debug("Failed to load %s from cache, probably already deleted", article)
return data
elif article.art_id:
data = sabnzbd.load_data(article.art_id, nzo.workpath, remove=True, do_pickle=False, silent=True)
data = sabnzbd.load_data(article.art_id, nzo.admin_path, remove=True, do_pickle=False, silent=True)
nzo.remove_saved_article(article)
return data
@@ -146,7 +144,7 @@ class ArticleCache:
# Could fail if already deleted by purge_articles or load_data
logging.debug("Failed to flush item from cache, probably already deleted or written to disk")
def purge_articles(self, articles):
def purge_articles(self, articles: List[Article]):
""" Remove all saved articles, from memory and disk """
logging.debug("Purging %s articles from the cache/disk", len(articles))
for article in articles:
@@ -158,10 +156,10 @@ class ArticleCache:
# Could fail if already deleted by flush_articles or load_data
logging.debug("Failed to flush %s from cache, probably already deleted or written to disk", article)
elif article.art_id:
sabnzbd.remove_data(article.art_id, article.nzf.nzo.workpath)
sabnzbd.remove_data(article.art_id, article.nzf.nzo.admin_path)
@staticmethod
def __flush_article_to_disk(article, data):
def __flush_article_to_disk(article: Article, data):
nzo = article.nzf.nzo
if nzo.is_gone():
# Don't store deleted jobs
@@ -169,8 +167,4 @@ class ArticleCache:
# Save data, but don't complain when destination folder is missing
# because this flush may come after completion of the NZO.
sabnzbd.save_data(data, article.get_art_id(), nzo.workpath, do_pickle=False, silent=True)
# Create the instance
ArticleCache()
sabnzbd.save_data(data, article.get_art_id(), nzo.admin_path, do_pickle=False, silent=True)

View File

@@ -26,46 +26,43 @@ import re
from threading import Thread
from time import sleep
import hashlib
from typing import Tuple, Optional, List
import sabnzbd
from sabnzbd.misc import get_all_passwords
from sabnzbd.filesystem import set_permissions, clip_path, has_win_device, diskspace, get_filename, get_ext
from sabnzbd.constants import Status, GIGI, MAX_ASSEMBLER_QUEUE
import sabnzbd.cfg as cfg
from sabnzbd.articlecache import ArticleCache
from sabnzbd.postproc import PostProcessor
from sabnzbd.nzbstuff import NzbObject, NzbFile
import sabnzbd.downloader
import sabnzbd.par2file as par2file
import sabnzbd.utils.rarfile as rarfile
from sabnzbd.rating import Rating
class Assembler(Thread):
do = None # Link to the instance of this method
def __init__(self):
Thread.__init__(self)
self.queue = queue.Queue()
Assembler.do = self
self.queue: queue.Queue[Tuple[Optional[NzbObject], Optional[NzbFile], Optional[bool]]] = queue.Queue()
def stop(self):
self.process(None)
self.queue.put((None, None, None))
def process(self, job):
self.queue.put(job)
def process(self, nzo: NzbObject, nzf: Optional[NzbFile] = None, file_done: Optional[bool] = None):
self.queue.put((nzo, nzf, file_done))
def queue_full(self):
return self.queue.qsize() >= MAX_ASSEMBLER_QUEUE
def run(self):
while 1:
job = self.queue.get()
if not job:
# Set NzbObject and NzbFile objects to None so references
# from this thread do not keep the objects alive (see #1628)
nzo = nzf = None
nzo, nzf, file_done = self.queue.get()
if not nzo:
logging.info("Shutting down")
break
nzo, nzf, file_done = job
if nzf:
# Check if enough disk space is free after each file is done
# If not enough space left, pause downloader and send email
@@ -74,10 +71,10 @@ class Assembler(Thread):
and diskspace(force=True)["download_dir"][1] < (cfg.download_free.get_float() + nzf.bytes) / GIGI
):
# Only warn and email once
if not sabnzbd.downloader.Downloader.do.paused:
if not sabnzbd.Downloader.paused:
logging.warning(T("Too little diskspace forcing PAUSE"))
# Pause downloader, but don't save, since the disk is almost full!
sabnzbd.downloader.Downloader.do.pause()
sabnzbd.Downloader.pause()
sabnzbd.emailer.diskfull_mail()
# Abort all direct unpackers, just to be sure
sabnzbd.directunpacker.abort_all()
@@ -100,7 +97,7 @@ class Assembler(Thread):
# Log traceback
logging.info("Traceback: ", exc_info=True)
# Pause without saving
sabnzbd.downloader.Downloader.do.pause()
sabnzbd.Downloader.pause()
continue
except:
logging.error(T("Fatal error in Assembler"), exc_info=True)
@@ -135,14 +132,16 @@ class Assembler(Thread):
nzo.final_name,
)
nzo.fail_msg = T("Aborted, encryption detected")
sabnzbd.nzbqueue.NzbQueue.do.end_job(nzo)
sabnzbd.NzbQueue.end_job(nzo)
if unwanted_file:
logging.warning(
T('In "%s" unwanted extension in RAR file. Unwanted file is %s '),
nzo.final_name,
unwanted_file,
)
# Don't repeat the warning after a user override of an unwanted extension pause
if nzo.unwanted_ext == 0:
logging.warning(
T('In "%s" unwanted extension in RAR file. Unwanted file is %s '),
nzo.final_name,
unwanted_file,
)
logging.debug(T("Unwanted extension is in rar file %s"), filepath)
if cfg.action_on_unwanted_extensions() == 1 and nzo.unwanted_ext == 0:
logging.debug("Unwanted extension ... pausing")
@@ -151,7 +150,7 @@ class Assembler(Thread):
if cfg.action_on_unwanted_extensions() == 2:
logging.debug("Unwanted extension ... aborting")
nzo.fail_msg = T("Aborted, unwanted extension detected")
sabnzbd.nzbqueue.NzbQueue.do.end_job(nzo)
sabnzbd.NzbQueue.end_job(nzo)
# Add to direct unpack
nzo.add_to_direct_unpacker(nzf)
@@ -175,14 +174,14 @@ class Assembler(Thread):
reason,
)
nzo.fail_msg = T("Aborted, rating filter matched (%s)") % reason
sabnzbd.nzbqueue.NzbQueue.do.end_job(nzo)
sabnzbd.NzbQueue.end_job(nzo)
else:
sabnzbd.nzbqueue.NzbQueue.do.remove(nzo.nzo_id, add_to_history=False, cleanup=False)
PostProcessor.do.process(nzo)
sabnzbd.NzbQueue.remove(nzo.nzo_id, cleanup=False)
sabnzbd.PostProcessor.process(nzo)
@staticmethod
def assemble(nzf, file_done):
def assemble(nzf: NzbFile, file_done: bool):
"""Assemble a NZF from its table of articles
1) Partial write: write what we have
2) Nothing written before: write all
@@ -203,7 +202,7 @@ class Assembler(Thread):
# Write all decoded articles
if article.decoded:
data = ArticleCache.do.load_article(article)
data = sabnzbd.ArticleCache.load_article(article)
# Could be empty in case nzo was deleted
if data:
fout.write(data)
@@ -226,14 +225,14 @@ class Assembler(Thread):
nzf.md5sum = nzf.md5.digest()
def file_has_articles(nzf):
def file_has_articles(nzf: NzbFile):
"""Do a quick check to see if any articles are present for this file.
Destructive: only to be used to differentiate between unknown encoding and no articles.
"""
has = False
for article in nzf.decodetable:
sleep(0.01)
data = ArticleCache.do.load_article(article)
data = sabnzbd.ArticleCache.load_article(article)
if data:
has = True
return has
@@ -243,7 +242,7 @@ RE_SUBS = re.compile(r"\W+sub|subs|subpack|subtitle|subtitles(?![a-z])", re.I)
SAFE_EXTS = (".mkv", ".mp4", ".avi", ".wmv", ".mpg", ".webm")
def is_cloaked(nzo, path, names):
def is_cloaked(nzo: NzbObject, path: str, names: List[str]) -> bool:
""" Return True if this is likely to be a cloaked encrypted post """
fname = os.path.splitext(get_filename(path.lower()))[0]
for name in names:
@@ -272,7 +271,7 @@ def is_cloaked(nzo, path, names):
return False
def check_encrypted_and_unwanted_files(nzo, filepath):
def check_encrypted_and_unwanted_files(nzo: NzbObject, filepath: str) -> Tuple[bool, Optional[str]]:
""" Combines check for unwanted and encrypted files to save on CPU and IO """
encrypted = False
unwanted = None
@@ -366,9 +365,9 @@ def check_encrypted_and_unwanted_files(nzo, filepath):
return encrypted, unwanted
def nzo_filtered_by_rating(nzo):
if Rating.do and cfg.rating_enable() and cfg.rating_filter_enable() and (nzo.rating_filtered < 2):
rating = Rating.do.get_rating_by_nzo(nzo.nzo_id)
def nzo_filtered_by_rating(nzo: NzbObject) -> Tuple[int, str]:
if cfg.rating_enable() and cfg.rating_filter_enable() and (nzo.rating_filtered < 2):
rating = sabnzbd.Rating.get_rating_by_nzo(nzo.nzo_id)
if rating is not None:
nzo.rating_filtered = 1
reason = rating_filtered(rating, nzo.filename.lower(), True)

View File

@@ -22,6 +22,7 @@ sabnzbd.bpsmeter - bpsmeter
import time
import logging
import re
from typing import List, Dict
import sabnzbd
from sabnzbd.constants import BYTES_FILE_NAME, KIBI
@@ -87,8 +88,6 @@ def next_month(t):
class BPSMeter:
do = None
def __init__(self):
t = time.time()
self.start_time = t
@@ -96,20 +95,20 @@ class BPSMeter:
self.speed_log_time = t
self.last_update = t
self.bps = 0.0
self.bps_list = []
self.bps_list: List[int] = []
self.bps_list_max = 275
self.day_total = {}
self.week_total = {}
self.month_total = {}
self.grand_total = {}
self.day_total: Dict[str, int] = {}
self.week_total: Dict[str, int] = {}
self.month_total: Dict[str, int] = {}
self.grand_total: Dict[str, int] = {}
self.timeline_total = {}
self.timeline_total: Dict[str, Dict[str, int]] = {}
self.day_label = time.strftime("%Y-%m-%d")
self.end_of_day = tomorrow(t) # Time that current day will end
self.end_of_week = next_week(t) # Time that current day will end
self.end_of_month = next_month(t) # Time that current month will end
self.day_label: str = time.strftime("%Y-%m-%d")
self.end_of_day: float = tomorrow(t) # Time that current day will end
self.end_of_week: float = next_week(t) # Time that current day will end
self.end_of_month: float = next_month(t) # Time that current month will end
self.q_day = 1 # Day of quota reset
self.q_period = "m" # Daily/Weekly/Monthly quota = d/w/m
self.quota = self.left = 0.0 # Quota and remaining quota
@@ -118,32 +117,32 @@ class BPSMeter:
self.q_hour = 0 # Quota reset hour
self.q_minute = 0 # Quota reset minute
self.quota_enabled = True # Scheduled quota enable/disable
BPSMeter.do = self
def save(self):
""" Save admin to disk """
data = (
self.last_update,
self.grand_total,
self.day_total,
self.week_total,
self.month_total,
self.end_of_day,
self.end_of_week,
self.end_of_month,
self.quota,
self.left,
self.q_time,
self.timeline_total,
sabnzbd.save_admin(
(
self.last_update,
self.grand_total,
self.day_total,
self.week_total,
self.month_total,
self.end_of_day,
self.end_of_week,
self.end_of_month,
self.quota,
self.left,
self.q_time,
self.timeline_total,
),
BYTES_FILE_NAME,
)
sabnzbd.save_admin(data, BYTES_FILE_NAME)
def defaults(self):
""" Get the latest data from the database and assign to a fake server """
logging.debug("Setting default BPS meter values")
history_db = sabnzbd.database.HistoryDB()
grand, month, week = history_db.get_history_size()
history_db.close()
with sabnzbd.database.HistoryDB() as history_db:
grand, month, week = history_db.get_history_size()
self.grand_total = {}
self.month_total = {}
self.week_total = {}
@@ -235,8 +234,8 @@ class BPSMeter:
if self.have_quota and self.quota_enabled:
self.left -= amount
if self.left <= 0.0:
if sabnzbd.downloader.Downloader.do and not sabnzbd.downloader.Downloader.do.paused:
sabnzbd.downloader.Downloader.do.pause()
if not sabnzbd.Downloader.paused:
sabnzbd.Downloader.pause()
logging.warning(T("Quota spent, pausing downloading"))
# Speedometer
@@ -348,15 +347,14 @@ class BPSMeter:
def reset_quota(self, force=False):
"""Check if it's time to reset the quota, optionally resuming
Return True, when still paused
Return True, when still paused or should be paused
"""
if force or (self.have_quota and time.time() > (self.q_time - 50)):
self.quota = self.left = cfg.quota_size.get_float()
logging.info("Quota was reset to %s", self.quota)
if cfg.quota_resume():
logging.info("Auto-resume due to quota reset")
if sabnzbd.downloader.Downloader.do:
sabnzbd.downloader.Downloader.do.resume()
sabnzbd.Downloader.resume()
self.next_reset()
return False
else:
@@ -464,8 +462,8 @@ class BPSMeter:
@staticmethod
def resume():
""" Resume downloading """
if cfg.quota_resume() and sabnzbd.downloader.Downloader.do and sabnzbd.downloader.Downloader.do.paused:
sabnzbd.downloader.Downloader.do.resume()
if cfg.quota_resume() and sabnzbd.Downloader.paused:
sabnzbd.Downloader.resume()
def midnight(self):
""" Midnight action: dummy update for all servers """
@@ -476,12 +474,4 @@ class BPSMeter:
def quota_handler():
""" To be called from scheduler """
logging.debug("Checking quota")
BPSMeter.do.reset_quota()
def midnight_action():
if BPSMeter.do:
BPSMeter.do.midnight()
BPSMeter()
sabnzbd.BPSMeter.reset_quota()

View File

@@ -85,11 +85,11 @@ queue_complete_pers = OptionBool("misc", "queue_complete_pers", False)
bandwidth_perc = OptionNumber("misc", "bandwidth_perc", 0, 0, 100)
refresh_rate = OptionNumber("misc", "refresh_rate", 0)
log_level = OptionNumber("logging", "log_level", 1, -1, 2)
log_size = OptionStr("logging", "max_log_size", "5242880")
log_size = OptionNumber("logging", "max_log_size", 5242880)
log_backups = OptionNumber("logging", "log_backups", 5, 1, 1024)
queue_limit = OptionNumber("misc", "queue_limit", 20, 0)
configlock = OptionBool("misc", "config_lock", 0)
configlock = OptionBool("misc", "config_lock", False)
##############################################################################
@@ -115,7 +115,7 @@ password = OptionPassword("misc", "password")
bandwidth_max = OptionStr("misc", "bandwidth_max")
cache_limit = OptionStr("misc", "cache_limit")
web_dir = OptionStr("misc", "web_dir", DEF_STDINTF)
web_color = OptionStr("misc", "web_color", "")
web_color = OptionStr("misc", "web_color")
https_cert = OptionDir("misc", "https_cert", "server.cert", create=False)
https_key = OptionDir("misc", "https_key", "server.key", create=False)
https_chain = OptionDir("misc", "https_chain", create=False)
@@ -130,7 +130,7 @@ nzb_key = OptionStr("misc", "nzb_key", create_api_key())
##############################################################################
# Config - Folders
##############################################################################
umask = OptionStr("misc", "permissions", "", validation=validate_octal)
umask = OptionStr("misc", "permissions", validation=validate_octal)
download_dir = OptionDir("misc", "download_dir", DEF_DOWNLOAD_DIR, create=False, validation=validate_safedir)
download_free = OptionStr("misc", "download_free")
complete_dir = OptionDir(
@@ -154,14 +154,13 @@ top_only = OptionBool("misc", "top_only", False)
sfv_check = OptionBool("misc", "sfv_check", True)
quick_check_ext_ignore = OptionList("misc", "quick_check_ext_ignore", ["nfo", "sfv", "srr"])
script_can_fail = OptionBool("misc", "script_can_fail", False)
ssl_ciphers = OptionStr("misc", "ssl_ciphers", "") # Now per-server setting
enable_recursive = OptionBool("misc", "enable_recursive", True)
flat_unpack = OptionBool("misc", "flat_unpack", False)
par_option = OptionStr("misc", "par_option", "")
par_option = OptionStr("misc", "par_option")
pre_check = OptionBool("misc", "pre_check", False)
nice = OptionStr("misc", "nice", "", validation=clean_nice_ionice_parameters)
nice = OptionStr("misc", "nice", validation=clean_nice_ionice_parameters)
win_process_prio = OptionNumber("misc", "win_process_prio", 3)
ionice = OptionStr("misc", "ionice", "", validation=clean_nice_ionice_parameters)
ionice = OptionStr("misc", "ionice", validation=clean_nice_ionice_parameters)
fail_hopeless_jobs = OptionBool("misc", "fail_hopeless_jobs", True)
fast_fail = OptionBool("misc", "fast_fail", True)
autodisconnect = OptionBool("misc", "auto_disconnect", True)
@@ -291,7 +290,7 @@ show_sysload = OptionNumber("misc", "show_sysload", 2, 0, 2)
history_limit = OptionNumber("misc", "history_limit", 10, 0)
wait_ext_drive = OptionNumber("misc", "wait_ext_drive", 5, 1, 60)
max_foldername_length = OptionNumber("misc", "max_foldername_length", DEF_FOLDER_MAX, 20, 65000)
marker_file = OptionStr("misc", "nomedia_marker", "")
marker_file = OptionStr("misc", "nomedia_marker")
ipv6_servers = OptionNumber("misc", "ipv6_servers", 1, 0, 2)
url_base = OptionStr("misc", "url_base", "/sabnzbd", validation=validate_strip_right_slash)
host_whitelist = OptionList("misc", "host_whitelist", validation=all_lowercase)

View File

@@ -25,12 +25,13 @@ import re
import shutil
import threading
import uuid
from typing import List, Dict, Any, Callable, Optional, Union, Tuple
from urllib.parse import urlparse
import configobj
import sabnzbd.misc
from sabnzbd.constants import CONFIG_VERSION, NORMAL_PRIORITY, DEFAULT_PRIORITY, MAX_WIN_DFOLDER
from sabnzbd.constants import CONFIG_VERSION, NORMAL_PRIORITY, DEFAULT_PRIORITY
from sabnzbd.decorators import synchronized
from sabnzbd.filesystem import clip_path, real_path, create_real_path, renamer, remove_file, is_writable
@@ -38,7 +39,7 @@ CONFIG_LOCK = threading.Lock()
SAVE_CONFIG_LOCK = threading.Lock()
CFG = {} # Holds INI structure
CFG: configobj.ConfigObj # Holds INI structure
# during re-write this variable is global
# to allow direct access to INI structure
@@ -47,13 +48,13 @@ database = {} # Holds the option dictionary
modified = False # Signals a change in option dictionary
# Should be reset after saving to settings file
paramfinder = re.compile(r"""(?:'.*?')|(?:".*?")|(?:[^'",\s][^,]*)""")
RE_PARAMFINDER = re.compile(r"""(?:'.*?')|(?:".*?")|(?:[^'",\s][^,]*)""")
class Option:
""" Basic option class, basic fields """
def __init__(self, section, keyword, default_val=None, add=True, protect=False):
def __init__(self, section: str, keyword: str, default_val: Any = None, add: bool = True, protect: bool = False):
"""Basic option
`section` : single section or comma-separated list of sections
a list will be a hierarchy: "foo, bar" --> [foo][[bar]]
@@ -63,10 +64,10 @@ class Option:
`protect` : Do not allow setting via the API (specifically set_dict)
"""
self.__sections = section.split(",")
self.__keyword = keyword
self.__default_val = default_val
self.__value = None
self.__callback = None
self.__keyword: str = keyword
self.__default_val: Any = default_val
self.__value: Any = None
self.__callback: Optional[Callable] = None
self.__protect = protect
# Add myself to the config dictionary
@@ -79,34 +80,29 @@ class Option:
anchor = anchor[section]
anchor[keyword] = self
def __call__(self):
""" get() replacement """
return self.get()
def get(self):
def get(self) -> Any:
""" Retrieve value field """
if self.__value is not None:
return self.__value
else:
return self.__default_val
def get_string(self):
def get_string(self) -> str:
return str(self.get())
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, Any]:
""" Return value a dictionary """
return {self.__keyword: self.get()}
def set_dict(self, input_dict):
def set_dict(self, values: Dict[str, Any]):
""" Set value based on dictionary """
if self.__protect:
return False
try:
return self.set(input_dict["value"])
except KeyError:
return False
if not self.__protect:
try:
self.set(values["value"])
except KeyError:
pass
def __set(self, value):
def set(self, value: Any):
""" Set new value, no validation """
global modified
if value is not None:
@@ -115,15 +111,11 @@ class Option:
modified = True
if self.__callback:
self.__callback()
return None
def set(self, value):
return self.__set(value)
def default(self):
def default(self) -> Any:
return self.__default_val
def callback(self, callback):
def callback(self, callback: Callable):
""" Set callback function """
self.__callback = callback
@@ -133,18 +125,26 @@ class Option:
class OptionNumber(Option):
""" Numeric option class, int/float is determined from default value """
"""Numeric option class, int/float is determined from default value."""
def __init__(
self, section, keyword, default_val=0, minval=None, maxval=None, validation=None, add=True, protect=False
self,
section: str,
keyword: str,
default_val: Union[int, float] = 0,
minval: Optional[int] = None,
maxval: Optional[int] = None,
validation: Optional[Callable] = None,
add: bool = True,
protect: bool = False,
):
self.__minval = minval
self.__maxval = maxval
self.__validation = validation
self.__int = isinstance(default_val, int)
self.__minval: Optional[int] = minval
self.__maxval: Optional[int] = maxval
self.__validation: Optional[Callable] = validation
self.__int: bool = isinstance(default_val, int)
super().__init__(section, keyword, default_val, add=add, protect=protect)
def set(self, value):
def set(self, value: Any):
""" set new value, limited by range """
if value is not None:
try:
@@ -155,7 +155,7 @@ class OptionNumber(Option):
except ValueError:
value = super().default()
if self.__validation:
error, val = self.__validation(value)
_, val = self.__validation(value)
super().set(val)
else:
if self.__maxval is not None and value > self.__maxval:
@@ -163,39 +163,49 @@ class OptionNumber(Option):
elif self.__minval is not None and value < self.__minval:
value = self.__minval
super().set(value)
return None
def __call__(self) -> Union[int, float]:
""" get() replacement """
return self.get()
class OptionBool(Option):
""" Boolean option class """
""" Boolean option class, always returns 0 or 1."""
def __init__(self, section, keyword, default_val=False, add=True, protect=False):
def __init__(self, section: str, keyword: str, default_val: bool = False, add: bool = True, protect: bool = False):
super().__init__(section, keyword, int(default_val), add=add, protect=protect)
def set(self, value):
if value is None:
value = 0
try:
super().set(int(value))
except ValueError:
super().set(0)
return None
def set(self, value: Any):
# Store the value as integer, easier to parse when reading the config.
super().set(sabnzbd.misc.int_conv(value))
def __call__(self) -> int:
""" get() replacement """
return int(self.get())
class OptionDir(Option):
""" Directory option class """
def __init__(
self, section, keyword, default_val="", apply_umask=False, create=True, validation=None, writable=True, add=True
self,
section: str,
keyword: str,
default_val: str = "",
apply_umask: bool = False,
create: bool = True,
validation: Optional[Callable] = None,
writable: bool = True,
add: bool = True,
):
self.__validation = validation
self.__root = "" # Base directory for relative paths
self.__apply_umask = apply_umask
self.__create = create
self.__writable = writable
self.__validation: Optional[Callable] = validation
self.__root: str = "" # Base directory for relative paths
self.__apply_umask: bool = apply_umask
self.__create: bool = create
self.__writable: bool = writable
super().__init__(section, keyword, default_val, add=add)
def get(self):
def get(self) -> str:
""" Return value, corrected for platform """
p = super().get()
if sabnzbd.WIN32:
@@ -203,7 +213,7 @@ class OptionDir(Option):
else:
return p.replace("\\", "/") if "\\" in p else p
def get_path(self):
def get_path(self) -> str:
""" Return full absolute path """
value = self.get()
path = ""
@@ -213,11 +223,11 @@ class OptionDir(Option):
_, path, _ = create_real_path(self.ident()[1], self.__root, value, self.__apply_umask, self.__writable)
return path
def get_clipped_path(self):
def get_clipped_path(self) -> str:
""" Return clipped full absolute path """
return clip_path(self.get_path())
def test_path(self):
def test_path(self) -> bool:
""" Return True if path exists """
value = self.get()
if value:
@@ -225,18 +235,18 @@ class OptionDir(Option):
else:
return False
def set_root(self, root):
def set_root(self, root: str):
""" Set new root, is assumed to be valid """
self.__root = root
def set(self, value, create=False):
def set(self, value: str, create: bool = False) -> Optional[str]:
"""Set new dir value, validate and create if needed
Return None when directory is accepted
Return error-string when not accepted, value will not be changed
'create' means try to create (but don't set permanent create flag)
"""
error = None
if value and (value != self.get() or create):
if value is not None and (value != self.get() or create):
value = value.strip()
if self.__validation:
error, value = self.__validation(self.__root, value, super().default())
@@ -249,21 +259,33 @@ class OptionDir(Option):
super().set(value)
return error
def set_create(self, value):
def set_create(self, value: bool):
""" Set auto-creation value """
self.__create = value
def __call__(self) -> str:
""" get() replacement """
return self.get()
class OptionList(Option):
""" List option class """
def __init__(self, section, keyword, default_val=None, validation=None, add=True, protect=False):
self.__validation = validation
def __init__(
self,
section: str,
keyword: str,
default_val: Union[str, List, None] = None,
validation: Optional[Callable] = None,
add: bool = True,
protect: bool = False,
):
self.__validation: Optional[Callable] = validation
if default_val is None:
default_val = []
super().__init__(section, keyword, default_val, add=add, protect=protect)
def set(self, value):
def set(self, value: Union[str, List]) -> Optional[str]:
""" Set the list given a comma-separated string or a list """
error = None
if value is not None:
@@ -271,47 +293,52 @@ class OptionList(Option):
if '"' not in value and "," not in value:
value = value.split()
else:
value = paramfinder.findall(value)
value = RE_PARAMFINDER.findall(value)
if self.__validation:
error, value = self.__validation(value)
if not error:
super().set(value)
return error
def get_string(self):
def get_string(self) -> str:
""" Return the list as a comma-separated string """
lst = self.get()
if isinstance(lst, str):
return lst
else:
return ", ".join(lst)
return ", ".join(self.get())
def default_string(self):
def default_string(self) -> str:
""" Return the default list as a comma-separated string """
lst = self.default()
if isinstance(lst, str):
return lst
else:
return ", ".join(lst)
return ", ".join(self.default())
def __call__(self) -> List[str]:
""" get() replacement """
return self.get()
class OptionStr(Option):
""" String class """
""" String class."""
def __init__(self, section, keyword, default_val="", validation=None, add=True, strip=True, protect=False):
self.__validation = validation
self.__strip = strip
def __init__(
self,
section: str,
keyword: str,
default_val: str = "",
validation: Optional[Callable] = None,
add: bool = True,
strip: bool = True,
protect: bool = False,
):
self.__validation: Optional[Callable] = validation
self.__strip: bool = strip
super().__init__(section, keyword, default_val, add=add, protect=protect)
def get_float(self):
def get_float(self) -> float:
""" Return value converted to a float, allowing KMGT notation """
return sabnzbd.misc.from_units(self.get())
def get_int(self):
def get_int(self) -> int:
""" Return value converted to an int, allowing KMGT notation """
return int(self.get_float())
def set(self, value):
def set(self, value: Any) -> Optional[str]:
""" Set stripped value """
error = None
if isinstance(value, str) and self.__strip:
@@ -323,57 +350,43 @@ class OptionStr(Option):
super().set(value)
return error
def __call__(self) -> str:
""" get() replacement """
return self.get()
class OptionPassword(Option):
""" Password class """
""" Password class. """
def __init__(self, section, keyword, default_val="", add=True):
def __init__(self, section: str, keyword: str, default_val: str = "", add: bool = True):
self.get_string = self.get_stars
super().__init__(section, keyword, default_val, add=add)
def get(self):
def get(self) -> Optional[str]:
""" Return decoded password """
return decode_password(super().get(), self.ident())
def get_stars(self):
""" Return decoded password as asterisk string """
return "*" * len(self.get())
def get_stars(self) -> Optional[str]:
""" Return non-descript asterisk string """
if self.get():
return "*" * 10
return ""
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, str]:
""" Return value a dictionary """
if safe:
return {self.ident()[1]: self.get_stars()}
else:
return {self.ident()[1]: self.get()}
def set(self, pw):
def set(self, pw: str):
""" Set password, encode it """
if (pw is not None and pw == "") or (pw and pw.strip("*")):
super().set(encode_password(pw))
return None
@synchronized(CONFIG_LOCK)
def add_to_database(section, keyword, obj):
""" add object as section/keyword to INI database """
global database
if section not in database:
database[section] = {}
database[section][keyword] = obj
@synchronized(CONFIG_LOCK)
def delete_from_database(section, keyword):
""" Remove section/keyword from INI database """
global database, CFG, modified
del database[section][keyword]
if section == "servers" and "[" in keyword:
keyword = keyword.replace("[", "{").replace("]", "}")
try:
del CFG[section][keyword]
except KeyError:
pass
modified = True
def __call__(self) -> str:
""" get() replacement """
return self.get()
class ConfigServer:
@@ -384,28 +397,28 @@ class ConfigServer:
self.__name = name
name = "servers," + self.__name
self.displayname = OptionStr(name, "displayname", "", add=False)
self.host = OptionStr(name, "host", "", add=False)
self.displayname = OptionStr(name, "displayname", add=False)
self.host = OptionStr(name, "host", add=False)
self.port = OptionNumber(name, "port", 119, 0, 2 ** 16 - 1, add=False)
self.timeout = OptionNumber(name, "timeout", 60, 20, 240, add=False)
self.username = OptionStr(name, "username", "", add=False)
self.password = OptionPassword(name, "password", "", add=False)
self.username = OptionStr(name, "username", add=False)
self.password = OptionPassword(name, "password", add=False)
self.connections = OptionNumber(name, "connections", 1, 0, 100, add=False)
self.ssl = OptionBool(name, "ssl", False, add=False)
# 0=No, 1=Normal, 2=Strict (hostname verification)
self.ssl_verify = OptionNumber(name, "ssl_verify", 2, add=False)
self.ssl_ciphers = OptionStr(name, "ssl_ciphers", "", add=False)
self.ssl_ciphers = OptionStr(name, "ssl_ciphers", add=False)
self.enable = OptionBool(name, "enable", True, add=False)
self.optional = OptionBool(name, "optional", False, add=False)
self.retention = OptionNumber(name, "retention", add=False)
self.retention = OptionNumber(name, "retention", 0, add=False)
self.send_group = OptionBool(name, "send_group", False, add=False)
self.priority = OptionNumber(name, "priority", 0, 0, 99, add=False)
self.notes = OptionStr(name, "notes", "", add=False)
self.notes = OptionStr(name, "notes", add=False)
self.set_dict(values)
add_to_database("servers", self.__name, self)
def set_dict(self, values):
def set_dict(self, values: Dict[str, Any]):
""" Set one or more fields, passed as dictionary """
for kw in (
"displayname",
@@ -427,14 +440,13 @@ class ConfigServer:
):
try:
value = values[kw]
getattr(self, kw).set(value)
except KeyError:
continue
exec("self.%s.set(value)" % kw)
if not self.displayname():
self.displayname.set(self.__name)
return True
if not self.displayname():
self.displayname.set(self.__name)
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, Any]:
""" Return a dictionary with all attributes """
output_dict = {}
output_dict["name"] = self.__name
@@ -463,23 +475,23 @@ class ConfigServer:
""" Remove from database """
delete_from_database("servers", self.__name)
def rename(self, name):
def rename(self, name: str):
""" Give server new display name """
self.displayname.set(name)
def ident(self):
def ident(self) -> Tuple[str, str]:
return "servers", self.__name
class ConfigCat:
""" Class defining a single category """
def __init__(self, name, values):
def __init__(self, name: str, values: Dict[str, Any]):
self.__name = name
name = "categories," + name
self.order = OptionNumber(name, "order", 0, 0, 100, add=False)
self.pp = OptionStr(name, "pp", "", add=False)
self.pp = OptionStr(name, "pp", add=False)
self.script = OptionStr(name, "script", "Default", add=False)
self.dir = OptionDir(name, "dir", add=False, create=False)
self.newzbin = OptionList(name, "newzbin", add=False, validation=validate_single_tag)
@@ -488,17 +500,16 @@ class ConfigCat:
self.set_dict(values)
add_to_database("categories", self.__name, self)
def set_dict(self, values):
def set_dict(self, values: Dict[str, Any]):
""" Set one or more fields, passed as dictionary """
for kw in ("order", "pp", "script", "dir", "newzbin", "priority"):
try:
value = values[kw]
getattr(self, kw).set(value)
except KeyError:
continue
exec("self.%s.set(value)" % kw)
return True
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, Any]:
""" Return a dictionary with all attributes """
output_dict = {}
output_dict["name"] = self.__name
@@ -522,7 +533,7 @@ class OptionFilters(Option):
super().__init__(section, keyword, add=add)
self.set([])
def move(self, current, new):
def move(self, current: int, new: int):
""" Move filter from position 'current' to 'new' """
lst = self.get()
try:
@@ -532,7 +543,7 @@ class OptionFilters(Option):
return
self.set(lst)
def update(self, pos, value):
def update(self, pos: int, value: Tuple):
"""Update filter 'pos' definition, value is a list
Append if 'pos' outside list
"""
@@ -543,7 +554,7 @@ class OptionFilters(Option):
lst.append(value)
self.set(lst)
def delete(self, pos):
def delete(self, pos: int):
""" Remove filter 'pos' """
lst = self.get()
try:
@@ -552,34 +563,27 @@ class OptionFilters(Option):
return
self.set(lst)
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, str]:
""" Return filter list as a dictionary with keys 'filter[0-9]+' """
output_dict = {}
n = 0
for filter_name in self.get():
output_dict["filter" + str(n)] = filter_name
n = n + 1
for n, rss_filter in enumerate(self.get()):
output_dict[f"filter{n}"] = rss_filter
return output_dict
def set_dict(self, values):
def set_dict(self, values: Dict[str, Any]):
""" Create filter list from dictionary with keys 'filter[0-9]+' """
filters = []
# We don't know how many filters there are, so just assume all values are filters
for n in range(len(values)):
kw = "filter%d" % n
val = values.get(kw)
if val is not None:
val = values[kw]
if isinstance(val, list):
filters.append(val)
else:
filters.append(paramfinder.findall(val))
while len(filters[-1]) < 7:
filters[-1].append("1")
if not filters[-1][6]:
filters[-1][6] = "1"
kw = f"filter{n}"
if kw in values:
filters.append(values[kw])
if filters:
self.set(filters)
return True
def __call__(self) -> List[List[str]]:
""" get() replacement """
return self.get()
class ConfigRSS:
@@ -591,7 +595,7 @@ class ConfigRSS:
self.uri = OptionList(name, "uri", add=False)
self.cat = OptionStr(name, "cat", add=False)
self.pp = OptionStr(name, "pp", "", add=False)
self.pp = OptionStr(name, "pp", add=False)
self.script = OptionStr(name, "script", add=False)
self.enable = OptionBool(name, "enable", add=False)
self.priority = OptionNumber(name, "priority", DEFAULT_PRIORITY, DEFAULT_PRIORITY, 2, add=False)
@@ -601,19 +605,17 @@ class ConfigRSS:
self.set_dict(values)
add_to_database("rss", self.__name, self)
def set_dict(self, values):
def set_dict(self, values: Dict[str, Any]):
""" Set one or more fields, passed as dictionary """
for kw in ("uri", "cat", "pp", "script", "priority", "enable"):
try:
value = values[kw]
getattr(self, kw).set(value)
except KeyError:
continue
exec("self.%s.set(value)" % kw)
self.filters.set_dict(values)
return True
def get_dict(self, safe=False):
def get_dict(self, safe: bool = False) -> Dict[str, Any]:
""" Return a dictionary with all attributes """
output_dict = {}
output_dict["name"] = self.__name
@@ -632,10 +634,33 @@ class ConfigRSS:
""" Remove from database """
delete_from_database("rss", self.__name)
def ident(self):
def ident(self) -> Tuple[str, str]:
return "rss", self.__name
@synchronized(CONFIG_LOCK)
def add_to_database(section, keyword, obj):
""" add object as section/keyword to INI database """
global database
if section not in database:
database[section] = {}
database[section][keyword] = obj
@synchronized(CONFIG_LOCK)
def delete_from_database(section, keyword):
""" Remove section/keyword from INI database """
global database, CFG, modified
del database[section][keyword]
if section == "servers" and "[" in keyword:
keyword = keyword.replace("[", "{").replace("]", "}")
try:
del CFG[section][keyword]
except KeyError:
pass
modified = True
def get_dconfig(section, keyword, nested=False):
"""Return a config values dictionary,
Single item or slices based on 'section', 'keyword'
@@ -696,7 +721,7 @@ def set_config(kwargs):
return True
def delete(section, keyword):
def delete(section: str, keyword: str):
""" Delete specific config item """
try:
database[section][keyword].delete()
@@ -778,9 +803,16 @@ def _read_config(path, try_backup=False):
except KeyError:
pass
define_categories()
define_rss()
define_servers()
# Define the special settings
if "categories" in CFG:
for cat in CFG["categories"]:
ConfigCat(cat, CFG["categories"][cat])
if "rss" in CFG:
for rss_feed in CFG["rss"]:
ConfigRSS(rss_feed, CFG["rss"][rss_feed])
if "servers" in CFG:
for server in CFG["servers"]:
ConfigServer(server.replace("{", "[").replace("}", "]"), CFG["servers"][server])
modified = False
return True, ""
@@ -825,13 +857,7 @@ def save_config(force=False):
CFG[sec] = {}
value = database[section][option]()
# bool is a subclass of int, check first
if isinstance(value, bool):
# convert bool to int when saving so we store 0 or 1
CFG[sec][kw] = str(int(value))
elif isinstance(value, int):
CFG[sec][kw] = str(value)
else:
CFG[sec][kw] = value
CFG[sec][kw] = value
res = False
filename = CFG.filename
@@ -872,27 +898,7 @@ def save_config(force=False):
return res
def define_servers():
"""Define servers listed in the Setup file
return a list of ConfigServer instances
"""
global CFG
try:
for server in CFG["servers"]:
svr = CFG["servers"][server]
s = ConfigServer(server.replace("{", "[").replace("}", "]"), svr)
# Conversion of global SSL-Ciphers to server ones
if sabnzbd.cfg.ssl_ciphers():
s.ssl_ciphers.set(sabnzbd.cfg.ssl_ciphers())
except KeyError:
pass
# No longer needed
sabnzbd.cfg.ssl_ciphers.set("")
def get_servers():
def get_servers() -> Dict[str, ConfigServer]:
global database
try:
return database["servers"]
@@ -900,22 +906,9 @@ def get_servers():
return {}
def define_categories():
"""Define categories listed in the Setup file
return a list of ConfigCat instances
"""
global CFG, categories
try:
for cat in CFG["categories"]:
ConfigCat(cat, CFG["categories"][cat])
except KeyError:
pass
def get_categories(cat=0):
def get_categories() -> Dict[str, ConfigCat]:
"""Return link to categories section.
This section will always contain special category '*'
When 'cat' is given, a link to that category or to '*' is returned
"""
global database
if "categories" not in database:
@@ -933,15 +926,19 @@ def get_categories(cat=0):
# Save config for future use
save_config(True)
if not isinstance(cat, int):
try:
cats = cats[cat]
except KeyError:
cats = cats["*"]
return cats
def get_ordered_categories():
def get_category(cat: str = "*") -> ConfigCat:
"""Get one specific category or if not found the default one"""
cats = get_categories()
try:
return cats[cat]
except KeyError:
return cats["*"]
def get_ordered_categories() -> List[Dict]:
"""Return list-copy of categories section that's ordered
by user's ordering including Default-category
"""
@@ -960,22 +957,10 @@ def get_ordered_categories():
return categories
def define_rss():
"""Define rss-feeds listed in the Setup file
return a list of ConfigRSS instances
"""
global CFG
try:
for r in CFG["rss"]:
ConfigRSS(r, CFG["rss"][r])
except KeyError:
pass
def get_rss():
def get_rss() -> Dict[str, ConfigRSS]:
global database
try:
# We have to remove non-seperator commas by detecting if they are valid URL's
# We have to remove non-separator commas by detecting if they are valid URL's
for feed_key in database["rss"]:
feed = database["rss"][feed_key]
# Only modify if we have to, to prevent repeated config-saving
@@ -1102,12 +1087,8 @@ def validate_no_unc(root, value, default):
def validate_safedir(root, value, default):
"""Allow only when queues are empty and no UNC
On Windows path should be small
"""
if sabnzbd.WIN32 and value and len(real_path(root, value)) >= MAX_WIN_DFOLDER:
return T("Error: Path length should be below %s.") % MAX_WIN_DFOLDER, None
if sabnzbd.empty_queues():
"""Allow only when queues are empty and no UNC"""
if not sabnzbd.__INITIALIZED__ or (sabnzbd.PostProcessor.empty() and sabnzbd.NzbQueue.is_empty()):
return validate_no_unc(root, value, default)
else:
return T("Error: Queue not empty, cannot change folder."), None

View File

@@ -24,7 +24,7 @@ CONFIG_VERSION = 19
QUEUE_VERSION = 10
POSTPROC_QUEUE_VERSION = 2
REC_RAR_VERSION = 500
REC_RAR_VERSION = 550
PNFO = namedtuple(
"PNFO",
@@ -52,6 +52,7 @@ QUEUE_FILE_NAME = QUEUE_FILE_TMPL % QUEUE_VERSION
POSTPROC_QUEUE_FILE_NAME = "postproc%s.sab" % POSTPROC_QUEUE_VERSION
RSS_FILE_NAME = "rss_data.sab"
SCAN_FILE_NAME = "watched_data2.sab"
RATING_FILE_NAME = "Rating.sab"
FUTURE_Q_FOLDER = "future"
JOB_ADMIN = "__ADMIN__"
VERIFIED_FILE = "__verified__"
@@ -87,7 +88,6 @@ DEF_ARTICLE_CACHE_MAX = "1G"
DEF_TIMEOUT = 60
DEF_SCANRATE = 5
MAX_WARNINGS = 20
MAX_WIN_DFOLDER = 60
MAX_BAD_ARTICLES = 5
# Constants affecting download performance
@@ -123,6 +123,8 @@ CHEETAH_DIRECTIVES = {"directiveStartToken": "<!--#", "directiveEndToken": "#-->
IGNORED_FOLDERS = ("@eaDir", ".appleDouble")
LOCALHOSTS = ("localhost", "127.0.0.1", "[::1]", "::1")
# (MATCHER, [EXTRA, MATCHERS])
series_match = [
(compile(r"( [sS]|[\d]+)x(\d+)"), [compile(r"^[-\.]+([sS]|[\d])+x(\d+)"), compile(r"^[-\.](\d+)")]), # 1x01

View File

@@ -26,6 +26,7 @@ import logging
import sys
import threading
import sqlite3
from typing import Union, Dict
import sabnzbd
import sabnzbd.cfg
@@ -84,7 +85,7 @@ class HistoryDB:
""" Create a connection to the database """
create_table = not os.path.exists(HistoryDB.db_path)
self.con = sqlite3.connect(HistoryDB.db_path)
self.con.row_factory = dict_factory
self.con.row_factory = sqlite3.Row
self.c = self.con.cursor()
if create_table:
self.create_history_db()
@@ -98,7 +99,7 @@ class HistoryDB:
self.execute("PRAGMA user_version;")
try:
version = self.c.fetchone()["user_version"]
except TypeError:
except IndexError:
version = 0
if version < 1:
# Add any missing columns added since first DB version
@@ -220,7 +221,7 @@ class HistoryDB:
"""SELECT path FROM history WHERE name LIKE ? AND status = ?""", (search, Status.FAILED)
)
if fetch_ok:
return [item.get("path") for item in self.c.fetchall()]
return [item["path"] for item in self.c.fetchall()]
else:
return []
@@ -309,8 +310,8 @@ class HistoryDB:
total_items = -1
if res:
try:
total_items = self.c.fetchone().get("COUNT(*)")
except AttributeError:
total_items = self.c.fetchone()["COUNT(*)"]
except IndexError:
pass
if not start:
@@ -346,8 +347,8 @@ class HistoryDB:
)
if res:
try:
total = self.c.fetchone().get("COUNT(*)")
except AttributeError:
total = self.c.fetchone()["COUNT(*)"]
except IndexError:
pass
return total > 0
@@ -360,8 +361,8 @@ class HistoryDB:
)
if res:
try:
total = self.c.fetchone().get("COUNT(*)")
except AttributeError:
total = self.c.fetchone()["COUNT(*)"]
except IndexError:
pass
return total > 0
@@ -373,8 +374,8 @@ class HistoryDB:
total = 0
if self.execute("""SELECT sum(bytes) FROM history"""):
try:
total = self.c.fetchone().get("sum(bytes)")
except AttributeError:
total = self.c.fetchone()["sum(bytes)"]
except IndexError:
pass
# Amount downloaded this month
@@ -385,8 +386,8 @@ class HistoryDB:
month = 0
if self.execute("""SELECT sum(bytes) FROM history WHERE completed > ?""", (month_timest,)):
try:
month = self.c.fetchone().get("sum(bytes)")
except AttributeError:
month = self.c.fetchone()["sum(bytes)"]
except IndexError:
pass
# Amount downloaded this week
@@ -395,8 +396,8 @@ class HistoryDB:
week = 0
if self.execute("""SELECT sum(bytes) FROM history WHERE completed > ?""", (week_timest,)):
try:
week = self.c.fetchone().get("sum(bytes)")
except AttributeError:
week = self.c.fetchone()["sum(bytes)"]
except IndexError:
pass
return total, month, week
@@ -407,7 +408,7 @@ class HistoryDB:
t = (nzo_id,)
if self.execute("""SELECT script_log FROM history WHERE nzo_id = ?""", t):
try:
data = ubtou(zlib.decompress(self.c.fetchone().get("script_log")))
data = ubtou(zlib.decompress(self.c.fetchone()["script_log"]))
except:
pass
return data
@@ -418,8 +419,8 @@ class HistoryDB:
name = ""
if self.execute("""SELECT name FROM history WHERE nzo_id = ?""", t):
try:
name = self.c.fetchone().get("name")
except AttributeError:
name = self.c.fetchone()["name"]
except IndexError:
pass
return name
@@ -429,7 +430,7 @@ class HistoryDB:
path = ""
if self.execute("""SELECT path FROM history WHERE nzo_id = ?""", t):
try:
path = self.c.fetchone().get("path")
path = self.c.fetchone()["path"]
except AttributeError:
pass
if os.path.exists(path):
@@ -441,24 +442,19 @@ class HistoryDB:
t = (nzo_id,)
if self.execute("""SELECT * FROM history WHERE nzo_id = ?""", t):
try:
items = self.c.fetchone()
dtype = items.get("report")
url = items.get("url")
pp = items.get("pp")
script = items.get("script")
cat = items.get("category")
return dtype, url, pp, script, cat
item = self.c.fetchone()
return item["report"], item["url"], item["pp"], item["script"], item["category"]
except (AttributeError, IndexError):
pass
return "", "", "", "", ""
def __enter__(self):
""" For context manager support """
return self
def dict_factory(cursor, row):
""" Return a dictionary for the current database position """
d = {}
for idx, col in enumerate(cursor.description):
d[col[0]] = row[idx]
return d
def __exit__(self, exc_type, exc_val, exc_tb):
""" For context manager support, ignore any exception """
self.close()
_PP_LOOKUP = {0: "", 1: "R", 2: "U", 3: "D"}
@@ -506,7 +502,7 @@ def build_history_info(nzo, workdir_complete="", postproc_time=0, script_output=
nzo.status,
nzo.nzo_id,
clip_path(workdir_complete),
clip_path(nzo.downpath),
clip_path(nzo.download_path),
script_output,
script_line,
download_time,
@@ -522,44 +518,42 @@ def build_history_info(nzo, workdir_complete="", postproc_time=0, script_output=
)
def unpack_history_info(item):
def unpack_history_info(item: Union[Dict, sqlite3.Row]):
"""Expands the single line stage_log from the DB
into a python dictionary for use in the history display
"""
# Convert result to dictionary
if isinstance(item, sqlite3.Row):
item = dict(item)
# Stage Name is separated by ::: stage lines by ; and stages by \r\n
lst = item["stage_log"]
if lst:
parsed_stage_log = []
try:
lines = lst.split("\r\n")
all_stages_lines = lst.split("\r\n")
except:
logging.error(T("Invalid stage logging in history for %s") + " (\\r\\n)", item["name"])
logging.error(T("Invalid stage logging in history for %s"), item["name"])
logging.debug("Lines: %s", lst)
lines = []
lst = [None for _ in STAGES]
for line in lines:
stage = {}
all_stages_lines = []
for stage_lines in all_stages_lines:
try:
key, logs = line.split(":::")
key, logs = stage_lines.split(":::")
except:
logging.debug('Missing key:::logs "%s"', line)
key = line
logs = ""
stage["name"] = key
stage["actions"] = []
logging.info('Missing key:::logs "%s"', stage_lines)
continue
stage = {"name": key, "actions": []}
try:
logs = logs.split(";")
stage["actions"] = logs.split(";")
except:
logging.error(T("Invalid stage logging in history for %s") + " (;)", item["name"])
logging.error(T("Invalid stage logging in history for %s"), item["name"])
logging.debug("Logs: %s", logs)
logs = []
for log in logs:
stage["actions"].append(log)
try:
lst[STAGES[key]] = stage
except KeyError:
lst.append(stage)
# Remove unused stages
item["stage_log"] = [x for x in lst if x is not None]
parsed_stage_log.append(stage)
# Sort it so it is more logical
parsed_stage_log.sort(key=lambda stage_log: STAGES.get(stage_log["name"], 100))
item["stage_log"] = parsed_stage_log
if item["script_log"]:
item["script_log"] = ""
@@ -571,6 +565,5 @@ def unpack_history_info(item):
def midnight_history_purge():
logging.info("Scheduled history purge")
history_db = HistoryDB()
history_db.auto_history_purge()
history_db.close()
with HistoryDB() as history_db:
history_db.auto_history_purge()

View File

@@ -23,13 +23,12 @@ import logging
import hashlib
import queue
from threading import Thread
from typing import Tuple, List, Optional
import sabnzbd
from sabnzbd.constants import SABYENC_VERSION_REQUIRED
from sabnzbd.articlecache import ArticleCache
from sabnzbd.downloader import Downloader
from sabnzbd.nzbqueue import NzbQueue
import sabnzbd.cfg as cfg
from sabnzbd.constants import SABYENC_VERSION_REQUIRED
from sabnzbd.nzbstuff import Article
from sabnzbd.misc import match_str
# Check for correct SABYenc version
@@ -62,8 +61,6 @@ class BadYenc(Exception):
class Decoder:
""" Implement thread-like coordinator for the decoders """
do = None
def __init__(self):
logging.debug("Initializing decoders")
# Initialize queue and servers
@@ -73,13 +70,12 @@ class Decoder:
self.decoder_workers = []
for i in range(cfg.num_decoders()):
self.decoder_workers.append(DecoderWorker(self.decoder_queue))
Decoder.do = self
def start(self):
for decoder_worker in self.decoder_workers:
decoder_worker.start()
def is_alive(self):
def is_alive(self) -> bool:
# Check all workers
for decoder_worker in self.decoder_workers:
if not decoder_worker.is_alive():
@@ -89,7 +85,7 @@ class Decoder:
def stop(self):
# Put multiple to stop all decoders
for _ in self.decoder_workers:
self.decoder_queue.put(None)
self.decoder_queue.put((None, None))
def join(self):
# Wait for all decoders to finish
@@ -99,14 +95,14 @@ class Decoder:
except:
pass
def process(self, article, raw_data):
def process(self, article: Article, raw_data: List[bytes]):
# We use reported article-size, just like sabyenc does
ArticleCache.do.reserve_space(article.bytes)
sabnzbd.ArticleCache.reserve_space(article.bytes)
self.decoder_queue.put((article, raw_data))
def queue_full(self):
def queue_full(self) -> bool:
# Check if the queue size exceeds the limits
return self.decoder_queue.qsize() >= ArticleCache.do.decoder_cache_article_limit
return self.decoder_queue.qsize() >= sabnzbd.ArticleCache.decoder_cache_article_limit
class DecoderWorker(Thread):
@@ -116,30 +112,25 @@ class DecoderWorker(Thread):
Thread.__init__(self)
logging.debug("Initializing decoder %s", self.name)
self.decoder_queue = decoder_queue
def stop(self):
# Put multiple to stop all decoders
self.decoder_queue.put(None)
self.decoder_queue.put(None)
self.decoder_queue: queue.Queue[Tuple[Optional[Article], Optional[List[bytes]]]] = decoder_queue
def run(self):
while 1:
# Let's get to work!
art_tup = self.decoder_queue.get()
if not art_tup:
# Set Article and NzbObject objects to None so references from this
# thread do not keep the parent objects alive (see #1628)
decoded_data = raw_data = article = nzo = None
article, raw_data = self.decoder_queue.get()
if not article:
logging.info("Shutting down decoder %s", self.name)
break
article, raw_data = art_tup
nzo = article.nzf.nzo
art_id = article.article
# Free space in the decoder-queue
ArticleCache.do.free_reserved_space(article.bytes)
sabnzbd.ArticleCache.free_reserved_space(article.bytes)
# Keeping track
decoded_data = None
article_success = False
try:
@@ -155,12 +146,12 @@ class DecoderWorker(Thread):
except MemoryError:
logging.warning(T("Decoder failure: Out of memory"))
logging.info("Decoder-Queue: %d", self.decoder_queue.qsize())
logging.info("Cache: %d, %d, %d", *ArticleCache.do.cache_info())
logging.info("Cache: %d, %d, %d", *sabnzbd.ArticleCache.cache_info())
logging.info("Traceback: ", exc_info=True)
Downloader.do.pause()
sabnzbd.Downloader.pause()
# This article should be fetched again
NzbQueue.do.reset_try_lists(article)
sabnzbd.NzbQueue.reset_try_lists(article)
continue
except CrcError:
@@ -193,7 +184,7 @@ class DecoderWorker(Thread):
logme = T("UUencode detected, only yEnc encoding is supported [%s]") % nzo.final_name
logging.error(logme)
nzo.fail_msg = logme
NzbQueue.do.end_job(nzo)
sabnzbd.NzbQueue.end_job(nzo)
break
# Pre-check, proper article found so just register
@@ -219,12 +210,12 @@ class DecoderWorker(Thread):
if decoded_data:
# If the data needs to be written to disk due to full cache, this will be slow
# Causing the decoder-queue to fill up and delay the downloader
ArticleCache.do.save_article(article, decoded_data)
sabnzbd.ArticleCache.save_article(article, decoded_data)
NzbQueue.do.register_article(article, article_success)
sabnzbd.NzbQueue.register_article(article, article_success)
def decode(article, raw_data):
def decode(article: Article, raw_data: List[bytes]) -> bytes:
# Let SABYenc do all the heavy lifting
decoded_data, yenc_filename, crc, crc_expected, crc_correct = sabyenc3.decode_usenet_chunks(raw_data, article.bytes)
@@ -251,7 +242,7 @@ def decode(article, raw_data):
return decoded_data
def search_new_server(article):
def search_new_server(article: Article) -> bool:
""" Shorthand for searching new server or else increasing bad_articles """
# Continue to the next one if we found new server
if not article.search_new_server():

View File

@@ -30,11 +30,10 @@ Based on work by P1nGu1n
import hashlib
import logging
import math
import os
import re
from sabnzbd.filesystem import get_unique_filename, globber_full, renamer, get_ext
from sabnzbd.filesystem import get_unique_filename, renamer, get_ext
from sabnzbd.par2file import is_parfile, parse_par2_file
# Files to exclude and minimal file size for renaming
@@ -81,12 +80,20 @@ def is_probably_obfuscated(myinputfilename):
path, filename = os.path.split(myinputfilename)
filebasename, fileextension = os.path.splitext(filename)
# First fixed patterns that we know of:
# ...blabla.H.264/b082fa0beaa644d3aa01045d5b8d0b36.mkv is certainly obfuscated
if re.findall("^[a-f0-9]{32}$", filebasename):
if re.findall(r"^[a-f0-9]{32}$", filebasename):
logging.debug("Obfuscated: 32 hex digit")
# exactly 32 hex digits, so:
return True
# /some/thing/abc.xyz.a4c567edbcbf27.BLA is certainly obfuscated
if re.findall(r"^abc\.xyz", filebasename):
logging.debug("Obfuscated: starts with 'abc.xyz'")
# ... which we consider as obfuscated:
return True
# these are signals for the obfuscation versus non-obfuscation
decimals = sum(1 for c in filebasename if c.isnumeric())
upperchars = sum(1 for c in filebasename if c.isupper())

View File

@@ -21,14 +21,17 @@ sabnzbd.directunpacker
import os
import re
import subprocess
import time
import threading
import logging
from typing import Optional, Dict, Tuple, List
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.misc import int_conv, format_time_string, build_and_run_command
from sabnzbd.filesystem import clip_path, long_path, remove_all, real_path, remove_file
from sabnzbd.misc import format_time_string, build_and_run_command
from sabnzbd.filesystem import long_path, remove_all, real_path, remove_file, analyze_rar_filename
from sabnzbd.nzbstuff import NzbObject, NzbFile
from sabnzbd.encoding import platform_btou
from sabnzbd.decorators import synchronized
from sabnzbd.newsunpack import EXTRACTFROM_RE, EXTRACTED_RE, rar_volumelist
@@ -42,26 +45,24 @@ START_STOP_LOCK = threading.RLock()
ACTIVE_UNPACKERS = []
RAR_NR = re.compile(r"(.*?)(\.part(\d*).rar|\.r(\d*))$", re.IGNORECASE)
class DirectUnpacker(threading.Thread):
def __init__(self, nzo):
def __init__(self, nzo: NzbObject):
threading.Thread.__init__(self)
self.nzo = nzo
self.active_instance = None
self.nzo: NzbObject = nzo
self.active_instance: Optional[subprocess.Popen] = None
self.killed = False
self.next_file_lock = threading.Condition(threading.RLock())
self.unpack_dir_info = None
self.rarfile_nzf = None
self.rarfile_nzf: Optional[NzbFile] = None
self.cur_setname = None
self.cur_volume = 0
self.total_volumes = {}
self.unpack_time = 0.0
self.success_sets = {}
self.success_sets: Dict[str, Tuple[List[str], List[str]]] = {}
self.next_sets = []
self.duplicate_lines = 0
@@ -94,6 +95,7 @@ class DirectUnpacker(threading.Thread):
if (
not cfg.direct_unpack()
or self.killed
or self.nzo.first_articles
or not self.nzo.unpack
or self.nzo.bad_articles
or sabnzbd.newsunpack.RAR_PROBLEM
@@ -121,12 +123,12 @@ class DirectUnpacker(threading.Thread):
self.total_volumes = {}
@synchronized(START_STOP_LOCK)
def add(self, nzf):
def add(self, nzf: NzbFile):
""" Add jobs and start instance of DirectUnpack """
if not cfg.direct_unpack_tested():
test_disk_performance()
# Stop if something is wrong
# Stop if something is wrong or we shouldn't start yet
if not self.check_requirements():
return
@@ -161,8 +163,9 @@ class DirectUnpacker(threading.Thread):
def run(self):
# Input and output
linebuf = ""
last_volume_linebuf = ""
linebuf = b""
linebuf_encoded = ""
last_volume_linebuf = b""
unrar_log = []
rarfiles = []
extracted = []
@@ -174,99 +177,107 @@ class DirectUnpacker(threading.Thread):
with START_STOP_LOCK:
if not self.active_instance or not self.active_instance.stdout:
break
char = platform_btou(self.active_instance.stdout.read(1))
char = self.active_instance.stdout.read(1)
if not char:
# End of program
break
linebuf += char
# Error? Let PP-handle it
if linebuf.endswith(
(
"ERROR: ",
"Cannot create",
"in the encrypted file",
"CRC failed",
"checksum failed",
"You need to start extraction from a previous volume",
"password is incorrect",
"Incorrect password",
"Write error",
"checksum error",
"Cannot open",
"start extraction from a previous volume",
"Unexpected end of archive",
)
):
logging.info("Error in DirectUnpack of %s: %s", self.cur_setname, linebuf.strip())
self.abort()
# Continue if it's not a space or end of line
if char not in (b" ", b"\n"):
continue
if linebuf.endswith("\n"):
# List files we used
if linebuf.startswith("Extracting from"):
filename = re.search(EXTRACTFROM_RE, linebuf.strip()).group(1)
# Handle whole lines
if char == b"\n":
# When reaching end-of-line, we can safely convert and add to the log
linebuf_encoded = platform_btou(linebuf.strip())
unrar_log.append(linebuf_encoded)
linebuf = b""
# Error? Let PP-handle this job
if any(
error_text in linebuf_encoded
for error_text in (
"ERROR: ",
"Cannot create",
"in the encrypted file",
"CRC failed",
"checksum failed",
"You need to start extraction from a previous volume",
"password is incorrect",
"Incorrect password",
"Write error",
"checksum error",
"Cannot open",
"start extraction from a previous volume",
"Unexpected end of archive",
)
):
logging.info("Error in DirectUnpack of %s: %s", self.cur_setname, platform_btou(linebuf.strip()))
self.abort()
elif linebuf_encoded.startswith("All OK"):
# Did we reach the end?
# Stop timer and finish
self.unpack_time += time.time() - start_time
ACTIVE_UNPACKERS.remove(self)
# Add to success
rarfile_path = os.path.join(self.nzo.download_path, self.rarfile_nzf.filename)
self.success_sets[self.cur_setname] = (
rar_volumelist(rarfile_path, self.nzo.password, rarfiles),
extracted,
)
logging.info("DirectUnpack completed for %s", self.cur_setname)
self.nzo.set_action_line(T("Direct Unpack"), T("Completed"))
# List success in history-info
msg = T("Unpacked %s files/folders in %s") % (len(extracted), format_time_string(self.unpack_time))
msg = "%s - %s" % (T("Direct Unpack"), msg)
self.nzo.set_unpack_info("Unpack", msg, self.cur_setname)
# Write current log and clear
logging.debug("DirectUnpack Unrar output %s", "\n".join(unrar_log))
unrar_log = []
rarfiles = []
extracted = []
# Are there more files left?
while self.nzo.files and not self.next_sets:
with self.next_file_lock:
self.next_file_lock.wait()
# Is there another set to do?
if self.next_sets:
# Start new instance
nzf = self.next_sets.pop(0)
self.reset_active()
self.cur_setname = nzf.setname
# Wait for the 1st volume to appear
self.wait_for_next_volume()
self.create_unrar_instance()
start_time = time.time()
else:
self.killed = True
break
elif linebuf_encoded.startswith("Extracting from"):
# List files we used
filename = re.search(EXTRACTFROM_RE, linebuf_encoded).group(1)
if filename not in rarfiles:
rarfiles.append(filename)
# List files we extracted
m = re.search(EXTRACTED_RE, linebuf)
if m:
# In case of flat-unpack, UnRar still prints the whole path (?!)
unpacked_file = m.group(2)
if cfg.flat_unpack():
unpacked_file = os.path.basename(unpacked_file)
extracted.append(real_path(self.unpack_dir_info[0], unpacked_file))
# Did we reach the end?
if linebuf.endswith("All OK"):
# Stop timer and finish
self.unpack_time += time.time() - start_time
ACTIVE_UNPACKERS.remove(self)
# Add to success
rarfile_path = os.path.join(self.nzo.downpath, self.rarfile_nzf.filename)
self.success_sets[self.cur_setname] = (
rar_volumelist(rarfile_path, self.nzo.password, rarfiles),
extracted,
)
logging.info("DirectUnpack completed for %s", self.cur_setname)
self.nzo.set_action_line(T("Direct Unpack"), T("Completed"))
# List success in history-info
msg = T("Unpacked %s files/folders in %s") % (len(extracted), format_time_string(self.unpack_time))
msg = "%s - %s" % (T("Direct Unpack"), msg)
self.nzo.set_unpack_info("Unpack", msg, self.cur_setname)
# Write current log and clear
unrar_log.append(linebuf.strip())
linebuf = ""
last_volume_linebuf = ""
logging.debug("DirectUnpack Unrar output %s", "\n".join(unrar_log))
unrar_log = []
rarfiles = []
extracted = []
# Are there more files left?
while self.nzo.files and not self.next_sets:
with self.next_file_lock:
self.next_file_lock.wait()
# Is there another set to do?
if self.next_sets:
# Start new instance
nzf = self.next_sets.pop(0)
self.reset_active()
self.cur_setname = nzf.setname
# Wait for the 1st volume to appear
self.wait_for_next_volume()
self.create_unrar_instance()
start_time = time.time()
else:
self.killed = True
break
# List files we extracted
m = re.search(EXTRACTED_RE, linebuf_encoded)
if m:
# In case of flat-unpack, UnRar still prints the whole path (?!)
unpacked_file = m.group(2)
if cfg.flat_unpack():
unpacked_file = os.path.basename(unpacked_file)
extracted.append(real_path(self.unpack_dir_info[0], unpacked_file))
if linebuf.endswith("[C]ontinue, [Q]uit "):
if linebuf.endswith(b"[C]ontinue, [Q]uit "):
# Stop timer
self.unpack_time += time.time() - start_time
@@ -299,20 +310,16 @@ class DirectUnpacker(threading.Thread):
logging.info("DirectUnpack failed due to missing files %s", self.cur_setname)
self.abort()
else:
logging.debug('Duplicate output line detected: "%s"', last_volume_linebuf)
logging.debug('Duplicate output line detected: "%s"', platform_btou(last_volume_linebuf))
self.duplicate_lines += 1
else:
self.duplicate_lines = 0
last_volume_linebuf = linebuf
# Show the log
if linebuf.endswith("\n"):
unrar_log.append(linebuf.strip())
linebuf = ""
# Add last line
unrar_log.append(linebuf.strip())
logging.debug("DirectUnpack Unrar output %s", "\n".join(unrar_log))
# Add last line and write any new output
if linebuf:
unrar_log.append(platform_btou(linebuf.strip()))
logging.debug("DirectUnpack Unrar output %s", "\n".join(unrar_log))
# Make more space
self.reset_active()
@@ -375,29 +382,32 @@ class DirectUnpacker(threading.Thread):
return
# Generate command
rarfile_path = os.path.join(self.nzo.downpath, self.rarfile_nzf.filename)
rarfile_path = os.path.join(self.nzo.download_path, self.rarfile_nzf.filename)
if sabnzbd.WIN32:
# For Unrar to support long-path, we need to cricumvent Python's list2cmdline
# For Unrar to support long-path, we need to circumvent Python's list2cmdline
# See: https://github.com/sabnzbd/sabnzbd/issues/1043
# The -scf forces the output to be UTF8
command = [
"%s" % sabnzbd.newsunpack.RAR_COMMAND,
action,
"-vp",
"-idp",
"-scf",
"-o+",
"-ai",
password_command,
"%s" % clip_path(rarfile_path),
rarfile_path,
"%s\\" % long_path(extraction_path),
]
else:
# Don't use "-ai" (not needed for non-Windows)
# The -scf forces the output to be UTF8
command = [
"%s" % sabnzbd.newsunpack.RAR_COMMAND,
action,
"-vp",
"-idp",
"-scf",
"-o+",
password_command,
"%s" % rarfile_path,
@@ -462,7 +472,7 @@ class DirectUnpacker(threading.Thread):
# RarFile can fail for mysterious reasons
try:
rar_contents = RarFile(
os.path.join(self.nzo.downpath, rarfile_nzf.filename), single_file_check=True
os.path.join(self.nzo.download_path, rarfile_nzf.filename), single_file_check=True
).filelist()
for rm_file in rar_contents:
# Flat-unpack, so remove foldername from RarFile output
@@ -491,23 +501,6 @@ class DirectUnpacker(threading.Thread):
return self.cur_volume
def analyze_rar_filename(filename):
"""Extract volume number and setname from rar-filenames
Both ".part01.rar" and ".r01"
"""
m = RAR_NR.search(filename)
if m:
if m.group(4):
# Special since starts with ".rar", ".r00"
return m.group(1), int_conv(m.group(4)) + 2
return m.group(1), int_conv(m.group(3))
else:
# Detect if first of "rxx" set
if filename.endswith(".rar"):
return os.path.splitext(filename)[0], 1
return None, None
def abort_all():
""" Abort all running DirectUnpackers """
logging.info("Aborting all DirectUnpackers")

View File

@@ -46,7 +46,7 @@ def compare_stat_tuple(tup1, tup2):
def clean_file_list(inp_list, folder, files):
""" Remove elements of "inp_list" not found in "files" """
for path in sorted(inp_list.keys()):
for path in sorted(inp_list):
fld, name = os.path.split(path)
if fld == folder:
present = False
@@ -65,8 +65,6 @@ class DirScanner(threading.Thread):
subsequent scans, unless changed.
"""
do = None # Access to instance of DirScanner
def __init__(self):
threading.Thread.__init__(self)
@@ -89,7 +87,6 @@ class DirScanner(threading.Thread):
self.trigger = False
cfg.dirscan_dir.callback(self.newdir)
cfg.dirscan_speed.callback(self.newspeed)
DirScanner.do = self
def newdir(self):
""" We're notified of a dir change """
@@ -105,7 +102,6 @@ class DirScanner(threading.Thread):
def stop(self):
""" Stop the dir scanner """
logging.info("Dirscanner shutting down")
self.shutdown = True
def save(self):
@@ -213,9 +209,3 @@ class DirScanner(threading.Thread):
if os.path.isdir(dpath) and dd.lower() in cats:
run_dir(dpath, dd.lower())
self.busy = False
def dirscan():
""" Wrapper required for scheduler """
logging.info("Scheduled or manual watched folder scan")
DirScanner.do.scan()

View File

@@ -27,15 +27,14 @@ from nntplib import NNTPPermanentError
import socket
import random
import sys
from typing import List, Dict
import sabnzbd
from sabnzbd.decorators import synchronized, NzbQueueLocker, DOWNLOADER_CV
from sabnzbd.newswrapper import NewsWrapper, request_server_info
import sabnzbd.notifier as notifier
import sabnzbd.notifier
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.bpsmeter import BPSMeter
import sabnzbd.scheduler
from sabnzbd.misc import from_units, nntp_to_msg, int_conv
from sabnzbd.utils.happyeyeballs import happyeyeballs
@@ -79,10 +78,10 @@ class Server:
self.restart = False
self.displayname = displayname
self.host = host
self.port = port
self.port: int = port
self.timeout = timeout
self.threads = threads
self.priority = priority
self.priority: int = priority
self.ssl = ssl
self.ssl_verify = ssl_verify
self.ssl_ciphers = ssl_ciphers
@@ -162,14 +161,12 @@ class Server:
self.idle_threads = []
def __repr__(self):
return "%s:%s" % (self.host, self.port)
return "<Server: %s:%s>" % (self.host, self.port)
class Downloader(Thread):
""" Singleton Downloader Thread """
do = None
def __init__(self, paused=False):
Thread.__init__(self)
@@ -199,19 +196,17 @@ class Downloader(Thread):
self.force_disconnect = False
self.read_fds = {}
self.write_fds = {}
self.read_fds: Dict[int, NewsWrapper] = {}
self.write_fds: Dict[int, NewsWrapper] = {}
self.servers = []
self.server_dict = {} # For faster lookups, but is not updated later!
self.servers: List[Server] = []
self.server_dict: Dict[str, Server] = {} # For faster lookups, but is not updated later!
self.server_nr = 0
self._timers = {}
for server in config.get_servers():
self.init_server(None, server)
Downloader.do = self
def init_server(self, oldserver, newserver):
"""Setup or re-setup single server
When oldserver is defined and in use, delay startup.
@@ -274,8 +269,6 @@ class Downloader(Thread):
# Update server-count
self.server_nr = len(self.servers)
return
@NzbQueueLocker
def set_paused_state(self, state):
""" Set downloader to specified paused state """
@@ -286,7 +279,7 @@ class Downloader(Thread):
# Do not notify when SABnzbd is still starting
if self.paused and sabnzbd.WEB_DIR:
logging.info("Resuming")
notifier.send_notification("SABnzbd", T("Resuming"), "pause_resume")
sabnzbd.notifier.send_notification("SABnzbd", T("Resuming"), "pause_resume")
self.paused = False
@NzbQueueLocker
@@ -295,9 +288,9 @@ class Downloader(Thread):
if not self.paused:
self.paused = True
logging.info("Pausing")
notifier.send_notification("SABnzbd", T("Paused"), "pause_resume")
sabnzbd.notifier.send_notification("SABnzbd", T("Paused"), "pause_resume")
if self.is_paused():
BPSMeter.do.reset()
sabnzbd.BPSMeter.reset()
if cfg.autodisconnect():
self.disconnect()
@@ -357,12 +350,12 @@ class Downloader(Thread):
if not self.paused:
return False
else:
if sabnzbd.nzbqueue.NzbQueue.do.has_forced_items():
if sabnzbd.NzbQueue.has_forced_items():
return False
else:
return True
def highest_server(self, me):
def highest_server(self, me: Server):
"""Return True when this server has the highest priority of the active ones
0 is the highest priority
"""
@@ -403,7 +396,7 @@ class Downloader(Thread):
# Make sure server address resolution is refreshed
server.info = None
sabnzbd.nzbqueue.NzbQueue.do.reset_all_try_lists()
sabnzbd.NzbQueue.reset_all_try_lists()
def decode(self, article, raw_data):
"""Decode article and check the status of
@@ -412,23 +405,21 @@ class Downloader(Thread):
# Handle broken articles directly
if not raw_data:
if not article.search_new_server():
sabnzbd.nzbqueue.NzbQueue.do.register_article(article, success=False)
sabnzbd.NzbQueue.register_article(article, success=False)
return
# Send to decoder-queue
sabnzbd.decoder.Decoder.do.process(article, raw_data)
sabnzbd.Decoder.process(article, raw_data)
# See if we need to delay because the queues are full
logged = False
while not self.shutdown and (
sabnzbd.decoder.Decoder.do.queue_full() or sabnzbd.assembler.Assembler.do.queue_full()
):
while not self.shutdown and (sabnzbd.Decoder.queue_full() or sabnzbd.Assembler.queue_full()):
if not logged:
# Only log once, to not waste any CPU-cycles
logging.debug(
"Delaying - Decoder queue: %s - Assembler queue: %s",
sabnzbd.decoder.Decoder.do.decoder_queue.qsize(),
sabnzbd.assembler.Assembler.do.queue.qsize(),
sabnzbd.Decoder.decoder_queue.qsize(),
sabnzbd.Assembler.queue.qsize(),
)
logged = True
time.sleep(0.05)
@@ -443,12 +434,29 @@ class Downloader(Thread):
logging.debug("SSL verification test: %s", sabnzbd.CERTIFICATE_VALIDATION)
# Kick BPS-Meter to check quota
BPSMeter.do.update()
sabnzbd.BPSMeter.update()
# Store when each server last searched for articles
last_searched = {}
# Store when each server last downloaded anything or found an article
last_busy = {}
while 1:
now = time.time()
for server in self.servers:
serverid = server.id
if server.busy_threads:
last_busy[serverid] = now
else:
# Skip this server if idle for 1 second and it has already been searched less than 0.5 seconds ago
if last_busy.get(serverid, 0) + 1 < now and last_searched.get(serverid, 0) + 0.5 > now:
continue
last_searched[serverid] = now
for nw in server.busy_threads[:]:
if (nw.nntp and nw.nntp.error_msg) or (nw.timeout and time.time() > nw.timeout):
if (nw.nntp and nw.nntp.error_msg) or (nw.timeout and now > nw.timeout):
if nw.nntp and nw.nntp.error_msg:
self.__reset_nw(nw, "", warn=False)
else:
@@ -463,7 +471,7 @@ class Downloader(Thread):
if newid:
self.init_server(None, newid)
self.__restart -= 1
sabnzbd.nzbqueue.NzbQueue.do.reset_all_try_lists()
sabnzbd.NzbQueue.reset_all_try_lists()
# Have to leave this loop, because we removed element
break
else:
@@ -478,24 +486,26 @@ class Downloader(Thread):
for nw in server.idle_threads[:]:
if nw.timeout:
if time.time() < nw.timeout:
if now < nw.timeout:
continue
else:
nw.timeout = None
if not server.info:
# Only request info if there's stuff in the queue
if not sabnzbd.nzbqueue.NzbQueue.do.is_empty():
if not sabnzbd.NzbQueue.is_empty():
self.maybe_block_server(server)
request_server_info(server)
break
article = sabnzbd.nzbqueue.NzbQueue.do.get_article(server, self.servers)
article = sabnzbd.NzbQueue.get_article(server, self.servers)
if not article:
break
if server.retention and article.nzf.nzo.avg_stamp < time.time() - server.retention:
last_busy[serverid] = now
if server.retention and article.nzf.nzo.avg_stamp < now - server.retention:
# Let's get rid of all the articles for this server at once
logging.info("Job %s too old for %s, moving on", article.nzf.nzo.final_name, server.host)
while article:
@@ -562,26 +572,26 @@ class Downloader(Thread):
# Need to initialize the check during first 20 seconds
if self.can_be_slowed is None or self.can_be_slowed_timer:
# Wait for stable speed to start testing
if not self.can_be_slowed_timer and BPSMeter.do.get_stable_speed(timespan=10):
self.can_be_slowed_timer = time.time()
if not self.can_be_slowed_timer and sabnzbd.BPSMeter.get_stable_speed(timespan=10):
self.can_be_slowed_timer = now
# Check 10 seconds after enabling slowdown
if self.can_be_slowed_timer and time.time() > self.can_be_slowed_timer + 10:
if self.can_be_slowed_timer and now > self.can_be_slowed_timer + 10:
# Now let's check if it was stable in the last 10 seconds
self.can_be_slowed = BPSMeter.do.get_stable_speed(timespan=10)
self.can_be_slowed = sabnzbd.BPSMeter.get_stable_speed(timespan=10)
self.can_be_slowed_timer = 0
logging.debug("Downloader-slowdown: %r", self.can_be_slowed)
else:
read, write, error = ([], [], [])
BPSMeter.do.reset()
sabnzbd.BPSMeter.reset()
time.sleep(1.0)
DOWNLOADER_CV.acquire()
while (
(sabnzbd.nzbqueue.NzbQueue.do.is_empty() or self.is_paused() or self.postproc)
(sabnzbd.NzbQueue.is_empty() or self.is_paused() or self.postproc)
and not self.shutdown
and not self.__restart
):
@@ -602,7 +612,7 @@ class Downloader(Thread):
self.write_fds.pop(fileno)
if not read:
BPSMeter.do.update()
sabnzbd.BPSMeter.update()
continue
for selected in read:
@@ -610,16 +620,13 @@ class Downloader(Thread):
article = nw.article
server = nw.server
if article:
nzo = article.nzf.nzo
try:
bytes_received, done, skip = nw.recv_chunk()
except:
bytes_received, done, skip = (0, False, False)
if skip:
BPSMeter.do.update()
sabnzbd.BPSMeter.update()
continue
if bytes_received < 1:
@@ -629,12 +636,12 @@ class Downloader(Thread):
else:
if self.bandwidth_limit:
limit = self.bandwidth_limit
if bytes_received + BPSMeter.do.bps > limit:
while BPSMeter.do.bps > limit:
if bytes_received + sabnzbd.BPSMeter.bps > limit:
while sabnzbd.BPSMeter.bps > limit:
time.sleep(0.05)
BPSMeter.do.update()
BPSMeter.do.update(server.id, bytes_received)
nzo.update_download_stats(BPSMeter.do.bps, server.id, bytes_received)
sabnzbd.BPSMeter.update()
sabnzbd.BPSMeter.update(server.id, bytes_received)
article.nzf.nzo.update_download_stats(sabnzbd.BPSMeter.bps, server.id, bytes_received)
if not done and nw.status_code != 222:
if not nw.connected or nw.status_code == 480:
@@ -716,7 +723,7 @@ class Downloader(Thread):
server.active = False
if penalty and (block or server.optional):
self.plan_server(server, penalty)
sabnzbd.nzbqueue.NzbQueue.do.reset_all_try_lists()
sabnzbd.NzbQueue.reset_all_try_lists()
self.__reset_nw(nw, None, warn=False, send_quit=True)
continue
except:
@@ -756,7 +763,7 @@ class Downloader(Thread):
nw.clear_data()
elif nw.status_code == 500:
if nzo.precheck:
if article.nzf.nzo.precheck:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug("Server %s does not support STAT", server.host)
@@ -823,7 +830,7 @@ class Downloader(Thread):
self.decode(article, None)
else:
# Allow all servers to iterate over each nzo/nzf again
sabnzbd.nzbqueue.NzbQueue.do.reset_try_lists(article)
sabnzbd.NzbQueue.reset_try_lists(article)
if destroy:
nw.terminate(quit=send_quit)
@@ -833,7 +840,7 @@ class Downloader(Thread):
# Empty SSL info, it might change on next connect
server.ssl_info = ""
def __request_article(self, nw):
def __request_article(self, nw: NewsWrapper):
try:
nzo = nw.article.nzf.nzo
if nw.server.send_group and nzo.group != nw.group:
@@ -877,7 +884,7 @@ class Downloader(Thread):
stamp = time.time() + 60.0 * interval
self._timers[server.id].append(stamp)
if interval:
sabnzbd.scheduler.plan_server(self.trigger_server, [server.id, stamp], interval)
sabnzbd.Scheduler.plan_server(self.trigger_server, [server.id, stamp], interval)
@synchronized(TIMER_LOCK)
def trigger_server(self, server_id, timestamp):
@@ -914,7 +921,8 @@ class Downloader(Thread):
# Clean expired timers
now = time.time()
kicked = []
for server_id in self._timers.keys():
# Create a copy so we can remove during iteration
for server_id in list(self._timers):
if not [stamp for stamp in self._timers[server_id] if stamp >= now]:
logging.debug("Forcing re-evaluation of server-id %s", server_id)
del self._timers[server_id]
@@ -940,18 +948,18 @@ class Downloader(Thread):
def stop(self):
self.shutdown = True
notifier.send_notification("SABnzbd", T("Shutting down"), "startup")
sabnzbd.notifier.send_notification("SABnzbd", T("Shutting down"), "startup")
def stop():
DOWNLOADER_CV.acquire()
try:
Downloader.do.stop()
sabnzbd.Downloader.stop()
finally:
DOWNLOADER_CV.notify_all()
DOWNLOADER_CV.release()
try:
Downloader.do.join()
sabnzbd.Downloader.join()
except:
pass

View File

@@ -27,6 +27,7 @@ import glob
from Cheetah.Template import Template
from email.message import EmailMessage
from email import policy
from sabnzbd.constants import *
import sabnzbd
@@ -296,4 +297,4 @@ def _prepare_message(txt):
msg[keyword] = value
msg.set_content("\n".join(payload))
return msg.as_bytes()
return msg.as_bytes(policy=msg.policy.clone(linesep="\r\n"))

View File

@@ -22,27 +22,27 @@ sabnzbd.encoding - Unicode/byte translation functions
import locale
import chardet
from xml.sax.saxutils import escape
from typing import AnyStr
CODEPAGE = locale.getpreferredencoding()
def utob(str_in):
""" Shorthand for converting UTF-8 to bytes """
def utob(str_in: AnyStr) -> bytes:
""" Shorthand for converting UTF-8 string to bytes """
if isinstance(str_in, bytes):
return str_in
return str_in.encode("utf-8")
def ubtou(str_in):
""" Shorthand for converting unicode bytes to UTF-8 """
def ubtou(str_in: AnyStr) -> str:
""" Shorthand for converting unicode bytes to UTF-8 string """
if not isinstance(str_in, bytes):
return str_in
return str_in.decode("utf-8")
def platform_btou(str_in):
"""Return Unicode, if not already Unicode, decode with locale encoding.
def platform_btou(str_in: AnyStr) -> str:
"""Return Unicode string, if not already Unicode, decode with locale encoding.
NOTE: Used for POpen because universal_newlines/text parameter doesn't
always work! We cannot use encoding-parameter because it's Python 3.7+
"""
@@ -55,7 +55,7 @@ def platform_btou(str_in):
return str_in
def correct_unknown_encoding(str_or_bytes_in):
def correct_unknown_encoding(str_or_bytes_in: AnyStr) -> str:
"""Files created on Windows but unpacked/repaired on
linux can result in invalid filenames. Try to fix this
encoding by going to bytes and then back to unicode again.

View File

@@ -29,6 +29,12 @@ import time
import fnmatch
import stat
import zipfile
from typing import Union, List, Tuple, Any, Dict, Optional
try:
import win32api
except ImportError:
pass
import sabnzbd
from sabnzbd.decorators import synchronized
@@ -37,7 +43,7 @@ from sabnzbd.encoding import correct_unknown_encoding
from sabnzbd.utils import rarfile
def get_ext(filename):
def get_ext(filename: str) -> str:
""" Return lowercased file extension """
try:
return os.path.splitext(filename)[1].lower()
@@ -45,7 +51,7 @@ def get_ext(filename):
return ""
def get_filename(path):
def get_filename(path: str) -> str:
""" Return path without the file extension """
try:
return os.path.split(path)[1]
@@ -53,12 +59,33 @@ def get_filename(path):
return ""
def setname_from_path(path):
def setname_from_path(path: str) -> str:
""" Get the setname from a path """
return os.path.splitext(os.path.basename(path))[0]
def is_writable(path):
RAR_NR = re.compile(r"(.*?)(\.part(\d*).rar|\.r(\d*))$", re.IGNORECASE)
def analyze_rar_filename(filename_or_path: str) -> Tuple[Optional[str], Optional[int]]:
"""Extract volume number and setname from rar-filenames or paths
Both ".part01.rar" and ".r01" work
"""
filename = os.path.basename(filename_or_path)
m = RAR_NR.search(filename)
if m:
if m.group(4):
# Special since starts with ".rar", ".r00"
return m.group(1), sabnzbd.misc.int_conv(m.group(4)) + 2
return m.group(1), sabnzbd.misc.int_conv(m.group(3))
else:
# Detect if first of "rxx" set
if filename.endswith(".rar"):
return os.path.splitext(filename)[0], 1
return None, None
def is_writable(path: str) -> bool:
""" Return True is file is writable (also when non-existent) """
if os.path.isfile(path):
return bool(os.stat(path).st_mode & stat.S_IWUSR)
@@ -92,7 +119,7 @@ _DEVICES = (
)
def replace_win_devices(name):
def replace_win_devices(name: str) -> str:
"""Remove reserved Windows device names from a name.
aux.txt ==> _aux.txt
txt.aux ==> txt.aux
@@ -111,13 +138,13 @@ def replace_win_devices(name):
return name
def has_win_device(p):
def has_win_device(filename: str) -> bool:
"""Return True if filename part contains forbidden name
Before and after sanitizing
"""
p = os.path.split(p)[1].lower()
filename = os.path.split(filename)[1].lower()
for dev in _DEVICES:
if p == dev or p.startswith(dev + ".") or p.startswith("_" + dev + "."):
if filename == dev or filename.startswith(dev + ".") or filename.startswith("_" + dev + "."):
return True
return False
@@ -128,7 +155,7 @@ CH_ILLEGAL_WIN = '\\/<>?*|"\t:'
CH_LEGAL_WIN = "++{}!@#'+-"
def sanitize_filename(name):
def sanitize_filename(name: str) -> str:
"""Return filename with illegal chars converted to legal ones
and with the par2 extension always in lowercase
"""
@@ -144,7 +171,7 @@ def sanitize_filename(name):
legal += CH_LEGAL_WIN
if ":" in name and sabnzbd.DARWIN:
# Compensate for the foolish way par2 on OSX handles a colon character
# Compensate for the foolish way par2 on macOS handles a colon character
name = name[name.rfind(":") + 1 :]
lst = []
@@ -167,7 +194,7 @@ def sanitize_filename(name):
return name + ext
def sanitize_foldername(name):
def sanitize_foldername(name: str) -> str:
"""Return foldername with dodgy chars converted to safe ones
Remove any leading and trailing dot and space characters
"""
@@ -209,7 +236,7 @@ def sanitize_foldername(name):
return name
def sanitize_and_trim_path(path):
def sanitize_and_trim_path(path: str) -> str:
""" Remove illegal characters and trim element size """
path = path.strip()
new_path = ""
@@ -252,14 +279,14 @@ def sanitize_files_in_folder(folder):
return lst
def is_obfuscated_filename(filename):
def is_obfuscated_filename(filename: str) -> bool:
"""Check if this file has an extension, if not, it's
probably obfuscated and we don't use it
"""
return len(get_ext(filename)) < 2
def real_path(loc, path):
def real_path(loc: str, path: str) -> str:
"""When 'path' is relative, return normalized join of 'loc' and 'path'
When 'path' is absolute, return normalized path
A path starting with ~ will be located in the user's Home folder
@@ -296,7 +323,9 @@ def real_path(loc, path):
return long_path(os.path.normpath(os.path.abspath(path)))
def create_real_path(name, loc, path, umask=False, writable=True):
def create_real_path(
name: str, loc: str, path: str, umask: bool = False, writable: bool = True
) -> Tuple[bool, str, Optional[str]]:
"""When 'path' is relative, create join of 'loc' and 'path'
When 'path' is absolute, create normalized path
'name' is used for logging.
@@ -324,7 +353,7 @@ def create_real_path(name, loc, path, umask=False, writable=True):
return False, path, None
def same_file(a, b):
def same_file(a: str, b: str) -> int:
"""Return 0 if A and B have nothing in common
return 1 if A and B are actually the same path
return 2 if B is a subfolder of A
@@ -353,7 +382,7 @@ def same_file(a, b):
return is_subfolder
def is_archive(path):
def is_archive(path: str) -> Tuple[int, Any, str]:
"""Check if file in path is an ZIP, RAR or 7z file
:param path: path to file
:return: (zf, status, expected_extension)
@@ -387,8 +416,8 @@ def is_archive(path):
return 1, None, ""
def check_mount(path):
"""Return False if volume isn't mounted on Linux or OSX
def check_mount(path: str) -> bool:
"""Return False if volume isn't mounted on Linux or macOS
Retry 6 times with an interval of 1 sec.
"""
if sabnzbd.DARWIN:
@@ -407,7 +436,7 @@ def check_mount(path):
return not m
def safe_fnmatch(f, pattern):
def safe_fnmatch(f: str, pattern: str) -> bool:
"""fnmatch will fail if the pattern contains any of it's
key characters, like [, ] or !.
"""
@@ -417,7 +446,7 @@ def safe_fnmatch(f, pattern):
return False
def globber(path, pattern="*"):
def globber(path: str, pattern: str = "*") -> List[str]:
""" Return matching base file/folder names in folder `path` """
# Cannot use glob.glob() because it doesn't support Windows long name notation
if os.path.exists(path):
@@ -425,7 +454,7 @@ def globber(path, pattern="*"):
return []
def globber_full(path, pattern="*"):
def globber_full(path: str, pattern: str = "*") -> List[str]:
""" Return matching full file/folder names in folder `path` """
# Cannot use glob.glob() because it doesn't support Windows long name notation
if os.path.exists(path):
@@ -433,18 +462,7 @@ def globber_full(path, pattern="*"):
return []
def trim_win_path(path):
""" Make sure Windows path stays below 70 by trimming last part """
if sabnzbd.WIN32 and len(path) > 69:
path, folder = os.path.split(path)
maxlen = 69 - len(path)
if len(folder) > maxlen:
folder = folder[:maxlen]
path = os.path.join(path, folder).rstrip(". ")
return path
def fix_unix_encoding(folder):
def fix_unix_encoding(folder: str):
"""Fix bad name encoding for Unix systems
This happens for example when files are created
on Windows but unpacked/repaired on linux
@@ -460,7 +478,7 @@ def fix_unix_encoding(folder):
logging.info("Cannot correct name of %s", os.path.join(root, name))
def make_script_path(script):
def make_script_path(script: str) -> Optional[str]:
""" Return full script path, if any valid script exists, else None """
script_path = None
script_dir = sabnzbd.cfg.script_dir.get_path()
@@ -475,7 +493,7 @@ def make_script_path(script):
return script_path
def get_admin_path(name, future):
def get_admin_path(name: str, future: bool):
"""Return news-style full path to job-admin folder of names job
or else the old cache path
"""
@@ -485,7 +503,7 @@ def get_admin_path(name, future):
return os.path.join(os.path.join(sabnzbd.cfg.download_dir.get_path(), name), JOB_ADMIN)
def set_chmod(path, permissions, report):
def set_chmod(path: str, permissions: int, report: bool):
""" Set 'permissions' on 'path', report any errors when 'report' is True """
try:
logging.debug("Applying permissions %s (octal) to %s", oct(permissions), path)
@@ -497,7 +515,7 @@ def set_chmod(path, permissions, report):
logging.info("Traceback: ", exc_info=True)
def set_permissions(path, recursive=True):
def set_permissions(path: str, recursive: bool = True):
""" Give folder tree and its files their proper permissions """
if not sabnzbd.WIN32:
umask = sabnzbd.cfg.umask()
@@ -528,7 +546,7 @@ def set_permissions(path, recursive=True):
set_chmod(path, umask_file, report)
def userxbit(filename):
def userxbit(filename: str) -> bool:
"""Returns boolean if the x-bit for user is set on the given file.
This is a workaround: os.access(filename, os.X_OK) does not work
on certain mounted file systems. Does not work at all on Windows.
@@ -542,14 +560,14 @@ def userxbit(filename):
return xbitset
def clip_path(path):
def clip_path(path: str) -> str:
r""" Remove \\?\ or \\?\UNC\ prefix from Windows path """
if sabnzbd.WIN32 and path and "?" in path:
path = path.replace("\\\\?\\UNC\\", "\\\\", 1).replace("\\\\?\\", "", 1)
return path
def long_path(path):
def long_path(path: str) -> str:
""" For Windows, convert to long style path; others, return same path """
if sabnzbd.WIN32 and path and not path.startswith("\\\\?\\"):
if path.startswith("\\\\"):
@@ -568,7 +586,7 @@ DIR_LOCK = threading.RLock()
@synchronized(DIR_LOCK)
def create_all_dirs(path, apply_umask=False):
def create_all_dirs(path: str, apply_umask: bool = False) -> Union[str, bool]:
"""Create all required path elements and set umask on all
The umask argument is ignored on Windows
Return path if elements could be made or exists
@@ -606,7 +624,7 @@ def create_all_dirs(path, apply_umask=False):
@synchronized(DIR_LOCK)
def get_unique_path(dirpath, n=0, create_dir=True):
def get_unique_path(dirpath: str, n: int = 0, create_dir: bool = True) -> str:
""" Determine a unique folder or filename """
if not check_mount(dirpath):
@@ -626,7 +644,7 @@ def get_unique_path(dirpath, n=0, create_dir=True):
@synchronized(DIR_LOCK)
def get_unique_filename(path):
def get_unique_filename(path: str) -> str:
"""Check if path is unique.
If not, add number like: "/path/name.NUM.ext".
"""
@@ -641,7 +659,7 @@ def get_unique_filename(path):
@synchronized(DIR_LOCK)
def listdir_full(input_dir, recursive=True):
def listdir_full(input_dir: str, recursive: bool = True) -> List[str]:
""" List all files in dirs and sub-dirs """
filelist = []
for root, dirs, files in os.walk(input_dir):
@@ -655,7 +673,7 @@ def listdir_full(input_dir, recursive=True):
@synchronized(DIR_LOCK)
def move_to_path(path, new_path):
def move_to_path(path: str, new_path: str) -> Tuple[bool, Optional[str]]:
"""Move a file to a new path, optionally give unique filename
Return (ok, new_path)
"""
@@ -695,7 +713,7 @@ def move_to_path(path, new_path):
@synchronized(DIR_LOCK)
def cleanup_empty_directories(path):
def cleanup_empty_directories(path: str):
""" Remove all empty folders inside (and including) 'path' """
path = os.path.normpath(path)
while 1:
@@ -719,7 +737,7 @@ def cleanup_empty_directories(path):
@synchronized(DIR_LOCK)
def get_filepath(path, nzo, filename):
def get_filepath(path: str, nzo, filename: str):
""" Create unique filepath """
# This procedure is only used by the Assembler thread
# It does no umask setting
@@ -756,7 +774,7 @@ def get_filepath(path, nzo, filename):
@synchronized(DIR_LOCK)
def renamer(old, new):
def renamer(old: str, new: str):
""" Rename file/folder with retries for Win32 """
# Sanitize last part of new name
path, name = os.path.split(new)
@@ -799,14 +817,14 @@ def renamer(old, new):
shutil.move(old, new)
def remove_file(path):
def remove_file(path: str):
""" Wrapper function so any file removal is logged """
logging.debug("[%s] Deleting file %s", sabnzbd.misc.caller_name(), path)
os.remove(path)
@synchronized(DIR_LOCK)
def remove_dir(path):
def remove_dir(path: str):
""" Remove directory with retries for Win32 """
logging.debug("[%s] Removing dir %s", sabnzbd.misc.caller_name(), path)
if sabnzbd.WIN32:
@@ -829,7 +847,7 @@ def remove_dir(path):
@synchronized(DIR_LOCK)
def remove_all(path, pattern="*", keep_folder=False, recursive=False):
def remove_all(path: str, pattern: str = "*", keep_folder: bool = False, recursive: bool = False):
""" Remove folder and all its content (optionally recursive) """
if path and os.path.exists(path):
# Fast-remove the whole tree if recursive
@@ -863,67 +881,47 @@ def remove_all(path, pattern="*", keep_folder=False, recursive=False):
##############################################################################
# Diskfree
##############################################################################
def find_dir(p):
""" Return first folder level that exists in this path """
def diskspace_base(dir_to_check: str) -> Tuple[float, float]:
""" Return amount of free and used diskspace in GBytes """
# Find first folder level that exists in the path
x = "x"
while x and not os.path.exists(p):
p, x = os.path.split(p)
return p
while x and not os.path.exists(dir_to_check):
dir_to_check, x = os.path.split(dir_to_check)
if sabnzbd.WIN32:
# windows diskfree
try:
# Careful here, because win32api test hasn't been done yet!
import win32api
except:
pass
def diskspace_base(_dir):
""" Return amount of free and used diskspace in GBytes """
_dir = find_dir(_dir)
if sabnzbd.WIN32:
# windows diskfree
try:
available, disk_size, total_free = win32api.GetDiskFreeSpaceEx(_dir)
available, disk_size, total_free = win32api.GetDiskFreeSpaceEx(dir_to_check)
return disk_size / GIGI, available / GIGI
except:
return 0.0, 0.0
else:
try:
os.statvfs
elif hasattr(os, "statvfs"):
# posix diskfree
def diskspace_base(_dir):
""" Return amount of free and used diskspace in GBytes """
_dir = find_dir(_dir)
try:
s = os.statvfs(_dir)
if s.f_blocks < 0:
disk_size = float(sys.maxsize) * float(s.f_frsize)
else:
disk_size = float(s.f_blocks) * float(s.f_frsize)
if s.f_bavail < 0:
available = float(sys.maxsize) * float(s.f_frsize)
else:
available = float(s.f_bavail) * float(s.f_frsize)
return disk_size / GIGI, available / GIGI
except:
return 0.0, 0.0
except ImportError:
def diskspace_base(_dir):
return 20.0, 10.0
try:
s = os.statvfs(dir_to_check)
if s.f_blocks < 0:
disk_size = float(sys.maxsize) * float(s.f_frsize)
else:
disk_size = float(s.f_blocks) * float(s.f_frsize)
if s.f_bavail < 0:
available = float(sys.maxsize) * float(s.f_frsize)
else:
available = float(s.f_bavail) * float(s.f_frsize)
return disk_size / GIGI, available / GIGI
except:
return 0.0, 0.0
else:
return 20.0, 10.0
# Store all results to speed things up
__DIRS_CHECKED = []
__DISKS_SAME = None
__LAST_DISK_RESULT = {"download_dir": [], "complete_dir": []}
__LAST_DISK_RESULT = {"download_dir": (0.0, 0.0), "complete_dir": (0.0, 0.0)}
__LAST_DISK_CALL = 0
def diskspace(force=False):
def diskspace(force: bool = False) -> Dict[str, Tuple[float, float]]:
""" Wrapper to cache results """
global __DIRS_CHECKED, __DISKS_SAME, __LAST_DISK_RESULT, __LAST_DISK_CALL

View File

@@ -68,7 +68,7 @@ def localipv4():
try:
with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as s_ipv4:
# Option: use 100.64.1.1 (IANA-Reserved IPv4 Prefix for Shared Address Space)
s_ipv4.connect(("1.2.3.4", 80))
s_ipv4.connect(("10.255.255.255", 80))
ipv4 = s_ipv4.getsockname()[0]
except socket.error:
ipv4 = None

View File

@@ -33,12 +33,10 @@ import functools
from threading import Thread
from random import randint
from xml.sax.saxutils import escape
from Cheetah.Template import Template
import sabnzbd
import sabnzbd.rss
import sabnzbd.scheduler as scheduler
from Cheetah.Template import Template
from sabnzbd.misc import (
to_units,
from_units,
@@ -52,24 +50,18 @@ from sabnzbd.misc import (
)
from sabnzbd.filesystem import real_path, long_path, globber, globber_full, remove_all, clip_path, same_file
from sabnzbd.newswrapper import GetServerParms
from sabnzbd.bpsmeter import BPSMeter
from sabnzbd.encoding import xml_name, utob
import sabnzbd.config as config
import sabnzbd.cfg as cfg
import sabnzbd.notifier as notifier
import sabnzbd.newsunpack
from sabnzbd.downloader import Downloader
from sabnzbd.nzbqueue import NzbQueue
from sabnzbd.utils.servertests import test_nntp_server_dict
from sabnzbd.decoder import SABYENC_ENABLED
from sabnzbd.utils.diskspeed import diskspeedmeasure
from sabnzbd.utils.getperformance import getpystone
from sabnzbd.utils.internetspeed import internetspeed
import sabnzbd.utils.ssdp
from sabnzbd.constants import MEBI, DEF_SKIN_COLORS, DEF_STDCONFIG, DEF_MAIN_TMPL, DEFAULT_PRIORITY, CHEETAH_DIRECTIVES
from sabnzbd.lang import list_languages
from sabnzbd.api import (
list_scripts,
list_cats,
@@ -85,11 +77,6 @@ from sabnzbd.api import (
build_queue_header,
)
##############################################################################
# Global constants
##############################################################################
##############################################################################
# Security functions
##############################################################################
@@ -408,7 +395,7 @@ class MainPage:
)
)
bytespersec_list = BPSMeter.do.get_bps_list()
bytespersec_list = sabnzbd.BPSMeter.get_bps_list()
info["bytespersec_list"] = ",".join([str(bps) for bps in bytespersec_list])
template = Template(
@@ -431,13 +418,13 @@ class MainPage:
@secured_expose(check_api_key=True)
def pause(self, **kwargs):
scheduler.plan_resume(0)
Downloader.do.pause()
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
raise Raiser(self.__root)
@secured_expose(check_api_key=True)
def resume(self, **kwargs):
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.unpause_all()
raise Raiser(self.__root)
@@ -482,6 +469,13 @@ class MainPage:
cherrypy.response.headers["Content-Type"] = "text/plain"
return "User-agent: *\nDisallow: /\n"
@secured_expose
def description_xml(self, **kwargs):
""" Keep web crawlers out """
logging.debug("description.xml was requested by %s", cherrypy.request.remote.ip)
cherrypy.response.headers["Content-Type"] = "application/xml"
return utob(sabnzbd.utils.ssdp.server_ssdp_xml())
##############################################################################
class Wizard:
@@ -529,7 +523,7 @@ class Wizard:
else:
# Sort servers to get the first enabled one
server_names = sorted(
servers.keys(),
servers,
key=lambda svr: "%d%02d%s"
% (int(not servers[svr].enable()), servers[svr].priority(), servers[svr].displayname().lower()),
)
@@ -581,13 +575,12 @@ class Wizard:
def get_access_info():
""" Build up a list of url's that sabnzbd can be accessed from """
# Access_url is used to provide the user a link to sabnzbd depending on the host
access_uri = "localhost"
# Access_url is used to provide the user a link to SABnzbd depending on the host
cherryhost = cfg.cherryhost()
host = socket.gethostname().lower()
socks = [host]
if cherryhost == "0.0.0.0":
host = socket.gethostname()
socks = [host]
# Grab a list of all ips for the hostname
try:
addresses = socket.getaddrinfo(host, None)
@@ -598,17 +591,8 @@ def get_access_info():
# Filter out ipv6 addresses (should not be allowed)
if ":" not in address and address not in socks:
socks.append(address)
if "host" in cherrypy.request.headers:
host = cherrypy.request.headers["host"]
host = host.rsplit(":")[0]
access_uri = host
socks.insert(0, host)
else:
socks.insert(0, "localhost")
socks.insert(0, "localhost")
elif cherryhost == "::":
host = socket.gethostname()
socks = [host]
# Grab a list of all ips for the hostname
addresses = socket.getaddrinfo(host, None)
for addr in addresses:
@@ -618,22 +602,14 @@ def get_access_info():
address = "[%s]" % address
if address not in socks:
socks.append(address)
if "host" in cherrypy.request.headers:
host = cherrypy.request.headers["host"]
host = host.rsplit(":")[0]
access_uri = host
socks.insert(0, host)
else:
socks.insert(0, "localhost")
elif not cherryhost:
socks = [socket.gethostname()]
access_uri = socket.gethostname()
else:
socks.insert(0, "localhost")
elif cherryhost:
socks = [cherryhost]
access_uri = cherryhost
urls = []
# Add the current requested URL as the base
access_url = urllib.parse.urljoin(cherrypy.request.base, cfg.url_base())
urls = [access_url]
for sock in socks:
if sock:
if cfg.enable_https() and cfg.https_port():
@@ -642,17 +618,10 @@ def get_access_info():
url = "https://%s:%s%s" % (sock, cfg.cherryport(), cfg.url_base())
else:
url = "http://%s:%s%s" % (sock, cfg.cherryport(), cfg.url_base())
urls.append(url)
if cfg.enable_https() and cfg.https_port():
access_url = "https://%s:%s%s" % (sock, cfg.https_port(), cfg.url_base())
elif cfg.enable_https():
access_url = "https://%s:%s%s" % (access_uri, cfg.cherryport(), cfg.url_base())
else:
access_url = "http://%s:%s%s" % (access_uri, cfg.cherryport(), cfg.url_base())
return access_url, urls
# Return a unique list
return access_url, set(urls)
##############################################################################
@@ -723,7 +692,7 @@ class NzoPage:
nzo_id = a
break
nzo = NzbQueue.do.get_nzo(nzo_id)
nzo = sabnzbd.NzbQueue.get_nzo(nzo_id)
if nzo_id and nzo:
info, pnfo_list, bytespersec, q_size, bytes_left_previous_page = build_queue_header()
@@ -762,7 +731,7 @@ class NzoPage:
n = 0
for pnfo in pnfo_list:
if pnfo.nzo_id == nzo_id:
nzo = NzbQueue.do.get_nzo(nzo_id)
nzo = sabnzbd.NzbQueue.get_nzo(nzo_id)
repair = pnfo.repair
unpack = pnfo.unpack
delete = pnfo.delete
@@ -795,7 +764,7 @@ class NzoPage:
def nzo_files(self, info, nzo_id):
active = []
nzo = NzbQueue.do.get_nzo(nzo_id)
nzo = sabnzbd.NzbQueue.get_nzo(nzo_id)
if nzo:
pnfo = nzo.gather_info(full=True)
info["nzo_id"] = pnfo.nzo_id
@@ -831,15 +800,15 @@ class NzoPage:
script = kwargs.get("script", None)
cat = kwargs.get("cat", None)
priority = kwargs.get("priority", None)
nzo = NzbQueue.do.get_nzo(nzo_id)
nzo = sabnzbd.NzbQueue.get_nzo(nzo_id)
if index is not None:
NzbQueue.do.switch(nzo_id, index)
sabnzbd.NzbQueue.switch(nzo_id, index)
if name is not None:
NzbQueue.do.change_name(nzo_id, name, password)
sabnzbd.NzbQueue.change_name(nzo_id, name, password)
if cat is not None and nzo.cat is not cat and not (nzo.cat == "*" and cat == "Default"):
NzbQueue.do.change_cat(nzo_id, cat, priority)
sabnzbd.NzbQueue.change_cat(nzo_id, cat, priority)
# Category changed, so make sure "Default" attributes aren't set again
if script == "Default":
script = None
@@ -849,11 +818,11 @@ class NzoPage:
pp = None
if script is not None and nzo.script != script:
NzbQueue.do.change_script(nzo_id, script)
sabnzbd.NzbQueue.change_script(nzo_id, script)
if pp is not None and nzo.pp != pp:
NzbQueue.do.change_opts(nzo_id, pp)
sabnzbd.NzbQueue.change_opts(nzo_id, pp)
if priority is not None and nzo.priority != int(priority):
NzbQueue.do.set_priority(nzo_id, priority)
sabnzbd.NzbQueue.set_priority(nzo_id, priority)
raise Raiser(urllib.parse.urljoin(self.__root, "../queue/"))
@@ -862,7 +831,7 @@ class NzoPage:
if kwargs["action_key"] == "Delete":
for key in kwargs:
if kwargs[key] == "on":
NzbQueue.do.remove_nzf(nzo_id, key, force_delete=True)
sabnzbd.NzbQueue.remove_nzf(nzo_id, key, force_delete=True)
elif kwargs["action_key"] in ("Top", "Up", "Down", "Bottom"):
nzf_ids = []
@@ -871,15 +840,15 @@ class NzoPage:
nzf_ids.append(key)
size = int_conv(kwargs.get("action_size", 1))
if kwargs["action_key"] == "Top":
NzbQueue.do.move_top_bulk(nzo_id, nzf_ids)
sabnzbd.NzbQueue.move_top_bulk(nzo_id, nzf_ids)
elif kwargs["action_key"] == "Up":
NzbQueue.do.move_up_bulk(nzo_id, nzf_ids, size)
sabnzbd.NzbQueue.move_up_bulk(nzo_id, nzf_ids, size)
elif kwargs["action_key"] == "Down":
NzbQueue.do.move_down_bulk(nzo_id, nzf_ids, size)
sabnzbd.NzbQueue.move_down_bulk(nzo_id, nzf_ids, size)
elif kwargs["action_key"] == "Bottom":
NzbQueue.do.move_bottom_bulk(nzo_id, nzf_ids)
sabnzbd.NzbQueue.move_bottom_bulk(nzo_id, nzf_ids)
if NzbQueue.do.get_nzo(nzo_id):
if sabnzbd.NzbQueue.get_nzo(nzo_id):
url = urllib.parse.urljoin(self.__root, nzo_id)
else:
url = urllib.parse.urljoin(self.__root, "../queue")
@@ -910,12 +879,12 @@ class QueuePage:
uid = kwargs.get("uid")
del_files = int_conv(kwargs.get("del_files"))
if uid:
NzbQueue.do.remove(uid, add_to_history=False, delete_all_data=del_files)
sabnzbd.NzbQueue.remove(uid, delete_all_data=del_files)
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def purge(self, **kwargs):
NzbQueue.do.remove_all(kwargs.get("search"))
sabnzbd.NzbQueue.remove_all(kwargs.get("search"))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
@@ -932,7 +901,7 @@ class QueuePage:
uid1 = kwargs.get("uid1")
uid2 = kwargs.get("uid2")
if uid1 and uid2:
NzbQueue.do.switch(uid1, uid2)
sabnzbd.NzbQueue.switch(uid1, uid2)
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
@@ -940,7 +909,7 @@ class QueuePage:
nzo_id = kwargs.get("nzo_id")
pp = kwargs.get("pp", "")
if nzo_id and pp and pp.isdigit():
NzbQueue.do.change_opts(nzo_id, int(pp))
sabnzbd.NzbQueue.change_opts(nzo_id, int(pp))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
@@ -950,7 +919,7 @@ class QueuePage:
if nzo_id and script:
if script == "None":
script = None
NzbQueue.do.change_script(nzo_id, script)
sabnzbd.NzbQueue.change_script(nzo_id, script)
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
@@ -960,7 +929,7 @@ class QueuePage:
if nzo_id and cat:
if cat == "None":
cat = None
NzbQueue.do.change_cat(nzo_id, cat)
sabnzbd.NzbQueue.change_cat(nzo_id, cat)
raise queueRaiser(self.__root, kwargs)
@@ -971,46 +940,46 @@ class QueuePage:
@secured_expose(check_api_key=True)
def pause(self, **kwargs):
scheduler.plan_resume(0)
Downloader.do.pause()
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def resume(self, **kwargs):
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.unpause_all()
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def pause_nzo(self, **kwargs):
uid = kwargs.get("uid", "")
NzbQueue.do.pause_multiple_nzo(uid.split(","))
sabnzbd.NzbQueue.pause_multiple_nzo(uid.split(","))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def resume_nzo(self, **kwargs):
uid = kwargs.get("uid", "")
NzbQueue.do.resume_multiple_nzo(uid.split(","))
sabnzbd.NzbQueue.resume_multiple_nzo(uid.split(","))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def set_priority(self, **kwargs):
NzbQueue.do.set_priority(kwargs.get("nzo_id"), kwargs.get("priority"))
sabnzbd.NzbQueue.set_priority(kwargs.get("nzo_id"), kwargs.get("priority"))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def sort_by_avg_age(self, **kwargs):
NzbQueue.do.sort_queue("avg_age", kwargs.get("dir"))
sabnzbd.NzbQueue.sort_queue("avg_age", kwargs.get("dir"))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def sort_by_name(self, **kwargs):
NzbQueue.do.sort_queue("name", kwargs.get("dir"))
sabnzbd.NzbQueue.sort_queue("name", kwargs.get("dir"))
raise queueRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True)
def sort_by_size(self, **kwargs):
NzbQueue.do.sort_queue("size", kwargs.get("dir"))
sabnzbd.NzbQueue.sort_queue("size", kwargs.get("dir"))
raise queueRaiser(self.__root, kwargs)
@@ -1031,7 +1000,7 @@ class HistoryPage:
history["rating_enable"] = bool(cfg.rating_enable())
postfix = T("B") # : Abbreviation for bytes, as in GB
grand, month, week, day = BPSMeter.do.get_sums()
grand, month, week, day = sabnzbd.BPSMeter.get_sums()
history["total_size"], history["month_size"], history["week_size"], history["day_size"] = (
to_units(grand, postfix=postfix),
to_units(month, postfix=postfix),
@@ -1110,7 +1079,7 @@ class ConfigPage:
conf["have_unzip"] = bool(sabnzbd.newsunpack.ZIP_COMMAND)
conf["have_7zip"] = bool(sabnzbd.newsunpack.SEVEN_COMMAND)
conf["have_sabyenc"] = SABYENC_ENABLED
conf["have_sabyenc"] = sabnzbd.decoder.SABYENC_ENABLED
conf["have_mt_par2"] = sabnzbd.newsunpack.PAR2_MT
conf["certificate_validation"] = sabnzbd.CERTIFICATE_VALIDATION
@@ -1121,7 +1090,7 @@ class ConfigPage:
new[svr] = {}
conf["servers"] = new
conf["folders"] = NzbQueue.do.scan_jobs(all_jobs=False, action=False)
conf["folders"] = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
template = Template(
file=os.path.join(sabnzbd.WEB_DIR_CONFIG, "config.tmpl"),
@@ -1580,13 +1549,13 @@ class ConfigServer:
new = []
servers = config.get_servers()
server_names = sorted(
list(servers.keys()),
servers,
key=lambda svr: "%d%02d%s"
% (int(not servers[svr].enable()), servers[svr].priority(), servers[svr].displayname().lower()),
)
for svr in server_names:
new.append(servers[svr].get_dict(safe=True))
t, m, w, d, timeline = BPSMeter.do.amounts(svr)
t, m, w, d, timeline = sabnzbd.BPSMeter.amounts(svr)
if t:
new[-1]["amounts"] = to_units(t), to_units(m), to_units(w), to_units(d), timeline
conf["servers"] = new
@@ -1623,7 +1592,7 @@ class ConfigServer:
def clrServer(self, **kwargs):
server = kwargs.get("server")
if server:
BPSMeter.do.clear_server(server)
sabnzbd.BPSMeter.clear_server(server)
raise Raiser(self.__root)
@secured_expose(check_api_key=True, check_configlock=True)
@@ -1634,7 +1603,7 @@ class ConfigServer:
if svr:
svr.enable.set(not svr.enable())
config.save_config()
Downloader.do.update_server(server, server)
sabnzbd.Downloader.update_server(server, server)
raise Raiser(self.__root)
@@ -1712,7 +1681,7 @@ def handle_server(kwargs, root=None, new_svr=False):
config.ConfigServer(server, kwargs)
config.save_config()
Downloader.do.update_server(old_server, server)
sabnzbd.Downloader.update_server(old_server, server)
if root:
if ajax:
return sabnzbd.api.report("json")
@@ -1767,7 +1736,7 @@ class ConfigRss:
active_feed = kwargs.get("feed", "")
conf["active_feed"] = active_feed
conf["rss"] = rss
conf["rss_next"] = time.strftime(time_format("%H:%M"), time.localtime(sabnzbd.rss.next_run()))
conf["rss_next"] = time.strftime(time_format("%H:%M"), time.localtime(sabnzbd.RSSReader.next_run))
if active_feed:
readout = bool(self.__refresh_readout)
@@ -1777,7 +1746,7 @@ class ConfigRss:
self.__refresh_force = False
self.__refresh_ignore = False
if self.__evaluate:
msg = sabnzbd.rss.run_feed(
msg = sabnzbd.RSSReader.run_feed(
active_feed,
download=self.__refresh_download,
force=self.__refresh_force,
@@ -1788,7 +1757,7 @@ class ConfigRss:
msg = ""
self.__evaluate = False
if readout:
sabnzbd.rss.save()
sabnzbd.RSSReader.save()
self.__last_msg = msg
else:
msg = self.__last_msg
@@ -1819,7 +1788,7 @@ class ConfigRss:
""" Save changed RSS automatic readout rate """
cfg.rss_rate.set(kwargs.get("rss_rate"))
config.save_config()
scheduler.restart()
sabnzbd.Scheduler.restart()
raise rssRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True, check_configlock=True)
@@ -1891,7 +1860,7 @@ class ConfigRss:
config.ConfigRSS(feed, kwargs)
# Clear out any existing reference to this feed name
# Otherwise first-run detection can fail
sabnzbd.rss.clear_feed(feed)
sabnzbd.RSSReader.clear_feed(feed)
config.save_config()
self.__refresh_readout = feed
self.__refresh_download = False
@@ -1947,7 +1916,7 @@ class ConfigRss:
kwargs["section"] = "rss"
kwargs["keyword"] = kwargs.get("feed")
del_from_section(kwargs)
sabnzbd.rss.clear_feed(kwargs.get("feed"))
sabnzbd.RSSReader.clear_feed(kwargs.get("feed"))
raise Raiser(self.__root)
@secured_expose(check_api_key=True, check_configlock=True)
@@ -1983,7 +1952,7 @@ class ConfigRss:
@secured_expose(check_api_key=True, check_configlock=True)
def clean_rss_jobs(self, *args, **kwargs):
""" Remove processed RSS jobs from UI """
sabnzbd.rss.clear_downloaded(kwargs["feed"])
sabnzbd.RSSReader.clear_downloaded(kwargs["feed"])
self.__evaluate = True
raise rssRaiser(self.__root, kwargs)
@@ -2018,7 +1987,7 @@ class ConfigRss:
feed = kwargs.get("feed")
url = kwargs.get("url")
nzbname = kwargs.get("nzbname")
att = sabnzbd.rss.lookup_url(feed, url)
att = sabnzbd.RSSReader.lookup_url(feed, url)
if att:
pp = att.get("pp")
cat = att.get("cat")
@@ -2028,13 +1997,13 @@ class ConfigRss:
if url:
sabnzbd.add_url(url, pp, script, cat, prio, nzbname)
# Need to pass the title instead
sabnzbd.rss.flag_downloaded(feed, url)
sabnzbd.RSSReader.flag_downloaded(feed, url)
raise rssRaiser(self.__root, kwargs)
@secured_expose(check_api_key=True, check_configlock=True)
def rss_now(self, *args, **kwargs):
""" Run an automatic RSS run now """
scheduler.force_rss()
sabnzbd.Scheduler.force_rss()
raise rssRaiser(self.__root, kwargs)
@@ -2113,7 +2082,7 @@ class ConfigScheduling:
snum = 1
conf["schedlines"] = []
conf["taskinfo"] = []
for ev in scheduler.sort_schedules(all_events=False):
for ev in sabnzbd.scheduler.sort_schedules(all_events=False):
line = ev[3]
conf["schedlines"].append(line)
try:
@@ -2229,7 +2198,7 @@ class ConfigScheduling:
cfg.schedules.set(sched)
config.save_config()
scheduler.restart(force=True)
sabnzbd.Scheduler.restart()
raise Raiser(self.__root)
@secured_expose(check_api_key=True, check_configlock=True)
@@ -2240,7 +2209,7 @@ class ConfigScheduling:
schedules.remove(line)
cfg.schedules.set(schedules)
config.save_config()
scheduler.restart(force=True)
sabnzbd.Scheduler.restart()
raise Raiser(self.__root)
@secured_expose(check_api_key=True, check_configlock=True)
@@ -2257,7 +2226,7 @@ class ConfigScheduling:
break
cfg.schedules.set(schedules)
config.save_config()
scheduler.restart(force=True)
sabnzbd.Scheduler.restart()
raise Raiser(self.__root)
@@ -2417,12 +2386,12 @@ class Status:
@secured_expose(check_api_key=True)
def reset_quota(self, **kwargs):
BPSMeter.do.reset_quota(force=True)
sabnzbd.BPSMeter.reset_quota(force=True)
raise Raiser(self.__root)
@secured_expose(check_api_key=True)
def disconnect(self, **kwargs):
Downloader.do.disconnect()
sabnzbd.Downloader.disconnect()
raise Raiser(self.__root)
@secured_expose(check_api_key=True)
@@ -2484,7 +2453,7 @@ class Status:
@secured_expose(check_api_key=True)
def unblock_server(self, **kwargs):
Downloader.do.unblock(kwargs.get("server"))
sabnzbd.Downloader.unblock(kwargs.get("server"))
# Short sleep so that UI shows new server status
time.sleep(1.0)
raise Raiser(self.__root)
@@ -2547,7 +2516,7 @@ def orphan_delete(kwargs):
def orphan_delete_all():
paths = NzbQueue.do.scan_jobs(all_jobs=False, action=False)
paths = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
for path in paths:
kwargs = {"name": path}
orphan_delete(kwargs)
@@ -2558,11 +2527,11 @@ def orphan_add(kwargs):
if path:
path = os.path.join(long_path(cfg.download_dir.get_path()), path)
logging.info("Re-adding orphaned job %s", path)
NzbQueue.do.repair_job(path, None, None)
sabnzbd.NzbQueue.repair_job(path, None, None)
def orphan_add_all():
paths = NzbQueue.do.scan_jobs(all_jobs=False, action=False)
paths = sabnzbd.NzbQueue.scan_jobs(all_jobs=False, action=False)
for path in paths:
kwargs = {"name": path}
orphan_add(kwargs)
@@ -2663,7 +2632,7 @@ def GetRssLog(feed):
return job
jobs = list(sabnzbd.rss.show_result(feed).values())
jobs = sabnzbd.RSSReader.show_result(feed).values()
good, bad, done = ([], [], [])
for job in jobs:
if job["status"][0] == "G":

View File

@@ -30,19 +30,21 @@ import time
import datetime
import inspect
import ctypes
import ipaddress
from typing import Union, Tuple, Any, Optional, List
import sabnzbd
from sabnzbd.constants import DEFAULT_PRIORITY, MEBI, DEF_ARTICLE_CACHE_DEFAULT, DEF_ARTICLE_CACHE_MAX
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.encoding import ubtou, platform_btou
from sabnzbd.filesystem import get_ext, userxbit
from sabnzbd.filesystem import userxbit
TAB_UNITS = ("", "K", "M", "G", "T", "P")
RE_UNITS = re.compile(r"(\d+\.*\d*)\s*([KMGTP]{0,1})", re.I)
RE_UNITS = re.compile(r"(\d+\.*\d*)\s*([KMGTP]?)", re.I)
RE_VERSION = re.compile(r"(\d+)\.(\d+)\.(\d+)([a-zA-Z]*)(\d*)")
RE_IP4 = re.compile(r"inet\s+(addr:\s*){0,1}(\d+\.\d+\.\d+\.\d+)")
RE_IP6 = re.compile(r"inet6\s+(addr:\s*){0,1}([0-9a-f:]+)", re.I)
RE_IP4 = re.compile(r"inet\s+(addr:\s*)?(\d+\.\d+\.\d+\.\d+)")
RE_IP6 = re.compile(r"inet6\s+(addr:\s*)?([0-9a-f:]+)", re.I)
# Check if strings are defined for AM and PM
HAVE_AMPM = bool(time.strftime("%p", time.localtime()))
@@ -71,7 +73,7 @@ def time_format(fmt):
return fmt
def calc_age(date, trans=False):
def calc_age(date: datetime.datetime, trans=False) -> str:
"""Calculate the age difference between now and date.
Value is returned as either days, hours, or minutes.
When 'trans' is True, time symbols will be translated.
@@ -105,16 +107,7 @@ def calc_age(date, trans=False):
return age
def monthrange(start, finish):
""" Calculate months between 2 dates, used in the Config template """
months = (finish.year - start.year) * 12 + finish.month + 1
for i in range(start.month, months):
year = (i - 1) / 12 + start.year
month = (i - 1) % 12 + 1
yield datetime.date(int(year), int(month), 1)
def safe_lower(txt):
def safe_lower(txt: Any) -> str:
""" Return lowercased string. Return '' for None """
if txt:
return txt.lower()
@@ -146,20 +139,19 @@ def name_to_cat(fname, cat=None):
return fname, cat
def cat_to_opts(cat, pp=None, script=None, priority=None):
def cat_to_opts(cat, pp=None, script=None, priority=None) -> Tuple[str, int, str, int]:
"""Derive options from category, if options not already defined.
Specified options have priority over category-options.
If no valid category is given, special category '*' will supply default values
"""
def_cat = config.get_categories("*")
def_cat = config.get_category()
cat = safe_lower(cat)
if cat in ("", "none", "default"):
cat = "*"
try:
my_cat = config.get_categories()[cat]
except KeyError:
my_cat = config.get_category(cat)
# Ignore the input category if we don't know it
if my_cat == def_cat:
cat = "*"
my_cat = def_cat
if pp is None:
pp = my_cat.pp()
@@ -176,11 +168,11 @@ def cat_to_opts(cat, pp=None, script=None, priority=None):
if priority == DEFAULT_PRIORITY:
priority = def_cat.priority()
logging.debug("Cat->Attrib cat=%s pp=%s script=%s prio=%s", cat, pp, script, priority)
logging.debug("Parsing category %s to attributes: pp=%s script=%s prio=%s", cat, pp, script, priority)
return cat, pp, script, priority
def pp_to_opts(pp):
def pp_to_opts(pp: int) -> Tuple[bool, bool, bool]:
""" Convert numeric processing options to (repair, unpack, delete) """
# Convert the pp to an int
pp = sabnzbd.interface.int_conv(pp)
@@ -193,10 +185,8 @@ def pp_to_opts(pp):
return True, True, True
def opts_to_pp(repair, unpack, delete):
def opts_to_pp(repair: bool, unpack: bool, delete: bool) -> int:
""" Convert (repair, unpack, delete) to numeric process options """
if repair is None:
return None
pp = 0
if repair:
pp = 1
@@ -345,7 +335,7 @@ def set_serv_parms(service, args):
return True
def get_from_url(url):
def get_from_url(url: str) -> Optional[str]:
""" Retrieve URL and return content """
try:
req = urllib.request.Request(url)
@@ -479,7 +469,7 @@ def upload_file_to_sabnzbd(url, fp):
logging.info("Traceback: ", exc_info=True)
def from_units(val):
def from_units(val: str) -> float:
""" Convert K/M/G/T/P notation to float """
val = str(val).strip().upper()
if val == "-1":
@@ -503,7 +493,7 @@ def from_units(val):
return 0.0
def to_units(val, postfix=""):
def to_units(val: Union[int, float], postfix="") -> str:
"""Convert number to K/M/G/T/P notation
Show single decimal for M and higher
"""
@@ -749,7 +739,7 @@ def format_time_string(seconds):
return " ".join(completestr)
def int_conv(value):
def int_conv(value: Any) -> int:
""" Safe conversion to int (can handle None) """
try:
value = int(value)
@@ -845,19 +835,26 @@ def find_on_path(targets):
def probablyipv4(ip):
if ip.count(".") == 3 and re.sub("[0123456789.]", "", ip) == "":
return True
else:
try:
return ipaddress.ip_address(ip).version == 4
except:
return False
def probablyipv6(ip):
# Returns True if the given input is probably an IPv6 address
# Square Brackets like '[2001::1]' are OK
if ip.count(":") >= 2 and re.sub(r"[0123456789abcdefABCDEF:\[\]]", "", ip) == "":
return True
else:
return False
try:
# Check for plain IPv6 address
return ipaddress.ip_address(ip).version == 6
except:
try:
# Remove '[' and ']' and test again:
ip = re.search(r"^\[(.*)\]$", ip).group(1)
return ipaddress.ip_address(ip).version == 6
except:
# No, not an IPv6 address
return False
def ip_extract():
@@ -927,7 +924,7 @@ def nntp_to_msg(text):
return ubtou(lines[0])
def build_and_run_command(command, flatten_command=False, **kwargs):
def build_and_run_command(command: List[str], flatten_command=False, **kwargs):
"""Builds and then runs command with nessecary flags and optional
IONice and Nice commands. Optional Popen arguments can be supplied.
On Windows we need to run our own list2cmdline for Unrar.
@@ -988,9 +985,9 @@ def build_and_run_command(command, flatten_command=False, **kwargs):
return subprocess.Popen(command, **popen_kwargs)
def run_command(cmd):
def run_command(cmd: List[str], **kwargs):
""" Run simple external command and return output as a string. """
with build_and_run_command(cmd) as p:
with build_and_run_command(cmd, **kwargs) as p:
txt = platform_btou(p.stdout.read())
p.wait()
return txt

View File

@@ -28,6 +28,7 @@ import time
import zlib
import shutil
import functools
from typing import List
import sabnzbd
from sabnzbd.encoding import platform_btou, correct_unknown_encoding, ubtou
@@ -55,14 +56,15 @@ from sabnzbd.filesystem import (
setname_from_path,
get_ext,
get_filename,
analyze_rar_filename,
)
from sabnzbd.nzbstuff import NzbObject, NzbFile
from sabnzbd.sorting import SeriesSorter
import sabnzbd.cfg as cfg
from sabnzbd.constants import Status
# Regex globals
RAR_RE = re.compile(r"\.(?P<ext>part\d*\.rar|rar|r\d\d|s\d\d|t\d\d|u\d\d|v\d\d|\d\d\d?\d)$", re.I)
RAR_RE_V3 = re.compile(r"\.(?P<ext>part\d*)$", re.I)
LOADING_RE = re.compile(r'^Loading "(.+)"')
TARGET_RE = re.compile(r'^(?:File|Target): "(.+)" -')
@@ -167,7 +169,7 @@ ENV_NZO_FIELDS = [
]
def external_processing(extern_proc, nzo, complete_dir, nicename, status):
def external_processing(extern_proc, nzo: NzbObject, complete_dir, nicename, status):
""" Run a user postproc script, return console output and exit value """
failure_url = nzo.nzo_info.get("failure", "")
# Items can be bool or null, causing POpen to fail
@@ -184,7 +186,7 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
]
# Add path to original NZB
nzb_paths = globber_full(nzo.workpath, "*.gz")
nzb_paths = globber_full(nzo.admin_path, "*.gz")
# Fields not in the NZO directly
extra_env_fields = {
@@ -199,6 +201,7 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
try:
p = build_and_run_command(command, env=create_env(nzo, extra_env_fields))
sabnzbd.PostProcessor.external_process = p
# Follow the output, so we can abort it
proc = p.stdout
@@ -215,14 +218,6 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
# Show current line in history
nzo.set_action_line(T("Running script"), line)
# Check if we should still continue
if not nzo.pp_active:
p.kill()
lines.append(T("PostProcessing was aborted (%s)") % T("Script"))
# Print at least what we got
output = "\n".join(lines)
return output, 1
except:
logging.debug("Failed script %s, Traceback: ", extern_proc, exc_info=True)
return "Cannot run script %s\r\n" % extern_proc, -1
@@ -232,7 +227,9 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
return output, ret
def unpack_magic(nzo, workdir, workdir_complete, dele, one_folder, joinables, zips, rars, sevens, ts, depth=0):
def unpack_magic(
nzo: NzbObject, workdir, workdir_complete, dele, one_folder, joinables, zips, rars, sevens, ts, depth=0
):
""" Do a recursive unpack from all archives in 'workdir' to 'workdir_complete' """
if depth > 5:
logging.warning(T("Unpack nesting too deep [%s]"), nzo.final_name)
@@ -380,7 +377,7 @@ def get_seq_number(name):
return 0
def file_join(nzo, workdir, workdir_complete, delete, joinables):
def file_join(nzo: NzbObject, workdir, workdir_complete, delete, joinables):
"""Join and joinable files in 'workdir' to 'workdir_complete' and
when successful, delete originals
"""
@@ -471,28 +468,71 @@ def file_join(nzo, workdir, workdir_complete, delete, joinables):
##############################################################################
# (Un)Rar Functions
##############################################################################
def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
def rar_unpack(nzo: NzbObject, workdir: str, workdir_complete: str, delete: bool, one_folder: bool, rars: List[str]):
"""Unpack multiple sets 'rars' of RAR files from 'workdir' to 'workdir_complete.
When 'delete' is set, originals will be deleted.
When 'one_folder' is set, all files will be in a single folder
"""
fail = False
newfiles = extracted_files = []
# Is the direct-unpacker still running? We wait for it
if nzo.direct_unpacker:
wait_count = 0
last_stats = nzo.direct_unpacker.get_formatted_stats()
while nzo.direct_unpacker.is_alive():
logging.debug("DirectUnpacker still alive for %s: %s", nzo.final_name, last_stats)
# Bump the file-lock in case it's stuck
with nzo.direct_unpacker.next_file_lock:
nzo.direct_unpacker.next_file_lock.notify()
time.sleep(2)
# Did something change? Might be stuck
if last_stats == nzo.direct_unpacker.get_formatted_stats():
wait_count += 1
if wait_count > 60:
# We abort after 2 minutes of no changes
nzo.direct_unpacker.abort()
else:
wait_count = 0
last_stats = nzo.direct_unpacker.get_formatted_stats()
# Process everything already extracted by Direct Unpack
for rar_set in nzo.direct_unpacker.success_sets:
logging.info("Set %s completed by DirectUnpack", rar_set)
unpacked_rars, newfiles = nzo.direct_unpacker.success_sets[rar_set]
logging.debug("Rars: %s", unpacked_rars)
logging.debug("Newfiles: %s", newfiles)
extracted_files.extend(newfiles)
# Remove all source files from the list and from the disk (if requested)
# so they don't get parsed by the regular unpack
for rar in unpacked_rars:
if rar in rars:
rars.remove(rar)
if delete:
remove_file(rar)
# Clear all sets
nzo.direct_unpacker.success_sets = []
# See which sets are left
rar_sets = {}
for rar in rars:
rar_set = setname_from_path(rar)
if RAR_RE_V3.search(rar_set):
# Remove the ".partXX" part
rar_set = os.path.splitext(rar_set)[0]
# Skip any files that were already removed
if not os.path.exists(rar):
continue
rar_set, _ = analyze_rar_filename(rar)
if rar_set not in rar_sets:
rar_sets[rar_set] = []
rar_sets[rar_set].append(rar)
logging.debug("Rar_sets: %s", rar_sets)
logging.debug("Remaining rar sets: %s", rar_sets)
for rar_set in rar_sets:
# Run the RAR extractor
rar_sets[rar_set].sort(key=functools.cmp_to_key(rar_sort))
rarpath = rar_sets[rar_set][0]
if workdir_complete and rarpath.startswith(workdir):
@@ -500,67 +540,31 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
else:
extraction_path = os.path.split(rarpath)[0]
# Is the direct-unpacker still running? We wait for it
if nzo.direct_unpacker:
wait_count = 0
last_stats = nzo.direct_unpacker.get_formatted_stats()
while nzo.direct_unpacker.is_alive():
logging.debug("DirectUnpacker still alive for %s: %s", nzo.final_name, last_stats)
logging.info("Extracting rarfile %s (belonging to %s) to %s", rarpath, rar_set, extraction_path)
try:
fail, newfiles, rars = rar_extract(
rarpath, len(rar_sets[rar_set]), one_folder, nzo, rar_set, extraction_path
)
except:
fail = True
msg = sys.exc_info()[1]
nzo.fail_msg = T("Unpacking failed, %s") % msg
setname = nzo.final_name
nzo.set_unpack_info("Unpack", T('[%s] Error "%s" while unpacking RAR files') % (setname, msg))
# Bump the file-lock in case it's stuck
with nzo.direct_unpacker.next_file_lock:
nzo.direct_unpacker.next_file_lock.notify()
time.sleep(2)
logging.error(T('Error "%s" while running rar_unpack on %s'), msg, setname)
logging.debug("Traceback: ", exc_info=True)
# Did something change? Might be stuck
if last_stats == nzo.direct_unpacker.get_formatted_stats():
wait_count += 1
if wait_count > 60:
# We abort after 2 minutes of no changes
nzo.direct_unpacker.abort()
else:
wait_count = 0
last_stats = nzo.direct_unpacker.get_formatted_stats()
# Did we already direct-unpack it? Not when recursive-unpacking
if nzo.direct_unpacker and rar_set in nzo.direct_unpacker.success_sets:
logging.info("Set %s completed by DirectUnpack", rar_set)
fail = False
success = True
rars, newfiles = nzo.direct_unpacker.success_sets.pop(rar_set)
else:
logging.info("Extracting rarfile %s (belonging to %s) to %s", rarpath, rar_set, extraction_path)
try:
fail, newfiles, rars = rar_extract(
rarpath, len(rar_sets[rar_set]), one_folder, nzo, rar_set, extraction_path
)
# Was it aborted?
if not nzo.pp_active:
fail = True
break
success = not fail
except:
success = False
fail = True
msg = sys.exc_info()[1]
nzo.fail_msg = T("Unpacking failed, %s") % msg
setname = nzo.final_name
nzo.set_unpack_info("Unpack", T('[%s] Error "%s" while unpacking RAR files') % (setname, msg))
logging.error(T('Error "%s" while running rar_unpack on %s'), msg, setname)
logging.debug("Traceback: ", exc_info=True)
if success:
logging.debug("rar_unpack(): Rars: %s", rars)
logging.debug("rar_unpack(): Newfiles: %s", newfiles)
if not fail:
logging.debug("Rars: %s", rars)
logging.debug("Newfiles: %s", newfiles)
extracted_files.extend(newfiles)
# Do not fail if this was a recursive unpack
if fail and rarpath.startswith(workdir_complete):
# Do not delete the files, leave it to user!
logging.info("Ignoring failure to do recursive unpack of %s", rarpath)
fail = 0
success = True
fail = False
newfiles = []
# Do not fail if this was maybe just some duplicate fileset
@@ -568,12 +572,11 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
if fail and rar_set.endswith((".1", ".2")):
# Just in case, we leave the raw files
logging.info("Ignoring failure of unpack for possible duplicate file %s", rarpath)
fail = 0
success = True
fail = False
newfiles = []
# Delete the old files if we have to
if success and delete and newfiles:
if not fail and delete and newfiles:
for rar in rars:
try:
remove_file(rar)
@@ -594,7 +597,7 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
return fail, extracted_files
def rar_extract(rarfile_path, numrars, one_folder, nzo, setname, extraction_path):
def rar_extract(rarfile_path, numrars, one_folder, nzo: NzbObject, setname, extraction_path):
"""Unpack single rar set 'rarfile' to 'extraction_path',
with password tries
Return fail==0(ok)/fail==1(error)/fail==2(wrong password), new_files, rars
@@ -621,7 +624,7 @@ def rar_extract(rarfile_path, numrars, one_folder, nzo, setname, extraction_path
return fail, new_files, rars
def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction_path, password):
def rar_extract_core(rarfile_path, numrars, one_folder, nzo: NzbObject, setname, extraction_path, password):
"""Unpack single rar set 'rarfile_path' to 'extraction_path'
Return fail==0(ok)/fail==1(error)/fail==2(wrong password)/fail==3(crc-error), new_files, rars
"""
@@ -648,22 +651,25 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
rename = "-or" # Auto renaming
if sabnzbd.WIN32:
# For Unrar to support long-path, we need to cricumvent Python's list2cmdline
# For Unrar to support long-path, we need to circumvent Python's list2cmdline
# See: https://github.com/sabnzbd/sabnzbd/issues/1043
# The -scf forces the output to be UTF8
command = [
"%s" % RAR_COMMAND,
action,
"-idp",
"-scf",
overwrite,
rename,
"-ai",
password_command,
"%s" % clip_path(rarfile_path),
rarfile_path,
"%s\\" % long_path(extraction_path),
]
elif RAR_PROBLEM:
# Use only oldest options (specifically no "-or")
# Use only oldest options, specifically no "-or" or "-scf"
# Luckily platform_btou has a fallback for non-UTF-8
command = [
"%s" % RAR_COMMAND,
action,
@@ -675,10 +681,12 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
]
else:
# Don't use "-ai" (not needed for non-Windows)
# The -scf forces the output to be UTF8
command = [
"%s" % RAR_COMMAND,
action,
"-idp",
"-scf",
overwrite,
rename,
password_command,
@@ -692,6 +700,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
# Get list of all the volumes part of this set
logging.debug("Analyzing rar file ... %s found", rarfile.is_rarfile(rarfile_path))
p = build_and_run_command(command, flatten_command=True)
sabnzbd.PostProcessor.external_process = p
proc = p.stdout
if p.stdin:
@@ -712,15 +721,6 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
if not line:
break
# Check if we should still continue
if not nzo.pp_active:
p.kill()
msg = T("PostProcessing was aborted (%s)") % T("Unpack")
nzo.fail_msg = msg
nzo.set_unpack_info("Unpack", msg, setname)
nzo.status = Status.FAILED
return fail, (), ()
line = line.strip()
lines.append(line)
@@ -856,7 +856,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
##############################################################################
# (Un)Zip Functions
##############################################################################
def unzip(nzo, workdir, workdir_complete, delete, one_folder, zips):
def unzip(nzo: NzbObject, workdir, workdir_complete, delete, one_folder, zips):
"""Unpack multiple sets 'zips' of ZIP files from 'workdir' to 'workdir_complete.
When 'delete' is ste, originals will be deleted.
"""
@@ -934,7 +934,7 @@ def ZIP_Extract(zipfile, extraction_path, one_folder):
##############################################################################
# 7Zip Functions
##############################################################################
def unseven(nzo, workdir, workdir_complete, delete, one_folder, sevens):
def unseven(nzo: NzbObject, workdir, workdir_complete, delete, one_folder, sevens):
"""Unpack multiple sets '7z' of 7Zip files from 'workdir' to 'workdir_complete.
When 'delete' is set, originals will be deleted.
"""
@@ -982,7 +982,7 @@ def unseven(nzo, workdir, workdir_complete, delete, one_folder, sevens):
return unseven_failed, new_files
def seven_extract(nzo, sevenset, extensions, extraction_path, one_folder, delete):
def seven_extract(nzo: NzbObject, sevenset, extensions, extraction_path, one_folder, delete):
"""Unpack single set 'sevenset' to 'extraction_path', with password tries
Return fail==0(ok)/fail==1(error)/fail==2(wrong password), new_files, sevens
"""
@@ -1051,6 +1051,7 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
command = [SEVEN_COMMAND, method, "-y", overwrite, parm, case, password, "-o%s" % extraction_path, name]
p = build_and_run_command(command)
sabnzbd.PostProcessor.external_process = p
output = platform_btou(p.stdout.read())
logging.debug("7za output: %s", output)
@@ -1089,7 +1090,7 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
##############################################################################
# PAR2 Functions
##############################################################################
def par2_repair(parfile_nzf, nzo, workdir, setname, single):
def par2_repair(parfile_nzf: NzbFile, nzo: NzbObject, workdir, setname, single):
""" Try to repair a set, return readd or correctness """
# Check if file exists, otherwise see if another is done
parfile_path = os.path.join(workdir, parfile_nzf.filename)
@@ -1215,7 +1216,7 @@ _RE_LOADING_PAR2 = re.compile(r'Loading "([^"]+)"\.')
_RE_LOADED_PAR2 = re.compile(r"Loaded (\d+) new packets")
def PAR_Verify(parfile, nzo, setname, joinables, single=False):
def PAR_Verify(parfile, nzo: NzbObject, setname, joinables, single=False):
""" Run par2 on par-set """
used_joinables = []
used_for_repair = []
@@ -1262,6 +1263,7 @@ def PAR_Verify(parfile, nzo, setname, joinables, single=False):
# Run the external command
p = build_and_run_command(command)
sabnzbd.PostProcessor.external_process = p
proc = p.stdout
if p.stdin:
@@ -1297,16 +1299,6 @@ def PAR_Verify(parfile, nzo, setname, joinables, single=False):
line = linebuf.strip()
linebuf = ""
# Check if we should still continue
if not nzo.pp_active:
p.kill()
msg = T("PostProcessing was aborted (%s)") % T("Repair")
nzo.fail_msg = msg
nzo.set_unpack_info("Repair", msg, setname)
nzo.status = Status.FAILED
readd = False
break
# Skip empty lines
if line == "":
continue
@@ -1350,7 +1342,7 @@ def PAR_Verify(parfile, nzo, setname, joinables, single=False):
block_table[nzf.blocks] = nzf
if block_table:
nzf = block_table[min(block_table.keys())]
nzf = block_table[min(block_table)]
logging.info("Found new par2file %s", nzf.filename)
# Move from extrapar list to files to be downloaded
@@ -1536,7 +1528,7 @@ def PAR_Verify(parfile, nzo, setname, joinables, single=False):
_RE_FILENAME = re.compile(r'"([^"]+)"')
def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
def MultiPar_Verify(parfile, nzo: NzbObject, setname, joinables, single=False):
""" Run par2 on par-set """
parfolder = os.path.split(parfile)[0]
used_joinables = []
@@ -1546,9 +1538,10 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
nzo.status = Status.VERIFYING
start = time.time()
# Caching of verification implemented by adding:
# Caching of verification implemented by adding -vs/-vd
# Force output of utf-8 by adding -uo
# But not really required due to prospective-par2
command = [str(MULTIPAR_COMMAND), "r", "-vs2", "-vd%s" % parfolder, parfile]
command = [str(MULTIPAR_COMMAND), "r", "-uo", "-vs2", "-vd%s" % parfolder, parfile]
# Check if there are maybe par2cmdline/par2tbb commands supplied
if "-t" in cfg.par_option() or "-p" in cfg.par_option():
@@ -1573,6 +1566,7 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
# Run MultiPar
p = build_and_run_command(command)
sabnzbd.PostProcessor.external_process = p
proc = p.stdout
if p.stdin:
p.stdin.close()
@@ -1583,7 +1577,7 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
renames = {}
reconstructed = []
linebuf = ""
linebuf = b""
finished = 0
readd = False
@@ -1599,27 +1593,17 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
# Loop over the output, whee
while 1:
char = platform_btou(proc.read(1))
char = proc.read(1)
if not char:
break
# Line not complete yet
if char not in ("\n", "\r"):
if char not in (b"\n", b"\r"):
linebuf += char
continue
line = linebuf.strip()
linebuf = ""
# Check if we should still continue
if not nzo.pp_active:
p.kill()
msg = T("PostProcessing was aborted (%s)") % T("Repair")
nzo.fail_msg = msg
nzo.set_unpack_info("Repair", msg, setname)
nzo.status = Status.FAILED
readd = False
break
line = ubtou(linebuf).strip()
linebuf = b""
# Skip empty lines
if line == "":
@@ -1650,7 +1634,7 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
block_table[nzf.blocks] = nzf
if block_table:
nzf = block_table[min(block_table.keys())]
nzf = block_table[min(block_table)]
logging.info("Found new par2file %s", nzf.filename)
# Move from extrapar list to files to be downloaded
@@ -1917,7 +1901,7 @@ def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
def create_env(nzo=None, extra_env_fields={}):
"""Modify the environment for pp-scripts with extra information
OSX: Return copy of environment without PYTHONPATH and PYTHONHOME
macOS: Return copy of environment without PYTHONPATH and PYTHONHOME
other: return None
"""
env = os.environ.copy()
@@ -2104,7 +2088,7 @@ def quick_check_set(set, nzo):
if nzf.md5sum == md5pack[file]:
try:
logging.debug("Quick-check will rename %s to %s", nzf.filename, file)
renamer(os.path.join(nzo.downpath, nzf.filename), os.path.join(nzo.downpath, file))
renamer(os.path.join(nzo.download_path, nzf.filename), os.path.join(nzo.download_path, file))
renames[file] = nzf.filename
nzf.filename = file
result &= True
@@ -2207,7 +2191,7 @@ def is_sfv_file(myfile):
return sfv_info_line_counter >= 1
def sfv_check(sfvs, nzo, workdir):
def sfv_check(sfvs, nzo: NzbObject, workdir):
""" Verify files using SFV files """
# Update status
nzo.status = Status.VERIFYING
@@ -2264,7 +2248,7 @@ def sfv_check(sfvs, nzo, workdir):
if nzf.filename in calculated_crc32 and calculated_crc32[nzf.filename] == sfv_parse_results[file]:
try:
logging.debug("SFV-check will rename %s to %s", nzf.filename, file)
renamer(os.path.join(nzo.downpath, nzf.filename), os.path.join(nzo.downpath, file))
renamer(os.path.join(nzo.download_path, nzf.filename), os.path.join(nzo.download_path, file))
renames[file] = nzf.filename
nzf.filename = file
result &= True
@@ -2330,7 +2314,7 @@ def analyse_show(name):
return show_name, info.get("season_num", ""), info.get("episode_num", ""), info.get("ep_name", "")
def pre_queue(nzo, pp, cat):
def pre_queue(nzo: NzbObject, pp, cat):
"""Run pre-queue script (if any) and process results.
pp and cat are supplied seperate since they can change.
"""
@@ -2437,10 +2421,7 @@ class SevenZip:
""" Read named file from 7Zip and return data """
command = [SEVEN_COMMAND, "e", "-p", "-y", "-so", self.path, name]
# Ignore diagnostic output, otherwise it will be appended to content
p = build_and_run_command(command, stderr=subprocess.DEVNULL)
output = platform_btou(p.stdout.read())
p.wait()
return output
return run_command(command, stderr=subprocess.DEVNULL)
def close(self):
""" Close file """

View File

@@ -26,11 +26,12 @@ from nntplib import NNTPPermanentError
import time
import logging
import ssl
from typing import List, Optional
import sabnzbd
from sabnzbd.constants import *
from sabnzbd.encoding import utob
import sabnzbd.cfg
from sabnzbd.constants import DEF_TIMEOUT
from sabnzbd.encoding import utob
from sabnzbd.misc import nntp_to_msg, probablyipv4, probablyipv6
# Set pre-defined socket timeout
@@ -50,7 +51,7 @@ def _retrieve_info(server):
else:
server.bad_cons = 0
(server.info, server.request) = (info, False)
sabnzbd.downloader.Downloader.do.wakeup()
sabnzbd.Downloader.wakeup()
def request_server_info(server):
@@ -139,7 +140,7 @@ class NNTP:
def __init__(self, host, port, info, sslenabled, nw, block=False, write_fds=None):
self.host = host
self.port = port
self.nw = nw
self.nw: NewsWrapper = nw
self.blocking = block
self.error_msg = None
@@ -165,21 +166,21 @@ class NNTP:
ctx.options |= ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 | ssl.OP_NO_TLSv1_1
# Only verify hostname when we're strict
if nw.server.ssl_verify < 2:
if self.nw.server.ssl_verify < 2:
ctx.check_hostname = False
# Certificates optional
if nw.server.ssl_verify == 0:
if self.nw.server.ssl_verify == 0:
ctx.verify_mode = ssl.CERT_NONE
# Did the user set a custom cipher-string?
if nw.server.ssl_ciphers:
if self.nw.server.ssl_ciphers:
# At their own risk, socket will error out in case it was invalid
ctx.set_ciphers(nw.server.ssl_ciphers)
ctx.set_ciphers(self.nw.server.ssl_ciphers)
self.sock = ctx.wrap_socket(socket.socket(af, socktype, proto), server_hostname=str(nw.server.host))
self.sock = ctx.wrap_socket(socket.socket(af, socktype, proto), server_hostname=self.nw.server.host)
else:
# Use a regular wrapper, no certificate validation
self.sock = ssl.wrap_socket(socket.socket(af, socktype, proto), ciphers=sabnzbd.cfg.ssl_ciphers())
self.sock = ssl.wrap_socket(socket.socket(af, socktype, proto))
else:
self.sock = socket.socket(af, socktype, proto)
@@ -260,6 +261,9 @@ class NNTP:
logging.info(msg)
self.nw.server.warning = msg
def __repr__(self):
return "<NNTP: %s:%s>" % (self.host, self.port)
class NewsWrapper:
# Pre-define attributes to save memory
@@ -283,16 +287,16 @@ class NewsWrapper:
)
def __init__(self, server, thrdnum, block=False):
self.server = server
self.server: sabnzbd.downloader.Server = server
self.thrdnum = thrdnum
self.blocking = block
self.timeout = None
self.article = None
self.data = []
self.article: Optional[sabnzbd.nzbstuff.Article] = None
self.data: List[bytes] = []
self.last_line = ""
self.nntp = None
self.nntp: Optional[NNTP] = None
self.recv = None
self.connected = False
@@ -430,16 +434,16 @@ class NewsWrapper:
# Official end-of-article is ".\r\n" but sometimes it can get lost between 2 chunks
chunk_len = len(chunk)
if chunk[-5:] == b"\r\n.\r\n":
return (chunk_len, True, False)
return chunk_len, True, False
elif chunk_len < 5 and len(self.data) > 1:
# We need to make sure the end is not split over 2 chunks
# This is faster than join()
combine_chunk = self.data[-2][-5:] + chunk
if combine_chunk[-5:] == b"\r\n.\r\n":
return (chunk_len, True, False)
return chunk_len, True, False
# Still in middle of data, so continue!
return (chunk_len, False, False)
return chunk_len, False, False
def soft_reset(self):
self.timeout = None
@@ -481,3 +485,11 @@ class NewsWrapper:
except:
pass
del self.nntp
def __repr__(self):
return "<NewsWrapper: server=%s:%s, thread=%s, connected=%s>" % (
self.server.host,
self.server.port,
self.thrdnum,
self.connected,
)

View File

@@ -118,7 +118,7 @@ def nzbfile_parser(raw_data, nzo):
pass
# Sort the articles by part number, compatible with Python 3.5
raw_article_db_sorted = [raw_article_db[partnum] for partnum in sorted(raw_article_db.keys())]
raw_article_db_sorted = [raw_article_db[partnum] for partnum in sorted(raw_article_db)]
# Create NZF
nzf = sabnzbd.nzbstuff.NzbFile(file_date, file_name, raw_article_db_sorted, file_bytes, nzo)
@@ -139,7 +139,7 @@ def nzbfile_parser(raw_data, nzo):
else:
logging.info("Error importing %s, skipping", file_name)
if nzf.nzf_id:
sabnzbd.remove_data(nzf.nzf_id, nzo.workpath)
sabnzbd.remove_data(nzf.nzf_id, nzo.admin_path)
skipped_files += 1
# Final bookkeeping
@@ -234,10 +234,10 @@ def process_nzb_archive_file(
if nzo:
if nzo_id:
# Re-use existing nzo_id, when a "future" job gets it payload
sabnzbd.nzbqueue.NzbQueue.do.remove(nzo_id, add_to_history=False, delete_all_data=False)
sabnzbd.NzbQueue.remove(nzo_id, delete_all_data=False)
nzo.nzo_id = nzo_id
nzo_id = None
nzo_ids.append(sabnzbd.nzbqueue.NzbQueue.do.add(nzo))
nzo_ids.append(sabnzbd.NzbQueue.add(nzo))
nzo.update_rating()
zf.close()
try:
@@ -329,7 +329,7 @@ def process_single_nzb(
except TypeError:
# Duplicate, ignore
if nzo_id:
sabnzbd.nzbqueue.NzbQueue.do.remove(nzo_id, add_to_history=False)
sabnzbd.NzbQueue.remove(nzo_id)
nzo = None
except ValueError:
# Empty
@@ -346,9 +346,9 @@ def process_single_nzb(
if nzo:
if nzo_id:
# Re-use existing nzo_id, when a "future" job gets it payload
sabnzbd.nzbqueue.NzbQueue.do.remove(nzo_id, add_to_history=False, delete_all_data=False)
sabnzbd.NzbQueue.remove(nzo_id, delete_all_data=False)
nzo.nzo_id = nzo_id
nzo_ids.append(sabnzbd.nzbqueue.NzbQueue.do.add(nzo, quiet=reuse))
nzo_ids.append(sabnzbd.NzbQueue.add(nzo, quiet=reuse))
nzo.update_rating()
try:

View File

@@ -24,9 +24,10 @@ import logging
import time
import datetime
import functools
from typing import List, Dict, Union, Tuple, Optional
import sabnzbd
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.nzbstuff import NzbObject, Article
from sabnzbd.misc import exit_sab, cat_to_opts, int_conv, caller_name, cmp, safe_lower
from sabnzbd.filesystem import get_admin_path, remove_all, globber_full, remove_file
from sabnzbd.nzbparser import process_single_nzb
@@ -53,23 +54,18 @@ from sabnzbd.constants import (
)
import sabnzbd.cfg as cfg
import sabnzbd.downloader
from sabnzbd.assembler import Assembler, file_has_articles
from sabnzbd.downloader import Server
from sabnzbd.assembler import file_has_articles
import sabnzbd.notifier as notifier
from sabnzbd.bpsmeter import BPSMeter
class NzbQueue:
""" Singleton NzbQueue """
do = None
def __init__(self):
self.__top_only = cfg.top_only()
self.__nzo_list = []
self.__nzo_table = {}
NzbQueue.do = self
self.__top_only: bool = cfg.top_only()
self.__nzo_list: List[NzbObject] = []
self.__nzo_table: Dict[str, NzbObject] = {}
def read_queue(self, repair):
"""Read queue from disk, supporting repair modes
@@ -115,6 +111,7 @@ class NzbQueue:
# Scan for any folders in "incomplete" that are not yet in the queue
if repair:
logging.info("Starting queue repair")
self.scan_jobs(not folders)
# Handle any lost future jobs
for item in globber_full(os.path.join(cfg.admin_dir.get_path(), FUTURE_Q_FOLDER)):
@@ -182,7 +179,6 @@ class NzbQueue:
remove_all(admin_path, "*.gz", keep_folder=True)
logging.debug("Repair job %s with new NZB (%s)", name, new_nzb.filename)
_, nzo_ids = sabnzbd.add_nzbfile(new_nzb, nzbname=name, reuse=repair_folder, password=password)
nzo_id = nzo_ids[0]
else:
# Was this file already post-processed?
verified = sabnzbd.load_data(VERIFIED_FILE, admin_path, remove=False)
@@ -193,34 +189,47 @@ class NzbQueue:
if filenames:
logging.debug("Repair job %s by re-parsing stored NZB", name)
_, nzo_ids = sabnzbd.add_nzbfile(filenames[0], nzbname=name, reuse=repair_folder, password=password)
nzo_id = nzo_ids[0]
else:
logging.debug("Repair job %s without stored NZB", name)
nzo = NzbObject(name, nzbname=name, reuse=repair_folder)
nzo.password = password
self.add(nzo)
nzo_id = nzo.nzo_id
try:
logging.debug("Repair job %s without stored NZB", name)
nzo = NzbObject(name, nzbname=name, reuse=repair_folder)
nzo.password = password
self.add(nzo)
nzo_ids = [nzo.nzo_id]
except:
# NzoObject can throw exceptions if duplicate or unwanted etc
logging.info("Skipping %s due to exception", name, exc_info=True)
nzo_ids = []
return nzo_id
# Return None if we could not add anything
if nzo_ids:
return nzo_ids[0]
return None
@NzbQueueLocker
def send_back(self, nzo):
def send_back(self, old_nzo: NzbObject):
""" Send back job to queue after successful pre-check """
try:
nzb_path = globber_full(nzo.workpath, "*.gz")[0]
nzb_path = globber_full(old_nzo.admin_path, "*.gz")[0]
except:
logging.info("Failed to find NZB file after pre-check (%s)", nzo.nzo_id)
logging.info("Failed to find NZB file after pre-check (%s)", old_nzo.nzo_id)
return
# Need to remove it first, otherwise it might still be downloading
self.remove(nzo, add_to_history=False, cleanup=False)
res, nzo_ids = process_single_nzb(nzo.filename, nzb_path, keep=True, reuse=nzo.downpath, nzo_id=nzo.nzo_id)
# Store old position and create new NZO
old_position = self.__nzo_list.index(old_nzo)
res, nzo_ids = process_single_nzb(
old_nzo.filename, nzb_path, keep=True, reuse=old_nzo.download_path, nzo_id=old_nzo.nzo_id
)
if res == 0 and nzo_ids:
# Swap to old position
new_nzo = self.get_nzo(nzo_ids[0])
self.__nzo_list.remove(new_nzo)
self.__nzo_list.insert(old_position, new_nzo)
# Reset reuse flag to make pause/abort on encryption possible
self.__nzo_table[nzo_ids[0]].reuse = None
@NzbQueueLocker
def save(self, save_nzo=None):
def save(self, save_nzo: Union[NzbObject, None, bool] = None):
""" Save queue, all nzo's or just the specified one """
logging.info("Saving queue")
@@ -234,7 +243,7 @@ class NzbQueue:
# Also includes save_data for NZO
nzo.save_to_disk()
else:
sabnzbd.save_data(nzo, nzo.nzo_id, nzo.workpath)
sabnzbd.save_data(nzo, nzo.nzo_id, nzo.admin_path)
sabnzbd.save_admin((QUEUE_VERSION, nzo_ids, []), QUEUE_FILE_NAME)
@@ -258,7 +267,7 @@ class NzbQueue:
self.add(future_nzo)
return future_nzo
def change_opts(self, nzo_ids, pp):
def change_opts(self, nzo_ids: str, pp: int) -> int:
result = 0
for nzo_id in [item.strip() for item in nzo_ids.split(",")]:
if nzo_id in self.__nzo_table:
@@ -266,7 +275,7 @@ class NzbQueue:
result += 1
return result
def change_script(self, nzo_ids, script):
def change_script(self, nzo_ids: str, script: str) -> int:
result = 0
for nzo_id in [item.strip() for item in nzo_ids.split(",")]:
if nzo_id in self.__nzo_table:
@@ -275,7 +284,7 @@ class NzbQueue:
result += 1
return result
def change_cat(self, nzo_ids, cat, explicit_priority=None):
def change_cat(self, nzo_ids: str, cat: str, explicit_priority=None):
result = 0
for nzo_id in [item.strip() for item in nzo_ids.split(",")]:
if nzo_id in self.__nzo_table:
@@ -290,7 +299,7 @@ class NzbQueue:
result += 1
return result
def change_name(self, nzo_id, name, password=None):
def change_name(self, nzo_id: str, name: str, password: str = None):
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
logging.info("Renaming %s to %s", nzo.final_name, name)
@@ -306,63 +315,61 @@ class NzbQueue:
else:
return False
def get_nzo(self, nzo_id):
def get_nzo(self, nzo_id) -> Optional[NzbObject]:
if nzo_id in self.__nzo_table:
return self.__nzo_table[nzo_id]
else:
return None
@NzbQueueLocker
def add(self, nzo, save=True, quiet=False):
def add(self, nzo: NzbObject, save=True, quiet=False) -> str:
if not nzo.nzo_id:
nzo.nzo_id = sabnzbd.get_new_id("nzo", nzo.workpath, self.__nzo_table)
nzo.nzo_id = sabnzbd.get_new_id("nzo", nzo.admin_path, self.__nzo_table)
# If no files are to be downloaded anymore, send to postproc
if not nzo.files and not nzo.futuretype:
self.end_job(nzo)
return nzo.nzo_id
# Reset try_lists
# Reset try_lists, markers and evaluate the scheduling settings
nzo.reset_try_list()
nzo.deleted = False
priority = nzo.priority
if sabnzbd.Scheduler.analyse(False, priority):
nzo.status = Status.PAUSED
if nzo.nzo_id:
nzo.deleted = False
priority = nzo.priority
if sabnzbd.scheduler.analyse(False, priority):
nzo.status = Status.PAUSED
self.__nzo_table[nzo.nzo_id] = nzo
if priority > HIGH_PRIORITY:
# Top and repair priority items are added to the top of the queue
self.__nzo_list.insert(0, nzo)
elif priority == LOW_PRIORITY:
self.__nzo_list.append(nzo)
else:
# for high priority we need to add the item at the bottom
# of any other high priority items above the normal priority
# for normal priority we need to add the item at the bottom
# of the normal priority items above the low priority
if self.__nzo_list:
pos = 0
added = False
for position in self.__nzo_list:
if position.priority < priority:
self.__nzo_list.insert(pos, nzo)
added = True
break
pos += 1
if not added:
# if there are no other items classed as a lower priority
# then it will be added to the bottom of the queue
self.__nzo_list.append(nzo)
else:
# if the queue is empty then simple append the item to the bottom
self.__nzo_table[nzo.nzo_id] = nzo
if priority > HIGH_PRIORITY:
# Top and repair priority items are added to the top of the queue
self.__nzo_list.insert(0, nzo)
elif priority == LOW_PRIORITY:
self.__nzo_list.append(nzo)
else:
# for high priority we need to add the item at the bottom
# of any other high priority items above the normal priority
# for normal priority we need to add the item at the bottom
# of the normal priority items above the low priority
if self.__nzo_list:
pos = 0
added = False
for position in self.__nzo_list:
if position.priority < priority:
self.__nzo_list.insert(pos, nzo)
added = True
break
pos += 1
if not added:
# if there are no other items classed as a lower priority
# then it will be added to the bottom of the queue
self.__nzo_list.append(nzo)
if save:
self.save(nzo)
else:
# if the queue is empty then simple append the item to the bottom
self.__nzo_list.append(nzo)
if save:
self.save(nzo)
if not (quiet or nzo.status == Status.FETCHING):
notifier.send_notification(T("NZB added to queue"), nzo.filename, "download", nzo.cat)
if not (quiet or nzo.status == Status.FETCHING):
notifier.send_notification(T("NZB added to queue"), nzo.filename, "download", nzo.cat)
if not quiet and cfg.auto_sort():
try:
@@ -373,7 +380,7 @@ class NzbQueue:
return nzo.nzo_id
@NzbQueueLocker
def remove(self, nzo_id, add_to_history=True, cleanup=True, delete_all_data=True):
def remove(self, nzo_id: str, cleanup=True, delete_all_data=True):
"""Remove NZO from queue.
It can be added to history directly.
Or, we do some clean-up, sometimes leaving some data.
@@ -388,36 +395,28 @@ class NzbQueue:
nzo.status = Status.DELETED
self.__nzo_list.remove(nzo)
if add_to_history:
# Create the history DB instance
history_db = database.HistoryDB()
# Add the nzo to the database. Only the path, script and time taken is passed
# Other information is obtained from the nzo
history_db.add_history_db(nzo)
history_db.close()
sabnzbd.history_updated()
elif cleanup:
if cleanup:
nzo.purge_data(delete_all_data=delete_all_data)
self.save(False)
return nzo_id
return None
@NzbQueueLocker
def remove_multiple(self, nzo_ids, delete_all_data=True):
def remove_multiple(self, nzo_ids: List[str], delete_all_data=True) -> List[str]:
removed = []
for nzo_id in nzo_ids:
if self.remove(nzo_id, add_to_history=False, delete_all_data=delete_all_data):
if self.remove(nzo_id, delete_all_data=delete_all_data):
removed.append(nzo_id)
# Any files left? Otherwise let's disconnect
if self.actives(grabs=False) == 0 and cfg.autodisconnect():
# This was the last job, close server connections
sabnzbd.downloader.Downloader.do.disconnect()
sabnzbd.Downloader.disconnect()
return removed
@NzbQueueLocker
def remove_all(self, search=None):
def remove_all(self, search: str = "") -> List[str]:
""" Remove NZO's that match the search-pattern """
nzo_ids = []
search = safe_lower(search)
@@ -426,7 +425,7 @@ class NzbQueue:
nzo_ids.append(nzo_id)
return self.remove_multiple(nzo_ids)
def remove_nzf(self, nzo_id, nzf_id, force_delete=False):
def remove_nzf(self, nzo_id: str, nzf_id: str, force_delete=False) -> List[str]:
removed = []
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
@@ -440,7 +439,7 @@ class NzbQueue:
if nzo.finished_files:
self.end_job(nzo)
else:
self.remove(nzo_id, add_to_history=False, keep_basic=False)
self.remove(nzo_id)
elif force_delete:
# Force-remove all trace and update counters
nzo.bytes -= nzf.bytes
@@ -452,14 +451,14 @@ class NzbQueue:
logging.info("Removed NZFs %s from job %s", removed, nzo.final_name)
return removed
def pause_multiple_nzo(self, nzo_ids):
def pause_multiple_nzo(self, nzo_ids: List[str]) -> List[str]:
handled = []
for nzo_id in nzo_ids:
self.pause_nzo(nzo_id)
handled.append(nzo_id)
return handled
def pause_nzo(self, nzo_id):
def pause_nzo(self, nzo_id: str) -> List[str]:
handled = []
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
@@ -468,7 +467,7 @@ class NzbQueue:
handled.append(nzo_id)
return handled
def resume_multiple_nzo(self, nzo_ids):
def resume_multiple_nzo(self, nzo_ids: List[str]) -> List[str]:
handled = []
for nzo_id in nzo_ids:
self.resume_nzo(nzo_id)
@@ -476,7 +475,7 @@ class NzbQueue:
return handled
@NzbQueueLocker
def resume_nzo(self, nzo_id):
def resume_nzo(self, nzo_id: str) -> List[str]:
handled = []
if nzo_id in self.__nzo_table:
nzo = self.__nzo_table[nzo_id]
@@ -487,7 +486,7 @@ class NzbQueue:
return handled
@NzbQueueLocker
def switch(self, item_id_1, item_id_2):
def switch(self, item_id_1: str, item_id_2: str) -> Tuple[int, int]:
try:
# Allow an index as second parameter, easier for some skins
i = int(item_id_2)
@@ -594,7 +593,7 @@ class NzbQueue:
elif field.lower() == "size" or field.lower() == "bytes":
self.sort_by_size(reverse)
elif field.lower() == "avg_age":
self.sort_by_avg_age(reverse)
self.sort_by_avg_age(not reverse)
else:
logging.debug("Sort: %s not recognized", field)
@@ -623,7 +622,7 @@ class NzbQueue:
return nzo_id_pos1
nzo.set_priority(priority)
if sabnzbd.scheduler.analyse(False, priority) and nzo.status in (
if sabnzbd.Scheduler.analyse(False, priority) and nzo.status in (
Status.CHECKING,
Status.DOWNLOADING,
Status.QUEUED,
@@ -687,7 +686,7 @@ class NzbQueue:
return -1
@staticmethod
def reset_try_lists(article, article_reset=True):
def reset_try_lists(article: Article, article_reset=True):
""" Let article get new fetcher and reset trylists """
article.fetcher = None
if article_reset:
@@ -708,7 +707,7 @@ class NzbQueue:
return True
return False
def get_article(self, server, servers):
def get_article(self, server: Server, servers: List[Server]) -> Optional[Article]:
"""Get next article for jobs in the queue
Not locked for performance, since it only reads the queue
"""
@@ -731,7 +730,7 @@ class NzbQueue:
if self.__top_only:
return
def register_article(self, article, success=True):
def register_article(self, article: Article, success=True):
"""Register the articles we tried
Not locked for performance, since it only modifies individual NZOs
"""
@@ -753,7 +752,7 @@ class NzbQueue:
# Only start decoding if we have a filename and type
# The type is only set if sabyenc could decode the article
if nzf.filename and nzf.type:
Assembler.do.process((nzo, nzf, file_done))
sabnzbd.Assembler.process(nzo, nzf, file_done)
elif nzf.filename.lower().endswith(".par2"):
# Broken par2 file, try to get another one
nzo.promote_par2(nzf)
@@ -764,7 +763,7 @@ class NzbQueue:
# Save bookkeeping in case of crash
if file_done and (nzo.next_save is None or time.time() > nzo.next_save):
nzo.save_to_disk()
BPSMeter.do.save()
sabnzbd.BPSMeter.save()
if nzo.save_timeout is None:
nzo.next_save = None
else:
@@ -772,9 +771,10 @@ class NzbQueue:
# Remove post from Queue
if post_done:
nzo.set_download_report()
self.end_job(nzo)
def end_job(self, nzo):
def end_job(self, nzo: NzbObject):
""" Send NZO to the post-processing queue """
# Notify assembler to call postprocessor
if not nzo.deleted:
@@ -791,9 +791,9 @@ class NzbQueue:
else:
# Not enough data, let postprocessor show it as failed
pass
Assembler.do.process((nzo, None, None))
sabnzbd.Assembler.process(nzo)
def actives(self, grabs=True):
def actives(self, grabs=True) -> int:
"""Return amount of non-paused jobs, optionally with 'grabbing' items
Not locked for performance, only reads the queue
"""
@@ -867,10 +867,10 @@ class NzbQueue:
# Stall prevention by checking if all servers are in the trylist
# This is a CPU-cheaper alternative to prevent stalling
if len(nzo.try_list) == sabnzbd.downloader.Downloader.do.server_nr:
if len(nzo.try_list) == sabnzbd.Downloader.server_nr:
# Maybe the NZF's need a reset too?
for nzf in nzo.files:
if len(nzf.try_list) == sabnzbd.downloader.Downloader.do.server_nr:
if len(nzf.try_list) == sabnzbd.Downloader.server_nr:
# We do not want to reset all article trylists, they are good
logging.info("Resetting bad trylist for file %s in job %s", nzf.filename, nzo.final_name)
nzf.reset_try_list()
@@ -882,25 +882,25 @@ class NzbQueue:
for nzo in empty:
self.end_job(nzo)
def pause_on_prio(self, priority):
def pause_on_prio(self, priority: int):
for nzo in self.__nzo_list:
if nzo.priority == priority:
nzo.pause()
@NzbQueueLocker
def resume_on_prio(self, priority):
def resume_on_prio(self, priority: int):
for nzo in self.__nzo_list:
if nzo.priority == priority:
# Don't use nzo.resume() to avoid resetting job warning flags
nzo.status = Status.QUEUED
def pause_on_cat(self, cat):
def pause_on_cat(self, cat: str):
for nzo in self.__nzo_list:
if nzo.cat == cat:
nzo.pause()
@NzbQueueLocker
def resume_on_cat(self, cat):
def resume_on_cat(self, cat: str):
for nzo in self.__nzo_list:
if nzo.cat == cat:
# Don't use nzo.resume() to avoid resetting job warning flags
@@ -921,7 +921,7 @@ class NzbQueue:
return "<NzbQueue>"
def _nzo_date_cmp(nzo1, nzo2):
def _nzo_date_cmp(nzo1: NzbObject, nzo2: NzbObject):
avg_date1 = nzo1.avg_date
avg_date2 = nzo2.avg_date
@@ -944,7 +944,7 @@ def _nzo_size_cmp(nzo1, nzo2):
return cmp(nzo1.bytes, nzo2.bytes)
def sort_queue_function(nzo_list, method, reverse):
def sort_queue_function(nzo_list: List[NzbObject], method, reverse: bool) -> List[NzbObject]:
ultra_high_priority = [nzo for nzo in nzo_list if nzo.priority == REPAIR_PRIORITY]
super_high_priority = [nzo for nzo in nzo_list if nzo.priority == FORCE_PRIORITY]
high_priority = [nzo for nzo in nzo_list if nzo.priority == HIGH_PRIORITY]

View File

@@ -18,7 +18,6 @@
"""
sabnzbd.nzbstuff - misc
"""
import os
import time
import re
@@ -27,6 +26,7 @@ import datetime
import threading
import functools
import difflib
from typing import List, Dict, Any, Tuple, Optional
# SABnzbd modules
import sabnzbd
@@ -68,7 +68,6 @@ from sabnzbd.filesystem import (
sanitize_filename,
set_permissions,
long_path,
trim_win_path,
fix_unix_encoding,
is_obfuscated_filename,
get_ext,
@@ -84,13 +83,13 @@ from sabnzbd.decorators import synchronized
import sabnzbd.config as config
import sabnzbd.cfg as cfg
import sabnzbd.nzbparser
from sabnzbd.downloader import Server
from sabnzbd.database import HistoryDB
from sabnzbd.articlecache import ArticleCache
from sabnzbd.rating import Rating
from sabnzbd.deobfuscate_filenames import is_probably_obfuscated
# Name patterns
SUBJECT_FN_MATCHER = re.compile(r'"([^"]*)"')
RE_NORMAL_NAME = re.compile(r"\.\w{1,5}$") # Test reasonably sized extension at the end
RE_SUBJECT_FILENAME_QUOTES = re.compile(r'"([^"]*)"') # In the subject, we expect the filename within double quotes
RE_SUBJECT_BASIC_FILENAME = re.compile(r"([\w\-+()'\s.,]*\.\w{2,4})") # Otherwise something that looks like a filename
RE_RAR = re.compile(r"(\.rar|\.r\d\d|\.s\d\d|\.t\d\d|\.u\d\d|\.v\d\d)$", re.I)
RE_PROPER = re.compile(r"(^|[\. _-])(PROPER|REAL|REPACK)([\. _-]|$)")
@@ -109,15 +108,15 @@ class TryList:
__slots__ = ("try_list", "fetcher_priority")
def __init__(self):
self.try_list = []
self.fetcher_priority = 0
self.try_list: List[Server] = []
self.fetcher_priority: int = 0
def server_in_try_list(self, server):
def server_in_try_list(self, server: Server):
""" Return whether specified server has been tried """
with TRYLIST_LOCK:
return server in self.try_list
def add_to_try_list(self, server):
def add_to_try_list(self, server: Server):
""" Register server as having been tried already """
with TRYLIST_LOCK:
if server not in self.try_list:
@@ -132,11 +131,11 @@ class TryList:
""" Save the servers """
return [server.id for server in self.try_list]
def __setstate__(self, servers_ids):
def __setstate__(self, servers_ids: List[str]):
self.try_list = []
for server_id in servers_ids:
if server_id in sabnzbd.downloader.Downloader.do.server_dict:
self.add_to_try_list(sabnzbd.downloader.Downloader.do.server_dict[server_id])
if server_id in sabnzbd.Downloader.server_dict:
self.add_to_try_list(sabnzbd.Downloader.server_dict[server_id])
##############################################################################
@@ -153,17 +152,17 @@ class Article(TryList):
def __init__(self, article, article_bytes, nzf):
TryList.__init__(self)
self.fetcher = None
self.article = article
self.fetcher: Optional[Server] = None
self.article: str = article
self.art_id = None
self.bytes = article_bytes
self.lowest_partnum = False
self.tries = 0 # Try count
self.decoded = False
self.on_disk = False
self.nzf = nzf
self.nzf: NzbFile = nzf
def get_article(self, server, servers):
def get_article(self, server: Server, servers: List[Server]):
""" Return article when appropriate for specified server """
log = sabnzbd.LOG_ALL
if not self.fetcher and not self.server_in_try_list(server):
@@ -232,18 +231,18 @@ class Article(TryList):
def get_art_id(self):
""" Return unique article storage name, create if needed """
if not self.art_id:
self.art_id = sabnzbd.get_new_id("article", self.nzf.nzo.workpath)
self.art_id = sabnzbd.get_new_id("article", self.nzf.nzo.admin_path)
return self.art_id
def search_new_server(self):
# Search new server
self.add_to_try_list(self.fetcher)
for server in sabnzbd.downloader.Downloader.do.servers:
for server in sabnzbd.Downloader.servers:
if server.active and not self.server_in_try_list(server):
if server.priority >= self.fetcher.priority:
self.tries = 0
# Allow all servers for this nzo and nzf again (but not for this article)
sabnzbd.nzbqueue.NzbQueue.do.reset_try_lists(self, article_reset=False)
sabnzbd.NzbQueue.reset_try_lists(self, article_reset=False)
return True
logging.info(T("%s => missing from all servers, discarding") % self)
@@ -324,54 +323,53 @@ class NzbFile(TryList):
""" Setup object """
TryList.__init__(self)
self.date = date
self.subject = subject
self.type = None
self.filename = name_extractor(subject)
self.date: datetime.datetime = date
self.subject: str = subject
self.type: Optional[str] = None
self.filename: str = sanitize_filename(name_extractor(subject))
self.filename_checked = False
self.filepath = None
self.filepath: Optional[str] = None
# Identifiers for par2 files
self.is_par2 = False
self.vol = None
self.blocks = None
self.setname = None
self.is_par2: bool = False
self.vol: Optional[int] = None
self.blocks: Optional[int] = None
self.setname: Optional[str] = None
# Articles are removed from "articles" after being fetched
self.articles = []
self.decodetable = []
self.articles: List[Article] = []
self.decodetable: List[Article] = []
self.bytes = file_bytes
self.bytes_left = file_bytes
self.bytes: int = file_bytes
self.bytes_left: int = file_bytes
self.nzo = nzo
self.nzf_id = sabnzbd.get_new_id("nzf", nzo.workpath)
self.nzo: NzbObject = nzo
self.nzf_id: str = sabnzbd.get_new_id("nzf", nzo.admin_path)
self.deleted = False
self.valid = False
self.import_finished = False
self.md5 = None
self.md5sum = None
self.md5of16k = None
self.md5sum: Optional[bytes] = None
self.md5of16k: Optional[bytes] = None
self.valid = bool(raw_article_db)
if self.valid and self.nzf_id:
# Save first article separate so we can do duplicate file detection
# Save first article separate so we can do
# duplicate file detection and deobfuscate-during-download
first_article = self.add_article(raw_article_db.pop(0))
first_article.lowest_partnum = True
self.nzo.first_articles.append(first_article)
self.nzo.first_articles_count += 1
# For non-par2 files we also use it to do deobfuscate-during-download
# And we count how many bytes are available for repair
# Count how many bytes are available for repair
if sabnzbd.par2file.is_parfile(self.filename):
self.nzo.first_articles.append(first_article)
self.nzo.first_articles_count += 1
self.nzo.bytes_par2 += self.bytes
# Any articles left?
if raw_article_db:
# Save the rest
sabnzbd.save_data(raw_article_db, self.nzf_id, nzo.workpath)
sabnzbd.save_data(raw_article_db, self.nzf_id, nzo.admin_path)
else:
# All imported
self.import_finished = True
@@ -379,11 +377,11 @@ class NzbFile(TryList):
def finish_import(self):
""" Load the article objects from disk """
logging.debug("Finishing import on %s", self.filename)
raw_article_db = sabnzbd.load_data(self.nzf_id, self.nzo.workpath, remove=False)
raw_article_db = sabnzbd.load_data(self.nzf_id, self.nzo.admin_path, remove=False)
if raw_article_db:
# Convert 2.x.x jobs
if isinstance(raw_article_db, dict):
raw_article_db = [raw_article_db[partnum] for partnum in sorted(raw_article_db.keys())]
raw_article_db = [raw_article_db[partnum] for partnum in sorted(raw_article_db)]
for raw_article in raw_article_db:
self.add_article(raw_article)
@@ -402,7 +400,7 @@ class NzbFile(TryList):
self.decodetable.append(article)
return article
def remove_article(self, article, success):
def remove_article(self, article: Article, success: bool) -> int:
""" Handle completed article, possibly end of file """
if article in self.articles:
self.articles.remove(article)
@@ -417,7 +415,7 @@ class NzbFile(TryList):
self.vol = vol
self.blocks = int_conv(blocks)
def get_article(self, server, servers):
def get_article(self, server: Server, servers: List[Server]) -> Optional[Article]:
""" Get next article to be downloaded """
for article in self.articles:
article = article.get_article(server, servers)
@@ -449,7 +447,7 @@ class NzbFile(TryList):
""" Remove article database from disk (sabnzbd_nzf_<id>)"""
try:
logging.debug("Removing article database for %s", self.nzf_id)
remove_file(os.path.join(self.nzo.workpath, self.nzf_id))
remove_file(os.path.join(self.nzo.admin_path, self.nzf_id))
except:
pass
@@ -473,7 +471,7 @@ class NzbFile(TryList):
# Convert 2.x.x jobs
if isinstance(self.decodetable, dict):
self.decodetable = [self.decodetable[partnum] for partnum in sorted(self.decodetable.keys())]
self.decodetable = [self.decodetable[partnum] for partnum in sorted(self.decodetable)]
# Set non-transferable values
self.md5 = None
@@ -618,11 +616,12 @@ class NzbObject(TryList):
else:
r, u, d = pp_to_opts(pp)
self.priority: int = NORMAL_PRIORITY
self.set_priority(priority) # Parse priority of input
self.repair = r # True if we want to repair this set
self.unpack = u # True if we want to unpack this set
self.delete = d # True if we want to delete this set
self.script = script # External script for this set
self.repair: bool = r # True if we want to repair this set
self.unpack: bool = u # True if we want to unpack this set
self.delete: bool = d # True if we want to delete this set
self.script: str = script # External script for this set
self.cat = cat # User-set category
# Information fields
@@ -633,36 +632,36 @@ class NzbObject(TryList):
# Bookkeeping values
self.meta = {}
self.servercount = {} # Dict to keep bytes per server
self.servercount: Dict[str, int] = {} # Dict to keep bytes per server
self.created = False # dirprefixes + work_name created
self.direct_unpacker = None # Holds the DirectUnpacker instance
self.bytes = 0 # Original bytesize
self.bytes_par2 = 0 # Bytes available for repair
self.bytes_downloaded = 0 # Downloaded byte
self.bytes_tried = 0 # Which bytes did we try
self.bytes_missing = 0 # Bytes missing
self.bad_articles = 0 # How many bad (non-recoverable) articles
self.direct_unpacker: Optional[sabnzbd.directunpacker.DirectUnpacker] = None # The DirectUnpacker instance
self.bytes: int = 0 # Original bytesize
self.bytes_par2: int = 0 # Bytes available for repair
self.bytes_downloaded: int = 0 # Downloaded byte
self.bytes_tried: int = 0 # Which bytes did we try
self.bytes_missing: int = 0 # Bytes missing
self.bad_articles: int = 0 # How many bad (non-recoverable) articles
self.partable = {} # Holds one parfile-name for each set
self.extrapars = {} # Holds the extra parfile names for all sets
self.md5packs = {} # Holds the md5pack for each set (name: hash)
self.md5of16k = {} # Holds the md5s of the first-16k of all files in the NZB (hash: name)
self.partable: Dict[str, NzbFile] = {} # Holds one parfile-name for each set
self.extrapars: Dict[str, List[NzbFile]] = {} # Holds the extra parfile names for all sets
self.md5packs: Dict[str, Dict[str, bytes]] = {} # Holds the md5pack for each set (name: hash)
self.md5of16k: Dict[bytes, str] = {} # Holds the md5s of the first-16k of all files in the NZB (hash: name)
self.files = [] # List of all NZFs
self.files_table = {} # Dictionary of NZFs indexed using NZF_ID
self.renames = {} # Dictionary of all renamed files
self.files: List[NzbFile] = [] # List of all NZFs
self.files_table: Dict[str, NzbFile] = {} # Dictionary of NZFs indexed using NZF_ID
self.renames: Dict[str, str] = {} # Dictionary of all renamed files
self.finished_files = [] # List of all finished NZFs
self.finished_files: List[NzbFile] = [] # List of all finished NZFs
# The current status of the nzo eg:
# Queued, Downloading, Repairing, Unpacking, Failed, Complete
self.status = status
self.status: str = status
self.avg_bps_freq = 0
self.avg_bps_total = 0
self.first_articles = []
self.first_articles: List[Article] = []
self.first_articles_count = 0
self.saved_articles = []
self.saved_articles: List[Article] = []
self.nzo_id = None
@@ -685,11 +684,11 @@ class NzbObject(TryList):
# Store one line responses for filejoin/par2/unrar/unzip here for history display
self.action_line = ""
# Store the results from various filejoin/par2/unrar/unzip stages
self.unpack_info = {}
self.unpack_info: Dict[str, List[str]] = {}
# Stores one line containing the last failure
self.fail_msg = ""
# Stores various info about the nzo to be
self.nzo_info = nzo_info or {}
self.nzo_info: Dict[str, Any] = nzo_info or {}
# Temporary store for custom foldername - needs to be stored because of url fetching
self.custom_name = nzbname
@@ -697,10 +696,10 @@ class NzbObject(TryList):
self.next_save = None
self.save_timeout = None
self.encrypted = 0
self.url_wait = None
self.url_wait: Optional[float] = None
self.url_tries = 0
self.pp_active = False # Signals active post-processing (not saved)
self.md5sum = None
self.md5sum: Optional[bytes] = None
if nzb is None and not reuse:
# This is a slot for a future NZB, ready now
@@ -725,9 +724,8 @@ class NzbObject(TryList):
if reuse and os.path.exists(reuse):
work_dir = long_path(reuse)
else:
# Determine "incomplete" folder and trim path on Windows to prevent long-path unrar errors
work_dir = long_path(os.path.join(cfg.download_dir.get_path(), self.work_name))
work_dir = trim_win_path(work_dir)
# Determine "incomplete" folder
work_dir = os.path.join(cfg.download_dir.get_path(), self.work_name)
work_dir = get_unique_path(work_dir, create_dir=True)
set_permissions(work_dir)
@@ -910,9 +908,8 @@ class NzbObject(TryList):
# In case pre-queue script or duplicate check want to move
# to history we first need an nzo_id by entering the NzbQueue
if accept == 2:
self.deleted = True
sabnzbd.NzbQueue.do.add(self, quiet=True)
sabnzbd.NzbQueue.do.end_job(self)
sabnzbd.NzbQueue.add(self, quiet=True)
sabnzbd.NzbQueue.end_job(self)
# Raise error, so it's not added
raise TypeError
@@ -926,7 +923,7 @@ class NzbObject(TryList):
self.servercount[serverid] = bytes_received
@synchronized(NZO_LOCK)
def remove_nzf(self, nzf):
def remove_nzf(self, nzf: NzbFile) -> bool:
if nzf in self.files:
self.files.remove(nzf)
if nzf not in self.finished_files:
@@ -974,7 +971,7 @@ class NzbObject(TryList):
self.reset_try_list()
@synchronized(NZO_LOCK)
def postpone_pars(self, nzf, parset):
def postpone_pars(self, nzf: NzbFile, parset: str):
""" Move all vol-par files matching 'parset' to the extrapars table """
# Create new extrapars if it didn't already exist
# For example if created when the first par2 file was missing
@@ -1008,7 +1005,7 @@ class NzbObject(TryList):
self.verify_all_filenames_and_resort()
@synchronized(NZO_LOCK)
def handle_par2(self, nzf, filepath):
def handle_par2(self, nzf: NzbFile, filepath):
""" Check if file is a par2 and build up par2 collection """
# Need to remove it from the other set it might be in
self.remove_extrapar(nzf)
@@ -1050,13 +1047,13 @@ class NzbObject(TryList):
if get_ext(nzf.filename) != ".par2":
# Do cheap renaming so it gets better picked up by par2
# Only basename has to be the same
new_fname = get_unique_filename(os.path.join(self.downpath, "%s.par2" % setname))
new_fname = get_unique_filename(os.path.join(self.download_path, "%s.par2" % setname))
renamer(filepath, new_fname)
self.renamed_file(get_filename(new_fname), nzf.filename)
nzf.filename = get_filename(new_fname)
@synchronized(NZO_LOCK)
def promote_par2(self, nzf):
def promote_par2(self, nzf: NzbFile):
"""In case of a broken par2 or missing par2, move another
of the same set to the top (if we can find it)
"""
@@ -1116,7 +1113,7 @@ class NzbObject(TryList):
return False
@synchronized(NZO_LOCK)
def remove_article(self, article, success):
def remove_article(self, article: Article, success: bool):
""" Remove article from the NzbFile and do check if it can succeed"""
job_can_succeed = True
nzf = article.nzf
@@ -1173,25 +1170,20 @@ class NzbObject(TryList):
# Abort the job due to failure
if not job_can_succeed:
self.set_download_report()
self.fail_msg = T("Aborted, cannot be completed") + " - https://sabnzbd.org/not-complete"
self.set_unpack_info("Download", self.fail_msg, unique=False)
logging.debug('Abort job "%s", due to impossibility to complete it', self.final_name)
return True, True, True
post_done = False
if not self.files:
post_done = True
self.set_download_report()
return articles_left, file_done, post_done
# Check if there are any files left here, so the check is inside the NZO_LOCK
return articles_left, file_done, not self.files
@synchronized(NZO_LOCK)
def add_saved_article(self, article):
def add_saved_article(self, article: Article):
self.saved_articles.append(article)
@synchronized(NZO_LOCK)
def remove_saved_article(self, article):
def remove_saved_article(self, article: Article):
try:
self.saved_articles.remove(article)
except ValueError:
@@ -1199,7 +1191,7 @@ class NzbObject(TryList):
# and this function is called from file_has_articles
pass
def check_existing_files(self, wdir):
def check_existing_files(self, wdir: str):
""" Check if downloaded files already exits, for these set NZF to complete """
fix_unix_encoding(wdir)
@@ -1207,7 +1199,7 @@ class NzbObject(TryList):
files = globber(wdir, "*.*")
# Substitute renamed files
renames = sabnzbd.load_data(RENAMES_FILE, self.workpath, remove=True)
renames = sabnzbd.load_data(RENAMES_FILE, self.admin_path, remove=True)
if renames:
for name in renames:
if name in files or renames[name] in files:
@@ -1288,7 +1280,7 @@ class NzbObject(TryList):
if not self.unpack:
self.abort_direct_unpacker()
def set_priority(self, value):
def set_priority(self, value: Any):
""" Check if this is a valid priority """
# When unknown (0 is a known one), set to DEFAULT
if value == "" or value is None:
@@ -1405,14 +1397,14 @@ class NzbObject(TryList):
self.partable.pop(setname)
@synchronized(NZO_LOCK)
def remove_extrapar(self, parfile):
def remove_extrapar(self, parfile: NzbFile):
""" Remove par file from any/all sets """
for _set in self.extrapars:
if parfile in self.extrapars[_set]:
self.extrapars[_set].remove(parfile)
@synchronized(NZO_LOCK)
def prospective_add(self, nzf):
def prospective_add(self, nzf: NzbFile):
"""Add par2 files to compensate for missing articles
This fails in case of multi-sets with identical setnames
"""
@@ -1435,7 +1427,7 @@ class NzbObject(TryList):
# Reset NZO TryList
self.reset_try_list()
def add_to_direct_unpacker(self, nzf):
def add_to_direct_unpacker(self, nzf: NzbFile):
""" Start or add to DirectUnpacker """
if not self.direct_unpacker:
sabnzbd.directunpacker.DirectUnpacker(self)
@@ -1493,7 +1485,7 @@ class NzbObject(TryList):
# Sort the servers first
servers = config.get_servers()
server_names = sorted(
servers.keys(),
servers,
key=lambda svr: "%d%02d%s"
% (int(not servers[svr].enable()), servers[svr].priority(), servers[svr].displayname().lower()),
)
@@ -1547,7 +1539,7 @@ class NzbObject(TryList):
self.nzo_info[article_type] += 1
self.bad_articles += 1
def get_article(self, server, servers):
def get_article(self, server: Server, servers: List[Server]) -> Optional[Article]:
article = None
nzf_remove_list = []
@@ -1569,7 +1561,7 @@ class NzbObject(TryList):
if not nzf.import_finished:
# Only load NZF when it's a primary server
# or when it's a backup server without active primaries
if sabnzbd.highest_server(server):
if sabnzbd.Downloader.highest_server(server):
nzf.finish_import()
# Still not finished? Something went wrong...
if not nzf.import_finished and not self.is_gone():
@@ -1591,7 +1583,7 @@ class NzbObject(TryList):
# If cleanup emptied the active files list, end this job
if nzf_remove_list and not self.files:
sabnzbd.NzbQueue.do.end_job(self)
sabnzbd.NzbQueue.end_job(self)
if not article:
# No articles for this server, block for next time
@@ -1609,7 +1601,7 @@ class NzbObject(TryList):
pos_nzf_table = self.build_pos_nzf_table(nzf_ids)
keys = list(pos_nzf_table.keys())
keys = list(pos_nzf_table)
keys.sort()
if target == keys:
@@ -1626,7 +1618,7 @@ class NzbObject(TryList):
pos_nzf_table = self.build_pos_nzf_table(nzf_ids)
keys = list(pos_nzf_table.keys())
keys = list(pos_nzf_table)
keys.sort()
if target == keys:
@@ -1666,7 +1658,7 @@ class NzbObject(TryList):
self.files[pos + 1] = nzf
self.files[pos] = tmp_nzf
def verify_nzf_filename(self, nzf, yenc_filename=None):
def verify_nzf_filename(self, nzf: NzbFile, yenc_filename: Optional[str] = None):
""" Get filename from par2-info or from yenc """
# Already done?
if nzf.filename_checked:
@@ -1695,15 +1687,18 @@ class NzbObject(TryList):
if (
yenc_filename
and yenc_filename != nzf.filename
and not is_obfuscated_filename(yenc_filename)
and not is_probably_obfuscated(yenc_filename)
and not nzf.filename.endswith(".par2")
):
logging.info("Detected filename from yenc: %s -> %s", nzf.filename, yenc_filename)
self.renamed_file(yenc_filename, nzf.filename)
nzf.filename = yenc_filename
@synchronized(NZO_LOCK)
def verify_all_filenames_and_resort(self):
""" Verify all filenames based on par2 info and then re-sort files """
"""Verify all filenames based on par2 info and then re-sort files.
Locked so all files are verified at once without interuptions.
"""
logging.info("Checking all filenames for %s", self.final_name)
for nzf_verify in self.files:
self.verify_nzf_filename(nzf_verify)
@@ -1751,18 +1746,18 @@ class NzbObject(TryList):
fields = {}
for k in rating_types:
fields[k] = _get_first_meta(k)
Rating.do.add_rating(_get_first_meta("id"), self.nzo_id, fields)
sabnzbd.Rating.add_rating(_get_first_meta("id"), self.nzo_id, fields)
except:
pass
@property
def workpath(self):
def admin_path(self):
""" Return the full path for my job-admin folder """
return long_path(get_admin_path(self.work_name, self.futuretype))
@property
def downpath(self):
""" Return the full path for my download folder """
def download_path(self):
""" Return the full path for the download folder """
if self.futuretype:
return ""
else:
@@ -1791,19 +1786,19 @@ class NzbObject(TryList):
self.abort_direct_unpacker()
# Remove all cached files
ArticleCache.do.purge_articles(self.saved_articles)
sabnzbd.ArticleCache.purge_articles(self.saved_articles)
# Delete all, or just basic files
if self.futuretype:
# Remove temporary file left from URL-fetches
sabnzbd.remove_data(self.nzo_id, self.workpath)
sabnzbd.remove_data(self.nzo_id, self.admin_path)
elif delete_all_data:
remove_all(self.downpath, recursive=True)
remove_all(self.download_path, recursive=True)
else:
# We remove any saved articles and save the renames file
remove_all(self.downpath, "SABnzbd_nz?_*", keep_folder=True)
remove_all(self.downpath, "SABnzbd_article_*", keep_folder=True)
sabnzbd.save_data(self.renames, RENAMES_FILE, self.workpath, silent=True)
remove_all(self.download_path, "SABnzbd_nz?_*", keep_folder=True)
remove_all(self.download_path, "SABnzbd_article_*", keep_folder=True)
sabnzbd.save_data(self.renames, RENAMES_FILE, self.admin_path, silent=True)
def gather_info(self, full=False):
queued_files = []
@@ -1842,12 +1837,12 @@ class NzbObject(TryList):
self.direct_unpacker.get_formatted_stats() if self.direct_unpacker else 0,
)
def get_nzf_by_id(self, nzf_id):
def get_nzf_by_id(self, nzf_id: str) -> NzbFile:
if nzf_id in self.files_table:
return self.files_table[nzf_id]
@synchronized(NZO_LOCK)
def set_unpack_info(self, key, msg, setname=None, unique=False):
def set_unpack_info(self, key: str, msg: str, setname: Optional[str] = None, unique: bool = False):
"""Builds a dictionary containing the stage name (key) and a message
If unique is present, it will only have a single line message
"""
@@ -1880,7 +1875,7 @@ class NzbObject(TryList):
""" Save job's admin to disk """
self.save_attribs()
if self.nzo_id and not self.is_gone():
sabnzbd.save_data(self, self.nzo_id, self.workpath)
sabnzbd.save_data(self, self.nzo_id, self.admin_path)
def save_attribs(self):
""" Save specific attributes for Retry """
@@ -1888,11 +1883,11 @@ class NzbObject(TryList):
for attrib in NzoAttributeSaver:
attribs[attrib] = getattr(self, attrib)
logging.debug("Saving attributes %s for %s", attribs, self.final_name)
sabnzbd.save_data(attribs, ATTRIB_FILE, self.workpath, silent=True)
sabnzbd.save_data(attribs, ATTRIB_FILE, self.admin_path, silent=True)
def load_attribs(self):
def load_attribs(self) -> Tuple[Optional[str], Optional[int], Optional[str]]:
""" Load saved attributes and return them to be parsed """
attribs = sabnzbd.load_data(ATTRIB_FILE, self.workpath, remove=False)
attribs = sabnzbd.load_data(ATTRIB_FILE, self.admin_path, remove=False)
logging.debug("Loaded attributes %s for %s", attribs, self.final_name)
# If attributes file somehow does not exists
@@ -2033,7 +2028,7 @@ class NzbObject(TryList):
return "<NzbObject: filename=%s, bytes=%s, nzo_id=%s>" % (self.filename, self.bytes, self.nzo_id)
def nzf_cmp_name(nzf1, nzf2):
def nzf_cmp_name(nzf1: NzbFile, nzf2: NzbFile):
# The comparison will sort .par2 files to the top of the queue followed by .rar files,
# they will then be sorted by name.
nzf1_name = nzf1.filename.lower()
@@ -2070,7 +2065,7 @@ def nzf_cmp_name(nzf1, nzf2):
return cmp(nzf1_name, nzf2_name)
def create_work_name(name):
def create_work_name(name: str) -> str:
""" Remove ".nzb" and ".par(2)" and sanitize, skip URL's """
if name.find("://") < 0:
# In case it was one of these, there might be more
@@ -2085,76 +2080,63 @@ def create_work_name(name):
return name.strip()
def scan_password(name):
def scan_password(name: str) -> Tuple[str, Optional[str]]:
""" Get password (if any) from the title """
if "http://" in name or "https://" in name:
return name, None
braces = name.find("{{")
braces = name[1:].find("{{")
if braces < 0:
braces = len(name)
else:
braces += 1
slash = name.find("/")
# Look for name/password, but make sure that '/' comes before any {{
if 0 <= slash < braces and "password=" not in name:
if 0 < slash < braces and "password=" not in name:
# Is it maybe in 'name / password' notation?
if slash == name.find(" / ") + 1:
if slash == name.find(" / ") + 1 and name[: slash - 1].strip(". "):
# Remove the extra space after name and before password
return name[: slash - 1].strip(". "), name[slash + 2 :]
return name[:slash].strip(". "), name[slash + 1 :]
if name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# Look for "name password=password"
pw = name.find("password=")
if pw >= 0:
if pw > 0 and name[:pw].strip(". "):
return name[:pw].strip(". "), name[pw + 9 :]
# Look for name{{password}}
if braces < len(name) and "}}" in name:
closing_braces = name.find("}}")
if closing_braces < 0:
closing_braces = len(name)
return name[:braces].strip(". "), name[braces + 2 : closing_braces]
if braces < len(name):
closing_braces = name.rfind("}}")
if closing_braces > braces and name[:braces].strip(". "):
return name[:braces].strip(". "), name[braces + 2 : closing_braces]
# Look again for name/password
if slash >= 0:
if slash > 0 and name[:slash].strip(". "):
return name[:slash].strip(". "), name[slash + 1 :]
# No password found
return name, None
def get_attrib_file(path, size):
""" Read job's attributes from file """
logging.debug("Reading %s attributes from %s", size, path)
attribs = []
path = os.path.join(path, ATTRIB_FILE)
try:
with open(path, "r", encoding="utf-8") as attr_file:
for _ in range(size):
line = attr_file.readline().strip("\r\n ")
if line:
if line.lower() == "none":
line = None
try:
line = int(line)
except:
pass
attribs.append(line)
else:
attribs.append(None)
return attribs
except OSError:
return [None for _ in range(size)]
def name_extractor(subject):
def name_extractor(subject: str) -> str:
""" Try to extract a file name from a subject line, return `subject` if in doubt """
result = subject
for name in re.findall(SUBJECT_FN_MATCHER, subject):
# Filename nicely wrapped in quotes
for name in re.findall(RE_SUBJECT_FILENAME_QUOTES, subject):
name = name.strip(' "')
if name and RE_NORMAL_NAME.search(name):
if name:
result = name
# Found nothing? Try a basic filename-like search
if result == subject:
for name in re.findall(RE_SUBJECT_BASIC_FILENAME, subject):
name = name.strip()
if name:
result = name
# Return the subject
return result

View File

@@ -16,7 +16,7 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.osxmenu - OSX Top Menu
sabnzbd.osxmenu - macOS Top Menu
"""
import objc
@@ -41,11 +41,8 @@ from sabnzbd.panic import launch_a_browser
import sabnzbd.notifier as notifier
from sabnzbd.api import fast_queue
from sabnzbd.nzbqueue import NzbQueue
import sabnzbd.config as config
import sabnzbd.scheduler as scheduler
import sabnzbd.downloader
from sabnzbd.bpsmeter import BPSMeter
status_icons = {
"idle": "icons/sabnzbd_osx_idle.tiff",
@@ -113,7 +110,7 @@ class SABnzbdDelegate(NSObject):
# Variables
self.state = "Idle"
try:
self.speed = sabnzbd.downloader.Downloader.do.get_limit()
self.speed = sabnzbd.Downloader.get_limit()
except:
self.speed = 0
self.version_notify = 1
@@ -234,7 +231,7 @@ class SABnzbdDelegate(NSObject):
100: "100%",
}
for speed in sorted(speeds.keys()):
for speed in sorted(speeds):
menu_speed_item = NSMenuItem.alloc().initWithTitle_action_keyEquivalent_(
"%s" % (speeds[speed]), "speedlimitAction:", ""
)
@@ -386,7 +383,7 @@ class SABnzbdDelegate(NSObject):
def queueUpdate(self):
try:
qnfo = NzbQueue.do.queue_info(start=0, limit=10)
qnfo = sabnzbd.NzbQueue.queue_info(start=0, limit=10)
pnfo_list = qnfo.list
bytesleftprogess = 0
@@ -407,7 +404,7 @@ class SABnzbdDelegate(NSObject):
bytesleftprogess += pnfo.bytes_left
bytes_total = pnfo.bytes / MEBI
nzo_id = pnfo.nzo_id
timeleft = self.calc_timeleft_(bytesleftprogess, BPSMeter.do.bps)
timeleft = self.calc_timeleft_(bytesleftprogess, sabnzbd.BPSMeter.bps)
job = "%s\t(%d/%d MB) %s" % (pnfo.filename, bytesleft, bytes_total, timeleft)
menu_queue_item = NSMenuItem.alloc().initWithTitle_action_keyEquivalent_(job, "", "")
@@ -509,8 +506,8 @@ class SABnzbdDelegate(NSObject):
if paused:
self.state = T("Paused")
if sabnzbd.scheduler.pause_int() != "0":
self.setMenuTitle_("\n\n%s\n" % (sabnzbd.scheduler.pause_int()))
if sabnzbd.Scheduler.pause_int() != "0":
self.setMenuTitle_("\n\n%s\n" % (sabnzbd.Scheduler.pause_int()))
else:
self.setMenuTitle_("")
elif bytes_left > 0:
@@ -546,7 +543,7 @@ class SABnzbdDelegate(NSObject):
def iconUpdate(self):
try:
if sabnzbd.downloader.Downloader.do.paused:
if sabnzbd.Downloader.paused:
self.status_item.setImage_(self.icons["pause"])
else:
self.status_item.setImage_(self.icons["idle"])
@@ -555,7 +552,7 @@ class SABnzbdDelegate(NSObject):
def pauseUpdate(self):
try:
if sabnzbd.downloader.Downloader.do.paused:
if sabnzbd.Downloader.paused:
if self.isLeopard:
self.resume_menu_item.setHidden_(NO)
self.pause_menu_item.setHidden_(YES)
@@ -574,7 +571,7 @@ class SABnzbdDelegate(NSObject):
def speedlimitUpdate(self):
try:
speed = int(sabnzbd.downloader.Downloader.do.get_limit())
speed = int(sabnzbd.Downloader.get_limit())
if self.speed != speed:
self.speed = speed
speedsValues = self.menu_speed.numberOfItems()
@@ -735,14 +732,14 @@ class SABnzbdDelegate(NSObject):
# logging.info("[osx] speed limit to %s" % (sender.representedObject()))
speed = int(sender.representedObject())
if speed != self.speed:
sabnzbd.downloader.Downloader.do.limit_speed("%s%%" % speed)
sabnzbd.Downloader.limit_speed("%s%%" % speed)
self.speedlimitUpdate()
def purgeAction_(self, sender):
mode = sender.representedObject()
# logging.info("[osx] purge %s" % (mode))
if mode == "queue":
NzbQueue.do.remove_all()
sabnzbd.NzbQueue.remove_all()
elif mode == "history":
if not self.history_db:
self.history_db = sabnzbd.database.HistoryDB()
@@ -752,18 +749,18 @@ class SABnzbdDelegate(NSObject):
minutes = int(sender.representedObject())
# logging.info("[osx] pause for %s" % (minutes))
if minutes:
scheduler.plan_resume(minutes)
sabnzbd.Scheduler.plan_resume(minutes)
else:
sabnzbd.downloader.Downloader.do.pause()
sabnzbd.Downloader.pause()
def resumeAction_(self, sender):
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
def watchedFolderAction_(self, sender):
sabnzbd.dirscanner.dirscan()
sabnzbd.DirScanner.scan()
def rssAction_(self, sender):
scheduler.force_rss()
sabnzbd.Scheduler.force_rss()
def openFolderAction_(self, sender):
folder2open = sender.representedObject()
@@ -802,7 +799,7 @@ class SABnzbdDelegate(NSObject):
# logging.info('[osx] file open')
# logging.info('[osx] file : %s' % (filenames))
for filename in filenames:
logging.info("[osx] receiving from OSX : %s", filename)
logging.info("[osx] receiving from macOS : %s", filename)
if os.path.exists(filename):
if sabnzbd.filesystem.get_ext(filename) in VALID_ARCHIVES + VALID_NZB_FILES:
sabnzbd.add_nzbfile(filename, keep=True)

View File

@@ -23,17 +23,18 @@ import logging
import os
import re
import struct
from typing import Dict, Optional, Tuple
from sabnzbd.encoding import correct_unknown_encoding
PROBABLY_PAR2_RE = re.compile(r"(.*)\.vol(\d*)[\+\-](\d*)\.par2", re.I)
PROBABLY_PAR2_RE = re.compile(r"(.*)\.vol(\d*)[+\-](\d*)\.par2", re.I)
PAR_PKT_ID = b"PAR2\x00PKT"
PAR_FILE_ID = b"PAR 2.0\x00FileDesc"
PAR_CREATOR_ID = b"PAR 2.0\x00Creator"
PAR_RECOVERY_ID = b"RecvSlic"
def is_parfile(filename):
def is_parfile(filename: str) -> bool:
"""Check quickly whether file has par2 signature
or if the filename has '.par2' in it
"""
@@ -49,7 +50,7 @@ def is_parfile(filename):
return False
def analyse_par2(name, filepath=None):
def analyse_par2(name: str, filepath: Optional[str] = None) -> Tuple[str, int, int]:
"""Check if file is a par2-file and determine vol/block
return setname, vol, block
setname is empty when not a par2 file
@@ -82,7 +83,7 @@ def analyse_par2(name, filepath=None):
return setname, vol, block
def parse_par2_file(fname, md5of16k):
def parse_par2_file(fname: str, md5of16k: Dict[bytes, str]) -> Dict[str, bytes]:
"""Get the hash table and the first-16k hash table from a PAR2 file
Return as dictionary, indexed on names or hashes for the first-16 table
The input md5of16k is modified in place and thus not returned!
@@ -128,7 +129,7 @@ def parse_par2_file(fname, md5of16k):
return table
def parse_par2_file_packet(f, header):
def parse_par2_file_packet(f, header) -> Tuple[Optional[str], Optional[bytes], Optional[bytes]]:
""" Look up and analyze a FileDesc package """
nothing = None, None, None

View File

@@ -21,12 +21,14 @@ sabnzbd.postproc - threaded post-processing of jobs
import os
import logging
import sabnzbd
import functools
import subprocess
import time
import re
import queue
from typing import List, Optional
import sabnzbd
from sabnzbd.newsunpack import (
unpack_magic,
par2_repair,
@@ -62,6 +64,7 @@ from sabnzbd.filesystem import (
get_ext,
get_filename,
)
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.sorting import Sorter
from sabnzbd.constants import (
REPAIR_PRIORITY,
@@ -74,7 +77,6 @@ from sabnzbd.constants import (
VERIFIED_FILE,
)
from sabnzbd.nzbparser import process_single_nzb
from sabnzbd.rating import Rating
import sabnzbd.emailer as emailer
import sabnzbd.downloader
import sabnzbd.config as config
@@ -98,28 +100,27 @@ RE_SAMPLE = re.compile(sample_match, re.I)
class PostProcessor(Thread):
""" PostProcessor thread, designed as Singleton """
do = None # Link to instance of the thread
def __init__(self):
""" Initialize PostProcessor thread """
Thread.__init__(self)
# This history queue is simply used to log what active items to display in the web_ui
self.history_queue: List[NzbObject] = []
self.load()
if self.history_queue is None:
self.history_queue = []
# Fast-queue for jobs already finished by DirectUnpack
self.fast_queue = queue.Queue()
self.fast_queue: queue.Queue[Optional[NzbObject]] = queue.Queue()
# Regular queue for jobs that might need more attention
self.slow_queue = queue.Queue()
self.slow_queue: queue.Queue[Optional[NzbObject]] = queue.Queue()
# Load all old jobs
for nzo in self.history_queue:
self.process(nzo)
# So we can always cancel external processes
self.external_process: Optional[subprocess.Popen] = None
# Counter to not only process fast-jobs
self.__fast_job_count = 0
@@ -127,7 +128,6 @@ class PostProcessor(Thread):
self.__stop = False
self.__busy = False
self.paused = False
PostProcessor.do = self
def save(self):
""" Save postproc queue """
@@ -136,7 +136,6 @@ class PostProcessor(Thread):
def load(self):
""" Save postproc queue """
self.history_queue = []
logging.info("Loading postproc queue")
data = sabnzbd.load_admin(POSTPROC_QUEUE_FILE_NAME)
if data is None:
@@ -146,7 +145,7 @@ class PostProcessor(Thread):
if POSTPROC_QUEUE_VERSION != version:
logging.warning(T("Old queue detected, use Status->Repair to convert the queue"))
elif isinstance(history_queue, list):
self.history_queue = [nzo for nzo in history_queue if os.path.exists(nzo.downpath)]
self.history_queue = [nzo for nzo in history_queue if os.path.exists(nzo.download_path)]
except:
logging.info("Corrupt %s file, discarding", POSTPROC_QUEUE_FILE_NAME)
logging.info("Traceback: ", exc_info=True)
@@ -164,7 +163,7 @@ class PostProcessor(Thread):
nzo.work_name = "" # Mark as deleted job
break
def process(self, nzo):
def process(self, nzo: NzbObject):
""" Push on finished job in the queue """
# Make sure we return the status "Waiting"
nzo.status = Status.QUEUED
@@ -179,7 +178,7 @@ class PostProcessor(Thread):
self.save()
sabnzbd.history_updated()
def remove(self, nzo):
def remove(self, nzo: NzbObject):
""" Remove given nzo from the queue """
try:
self.history_queue.remove(nzo)
@@ -201,6 +200,12 @@ class PostProcessor(Thread):
nzo.abort_direct_unpacker()
if nzo.pp_active:
nzo.pp_active = False
try:
# Try to kill any external running process
self.external_process.kill()
logging.info("Killed external process %s", self.external_process.args[0])
except:
pass
return True
return None
@@ -216,7 +221,7 @@ class PostProcessor(Thread):
""" Return download path for given nzo_id or None when not found """
for nzo in self.history_queue:
if nzo.nzo_id == nzo_id:
return nzo.downpath
return nzo.download_path
return None
def run(self):
@@ -240,6 +245,10 @@ class PostProcessor(Thread):
time.sleep(5)
continue
# Set NzbObject object to None so references from this thread do not keep the
# object alive until the next job is added to post-processing (see #1628)
nzo = None
# Something in the fast queue?
try:
# Every few fast-jobs we should check allow a
@@ -277,29 +286,29 @@ class PostProcessor(Thread):
# Pause downloader, if users wants that
if cfg.pause_on_post_processing():
sabnzbd.downloader.Downloader.do.wait_for_postproc()
sabnzbd.Downloader.wait_for_postproc()
self.__busy = True
process_job(nzo)
if nzo.to_be_removed:
history_db = database.HistoryDB()
history_db.remove_history(nzo.nzo_id)
history_db.close()
with database.HistoryDB() as history_db:
history_db.remove_history(nzo.nzo_id)
nzo.purge_data()
# Processing done
nzo.pp_active = False
self.remove(nzo)
self.external_process = None
check_eoq = True
# Allow download to proceed
sabnzbd.downloader.Downloader.do.resume_from_postproc()
sabnzbd.Downloader.resume_from_postproc()
def process_job(nzo):
def process_job(nzo: NzbObject):
""" Process one job """
start = time.time()
@@ -329,17 +338,15 @@ def process_job(nzo):
# Get the NZB name
filename = nzo.final_name
# Download-processes can mark job as failed
# Download-processes can mark job as failed, skip all steps
if nzo.fail_msg:
nzo.status = Status.FAILED
nzo.save_attribs()
all_ok = False
par_error = True
unpack_error = 1
try:
# Get the folder containing the download result
workdir = nzo.downpath
workdir = nzo.download_path
tmp_workdir_complete = None
# if no files are present (except __admin__), fail the job
@@ -352,7 +359,7 @@ def process_job(nzo):
empty = True
emsg += " - https://sabnzbd.org/not-complete"
nzo.fail_msg = emsg
nzo.set_unpack_info("Fail", emsg)
nzo.set_unpack_info("Download", emsg)
nzo.status = Status.FAILED
# do not run unpacking or parity verification
flag_repair = flag_unpack = False
@@ -386,9 +393,9 @@ def process_job(nzo):
return False
# If we don't need extra par2, we can disconnect
if sabnzbd.nzbqueue.NzbQueue.do.actives(grabs=False) == 0 and cfg.autodisconnect():
if sabnzbd.NzbQueue.actives(grabs=False) == 0 and cfg.autodisconnect():
# This was the last job, close server connections
sabnzbd.downloader.Downloader.do.disconnect()
sabnzbd.Downloader.disconnect()
# Sanitize the resulting files
if sabnzbd.WIN32:
@@ -519,7 +526,7 @@ def process_job(nzo):
all_ok = False
if cfg.deobfuscate_final_filenames() and all_ok and not nzb_list:
# deobfuscate the filenames
# Deobfuscate the filenames
logging.info("Running deobfuscate")
deobfuscate.deobfuscate_list(newfiles, nzo.final_name)
@@ -596,19 +603,19 @@ def process_job(nzo):
# Update indexer with results
if cfg.rating_enable():
if nzo.encrypted > 0:
Rating.do.update_auto_flag(nzo.nzo_id, Rating.FLAG_ENCRYPTED)
sabnzbd.Rating.update_auto_flag(nzo.nzo_id, sabnzbd.Rating.FLAG_ENCRYPTED)
if empty:
hosts = [s.host for s in sabnzbd.downloader.Downloader.do.nzo_servers(nzo)]
hosts = [s.host for s in sabnzbd.Downloader.nzo_servers(nzo)]
if not hosts:
hosts = [None]
for host in hosts:
Rating.do.update_auto_flag(nzo.nzo_id, Rating.FLAG_EXPIRED, host)
sabnzbd.Rating.update_auto_flag(nzo.nzo_id, sabnzbd.Rating.FLAG_EXPIRED, host)
except:
logging.error(T("Post Processing Failed for %s (%s)"), filename, T("see logfile"))
logging.info("Traceback: ", exc_info=True)
nzo.fail_msg = T("PostProcessing was aborted (%s)") % T("see logfile")
nzo.fail_msg = T("Post-processing was aborted")
notifier.send_notification(T("Download Failed"), filename, "failed", nzo.cat)
nzo.status = Status.FAILED
par_error = True
@@ -645,6 +652,11 @@ def process_job(nzo):
if par_error or unpack_error in (2, 3):
try_alt_nzb(nzo)
# Check if it was aborted
if not nzo.pp_active:
nzo.fail_msg = T("Post-processing was aborted")
all_ok = False
# Show final status in history
if all_ok:
notifier.send_notification(T("Download Completed"), filename, "complete", nzo.cat)
@@ -656,20 +668,18 @@ def process_job(nzo):
# Log the overall time taken for postprocessing
postproc_time = int(time.time() - start)
# Create the history DB instance
history_db = database.HistoryDB()
# Add the nzo to the database. Only the path, script and time taken is passed
# Other information is obtained from the nzo
history_db.add_history_db(nzo, workdir_complete, postproc_time, script_log, script_line)
# Purge items
history_db.auto_history_purge()
# The connection is only used once, so close it here
history_db.close()
with database.HistoryDB() as history_db:
# Add the nzo to the database. Only the path, script and time taken is passed
# Other information is obtained from the nzo
history_db.add_history_db(nzo, workdir_complete, postproc_time, script_log, script_line)
# Purge items
history_db.auto_history_purge()
sabnzbd.history_updated()
return True
def prepare_extraction_path(nzo):
def prepare_extraction_path(nzo: NzbObject):
"""Based on the information that we have, generate
the extraction path and create the directory.
Separated so it can be called from DirectUnpacker
@@ -677,7 +687,7 @@ def prepare_extraction_path(nzo):
one_folder = False
marker_file = None
# Determine class directory
catdir = config.get_categories(nzo.cat).dir()
catdir = config.get_category(nzo.cat).dir()
if catdir.endswith("*"):
catdir = catdir.strip("*")
one_folder = True
@@ -723,14 +733,14 @@ def prepare_extraction_path(nzo):
return tmp_workdir_complete, workdir_complete, file_sorter, one_folder, marker_file
def parring(nzo, workdir):
def parring(nzo: NzbObject, workdir: str):
""" Perform par processing. Returns: (par_error, re_add) """
logging.info("Starting verification and repair of %s", nzo.final_name)
par_error = False
re_add = False
# Get verification status of sets
verified = sabnzbd.load_data(VERIFIED_FILE, nzo.workpath, remove=False) or {}
verified = sabnzbd.load_data(VERIFIED_FILE, nzo.admin_path, remove=False) or {}
# If all were verified successfully, we skip the rest of the checks
if verified and all(verified.values()):
@@ -749,15 +759,8 @@ def parring(nzo, workdir):
parfile_nzf = nzo.partable[setname]
# Check if file maybe wasn't deleted and if we maybe have more files in the parset
if os.path.exists(os.path.join(nzo.downpath, parfile_nzf.filename)) or nzo.extrapars[setname]:
if os.path.exists(os.path.join(nzo.download_path, parfile_nzf.filename)) or nzo.extrapars[setname]:
need_re_add, res = par2_repair(parfile_nzf, nzo, workdir, setname, single=single)
# Was it aborted?
if not nzo.pp_active:
re_add = False
par_error = True
break
re_add = re_add or need_re_add
verified[setname] = res
else:
@@ -796,16 +799,16 @@ def parring(nzo, workdir):
if nzo.priority != FORCE_PRIORITY:
nzo.priority = REPAIR_PRIORITY
nzo.status = Status.FETCHING
sabnzbd.nzbqueue.NzbQueue.do.add(nzo)
sabnzbd.downloader.Downloader.do.resume_from_postproc()
sabnzbd.NzbQueue.add(nzo)
sabnzbd.Downloader.resume_from_postproc()
sabnzbd.save_data(verified, VERIFIED_FILE, nzo.workpath)
sabnzbd.save_data(verified, VERIFIED_FILE, nzo.admin_path)
logging.info("Verification and repair finished for %s", nzo.final_name)
return par_error, re_add
def try_sfv_check(nzo, workdir):
def try_sfv_check(nzo: NzbObject, workdir):
"""Attempt to verify set using SFV file
Return None if no SFV-sets, True/False based on verification
"""
@@ -837,7 +840,7 @@ def try_sfv_check(nzo, workdir):
return True
def try_rar_check(nzo, rars):
def try_rar_check(nzo: NzbObject, rars):
"""Attempt to verify set using the RARs
Return True if verified, False when failed
When setname is '', all RAR files will be used, otherwise only the matching one
@@ -882,7 +885,7 @@ def try_rar_check(nzo, rars):
return True
def rar_renamer(nzo, workdir):
def rar_renamer(nzo: NzbObject, workdir):
""" Deobfuscate rar file names: Use header and content information to give RAR-files decent names """
nzo.status = Status.VERIFYING
nzo.set_unpack_info("Repair", T("Trying RAR-based verification"))
@@ -933,10 +936,14 @@ def rar_renamer(nzo, workdir):
if not len(rarvolnr):
return renamed_files
# Check number of different obfuscated rar sets:
numberofrarsets = len(rarvolnr[1])
# this can probably done with a max-key-lambda oneliner, but ... how?
numberofrarsets = 0
for mykey in rarvolnr.keys():
numberofrarsets = max(numberofrarsets, len(rarvolnr[mykey]))
logging.debug("Number of rarset is %s", numberofrarsets)
if numberofrarsets == 1:
# Just one obfuscated rarset
# Just one obfuscated rarset ... that's easy
logging.debug("Deobfuscate: Just one obfuscated rarset")
for filename in volnrext:
new_rar_name = "%s.%s" % (nzo.final_name, volnrext[filename][1])
@@ -945,47 +952,70 @@ def rar_renamer(nzo, workdir):
logging.debug("Deobfuscate: Renaming %s to %s" % (filename, new_rar_name))
renamer(filename, new_rar_name)
renamed_files += 1
else:
# More than one obfuscated rarset, so we must do matching based of files inside the rar files
logging.debug("Number of obfuscated rarsets: %s", numberofrarsets)
return renamed_files
# Assign (random) rar set names
rarsetname = {} # in which rar set it should be, so rar set 'A', or 'B', or ...
mychar = "A"
# First things first: Assigning a rarsetname to the rar file which have volume number 1
for base_obfuscated_filename in rarvolnr[1]:
rarsetname[base_obfuscated_filename] = mychar + "--" + nzo.final_name
mychar = chr(ord(mychar) + 1)
logging.debug("Deobfuscate: rarsetname %s", rarsetname)
# numberofrarsets bigger than 1, so a mixed rar set, so we need pre-checking
# Do the matching, layer by layer (read: rarvolnumber)
# So, all rar files with rarvolnr 1, find the contents (files inside the rar),
# and match with rarfiles with rarvolnr 2, and put them in the correct rarset.
# And so on, until the highest rarvolnr minus 1 matched against highest rarvolnr
for n in range(1, len(rarvolnr.keys())):
logging.debug("Deobfuscate: Finding matches between rar sets %s and %s" % (n, n + 1))
for base_obfuscated_filename in rarvolnr[n]:
matchcounter = 0
for next_obfuscated_filename in rarvolnr[n + 1]:
# set() method with intersection (less strict): set(rarvolnr[n][base_obfuscated_filename]).intersection(set(rarvolnr[n+1][next_obfuscated_filename]))
# check if the last filename inside the existing rar matches with the first filename in the following rar
if rarvolnr[n][base_obfuscated_filename][-1] == rarvolnr[n + 1][next_obfuscated_filename][0]:
try:
rarsetname[next_obfuscated_filename] = rarsetname[base_obfuscated_filename]
matchcounter += 1
except KeyError:
logging.warning(T("No matching earlier rar file for %s"), next_obfuscated_filename)
if matchcounter > 1:
logging.info("Deobfuscate: more than one match, so risk on false positive matching.")
# Sanity check of the rar set
# Get the highest rar part number (that's the upper limit):
highest_rar = sorted(rarvolnr.keys())[-1]
# A staircase check: number of rarsets should no go up, but stay the same or go down
how_many_previous = 1000 # 1000 rarset mixed ... should be enough ... typical is 1, 2 or maybe 3
# Start at part001.rar and go the highest
for rar_set_number in range(1, highest_rar + 1):
try:
how_many_here = len(rarvolnr[rar_set_number])
except:
# rarset does not exist at all
logging.warning("rarset %s is missing completely, so I can't deobfuscate.", rar_set_number)
return 0
# OK, it exists, now let's check it's not higher
if how_many_here > how_many_previous:
# this should not happen: higher number of rarset than previous number of rarset
logging.warning("no staircase! rarset %s is higher than previous, so I can't deobfuscate.", rar_set_number)
return 0
how_many_previous = how_many_here
# Do the renaming:
for filename in rarsetname:
new_rar_name = "%s.%s" % (rarsetname[filename], volnrext[filename][1])
new_rar_name = os.path.join(workdir, new_rar_name)
new_rar_name = get_unique_filename(new_rar_name)
logging.debug("Deobfuscate: Renaming %s to %s" % (filename, new_rar_name))
renamer(filename, new_rar_name)
renamed_files += 1
# OK, that looked OK (a declining staircase), so we can safely proceed
# More than one obfuscated rarset, so we must do matching based of files inside the rar files
# Assign (random) rar set names, first come first serve basis
rarsetname = {} # in which rar set it should be, so rar set 'A', or 'B', or ...
mychar = "A"
# First things first: Assigning a rarsetname to the rar file which have volume number 1
for base_obfuscated_filename in rarvolnr[1]:
rarsetname[base_obfuscated_filename] = mychar + "--" + nzo.final_name
mychar = chr(ord(mychar) + 1)
logging.debug("Deobfuscate: rarsetname %s", rarsetname)
# Do the matching, layer by layer (read: rarvolnumber)
# So, all rar files with rarvolnr 1, find the contents (files inside the rar),
# and match with rarfiles with rarvolnr 2, and put them in the correct rarset.
# And so on, until the highest rarvolnr minus 1 matched against highest rarvolnr
for n in range(1, len(rarvolnr)):
logging.debug("Deobfuscate: Finding matches between rar sets %s and %s" % (n, n + 1))
for base_obfuscated_filename in rarvolnr[n]:
matchcounter = 0
for next_obfuscated_filename in rarvolnr[n + 1]:
# set() method with intersection (less strict): set(rarvolnr[n][base_obfuscated_filename]).intersection(set(rarvolnr[n+1][next_obfuscated_filename]))
# check if the last filename inside the existing rar matches with the first filename in the following rar
if rarvolnr[n][base_obfuscated_filename][-1] == rarvolnr[n + 1][next_obfuscated_filename][0]:
try:
rarsetname[next_obfuscated_filename] = rarsetname[base_obfuscated_filename]
matchcounter += 1
except KeyError:
logging.warning(T("No matching earlier rar file for %s"), next_obfuscated_filename)
if matchcounter > 1:
logging.info("Deobfuscate: more than one match, so risk on false positive matching.")
# Do the renaming:
for filename in rarsetname:
new_rar_name = "%s.%s" % (rarsetname[filename], volnrext[filename][1])
new_rar_name = os.path.join(workdir, new_rar_name)
new_rar_name = get_unique_filename(new_rar_name)
logging.debug("Deobfuscate: Renaming %s to %s" % (filename, new_rar_name))
renamer(filename, new_rar_name)
renamed_files += 1
# Done: The obfuscated rar files have now been renamed to regular formatted filenames
return renamed_files
@@ -993,7 +1023,7 @@ def rar_renamer(nzo, workdir):
def handle_empty_queue():
""" Check if empty queue calls for action """
if sabnzbd.nzbqueue.NzbQueue.do.actives() == 0:
if sabnzbd.NzbQueue.actives() == 0:
sabnzbd.save_state()
notifier.send_notification("SABnzbd", T("Queue finished"), "queue_done")

View File

@@ -75,12 +75,12 @@ def win_shutdown():
##############################################################################
# Power management for OSX
# Power management for macOS
##############################################################################
def osx_shutdown():
""" Shutdown OSX system, never returns """
""" Shutdown macOS system, never returns """
try:
subprocess.call(["osascript", "-e", 'tell app "System Events" to shut down'])
except:
@@ -90,7 +90,7 @@ def osx_shutdown():
def osx_standby():
""" Make OSX system sleep, returns after wakeup """
""" Make macOS system sleep, returns after wakeup """
try:
subprocess.call(["osascript", "-e", 'tell app "System Events" to sleep'])
time.sleep(10)
@@ -100,7 +100,7 @@ def osx_standby():
def osx_hibernate():
""" Make OSX system sleep, returns after wakeup """
""" Make macOS system sleep, returns after wakeup """
osx_standby()

View File

@@ -27,7 +27,9 @@ import copy
import queue
import collections
from threading import RLock, Thread
import sabnzbd
from sabnzbd.constants import RATING_FILE_NAME
from sabnzbd.decorators import synchronized
import sabnzbd.cfg as cfg
@@ -76,19 +78,14 @@ class NzbRating:
self.user_flag = {}
self.auto_flag = {}
self.changed = 0
class NzbRatingV2(NzbRating):
def __init__(self):
super(NzbRatingV2, self).__init__()
self.avg_spam_cnt = 0
self.avg_spam_confirm = False
self.avg_encrypted_cnt = 0
self.avg_encrypted_confirm = False
def to_v2(self, rating):
self.__dict__.update(rating.__dict__)
return self
# TODO: Can be removed in version 3.3.0, needed for backwards compatibility
NzbRatingV2 = NzbRating
class Rating(Thread):
@@ -110,28 +107,19 @@ class Rating(Thread):
CHANGED_USER_FLAG = 0x08
CHANGED_AUTO_FLAG = 0x10
do = None
def __init__(self):
Rating.do = self
self.shutdown = False
self.queue = OrderedSetQueue()
self.version = Rating.VERSION
self.ratings = {}
self.nzo_indexer_map = {}
try:
self.version, self.ratings, self.nzo_indexer_map = sabnzbd.load_admin(
"Rating.sab", silent=not cfg.rating_enable()
)
if self.version == 1:
ratings = {}
for k, v in self.ratings.items():
ratings[k] = NzbRatingV2().to_v2(v)
self.ratings = ratings
self.version = 2
if self.version != Rating.VERSION:
raise Exception()
rating_data = sabnzbd.load_admin(RATING_FILE_NAME)
if rating_data:
self.version, self.ratings, self.nzo_indexer_map = rating_data
except:
self.version = Rating.VERSION
self.ratings = {}
self.nzo_indexer_map = {}
logging.info("Corrupt %s file, discarding", RATING_FILE_NAME)
logging.info("Traceback: ", exc_info=True)
Thread.__init__(self)
def stop(self):
@@ -158,8 +146,7 @@ class Rating(Thread):
@synchronized(RATING_LOCK)
def save(self):
if self.ratings and self.nzo_indexer_map:
sabnzbd.save_admin((self.version, self.ratings, self.nzo_indexer_map), "Rating.sab")
sabnzbd.save_admin((self.version, self.ratings, self.nzo_indexer_map), RATING_FILE_NAME)
# The same file may be uploaded multiple times creating a new nzo_id each time
@synchronized(RATING_LOCK)
@@ -175,7 +162,7 @@ class Rating(Thread):
fields["votedown"],
)
try:
rating = self.ratings.get(indexer_id, NzbRatingV2())
rating = self.ratings.get(indexer_id, NzbRating())
if fields["video"] and fields["videocnt"]:
rating.avg_video = int(float(fields["video"]))
rating.avg_video_cnt = int(float(fields["videocnt"]))

View File

@@ -36,93 +36,12 @@ import sabnzbd.emailer as emailer
import feedparser
__RSS = None # Global pointer to RSS-scanner instance
##############################################################################
# Wrapper functions
##############################################################################
def init():
global __RSS
__RSS = RSSQueue()
def stop():
global __RSS
if __RSS:
__RSS.stop()
try:
__RSS.join()
except:
pass
def run_feed(feed, download, ignoreFirst=False, force=False, readout=True):
global __RSS
if __RSS:
return __RSS.run_feed(feed, download, ignoreFirst, force=force, readout=readout)
def show_result(feed):
global __RSS
if __RSS:
return __RSS.show_result(feed)
def flag_downloaded(feed, fid):
global __RSS
if __RSS:
__RSS.flag_downloaded(feed, fid)
def lookup_url(feed, fid):
global __RSS
if __RSS:
return __RSS.lookup_url(feed, fid)
def run_method():
global __RSS
if __RSS:
return __RSS.run()
else:
return None
def next_run(t=None):
global __RSS
if __RSS:
if t:
__RSS.next_run = t
else:
return __RSS.next_run
else:
return time.time()
def save():
global __RSS
if __RSS:
__RSS.save()
def clear_feed(feed):
global __RSS
if __RSS:
__RSS.clear_feed(feed)
def clear_downloaded(feed):
global __RSS
if __RSS:
__RSS.clear_downloaded(feed)
##############################################################################
def notdefault(item):
""" Return True if not 'Default|''|*' """
return bool(item) and str(item).lower() not in ("default", "*", "", str(DEFAULT_PRIORITY))
@@ -151,8 +70,7 @@ def remove_obsolete(jobs, new_jobs):
"""
now = time.time()
limit = now - 259200 # 3days (3x24x3600)
olds = list(jobs.keys())
for old in olds:
for old in list(jobs):
tm = jobs[old]["time"]
if old not in new_jobs:
if jobs[old].get("status", " ")[0] in ("G", "B"):
@@ -162,13 +80,13 @@ def remove_obsolete(jobs, new_jobs):
del jobs[old]
LOCK = threading.RLock()
RSS_LOCK = threading.RLock()
_RE_SP = re.compile(r"s*(\d+)[ex](\d+)", re.I)
_RE_SIZE1 = re.compile(r"Size:\s*(\d+\.\d+\s*[KMG]{0,1})B\W*", re.I)
_RE_SIZE2 = re.compile(r"\W*(\d+\.\d+\s*[KMG]{0,1})B\W*", re.I)
class RSSQueue:
class RSSReader:
def __init__(self):
self.jobs = {}
self.next_run = time.time()
@@ -178,7 +96,7 @@ class RSSQueue:
self.jobs = sabnzbd.load_admin(RSS_FILE_NAME)
if self.jobs:
for feed in self.jobs:
remove_obsolete(self.jobs[feed], list(self.jobs[feed].keys()))
remove_obsolete(self.jobs[feed], list(self.jobs[feed]))
except:
logging.warning(T("Cannot read %s"), RSS_FILE_NAME)
logging.info("Traceback: ", exc_info=True)
@@ -212,7 +130,7 @@ class RSSQueue:
def stop(self):
self.shutdown = True
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def run_feed(self, feed=None, download=False, ignoreFirst=False, force=False, readout=True):
""" Run the query for one URI and apply filters """
self.shutdown = False
@@ -233,7 +151,6 @@ class RSSQueue:
uris = feeds.uri()
defCat = feeds.cat()
import sabnzbd.api
if not notdefault(defCat) or defCat not in sabnzbd.api.list_cats(default=False):
defCat = None
@@ -278,10 +195,12 @@ class RSSQueue:
feedparser.USER_AGENT = "SABnzbd/%s" % sabnzbd.__version__
# Read the RSS feed
msg = ""
entries = []
if readout:
all_entries = []
for uri in uris:
# Reset parsing message for each feed
msg = ""
feed_parsed = {}
uri = uri.replace(" ", "%20").replace("feed://", "http://")
@@ -554,11 +473,11 @@ class RSSQueue:
if not sabnzbd.PAUSED_ALL:
active = False
if self.next_run < time.time():
self.next_run = time.time() + cfg.rss_rate.get() * 60
self.next_run = time.time() + cfg.rss_rate() * 60
feeds = config.get_rss()
try:
for feed in feeds:
if feeds[feed].enable.get():
if feeds[feed].enable():
logging.info('Starting scheduled RSS read-out for "%s"', feed)
active = True
self.run_feed(feed, download=True, ignoreFirst=True)
@@ -577,7 +496,7 @@ class RSSQueue:
self.save()
logging.info("Finished scheduled RSS read-outs")
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def show_result(self, feed):
if feed in self.jobs:
try:
@@ -587,16 +506,16 @@ class RSSQueue:
else:
return {}
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def save(self):
sabnzbd.save_admin(self.jobs, RSS_FILE_NAME)
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def delete(self, feed):
if feed in self.jobs:
del self.jobs[feed]
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def flag_downloaded(self, feed, fid):
if feed in self.jobs:
lst = self.jobs[feed]
@@ -605,7 +524,7 @@ class RSSQueue:
lst[link]["status"] = "D"
lst[link]["time_downloaded"] = time.localtime()
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def lookup_url(self, feed, url):
if url and feed in self.jobs:
lst = self.jobs[feed]
@@ -614,13 +533,13 @@ class RSSQueue:
return lst[link]
return None
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def clear_feed(self, feed):
# Remove any previous references to this feed name, and start fresh
if feed in self.jobs:
del self.jobs[feed]
@synchronized(LOCK)
@synchronized(RSS_LOCK)
def clear_downloaded(self, feed):
# Mark downloaded jobs, so that they won't be displayed any more.
if feed in self.jobs:

View File

@@ -26,8 +26,6 @@ from time import sleep
import sabnzbd
from sabnzbd.panic import launch_a_browser
import sabnzbd.api as api
import sabnzbd.scheduler as scheduler
from sabnzbd.downloader import Downloader
import sabnzbd.cfg as cfg
from sabnzbd.misc import to_units
@@ -145,7 +143,7 @@ class SABTrayThread(SysTrayIconThread):
def pausefor(self, minutes):
""" Need function for each pause-timer """
scheduler.plan_resume(minutes)
sabnzbd.Scheduler.plan_resume(minutes)
def pausefor5min(self, icon):
self.pausefor(5)
@@ -172,7 +170,7 @@ class SABTrayThread(SysTrayIconThread):
def rss(self, icon):
self.hover_text = T("Read all RSS feeds")
scheduler.force_rss()
sabnzbd.Scheduler.force_rss()
def nologin(self, icon):
sabnzbd.cfg.username.set("")
@@ -193,9 +191,9 @@ class SABTrayThread(SysTrayIconThread):
sabnzbd.shutdown_program()
def pause(self):
scheduler.plan_resume(0)
Downloader.do.pause()
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
def resume(self):
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.unpause_all()

View File

@@ -42,8 +42,6 @@ from os.path import abspath
import sabnzbd
from sabnzbd.panic import launch_a_browser
import sabnzbd.api as api
import sabnzbd.scheduler as scheduler
from sabnzbd.downloader import Downloader
import sabnzbd.cfg as cfg
from sabnzbd.misc import to_units
@@ -195,12 +193,12 @@ class StatusIcon(Thread):
sabnzbd.shutdown_program()
def pause(self):
scheduler.plan_resume(0)
Downloader.do.pause()
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.Downloader.pause()
def resume(self):
scheduler.plan_resume(0)
sabnzbd.Scheduler.plan_resume(0)
sabnzbd.unpause_all()
def rss(self, icon):
scheduler.force_rss()
sabnzbd.Scheduler.force_rss()

View File

@@ -22,257 +22,370 @@ sabnzbd.scheduler - Event Scheduler
import random
import logging
import time
from typing import Optional
import sabnzbd.utils.kronos as kronos
import sabnzbd.rss as rss
import sabnzbd.rss
import sabnzbd.downloader
import sabnzbd.dirscanner
import sabnzbd.misc
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.postproc import PostProcessor
from sabnzbd.constants import LOW_PRIORITY, NORMAL_PRIORITY, HIGH_PRIORITY
__SCHED = None # Global pointer to Scheduler instance
class Scheduler:
def __init__(self):
self.scheduler = kronos.ThreadedScheduler()
self.pause_end: Optional[float] = None # Moment when pause will end
self.restart_scheduler = False
self.pp_pause_event = False
self.load_schedules()
SCHEDULE_GUARD_FLAG = False
PP_PAUSE_EVENT = False
def start(self):
""" Start the scheduler """
self.scheduler.start()
def stop(self):
""" Stop the scheduler, destroy instance """
logging.debug("Stopping scheduler")
self.scheduler.stop()
def schedule_guard():
""" Set flag for scheduler restart """
global SCHEDULE_GUARD_FLAG
SCHEDULE_GUARD_FLAG = True
def restart(self, plan_restart=True):
""" Stop and start scheduler """
if plan_restart:
self.restart_scheduler = True
elif self.restart_scheduler:
logging.debug("Restarting scheduler")
self.restart_scheduler = False
self.scheduler.stop()
self.scheduler.start()
self.analyse(sabnzbd.Downloader.paused)
self.load_schedules()
def abort(self):
"""Emergency stop, just set the running attribute false so we don't
have to wait the full scheduler-check cycle before it really stops"""
self.scheduler.running = False
def is_alive(self):
""" Thread-like check if we are doing fine """
if self.scheduler.thread:
return self.scheduler.thread.is_alive()
return False
def load_schedules(self):
rss_planned = False
for schedule in cfg.schedules():
arguments = []
argument_list = None
try:
enabled, m, h, d, action_name = schedule.split()
except:
try:
enabled, m, h, d, action_name, argument_list = schedule.split(None, 5)
except:
continue # Bad schedule, ignore
if argument_list:
arguments = argument_list.split()
action_name = action_name.lower()
try:
m = int(m)
h = int(h)
except:
logging.warning(T("Bad schedule %s at %s:%s"), action_name, m, h)
continue
if d.isdigit():
d = [int(i) for i in d]
else:
d = list(range(1, 8))
if action_name == "resume":
action = self.scheduled_resume
arguments = []
elif action_name == "pause":
action = sabnzbd.Downloader.pause
arguments = []
elif action_name == "pause_all":
action = sabnzbd.pause_all
arguments = []
elif action_name == "shutdown":
action = sabnzbd.shutdown_program
arguments = []
elif action_name == "restart":
action = sabnzbd.restart_program
arguments = []
elif action_name == "pause_post":
action = pp_pause
elif action_name == "resume_post":
action = pp_resume
elif action_name == "speedlimit" and arguments != []:
action = sabnzbd.Downloader.limit_speed
elif action_name == "enable_server" and arguments != []:
action = sabnzbd.enable_server
elif action_name == "disable_server" and arguments != []:
action = sabnzbd.disable_server
elif action_name == "scan_folder":
action = sabnzbd.DirScanner.scan
elif action_name == "rss_scan":
action = sabnzbd.RSSReader.run
rss_planned = True
elif action_name == "remove_failed":
action = sabnzbd.api.history_remove_failed
elif action_name == "remove_completed":
action = sabnzbd.api.history_remove_completed
elif action_name == "enable_quota":
action = sabnzbd.BPSMeter.set_status
arguments = [True]
elif action_name == "disable_quota":
action = sabnzbd.BPSMeter.set_status
arguments = [False]
elif action_name == "pause_all_low":
action = sabnzbd.NzbQueue.pause_on_prio
arguments = [LOW_PRIORITY]
elif action_name == "pause_all_normal":
action = sabnzbd.NzbQueue.pause_on_prio
arguments = [NORMAL_PRIORITY]
elif action_name == "pause_all_high":
action = sabnzbd.NzbQueue.pause_on_prio
arguments = [HIGH_PRIORITY]
elif action_name == "resume_all_low":
action = sabnzbd.NzbQueue.resume_on_prio
arguments = [LOW_PRIORITY]
elif action_name == "resume_all_normal":
action = sabnzbd.NzbQueue.resume_on_prio
arguments = [NORMAL_PRIORITY]
elif action_name == "resume_all_high":
action = sabnzbd.NzbQueue.resume_on_prio
arguments = [HIGH_PRIORITY]
elif action_name == "pause_cat":
action = sabnzbd.NzbQueue.pause_on_cat
arguments = [argument_list]
elif action_name == "resume_cat":
action = sabnzbd.NzbQueue.resume_on_cat
arguments = [argument_list]
else:
logging.warning(T("Unknown action: %s"), action_name)
continue
if enabled == "1":
logging.info("Scheduling %s(%s) on days %s at %02d:%02d", action_name, arguments, d, h, m)
self.scheduler.add_daytime_task(action, action_name, d, None, (h, m), args=arguments)
else:
logging.debug("Skipping %s(%s) on days %s at %02d:%02d", action_name, arguments, d, h, m)
# Set RSS check interval
if not rss_planned:
interval = cfg.rss_rate()
delay = random.randint(0, interval - 1)
logging.info("Scheduling RSS interval task every %s min (delay=%s)", interval, delay)
sabnzbd.RSSReader.next_run = time.time() + delay * 60
self.scheduler.add_interval_task(sabnzbd.RSSReader.run, "RSS", delay * 60, interval * 60)
self.scheduler.add_single_task(sabnzbd.RSSReader.run, "RSS", 15)
if cfg.version_check():
# Check for new release, once per week on random time
m = random.randint(0, 59)
h = random.randint(0, 23)
d = (random.randint(1, 7),)
logging.info("Scheduling VersionCheck on day %s at %s:%s", d[0], h, m)
self.scheduler.add_daytime_task(sabnzbd.misc.check_latest_version, "VerCheck", d, None, (h, m))
action, hour, minute = sabnzbd.BPSMeter.get_quota()
if action:
logging.info("Setting schedule for quota check daily at %s:%s", hour, minute)
self.scheduler.add_daytime_task(action, "quota_reset", list(range(1, 8)), None, (hour, minute))
if sabnzbd.misc.int_conv(cfg.history_retention()) > 0:
logging.info("Setting schedule for midnight auto history-purge")
self.scheduler.add_daytime_task(
sabnzbd.database.midnight_history_purge, "midnight_history_purge", list(range(1, 8)), None, (0, 0)
)
logging.info("Setting schedule for midnight BPS reset")
self.scheduler.add_daytime_task(sabnzbd.BPSMeter.midnight, "midnight_bps", list(range(1, 8)), None, (0, 0))
# Subscribe to special schedule changes
cfg.rss_rate.callback(self.scheduler_restart_guard)
def analyse(self, was_paused=False, priority=None):
"""Determine what pause/resume state we would have now.
'priority': evaluate only effect for given priority, return True for paused
"""
self.pp_pause_event = False
paused = None
paused_all = False
pause_post = False
pause_low = pause_normal = pause_high = False
speedlimit = None
quota = True
servers = {}
for ev in sort_schedules(all_events=True):
if priority is None:
logging.debug("Schedule check result = %s", ev)
# Skip if disabled
if ev[4] == "0":
continue
action = ev[1]
try:
value = ev[2]
except:
value = None
if action == "pause":
paused = True
elif action == "pause_all":
paused_all = True
self.pp_pause_event = True
elif action == "resume":
paused = False
paused_all = False
elif action == "pause_post":
pause_post = True
self.pp_pause_event = True
elif action == "resume_post":
pause_post = False
self.pp_pause_event = True
elif action == "speedlimit" and value is not None:
speedlimit = ev[2]
elif action == "pause_all_low":
pause_low = True
elif action == "pause_all_normal":
pause_normal = True
elif action == "pause_all_high":
pause_high = True
elif action == "resume_all_low":
pause_low = False
elif action == "resume_all_normal":
pause_normal = False
elif action == "resume_all_high":
pause_high = False
elif action == "enable_quota":
quota = True
elif action == "disable_quota":
quota = False
elif action == "enable_server":
try:
servers[value] = 1
except:
logging.warning(T("Schedule for non-existing server %s"), value)
elif action == "disable_server":
try:
servers[value] = 0
except:
logging.warning(T("Schedule for non-existing server %s"), value)
# Special case, a priority was passed, so evaluate only that and return state
if priority == LOW_PRIORITY:
return pause_low
if priority == NORMAL_PRIORITY:
return pause_normal
if priority == HIGH_PRIORITY:
return pause_high
if priority is not None:
return False
# Normal analysis
if not was_paused:
if paused_all:
sabnzbd.pause_all()
else:
sabnzbd.unpause_all()
sabnzbd.Downloader.set_paused_state(paused or paused_all)
sabnzbd.PostProcessor.paused = pause_post
if speedlimit is not None:
sabnzbd.Downloader.limit_speed(speedlimit)
sabnzbd.BPSMeter.set_status(quota, action=False)
for serv in servers:
try:
item = config.get_config("servers", serv)
value = servers[serv]
if bool(item.enable()) != bool(value):
item.enable.set(value)
sabnzbd.Downloader.init_server(serv, serv)
except:
pass
config.save_config()
def scheduler_restart_guard(self):
""" Set flag for scheduler restart """
self.restart_scheduler = True
def scheduled_resume(self):
""" Scheduled resume, only when no oneshot resume is active """
if self.pause_end is None:
sabnzbd.unpause_all()
def __oneshot_resume(self, when):
"""Called by delayed resume schedule
Only resumes if call comes at the planned time
"""
if self.pause_end is not None and (when > self.pause_end - 5) and (when < self.pause_end + 55):
self.pause_end = None
logging.debug("Resume after pause-interval")
sabnzbd.unpause_all()
else:
logging.debug("Ignoring cancelled resume")
def plan_resume(self, interval):
""" Set a scheduled resume after the interval """
if interval > 0:
self.pause_end = time.time() + (interval * 60)
logging.debug("Schedule resume at %s", self.pause_end)
self.scheduler.add_single_task(self.__oneshot_resume, "", interval * 60, args=[self.pause_end])
sabnzbd.Downloader.pause()
else:
self.pause_end = None
sabnzbd.unpause_all()
def pause_int(self) -> str:
""" Return minutes:seconds until pause ends """
if self.pause_end is None:
return "0"
else:
val = self.pause_end - time.time()
if val < 0:
sign = "-"
val = abs(val)
else:
sign = ""
mins = int(val / 60)
sec = int(val - mins * 60)
return "%s%d:%02d" % (sign, mins, sec)
def pause_check(self):
""" Unpause when time left is negative, compensate for missed schedule """
if self.pause_end is not None and (self.pause_end - time.time()) < 0:
self.pause_end = None
logging.debug("Force resume, negative timer")
sabnzbd.unpause_all()
def plan_server(self, action, parms, interval):
""" Plan to re-activate server after 'interval' minutes """
self.scheduler.add_single_task(action, "", interval * 60, args=parms)
def force_rss(self):
""" Add a one-time RSS scan, one second from now """
self.scheduler.add_single_task(sabnzbd.RSSReader.run, "RSS", 1)
def pp_pause():
PostProcessor.do.paused = True
sabnzbd.PostProcessor.paused = True
def pp_resume():
PostProcessor.do.paused = False
def pp_pause_event():
return PP_PAUSE_EVENT
def init():
""" Create the scheduler and set all required events """
global __SCHED
reset_guardian()
__SCHED = kronos.ThreadedScheduler()
rss_planned = False
for schedule in cfg.schedules():
arguments = []
argument_list = None
try:
enabled, m, h, d, action_name = schedule.split()
except:
try:
enabled, m, h, d, action_name, argument_list = schedule.split(None, 5)
except:
continue # Bad schedule, ignore
if argument_list:
arguments = argument_list.split()
action_name = action_name.lower()
try:
m = int(m)
h = int(h)
except:
logging.warning(T("Bad schedule %s at %s:%s"), action_name, m, h)
continue
if d.isdigit():
d = [int(i) for i in d]
else:
d = list(range(1, 8))
if action_name == "resume":
action = scheduled_resume
arguments = []
elif action_name == "pause":
action = sabnzbd.downloader.Downloader.do.pause
arguments = []
elif action_name == "pause_all":
action = sabnzbd.pause_all
arguments = []
elif action_name == "shutdown":
action = sabnzbd.shutdown_program
arguments = []
elif action_name == "restart":
action = sabnzbd.restart_program
arguments = []
elif action_name == "pause_post":
action = pp_pause
elif action_name == "resume_post":
action = pp_resume
elif action_name == "speedlimit" and arguments != []:
action = sabnzbd.downloader.Downloader.do.limit_speed
elif action_name == "enable_server" and arguments != []:
action = sabnzbd.enable_server
elif action_name == "disable_server" and arguments != []:
action = sabnzbd.disable_server
elif action_name == "scan_folder":
action = sabnzbd.dirscanner.dirscan
elif action_name == "rss_scan":
action = rss.run_method
rss_planned = True
elif action_name == "remove_failed":
action = sabnzbd.api.history_remove_failed
elif action_name == "remove_completed":
action = sabnzbd.api.history_remove_completed
elif action_name == "enable_quota":
action = sabnzbd.bpsmeter.BPSMeter.do.set_status
arguments = [True]
elif action_name == "disable_quota":
action = sabnzbd.bpsmeter.BPSMeter.do.set_status
arguments = [False]
elif action_name == "pause_all_low":
action = sabnzbd.nzbqueue.NzbQueue.do.pause_on_prio
arguments = [LOW_PRIORITY]
elif action_name == "pause_all_normal":
action = sabnzbd.nzbqueue.NzbQueue.do.pause_on_prio
arguments = [NORMAL_PRIORITY]
elif action_name == "pause_all_high":
action = sabnzbd.nzbqueue.NzbQueue.do.pause_on_prio
arguments = [HIGH_PRIORITY]
elif action_name == "resume_all_low":
action = sabnzbd.nzbqueue.NzbQueue.do.resume_on_prio
arguments = [LOW_PRIORITY]
elif action_name == "resume_all_normal":
action = sabnzbd.nzbqueue.NzbQueue.do.resume_on_prio
arguments = [NORMAL_PRIORITY]
elif action_name == "resume_all_high":
action = sabnzbd.nzbqueue.NzbQueue.do.resume_on_prio
arguments = [HIGH_PRIORITY]
elif action_name == "pause_cat":
action = sabnzbd.nzbqueue.NzbQueue.do.pause_on_cat
arguments = [argument_list]
elif action_name == "resume_cat":
action = sabnzbd.nzbqueue.NzbQueue.do.resume_on_cat
arguments = [argument_list]
else:
logging.warning(T("Unknown action: %s"), action_name)
continue
if enabled == "1":
logging.debug("Scheduling %s(%s) on days %s at %02d:%02d", action_name, arguments, d, h, m)
__SCHED.add_daytime_task(action, action_name, d, None, (h, m), kronos.method.sequential, arguments, None)
else:
logging.debug("Skipping %s(%s) on days %s at %02d:%02d", action_name, arguments, d, h, m)
# Set Guardian interval to 30 seconds
__SCHED.add_interval_task(sched_guardian, "Guardian", 15, 30, kronos.method.sequential, None, None)
# Set RSS check interval
if not rss_planned:
interval = cfg.rss_rate()
delay = random.randint(0, interval - 1)
logging.debug("Scheduling RSS interval task every %s min (delay=%s)", interval, delay)
sabnzbd.rss.next_run(time.time() + delay * 60)
__SCHED.add_interval_task(
rss.run_method, "RSS", delay * 60, interval * 60, kronos.method.sequential, None, None
)
__SCHED.add_single_task(rss.run_method, "RSS", 15, kronos.method.sequential, None, None)
if cfg.version_check():
# Check for new release, once per week on random time
m = random.randint(0, 59)
h = random.randint(0, 23)
d = (random.randint(1, 7),)
logging.debug("Scheduling VersionCheck on day %s at %s:%s", d[0], h, m)
__SCHED.add_daytime_task(
sabnzbd.misc.check_latest_version, "VerCheck", d, None, (h, m), kronos.method.sequential, [], None
)
action, hour, minute = sabnzbd.bpsmeter.BPSMeter.do.get_quota()
if action:
logging.info("Setting schedule for quota check daily at %s:%s", hour, minute)
__SCHED.add_daytime_task(
action, "quota_reset", list(range(1, 8)), None, (hour, minute), kronos.method.sequential, [], None
)
if sabnzbd.misc.int_conv(cfg.history_retention()) > 0:
logging.info("Setting schedule for midnight auto history-purge")
__SCHED.add_daytime_task(
sabnzbd.database.midnight_history_purge,
"midnight_history_purge",
list(range(1, 8)),
None,
(0, 0),
kronos.method.sequential,
[],
None,
)
logging.info("Setting schedule for midnight BPS reset")
__SCHED.add_daytime_task(
sabnzbd.bpsmeter.midnight_action,
"midnight_bps",
list(range(1, 8)),
None,
(0, 0),
kronos.method.sequential,
[],
None,
)
# Subscribe to special schedule changes
cfg.rss_rate.callback(schedule_guard)
def start():
""" Start the scheduler """
global __SCHED
if __SCHED:
logging.debug("Starting scheduler")
__SCHED.start()
def restart(force=False):
""" Stop and start scheduler """
global __PARMS, SCHEDULE_GUARD_FLAG
if force:
SCHEDULE_GUARD_FLAG = True
else:
if SCHEDULE_GUARD_FLAG:
SCHEDULE_GUARD_FLAG = False
stop()
analyse(sabnzbd.downloader.Downloader.do.paused)
init()
start()
def stop():
""" Stop the scheduler, destroy instance """
global __SCHED
if __SCHED:
logging.debug("Stopping scheduler")
try:
__SCHED.stop()
except IndexError:
pass
del __SCHED
__SCHED = None
def abort():
""" Emergency stop, just set the running attribute false """
global __SCHED
if __SCHED:
logging.debug("Terminating scheduler")
__SCHED.running = False
sabnzbd.PostProcessor.paused = False
def sort_schedules(all_events, now=None):
@@ -318,211 +431,3 @@ def sort_schedules(all_events, now=None):
events.sort(key=lambda x: x[0])
return events
def analyse(was_paused=False, priority=None):
"""Determine what pause/resume state we would have now.
'priority': evaluate only effect for given priority, return True for paused
"""
global PP_PAUSE_EVENT
PP_PAUSE_EVENT = False
paused = None
paused_all = False
pause_post = False
pause_low = pause_normal = pause_high = False
speedlimit = None
quota = True
servers = {}
for ev in sort_schedules(all_events=True):
if priority is None:
logging.debug("Schedule check result = %s", ev)
# Skip if disabled
if ev[4] == "0":
continue
action = ev[1]
try:
value = ev[2]
except:
value = None
if action == "pause":
paused = True
elif action == "pause_all":
paused_all = True
PP_PAUSE_EVENT = True
elif action == "resume":
paused = False
paused_all = False
elif action == "pause_post":
pause_post = True
PP_PAUSE_EVENT = True
elif action == "resume_post":
pause_post = False
PP_PAUSE_EVENT = True
elif action == "speedlimit" and value is not None:
speedlimit = ev[2]
elif action == "pause_all_low":
pause_low = True
elif action == "pause_all_normal":
pause_normal = True
elif action == "pause_all_high":
pause_high = True
elif action == "resume_all_low":
pause_low = False
elif action == "resume_all_normal":
pause_normal = False
elif action == "resume_all_high":
pause_high = False
elif action == "enable_quota":
quota = True
elif action == "disable_quota":
quota = False
elif action == "enable_server":
try:
servers[value] = 1
except:
logging.warning(T("Schedule for non-existing server %s"), value)
elif action == "disable_server":
try:
servers[value] = 0
except:
logging.warning(T("Schedule for non-existing server %s"), value)
# Special case, a priority was passed, so evaluate only that and return state
if priority == LOW_PRIORITY:
return pause_low
if priority == NORMAL_PRIORITY:
return pause_normal
if priority == HIGH_PRIORITY:
return pause_high
if priority is not None:
return False
# Normal analysis
if not was_paused:
if paused_all:
sabnzbd.pause_all()
else:
sabnzbd.unpause_all()
sabnzbd.downloader.Downloader.do.set_paused_state(paused or paused_all)
PostProcessor.do.paused = pause_post
if speedlimit is not None:
sabnzbd.downloader.Downloader.do.limit_speed(speedlimit)
sabnzbd.bpsmeter.BPSMeter.do.set_status(quota, action=False)
for serv in servers:
try:
item = config.get_config("servers", serv)
value = servers[serv]
if bool(item.enable()) != bool(value):
item.enable.set(value)
sabnzbd.downloader.Downloader.do.init_server(serv, serv)
except:
pass
config.save_config()
# Support for single shot pause (=delayed resume)
__PAUSE_END = None # Moment when pause will end
def scheduled_resume():
""" Scheduled resume, only when no oneshot resume is active """
global __PAUSE_END
if __PAUSE_END is None:
sabnzbd.unpause_all()
def __oneshot_resume(when):
"""Called by delayed resume schedule
Only resumes if call comes at the planned time
"""
global __PAUSE_END
if __PAUSE_END is not None and (when > __PAUSE_END - 5) and (when < __PAUSE_END + 55):
__PAUSE_END = None
logging.debug("Resume after pause-interval")
sabnzbd.unpause_all()
else:
logging.debug("Ignoring cancelled resume")
def plan_resume(interval):
""" Set a scheduled resume after the interval """
global __SCHED, __PAUSE_END
if interval > 0:
__PAUSE_END = time.time() + (interval * 60)
logging.debug("Schedule resume at %s", __PAUSE_END)
__SCHED.add_single_task(__oneshot_resume, "", interval * 60, kronos.method.sequential, [__PAUSE_END], None)
sabnzbd.downloader.Downloader.do.pause()
else:
__PAUSE_END = None
sabnzbd.unpause_all()
def pause_int():
""" Return minutes:seconds until pause ends """
global __PAUSE_END
if __PAUSE_END is None:
return "0"
else:
val = __PAUSE_END - time.time()
if val < 0:
sign = "-"
val = abs(val)
else:
sign = ""
mins = int(val / 60)
sec = int(val - mins * 60)
return "%s%d:%02d" % (sign, mins, sec)
def pause_check():
""" Unpause when time left is negative, compensate for missed schedule """
global __PAUSE_END
if __PAUSE_END is not None and (__PAUSE_END - time.time()) < 0:
__PAUSE_END = None
logging.debug("Force resume, negative timer")
sabnzbd.unpause_all()
def plan_server(action, parms, interval):
""" Plan to re-activate server after 'interval' minutes """
__SCHED.add_single_task(action, "", interval * 60, kronos.method.sequential, parms, None)
def force_rss():
""" Add a one-time RSS scan, one second from now """
__SCHED.add_single_task(rss.run_method, "RSS", 1, kronos.method.sequential, None, None)
# Scheduler Guarding system
# Each check sets the guardian flag False
# Each successful scheduled check sets the flag
# If 4 consecutive checks fail, the scheduler is assumed to have crashed
__SCHED_GUARDIAN = False
__SCHED_GUARDIAN_CNT = 0
def reset_guardian():
global __SCHED_GUARDIAN, __SCHED_GUARDIAN_CNT
__SCHED_GUARDIAN = False
__SCHED_GUARDIAN_CNT = 0
def sched_guardian():
global __SCHED_GUARDIAN, __SCHED_GUARDIAN_CNT
__SCHED_GUARDIAN = True
def sched_check():
global __SCHED_GUARDIAN, __SCHED_GUARDIAN_CNT
if not __SCHED_GUARDIAN:
__SCHED_GUARDIAN_CNT += 1
return __SCHED_GUARDIAN_CNT < 4
reset_guardian()
return True

View File

@@ -115,6 +115,7 @@ SKIN_TEXT = {
"thisMonth": TT("This month"),
"today": TT("Today"),
"total": TT("Total"),
"custom": TT("Custom"),
"on": TT("on"),
"off": TT("off"),
"parameters": TT("Parameters"), #: Config: startup parameters of SABnzbd
@@ -459,7 +460,9 @@ SKIN_TEXT = {
"opt-top_only": TT("Only Get Articles for Top of Queue"),
"explain-top_only": TT("Enable for less memory usage. Disable to prevent slow jobs from blocking the queue."),
"opt-safe_postproc": TT("Post-Process Only Verified Jobs"),
"explain-safe_postproc": TT("Only perform post-processing on jobs that passed all PAR2 checks."),
"explain-safe_postproc": TT(
"Only unpack and run scripts on jobs that passed the verification stage. If turned off, all jobs will be marked as Completed even if they are incomplete."
),
"opt-pause_on_pwrar": TT("Action when encrypted RAR is downloaded"),
"explain-pause_on_pwrar": TT('In case of "Pause", you\'ll need to set a password and resume the job.'),
"opt-no_dupes": TT("Detect Duplicate Downloads"),
@@ -696,7 +699,7 @@ SKIN_TEXT = {
"opt-ncenter_enable": TT("Notification Center"),
"opt-acenter_enable": TT("Enable Windows Notifications"),
"testNotify": TT("Test Notification"),
"section-NC": TT("Notification Center"), #: Header for OSX Notfication Center section
"section-NC": TT("Notification Center"), #: Header for macOS Notfication Center section
"section-AC": TT("Windows Notifications"),
"section-OSD": TT("NotifyOSD"), #: Header for Ubuntu's NotifyOSD notifications section
"section-Prowl": TT("Prowl"), #: Header for Prowl notification section

View File

@@ -25,6 +25,7 @@ Generic Sorting - Sorting large files by a custom matching
import os
import logging
import re
from typing import Optional
import sabnzbd
from sabnzbd.filesystem import (
@@ -39,6 +40,7 @@ from sabnzbd.filesystem import (
)
from sabnzbd.constants import series_match, date_match, year_match, sample_match
import sabnzbd.cfg as cfg
from sabnzbd.nzbstuff import NzbObject
RE_SAMPLE = re.compile(sample_match, re.I)
# Do not rename .vob files as they are usually DVD's
@@ -147,7 +149,7 @@ def move_to_parent_folder(workdir):
class Sorter:
""" Generic Sorter class """
def __init__(self, nzo, cat):
def __init__(self, nzo: Optional[NzbObject], cat):
self.sorter = None
self.type = None
self.sort_file = False
@@ -231,7 +233,7 @@ class Sorter:
class SeriesSorter:
""" Methods for Series Sorting """
def __init__(self, nzo, job_name, path, cat):
def __init__(self, nzo: Optional[NzbObject], job_name, path, cat):
self.matched = False
self.original_job_name = job_name
@@ -532,7 +534,7 @@ def check_for_sequence(regex, files):
prefix = name[: match1.start()]
# Don't do anything if only one or no files matched
if len(list(matches.keys())) < 2:
if len(list(matches)) < 2:
return {}
key_prev = 0
@@ -540,7 +542,7 @@ def check_for_sequence(regex, files):
alphabet = "abcdefghijklmnopqrstuvwxyz"
# Check the dictionary to see if the keys are in a numeric or alphabetic sequence
for akey in sorted(matches.keys()):
for akey in sorted(matches):
if akey.isdigit():
key = int(akey)
elif akey in alphabet:
@@ -570,7 +572,7 @@ def check_for_sequence(regex, files):
class MovieSorter:
""" Methods for Generic Sorting """
def __init__(self, nzo, job_name, path, cat):
def __init__(self, nzo: Optional[NzbObject], job_name, path, cat):
self.matched = False
self.original_job_name = job_name
@@ -784,7 +786,7 @@ class MovieSorter:
class DateSorter:
""" Methods for Date Sorting """
def __init__(self, nzo, job_name, path, cat):
def __init__(self, nzo: Optional[NzbObject], job_name, path, cat):
self.matched = False
self.original_job_name = job_name
@@ -1001,7 +1003,7 @@ def path_subst(path, mapping):
return "".join(newpath)
def get_titles(nzo, match, name, titleing=False):
def get_titles(nzo: NzbObject, match, name, titleing=False):
"""The title will be the part before the match
Clean it up and title() it
@@ -1082,12 +1084,12 @@ def replace_word(word_input, one, two):
regex = re.compile(r"\W(%s)(\W|$)" % one, re.I)
matches = regex.findall(word_input)
if matches:
for unused in matches:
for _ in matches:
word_input = word_input.replace(one, two)
return word_input
def get_descriptions(nzo, match, name):
def get_descriptions(nzo: NzbObject, match, name):
"""If present, get a description from the nzb name.
A description has to be after the matched item, separated either
like ' - Description' or '_-_Description'
@@ -1165,7 +1167,7 @@ def strip_folders(path):
""" Strip all leading/trailing underscores also dots for Windows """
x = x.strip().strip("_")
if sabnzbd.WIN32:
# OSX and Linux should keep dots, because leading dots are significant
# macOS and Linux should keep dots, because leading dots are significant
# while Windows cannot handle trailing dots
x = x.strip(".")
x = x.strip()

View File

@@ -27,21 +27,20 @@ import queue
import urllib.request
import urllib.error
import urllib.parse
from http.client import IncompleteRead
from http.client import IncompleteRead, HTTPResponse
from threading import Thread
import base64
from typing import Tuple, Optional
import sabnzbd
from sabnzbd.constants import DEF_TIMEOUT, FUTURE_Q_FOLDER, VALID_NZB_FILES, Status, VALID_ARCHIVES
import sabnzbd.misc as misc
import sabnzbd.filesystem
from sabnzbd.nzbqueue import NzbQueue
from sabnzbd.postproc import PostProcessor
import sabnzbd.cfg as cfg
import sabnzbd.emailer as emailer
import sabnzbd.notifier as notifier
from sabnzbd.encoding import ubtou, utob
from sabnzbd.nzbstuff import NzbObject
_RARTING_FIELDS = (
"x-rating-id",
@@ -61,18 +60,14 @@ _RARTING_FIELDS = (
class URLGrabber(Thread):
do = None # Link to instance of the thread
def __init__(self):
Thread.__init__(self)
self.queue = queue.Queue()
for tup in NzbQueue.do.get_urls():
url, nzo = tup
self.queue.put((url, nzo))
self.queue: queue.Queue[Tuple[Optional[str], Optional[NzbObject]]] = queue.Queue()
for url_nzo_tup in sabnzbd.NzbQueue.get_urls():
self.queue.put(url_nzo_tup)
self.shutdown = False
URLGrabber.do = self
def add(self, url, future_nzo, when=None):
def add(self, url: str, future_nzo: NzbObject, when: Optional[int] = None):
""" Add an URL to the URLGrabber queue, 'when' is seconds from now """
if future_nzo and when:
# Always increase counter
@@ -88,16 +83,16 @@ class URLGrabber(Thread):
self.queue.put((url, future_nzo))
def stop(self):
logging.info("URLGrabber shutting down")
self.shutdown = True
self.add(None, None)
self.queue.put((None, None))
def run(self):
logging.info("URLGrabber starting up")
self.shutdown = False
while not self.shutdown:
(url, future_nzo) = self.queue.get()
# Set NzbObject object to None so reference from this thread
# does not keep the object alive in the future (see #1628)
future_nzo = None
url, future_nzo = self.queue.get()
if not url:
# stop signal, go test self.shutdown
@@ -220,10 +215,10 @@ class URLGrabber(Thread):
# URL was redirected, maybe the redirect has better filename?
# Check if the original URL has extension
if (
url != fetch_request.url
url != fetch_request.geturl()
and sabnzbd.filesystem.get_ext(filename) not in VALID_NZB_FILES + VALID_ARCHIVES
):
filename = os.path.basename(urllib.parse.unquote(fetch_request.url))
filename = os.path.basename(urllib.parse.unquote(fetch_request.geturl()))
elif "&nzbname=" in filename:
# Sometimes the filename contains the full URL, duh!
filename = filename[filename.find("&nzbname=") + 9 :]
@@ -300,7 +295,7 @@ class URLGrabber(Thread):
logging.debug("URLGRABBER Traceback: ", exc_info=True)
@staticmethod
def fail_to_history(nzo, url, msg="", content=False):
def fail_to_history(nzo: NzbObject, url: str, msg="", content=False):
"""Create History entry for failed URL Fetch
msg: message to be logged
content: report in history that cause is a bad NZB file
@@ -329,11 +324,11 @@ class URLGrabber(Thread):
nzo.cat, _, nzo.script, _ = misc.cat_to_opts(nzo.cat, script=nzo.script)
# Add to history and run script if desired
NzbQueue.do.remove(nzo.nzo_id, add_to_history=False)
PostProcessor.do.process(nzo)
sabnzbd.NzbQueue.remove(nzo.nzo_id)
sabnzbd.PostProcessor.process(nzo)
def _build_request(url):
def _build_request(url: str) -> HTTPResponse:
# Detect basic auth
# Adapted from python-feedparser
user_passwd = None
@@ -357,12 +352,12 @@ def _build_request(url):
return urllib.request.urlopen(req)
def _analyse(fetch_request, future_nzo):
def _analyse(fetch_request: HTTPResponse, future_nzo: NzbObject):
"""Analyze response of indexer
returns fetch_request|None, error-message|None, retry, wait-seconds, data
"""
data = None
if not fetch_request or fetch_request.code != 200:
if not fetch_request or fetch_request.getcode() != 200:
if fetch_request:
msg = fetch_request.msg
else:

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/python3
"""
Functions to check if the path filesystem uses FAT

View File

@@ -1,4 +1,6 @@
#!/usr/bin/env python
#!/usr/bin/python3
""" Measure writing speed of disk specifiec, or working directory if not specified"""
import time
import os
@@ -8,39 +10,36 @@ _DUMP_DATA_SIZE = 10 * 1024 * 1024
_DUMP_DATA = os.urandom(_DUMP_DATA_SIZE)
def diskspeedmeasure(dirname):
"""Returns writing speed to dirname in MB/s
method: keep writing a file, until 1 second is passed.
def diskspeedmeasure(my_dirname: str) -> float:
"""Returns writing speed to my_dirname in MB/s
method: keep writing a file, until certain time is passed.
Then divide bytes written by time passed
In case of problems (ie non-writable dir or file), return None
In case of problems (ie non-writable dir or file), return 0.0
"""
maxtime = 1.0 # sec
maxtime = 0.5 # sec
total_written = 0
filename = os.path.join(dirname, "outputTESTING.txt")
filename = os.path.join(my_dirname, "outputTESTING.txt")
try:
# Use low-level I/O
fp = os.open(filename, os.O_CREAT | os.O_WRONLY, 0o777)
fp_testfile = os.open(filename, os.O_CREAT | os.O_WRONLY, 0o777)
# Start looping
total_time = 0.0
while total_time < maxtime:
start = time.time()
os.write(fp, _DUMP_DATA)
os.fsync(fp)
os.write(fp_testfile, _DUMP_DATA)
os.fsync(fp_testfile)
total_time += time.time() - start
total_written += _DUMP_DATA_SIZE
# Have to use low-level close
os.close(fp_testfile)
# Remove the file
try:
# Have to use low-level close
os.close(fp)
os.remove(filename)
except:
pass
except:
# No succesful measurement, so ... report None
return None
os.remove(filename)
except (PermissionError, NotADirectoryError, FileNotFoundError):
# Could not write, so ... report 0.0
return 0.0
return total_written / total_time / 1024 / 1024
@@ -50,19 +49,19 @@ if __name__ == "__main__":
print("Let's go")
if len(sys.argv) >= 2:
dirname = sys.argv[1]
if not os.path.isdir(dirname):
DIRNAME = sys.argv[1]
if not os.path.isdir(DIRNAME):
print("Specified argument is not a directory. Bailing out")
sys.exit(1)
else:
# no argument, so use current working directory
dirname = os.getcwd()
DIRNAME = os.getcwd()
print("Using current working directory")
try:
speed = diskspeedmeasure(dirname)
if speed:
print("Disk writing speed: %.2f Mbytes per second" % speed)
SPEED = max(diskspeedmeasure(DIRNAME), diskspeedmeasure(DIRNAME))
if SPEED:
print("Disk writing speed: %.2f Mbytes per second" % SPEED)
else:
print("No measurement possible. Check that directory is writable.")
except:

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env python3
#!/usr/bin/python3
"""
Module to measure and report Internet speed

View File

@@ -42,9 +42,11 @@ The version in Turbogears is based on the original stand-alone Kronos.
This is open-source software, released under the MIT Software License:
http://www.opensource.org/licenses/mit-license.php
Adapted to work on Python 3 by the SABnzbd-Team.
"""
__version__ = "2.0"
__version__ = "2.1"
__all__ = [
"DayTaskRescheduler",
@@ -66,20 +68,15 @@ __all__ = [
"ThreadedTaskMixin",
"ThreadedWeekdayTask",
"WeekdayTask",
"add_interval_task",
"add_monthday_task",
"add_single_task",
"add_weekday_task",
"cancel",
"method",
]
import os
import sys
import sched
import time
import weakref
import logging
import threading
class method:
@@ -121,7 +118,9 @@ class Scheduler:
def _release_lock(self):
pass
def add_interval_task(self, action, taskname, initialdelay, interval, processmethod, args, kw):
def add_interval_task(
self, action, taskname, initialdelay, interval, processmethod=method.sequential, args=None, kw=None
):
"""Add a new Interval Task to the schedule.
A very short initialdelay or one of zero cannot be honored, you will
@@ -148,7 +147,7 @@ class Scheduler:
self.schedule_task(task, initialdelay)
return task
def add_single_task(self, action, taskname, initialdelay, processmethod, args, kw):
def add_single_task(self, action, taskname, initialdelay, processmethod=method.sequential, args=None, kw=None):
"""Add a new task to the scheduler that will only be executed once."""
if initialdelay < 0:
raise ValueError("Delay must be >0")
@@ -169,7 +168,9 @@ class Scheduler:
self.schedule_task(task, initialdelay)
return task
def add_daytime_task(self, action, taskname, weekdays, monthdays, timeonday, processmethod, args, kw):
def add_daytime_task(
self, action, taskname, weekdays, monthdays, timeonday, processmethod=method.sequential, args=None, kw=None
):
"""Add a new Day Task (Weekday or Monthday) to the schedule."""
if weekdays and monthdays:
raise ValueError("You can only specify weekdays or monthdays, " "not both")
@@ -250,35 +251,23 @@ class Scheduler:
"""Cancel given scheduled task."""
self.sched.cancel(task.event)
if sys.version_info >= (2, 6):
# code for sched module of python 2.6+
def _getqueuetoptime(self):
try:
return self.sched._queue[0].time
except IndexError:
return 0.0
def _getqueuetoptime(self):
try:
return self.sched._queue[0].time
except IndexError:
return 0.0
def _clearschedqueue(self):
self.sched._queue[:] = []
else:
# code for sched module of python 2.5 and older
def _getqueuetoptime(self):
try:
return self.sched.queue[0][0]
except IndexError:
return 0.0
def _clearschedqueue(self):
self.sched.queue[:] = []
def _clearschedqueue(self):
self.sched._queue[:] = []
def _run(self):
# Low-level run method to do the actual scheduling loop.
self.running = True
while self.running:
try:
self.sched.run()
except Exception as x:
logging.error("ERROR DURING SCHEDULER EXECUTION %s" % str(x), exc_info=True)
logging.error("Error during scheduler execution: %s" % str(x), exc_info=True)
# queue is empty; sleep a short while before checking again
if self.running:
time.sleep(5)
@@ -312,7 +301,7 @@ class Task:
def handle_exception(self, exc):
"""Handle any exception that occured during task execution."""
logging.error("ERROR DURING SCHEDULER EXECUTION %s" % str(exc), exc_info=True)
logging.error("Error during scheduler execution: %s" % str(exc), exc_info=True)
class SingleTask(Task):
@@ -414,78 +403,75 @@ class MonthdayTask(DayTaskRescheduler, Task):
self.action(*self.args, **self.kw)
try:
import threading
class ThreadedScheduler(Scheduler):
"""A Scheduler that runs in its own thread."""
class ThreadedScheduler(Scheduler):
"""A Scheduler that runs in its own thread."""
def __init__(self):
Scheduler.__init__(self)
# we require a lock around the task queue
self._lock = threading.Lock()
def __init__(self):
Scheduler.__init__(self)
# we require a lock around the task queue
self._lock = threading.Lock()
def start(self):
"""Splice off a thread in which the scheduler will run."""
self.thread = threading.Thread(target=self._run)
self.thread.setDaemon(True)
self.thread.start()
def start(self):
"""Splice off a thread in which the scheduler will run."""
self.thread = threading.Thread(target=self._run)
self.thread.setDaemon(True)
self.thread.start()
def stop(self):
"""Stop the scheduler and wait for the thread to finish."""
Scheduler.stop(self)
try:
self.thread.join()
except AttributeError:
pass
def stop(self):
"""Stop the scheduler and wait for the thread to finish."""
Scheduler.stop(self)
try:
self.thread.join()
except AttributeError:
pass
def _acquire_lock(self):
"""Lock the thread's task queue."""
self._lock.acquire()
def _acquire_lock(self):
"""Lock the thread's task queue."""
self._lock.acquire()
def _release_lock(self):
"""Release the lock on th ethread's task queue."""
self._lock.release()
class ThreadedTaskMixin:
"""A mixin class to make a Task execute in a separate thread."""
def __call__(self, schedulerref):
"""Execute the task action in its own thread."""
threading.Thread(target=self.threadedcall).start()
self.reschedule(schedulerref())
def threadedcall(self):
# This method is run within its own thread, so we have to
# do the execute() call and exception handling here.
try:
self.execute()
except Exception as x:
self.handle_exception(x)
class ThreadedIntervalTask(ThreadedTaskMixin, IntervalTask):
"""Interval Task that executes in its own thread."""
pass
class ThreadedSingleTask(ThreadedTaskMixin, SingleTask):
"""Single Task that executes in its own thread."""
pass
class ThreadedWeekdayTask(ThreadedTaskMixin, WeekdayTask):
"""Weekday Task that executes in its own thread."""
pass
class ThreadedMonthdayTask(ThreadedTaskMixin, MonthdayTask):
"""Monthday Task that executes in its own thread."""
pass
def _release_lock(self):
"""Release the lock on th ethread's task queue."""
self._lock.release()
except ImportError:
# threading is not available
class ThreadedTaskMixin:
"""A mixin class to make a Task execute in a separate thread."""
def __call__(self, schedulerref):
"""Execute the task action in its own thread."""
threading.Thread(target=self.threadedcall).start()
self.reschedule(schedulerref())
def threadedcall(self):
# This method is run within its own thread, so we have to
# # do the execute() call and exception handling here.
try:
self.execute()
except Exception as x:
self.handle_exception(x)
class ThreadedIntervalTask(ThreadedTaskMixin, IntervalTask):
"""Interval Task that executes in its own thread."""
pass
class ThreadedSingleTask(ThreadedTaskMixin, SingleTask):
"""Single Task that executes in its own thread."""
pass
class ThreadedWeekdayTask(ThreadedTaskMixin, WeekdayTask):
"""Weekday Task that executes in its own thread."""
pass
class ThreadedMonthdayTask(ThreadedTaskMixin, MonthdayTask):
"""Monthday Task that executes in its own thread."""
pass

View File

@@ -17,7 +17,7 @@
"""
sabnzbd.utils.sleepless - Keep macOS (OSX) awake by setting power assertions
sabnzbd.utils.sleepless - Keep macOS awake by setting power assertions
"""

158
sabnzbd/utils/ssdp.py Normal file
View File

@@ -0,0 +1,158 @@
#!/usr/bin/python3 -OO
# Copyright 2009-2020 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
sabnzbd.utils.ssdp - Support for SSDP / Simple Service Discovery Protocol plus XML to appear on Windows
Method:
1) this service sends a SSDP broadcast with a description.xml URL in it
2) Windows retrieves that description.xml from this service
3) Windows presents the info from the XML in Windows Exporter's Network
Based on the following Specs:
SSDP:
https://tools.ietf.org/html/draft-cai-ssdp-v1-03
XML:
UPnP™ Device Architecture 1.1, paragraph 2.3 Device description
http://upnp.org/specs/arch/UPnP-arch-DeviceArchitecture-v1.1.pdf
"""
import logging
import time
import socket
import uuid
from threading import Thread
from typing import Optional
class SSDP(Thread):
def __init__(self, host, server_name, url, description, manufacturer, manufacturer_url, model):
self.__host = host # Note: this is the LAN IP address!
self.__server_name = server_name
self.__url = url
self.__description = description
self.__manufacturer = manufacturer
self.__manufacturer_url = manufacturer_url
self.__model = model
self.__myhostname = socket.gethostname()
# a steady uuid: stays the same as long as hostname and ip address stay the same:
self.__uuid = uuid.uuid3(uuid.NAMESPACE_DNS, self.__myhostname + self.__host)
# Create the SSDP broadcast message
self.__mySSDPbroadcast = f"""NOTIFY * HTTP/1.1
HOST: 239.255.255.250:1900
CACHE-CONTROL: max-age=60
LOCATION: {self.__url}/description.xml
SERVER: {self.__server_name}
NT: upnp:rootdevice
USN: uuid:{self.__uuid}::upnp:rootdevice
NTS: ssdp:alive
OPT: "http://schemas.upnp.org/upnp/1/0/"; ns=01
"""
self.__mySSDPbroadcast = self.__mySSDPbroadcast.replace("\n", "\r\n").encode("utf-8")
# Create the XML info (description.xml)
self.__myxml = f"""<?xml version="1.0" encoding="UTF-8" ?>
<root xmlns="urn:schemas-upnp-org:device-1-0">
<specVersion>
<major>1</major>
<minor>0</minor>
</specVersion>
<URLBase>{self.__url}</URLBase>
<device>
<deviceType>urn:schemas-upnp-org:device:Basic:1</deviceType>
<friendlyName>{self.__server_name} ({self.__myhostname})</friendlyName>
<manufacturer>{self.__manufacturer}</manufacturer>
<manufacturerURL>{self.__manufacturer_url}</manufacturerURL>
<modelDescription>{self.__model} </modelDescription>
<modelName>{self.__model}</modelName>
<modelNumber> </modelNumber>
<modelDescription>{self.__description}</modelDescription>
<modelURL>{self.__manufacturer_url}</modelURL>
<UDN>uuid:{self.__uuid}</UDN>
<presentationURL>{self.__url}</presentationURL>
</device>
</root>"""
self.__stop = False
super().__init__()
def stop(self):
logging.info("Stopping SSDP")
self.__stop = True
def run(self):
logging.info("Serving SSDP on %s as %s", self.__host, self.__server_name)
# logging.info("self.__url is %s", self.__url)
# the standard multicast settings for SSDP:
MCAST_GRP = "239.255.255.250"
MCAST_PORT = 1900
MULTICAST_TTL = 2
while 1 and not self.__stop:
# Do network stuff
# Use self.__host, self.__url, self.__server_name to do stuff!
# Create socket, send the broadcast, and close the socket again
try:
with socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP) as sock:
sock.setsockopt(socket.IPPROTO_IP, socket.IP_MULTICAST_TTL, MULTICAST_TTL)
sock.sendto(self.__mySSDPbroadcast, (MCAST_GRP, MCAST_PORT))
except:
# probably no network
pass
time.sleep(5)
def serve_xml(self):
"""Returns an XML-structure based on the information being
served by this service, returns nothing if not running"""
if self.__stop:
return
# Use self.__host, self.__url, self.__server_name to do stuff!
return self.__myxml
# Reserve class variable, to be started later
__SSDP: Optional[SSDP] = None
# Wrapper functions to be called by program
def start_ssdp(host, server_name, url, description, manufacturer, manufacturer_url, model):
global __SSDP
__SSDP = SSDP(host, server_name, url, description, manufacturer, manufacturer_url, model)
__SSDP.start()
def stop_ssdp():
if __SSDP and __SSDP.is_alive():
__SSDP.stop()
__SSDP.join()
def server_ssdp_xml():
"""Returns the description.xml if the server is alive, empty otherwise"""
if __SSDP and __SSDP.is_alive():
return __SSDP.serve_xml()
return ""

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env python
#!/usr/bin/python3
# based on SysTrayIcon.py by Simon Brunning - simon@brunningonline.net
# http://www.brunningonline.net/simon/blog/archives/001835.html
# http://www.brunningonline.net/simon/blog/archives/SysTrayIcon.py.html

View File

@@ -1,8 +1,9 @@
# This file will be patched by setup.py
# The __version__ should be set to the branch name
# Leave __baseline__ set to unknown to enable setting commit-hash
# (e.g. "develop" or "1.2.x")
# You MUST use double quotes (so " and not ')
__version__ = "3.1.0-develop"
__version__ = "3.2.0-develop"
__baseline__ = "unknown"

View File

@@ -21,6 +21,7 @@ sabnzbd.zconfig - bonjour/zeroconfig support
import os
import logging
import socket
_HOST_PORT = (None, None)
@@ -34,21 +35,11 @@ except:
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.misc import match_str
from sabnzbd.constants import LOCALHOSTS
_BONJOUR_OBJECT = None
def hostname():
""" Return host's pretty name """
if sabnzbd.WIN32:
return os.environ.get("computername", "unknown")
try:
return os.uname()[1]
except:
return "unknown"
def _zeroconf_callback(sdRef, flags, errorCode, name, regtype, domain):
logging.debug(
"Full Bonjour-callback sdRef=%s, flags=%s, errorCode=%s, name=%s, regtype=%s, domain=%s",
@@ -68,7 +59,7 @@ def set_bonjour(host=None, port=None):
global _HOST_PORT, _BONJOUR_OBJECT
if not _HAVE_BONJOUR or not cfg.enable_bonjour():
logging.info("No Bonjour/ZeroConfig support installed")
logging.info("No bonjour/zeroconf support installed")
return
if host is None and port is None:
@@ -80,13 +71,13 @@ def set_bonjour(host=None, port=None):
zhost = None
domain = None
if match_str(host, ("localhost", "127.0.", "::1")):
logging.info('Bonjour/ZeroConfig does not support "localhost"')
if host in LOCALHOSTS:
logging.info("bonjour/zeroconf cannot be one of %s", LOCALHOSTS)
# All implementations fail to implement "localhost" properly
# A false address is published even when scope==kDNSServiceInterfaceIndexLocalOnly
return
name = hostname()
name = socket.gethostname()
logging.debug('Try to publish in Bonjour as "%s" (%s:%s)', name, host, port)
try:
refObject = pybonjour.DNSServiceRegister(

Some files were not shown because too many files have changed in this diff Show More