Compare commits

...

86 Commits
2.3.4 ... 2.3.6

Author SHA1 Message Date
Safihre
ee7e209a8b Set version to 2.3.6 2018-12-20 22:52:52 +01:00
Safihre
9bc1601939 Merge branch 'develop' 2018-12-20 22:51:54 +01:00
Safihre
190ec0a472 Update text files for 2.3.6 2018-12-20 22:33:13 +01:00
Safihre
468f01d839 Make clear that require_modern_tls is TLS 1.2 and above
@sanderjo
2018-12-19 11:17:25 +01:00
Erik Berkun-Drevnig
5bbbf602f9 Fix missing unrar, fix Python encoding 2018-12-19 08:58:29 +01:00
Erik Berkun-Drevnig
b03d68b434 Fix missing unrar, fix Python encoding 2018-12-19 08:57:59 +01:00
Safihre
0298beac15 Update text files for 2.3.6 RC 1 2018-12-17 13:56:34 +01:00
SABnzbd Automation
be6f047e31 Automatic translation update 2018-12-16 18:01:50 +00:00
Safihre
e8206371e4 Update Wizard example URL 2018-12-14 08:41:00 +01:00
SABnzbd Automation
6609248fce Automatic translation update 2018-12-06 08:32:33 +00:00
Safihre
0ae5c7f8aa Code improvements 2018-11-28 09:23:11 +01:00
Sander Jo
02b6f63156 Option require_modern_tls if you want TLS 1.2 or higher for NNTPS 2018-11-23 15:28:03 +01:00
Safihre
2b665667af Unavailable feeds would crash reading 2018-11-20 09:42:13 +01:00
jcfp
ad7fc240c7 help output: add -1 param for logging, 7z as supported file ext (#1192)
* help output: add -1 param for logging, 7z as supported file ext

* stop the universe from expanding
2018-11-18 09:02:37 +01:00
jcfp
aef878a0f2 update snapcraft url
the old link only displays a 'your session has expired' message...
2018-11-01 13:08:47 +01:00
Erik Berkun-Drevnig
2a967b62d9 Add Travis and AppVeyor badges (#1186)
* Add Travis and AppVeyor badges

* Add snap, issue resolution and license badges

* Switch master to develop branch

* Update README.md
2018-10-31 09:48:57 +01:00
Erik Berkun-Drevnig
19946684d5 Add initial snap support (#1183)
* Add initial snap support
* Apply review feedback
* Fix armhf and arm64 builds
* Use PPA and build lang files
* Add openssl for x86
* Remove unnecessary stage-packages
* Improve arch grammar
* Add back dev packages for building extensions
* Add back missing SSL
* Add icon
* Update snapcraft.yaml
2018-10-31 07:53:12 +01:00
Safihre
4c5ca149ba Update MultiPar to v1.3.0.2 2018-10-30 09:09:05 +01:00
jcfp
54d6e0dc21 fix extension filters in linux tray
correct typo (nbz) in filter name and bring the extension filters in line with the supported types (cf. sabnzbd/constants.py:99-100)
2018-10-29 22:17:23 +01:00
Erik Berkun-Drevnig
ecb1403776 Add initial snap support (#1183)
* Add initial snap support
* Apply review feedback
* Fix armhf and arm64 builds
* Use PPA and build lang files
* Add openssl for x86
* Remove unnecessary stage-packages
* Improve arch grammar
* Add back dev packages for building extensions
* Add back missing SSL
* Add icon
* Update snapcraft.yaml
2018-10-29 09:50:28 +01:00
Safihre
7e5c6d1c04 Update text files for 2.3.6 Beta 1 2018-10-26 09:27:59 +02:00
Safihre
96b140dee0 Update included Python license 2018-10-26 09:27:42 +02:00
Safihre
2e098b641f Update UnRar to 5.61 for macOS and Windows 2018-10-26 09:15:43 +02:00
Sander Jo
6678cb9d56 Remove \x00 from par2 creator info: uniformed method 2018-10-25 07:58:06 +02:00
Sander Jo
4b67405d16 Remove \x00 from par2 creator info 2018-10-19 08:45:58 +02:00
Safihre
7463a4abdc Existing RSS-feeds don't have the infourl yet
Rookie mistake.
2018-10-14 09:39:38 +02:00
SABnzbd Automation
163523048b Automatic translation update 2018-10-09 13:55:06 +00:00
Safihre
4892bc18f3 Prevent endless loop when disk-space is exceeded
It was a nice idea, to keep retrying to save the job, but it also breaks the pp-actions as it gets removed before it is ready to be removed.
Closes #1095
2018-10-09 15:14:49 +02:00
Safihre
217b2436f2 Detect RSS-feed login redirect and show specific error
Before it would overwrite specific errors detected above by general "No entries" warning.
2018-10-09 14:26:49 +02:00
Safihre
a9247ba934 No URL-encoding of RSS-URL's with comma's
Some indexers don't like that!
2018-10-09 13:55:58 +02:00
Safihre
8b2a6ef825 Link to details page of RSS-feed item if provided 2018-10-09 10:04:21 +02:00
Safihre
320495671b Add RSS-source icon to all tabs 2018-10-08 13:22:34 +02:00
Safihre
5ab872afa0 Assume correct SSL if test-host disabled
Closes #1179
2018-10-03 08:04:43 +02:00
Safihre
7ecb31805e Add API-capability to modify RSS filters
Closes #1154
2018-09-16 14:51:11 +02:00
Safihre
e8ebeb843c Retry All would not Retry URL's
Closes #1164
2018-09-16 13:32:13 +02:00
Safihre
3840678913 Rename thread-database getter
Each thread needs it's own DB-connection (see Python-docs). So for each CherryPy-thread we need to store the thread connection. The name of the function made it sound like we create a whole new connection, which isn't the case.
2018-09-16 13:31:43 +02:00
Safihre
da7082b17e Sending Retry for already completed jobs would give traceback
Due to missing file/folder.
See: https://forums.sabnzbd.org/viewtopic.php?f=2&t=23587
2018-09-16 10:11:32 +02:00
Safihre
6198f95e1e Do not log null-bytes in par2-creator
#1153
2018-09-11 16:21:06 +02:00
Safihre
4075b1accb Set version to 2.3.5 2018-09-07 10:44:18 +02:00
Safihre
6d8a774443 Merge branch 'develop' 2018-09-07 10:41:57 +02:00
Safihre
76c7a6ce95 Update text files for 2.3.5 2018-09-07 10:11:20 +02:00
Safihre
01bd0bdce0 Small cleanup of generated POT file 2018-09-07 10:11:00 +02:00
Safihre
fa908de6e9 Test a unicode download
Broken-unicode job is still work-in-progress
2018-09-05 15:08:50 +02:00
Safihre
f05a6c6f76 Add seperate servers for each test-OS 2018-09-05 15:05:47 +02:00
Safihre
d86fb42d28 Code-style changes 2018-09-05 14:51:16 +02:00
Safihre
7a7ce47769 Remove redundant parentheses 2018-09-05 11:26:38 +02:00
Safihre
40e57845a7 Correctly sort additional blocks to be added
Oops, .pop() gets the last one not the first one.
2018-09-03 13:50:38 +02:00
Safihre
884dedc9d1 Failed file join would not result in failed job 2018-09-03 13:14:29 +02:00
Safihre
8e1f4e14a2 MultiPar repair of joinable files doesn't join them 2018-09-03 13:06:07 +02:00
Safihre
1190742127 Correctly report CRC errors in (7)zip archives
They were reported as password errors.
2018-09-03 09:43:59 +02:00
Safihre
e6d481a2ba Perform functional tests also on macOS 2018-08-28 07:47:33 +02:00
Safihre
0958caf5ed If no newsserver-info, skip tests 2018-08-28 07:47:33 +02:00
Safihre
e2761d967e Functional testing using Selenium 2018-08-28 07:47:33 +02:00
Safihre
b4f36be170 Update text files for 2.3.5RC2 2018-08-23 20:54:12 +02:00
Safihre
5e722b27f3 Fix more errors and warnings found by code validation 2018-08-23 14:31:15 +02:00
Safihre
367a73ef29 Fix errors found by code validation tool 2018-08-22 09:52:58 +02:00
Safihre
9228bc28ff Typos in stylesheets 2018-08-22 07:58:24 +02:00
Sander Jo
02e18be5e1 Better pystone calculation 2018-08-22 07:58:08 +02:00
Safihre
531ef59e0a Small Config > General styling fix 2018-08-21 15:39:31 +02:00
Safihre
54e03fb40a Update text files for 2.3.5RC1 2018-08-08 20:35:01 +02:00
Safihre
904bb9f85a Remove unused imports 2018-08-07 17:10:13 +02:00
Rik Smith
b011e1a518 Direct Unpack would abort if single-file unpack was too slow (#1165) 2018-08-06 08:40:59 +02:00
Rik Smith
f83f71a950 Fix Deobfuscate.py script (#1166)
* Fix deobfuscate script

* Rename for code check

* Rename for code check
2018-08-06 08:35:01 +02:00
Safihre
4dbf5266ef Par2 files with same number of "+x" blocks were not counted seperatly
Strange why I did that before. It does create the possibility that if we have a huge NZB with many sets with similar filenames that we take forever to repair, but for normal NZB's with lot's of "volXX+40" we would ignore all those extra blocks.
2018-08-04 10:38:02 +02:00
Safihre
05aac4e01e Remove redundant integer conversion 2018-08-04 10:36:03 +02:00
Safihre
267c48f9a7 Update gitignore for PyCharm 2018-08-04 10:00:46 +02:00
Safihre
5168915a65 Manualy par2 parsing in Deobfuscate.py for major performance increase
By @P1nGu1n, thanks!
2018-08-03 15:13:20 +02:00
Safihre
71017d0d55 Diskspeed test did not remove test file 2018-08-03 15:06:19 +02:00
Safihre
a5db51a2c5 Windows-tray also show queue size left info when paused 2018-08-01 21:25:17 +02:00
Safihre
0bf2968e6a Windows installer language wasn't parsed for the wizard
It needs a translation table and it was fetched from the wrong register location
2018-08-01 11:42:23 +02:00
Safihre
2ec5918f5e Extend disk-speed time to 1 second for more stable results 2018-08-01 07:57:24 +02:00
Safihre
04f5a63cd7 Backported new-style speed-test from Py3 branch
Old-style would create bad file on Windows. This is more robust. Thanks @albino1 for report!
2018-07-31 22:25:14 +02:00
Safihre
43d8283f5b Wizard final page not linking to Folders config page
Closes #1163
2018-07-31 22:07:01 +02:00
SABnzbd Automation
f8111121c4 Automatic translation update 2018-07-14 20:49:01 +00:00
Safihre
b53b73c135 Only abort active DirectUnpackers
Only when we have assigned a setname we are doing something, otherwise we might just as well leave the files and not delete too much. For example see: https://forums.sabnzbd.org/viewtopic.php?f=2&t=23440
2018-07-14 22:24:06 +02:00
Safihre
bd7b8a975b Wrap is_rarfile so it can't crash when files are gone
Seen in users log-files. Possible when deleting job during assembly-steps.
2018-07-14 21:20:11 +02:00
Safihre
7ca765f276 Lock start and stop of DirectUnpack so they can't overlap
In high-speed situations this could happen.
2018-07-14 21:18:57 +02:00
Safihre
b918a53af5 Add env-variables to pre-queue call 2018-07-14 17:28:37 +02:00
Safihre
525809afc9 Always add basic env-variables to external calls 2018-07-14 17:28:14 +02:00
Safihre
a7048cdc8e Use server hostname in logs and warnings
Closes #1159
2018-07-14 12:14:48 +02:00
Safihre
02888568bd Log par2 creator
Closes #1153
2018-07-02 10:29:07 +02:00
Safihre
203409f02f Update MultiPar to 1.3.0.1 2018-06-29 10:06:15 +02:00
Safihre
ecc8e6ac0e Update UnRar to 5.60 for Windows and macOS
Closes #1156
2018-06-29 10:01:40 +02:00
Safihre
852636acda Add VPN problems to known-issues 2018-06-12 11:14:07 +02:00
Safihre
bc18369552 Spelling fix in Deobfuscate.py
Closes #1152
2018-06-08 10:45:17 +02:00
Safihre
8f248a2219 Could not set single Indexer Tag for a category
"TV > HD" would become ['TV', '>', 'HD'].
See https://forums.sabnzbd.org/viewtopic.php?f=3&t=23409
2018-05-30 07:53:51 +02:00
95 changed files with 3917 additions and 4500 deletions

13
.gitignore vendored
View File

@@ -1,4 +1,4 @@
#Compiled python
# Compiled python
*.py[cod]
# Working folders for Win build
@@ -7,6 +7,13 @@ dist/
locale/
srcdist/
# Snapcraft
parts/
prime/
stage/
snap/.snapcraft/
*.snap
# Generated email templates
email/*.tmpl
@@ -16,12 +23,14 @@ SABnzbd*.exe
SABnzbd*.gz
SABnzbd*.dmg
# WingIDE project files
# WingIDE/PyCharm project files
*.wp[ru]
.idea
# Testing folders
.cache
.xprocess
tests/cache
# General junk
*.keep

View File

@@ -1,15 +1,38 @@
language: python
python:
- "2.7"
before_install:
- sudo add-apt-repository ppa:jcfp -y
- sudo apt-get update -q
- sudo apt-get install sabnzbdplus -y
# Include the host/username/password for the test-servers
matrix:
include:
- os: linux
language: python
env:
- secure: X5MY2HAtCxBI84IySY/XroFsFy2RIVhfsX+P1y3WXfvwBHYKCgrPV6BgwCg93ttkPmMS/IslP5Vp4F1OGqC9AZdxtxfHKpIPlIVxIHj6Lf6xwynmbGDQXjy73K13gjznK2mkGA0jcsp4Q5POS4ZKVkd6aOXnc8l8xS08+ztAvfxCC3tsMj2oMLEPP92j6zqb/1x2aX5+gVyVzrKgQQVKIk6R6jTxhIiOMPzj4+VMLXK8NEZqjV6RPwUjSoKHqJiV5URyf6/+2Ojmem3ilnpktn7xIJ/ZO1UNnZCrElOGZtmbryZFMctJvEAIQCOSdzsq/MACk0gocnOL3JQfDF5sYPOjJmc6sZI9PL78oFhwKaLkWEx565u8kdkLTMvv4A02HAmPzV1rKE1CTlEhsy0djU8mueCr9Ep1WyLJdY/igbyhR+dOd8fVo9Y1tY2o+ZisCsO5+PRfzhypK9xukqmWDJSXIWSuExUU/becXJ4IaTmlYJ+ArhKvkL90GmckH/zt9ZPIgr9Lq0OFva9uVHX+sbbsQZZ48lAmgiiiX335dONj8MxO8cDKsUT9FWQ8PzeJ8g8PErv5pmVVVODoOdKZg2Oo4jUsZG2yV8uUt9j87I2DPou4WiJ7wcTzQCPdzlaA5hdixPMyVUF/yCL+eKdJQFaKy3eaKwCxnAVp3WA2WdA=
- secure: gzvbgg+rdM/TfPOlXGE/JOW8VEfIJxwUnzt2bH07nj/PbREXQXrImC1SreMtwQX+gvd7bjIBZU/sgNSgRYcWKaSim4uNtAbinkcKQeP+V844ZY2Fabq/xpLluSP61vJLQ+hOLbnvilxmuI0E1Divahxo5qZJlwzCdk5u2txJRT/2LKGDT1lpGgIyDj9u0/ViAcpaAyfFR2Zd6ydHKbx+eFBE21LkyH/+GJmRiO0+qLIuCa2knmOJYjwBxRcPiAEDpbrRUbYDiNyzPqEVxJfCbsGYlo/QN/SnV6kTqM1WoFzvi4d1pDxDGRFLQj+KigihF6uY4eLC1e6yVQrDy0tyWKt6E+1tc8fH5dRS7AYtWMzURn/7Ebd72AiNIYaeAL8ZPqI7vw3ovFTqBS0h8Mg2uUZ503ytUvfwKyU9MgIkzXwmGuE37MCd0bRJ/blPS2DT+IMbrbEP90K5VrDrN/AGiYHR1TZ9GKUZd6xHibulEh2nNFMMQEga8nE2CWaJ3uJrCN7ud+4OJ0zCZFF7JiJTbOGApHg/aGWD/bYfg9sIh7up4PcxVs6RFxbf+M1aB8GO2A9aEZFow+djYVxiqf6esmzrnlsTfz16f8Txmez3BRftjVULre03a3Rt7WRxwYLveNlJos1nMw3G0CnruCe+wJbHEK4tEiqIXqB8UemT4zw=
- secure: f5QGjZESH4khjLOO1Ntgtiol4ZvwcqHLIV1sdK162dVkNT6UKOTRQflj2UmRXzwiRzWtVX/Ri0zT0j+SUJy2+aqJY/gxvisdTIWzRQ3w/CJPGgCizSkTQEWJ2V/n7DUAJ4xerme36zYi21S3d8VEWVDzU/duLu3yhlN5x0wMCY+dDPSDTFubmptGeCmyxqBqGVd7gD3PaiK7fDBB/eAXbW3QxLLQfxLHmPsx8vzPhDTQiLFtY43jfnVGEBdUbxSMXbq2NRB5eXH3bBkW8u/5y9uoyuF45CQn8f3UB6F84L+/n9M2ryCGeSJOFuZqSUHXvRF2acON40jx3t4PVocEzYguPwewoiFxfFHjRWmiI4WljiN30taK0pgstmzLTedozK+NdZ0M8vD7MCyK0yegPQolzFRngWW5Y8NY1XwlBT9W2lqGmrFge+dB86wOArMcRlY62PTOJ9Zqspbe/6mBT4Tq4O2OsXxGX/x60W/NJynva9WAz2SLEi5Pjs6r1a3tyXssw4/8KVhWl92WfpOnWrZrnZlsxOTmcS2OhLB0FQikTv9T/i3CZNcCI4PELkExeIwh4JW1MY0iGeLDHcAUKryJGrRZj1x32Nt1uUPTPBi8l8EzNyNOUvbHYTdpBr5r2JW1orvT55OhvKauc3wB7ogj673iYsqx5jeazHhgJMs=
- os: osx
env:
- HOMEBREW_NO_AUTO_UPDATE=1
- secure: RI/WxBI5kTP5v0qZ6im5BwqPurzzwKIl8GSbM2dFSEiWYEbKwHTDJ3KDDsXqS8YMNaropNVgzsdpCGXYUWRNTraKVa9GZEGNJ+fQuBWK9wkJ0MDTYfL/QFSN1CDXXlg7k26VXu6PgvEFA5kyDfSpxrxXJC6yXOUeJqmebkU2fiQo7/5Vpb1WAwpYlBP6zL5lYt2lpJ85fhYEjuAeuP/9zdVIlgCB7rDCgUX7tCKKXgwbKXfcff7lOCneB00/RCmRuNp3/tohGlgrSXh4ivHx4XEQgRoiVdeR3RCKZa5tBIXANefuJ2VopBrAbSRmVBexQP1818IU/XLlwtEEpC1vulpkx+5JolWksCrx4uJkKdlH0KA4k1m88L0Q1Rmmnp9LgRgeEl5xqt5s6RR6lS63ChQYkVFgWandwlrWu7Uenne4401KbG58PzDXEGlsKhUXnYBX+SU6gwejImCMb3vszKRAge5QAQlkiruCu31W9tWpY9ezHYrbv9ckOqdFXf9qsPEnU352v/8qHFe7jT/+7RSYdUzuo/d2aQqPKfkb7sy1VLEznmbGmv1BH4rGNpxd5inlcFKsR099Hx7PWgY8MHZcnEP3PJ2kBseFzVP3WKXHDTcv8yR0w6EgQyMzSHl9Ah3WJJ7TXZQ82gcqF8LcmuKcqXcwTkffG3ww7Vzuq4M=
- secure: uXHKYgQAwnfhWKi7RKAEumMMZZTJBb878KpodRfs1fz0NffdPo5+Ak1ricNzOJ8wti8/lXycDS+YmnFs64lGUxL+zvbQlFv7QuKfN0uHfPlo6zux9Ha9pg1rSUI4zqZ9kmbtwc0I2mdy1VeWwHvnbQDXUIt6a+tTwYZL3MGdP6kNvtSXaYhbEoHExjqeHUtVhUTafvWGtwE7uN+sdvhwXQ0dWlz6HGub8qYjkKCmF9VG+OyLKjFHjLVDMQ7Jnng2l1ZOgHSh5g5m6r++NEwSzZ8wFVULdzv5eEcR9U+mHmonFKOA/ICcZGd8MhEuvz9BupfgDWFqSTb5JGxzlZ28YdtjcAudzrWQMSpP2R0ks2Ttxz9Kpgw1L75HMvj0smazHs7IEEiXf2Yr03bzeHg7CGXNqOYyEOxxrPaJekCjMlX/YGqT/iv/8pZPfew7k/iVJlvCam76WNXABjJncHJeMsCgkItYZAoRZJDc+7z8J4g4ys1Rk0V/difjjwc/pSeKbt6wDA/9cmZ7r4Cs1Yh9Pl/mw6kzWGGpejO7lmsayQN3Pw99QMcZByUHx5BR+ZtIfF7Sl+F0uDQJ0MntJcteF7z1Dam2jHlkLckb85j6YWup5ItLAj5Hz7V2YUwqFmQhfOWEAjxagNSNnB8we4YBWS4KDTBEVDm6ITTfddlYvCw=
- secure: HKaT52NUQh18kllFQTjpKC64KlDkWEz0XnIEKJffumctrJjCvoFZFNC7ip3j7Bi3yp2IeD2SMsdxrrT6YFKxx5FfSdPqpQnsY34bzdEFZQomNJg4n/tmBc350PoVQ0PvLQiVoCCfVbdS/b4makNK7A+d9KED+SEsQMAqKp2mSnGhATB9MwFaZL5S4nGnEkqW5+eeAQxJ8JRawwumOOx/xhPOoEMIfHMpyTwFI1yUh1nJhZ9k1nxHzPlM78goyIuf0MjeZfSZ2fIlNZGVruYM28i9hpO4bzPFhk51uryWv8DQZiZlpCkHl6Po7rVVf5pNqm+l9SD/t0DnhS2rJHdeFSI2lM/uZtdOxaY5fTTj83LbCGhFtuZnZRwoQ73tpda8J7Z1E5Ni9bi7vOiZQ4pEIPt4LLu0X607sPWMkqrmgalKQQS13b5oliyMpkIguvmj9822BpaNVqamIrfn0z38+0Gog8iuGlMAQnRO9tGDO4kbVLcZQTRWpSwIC3niTPjPgLq/N92XQ9xmccrFT7efwemgF65FNM5ltv8+9AmI+hsuyXfqeHaAV9wmxRAAhaqvRgnSLYa3u1CPn5fF2CDvPvPcyCEIWnyxc7dYHDpzAQDcyuSejtbnL8gpkDqEHpy23hTjgZnZD7Pk7PQ7ayA8zBumTMGZ+/GAn5Wmgce+w0M=
addons:
chrome: stable
before_script:
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then
brew cask install chromedriver;
else
sudo add-apt-repository ppa:jcfp -y;
sudo apt-get update -q;
sudo apt-get install unrar p7zip-full par2 chromium-chromedriver -y;
ln -s /usr/lib/chromium-browser/chromedriver ~/bin/chromedriver;
fi;
install:
- pip install --upgrade -r tests/requirements.txt
script:
- pytest
- python ./tests/test_functional.py
notifications:
email:
on_success: never
on_failure: always
on_failure: always

View File

@@ -1,5 +1,5 @@
*******************************************
*** This is SABnzbd 2.3.4 ***
*** This is SABnzbd 2.3.6 ***
*******************************************
SABnzbd is an open-source cross-platform binary newsreader.
It simplifies the process of downloading from Usenet dramatically,

View File

@@ -1,4 +1,4 @@
SABnzbd 2.3.4
SABnzbd 2.3.6
-------------------------------------------------------------------------------
0) LICENSE

View File

@@ -66,3 +66,7 @@
Config->Special->wait_for_dfolder to 1.
SABnzbd will appear to hang until the drive is mounted.
- If you experience speed-drops to KB/s when using a VPN, try setting the number of connections
to your servers to a total of 7. There is a CPU-usage reduction feature in SABnzbd that
gets confused by the way some VPN's handle the state of a connection. Below 8 connections
this feature is not active.

View File

@@ -1,7 +1,7 @@
Metadata-Version: 1.0
Name: SABnzbd
Version: 2.3.4
Summary: SABnzbd-2.3.4
Version: 2.3.6
Summary: SABnzbd-2.3.6
Home-page: https://sabnzbd.org
Author: The SABnzbd Team
Author-email: team@sabnzbd.org

View File

@@ -1,6 +1,12 @@
SABnzbd - The automated Usenet download tool
============================================
[![Average time to resolve an issue](https://isitmaintained.com/badge/resolution/sabnzbd/sabnzbd.svg)](https://isitmaintained.com/project/sabnzbd/sabnzbd "Average time to resolve an issue")
[![Travis CI](https://travis-ci.org/sabnzbd/sabnzbd.svg?branch=develop)](https://travis-ci.org/sabnzbd/sabnzbd)
[![AppVeryor](https://ci.appveyor.com/api/projects/status/github/sabnzbd/sabnzbd?svg=true&branch=develop)](https://ci.appveyor.com/project/Safihre/sabnzbd)
[![Snap Status](https://build.snapcraft.io/badge/sabnzbd/sabnzbd.svg)](https://snapcraft.io/sabnzbd)
[![License](https://img.shields.io/badge/license-GPL%20v2-blue.svg)](https://www.gnu.org/licenses/old-licenses/gpl-2.0.en.html)
SABnzbd is an Open Source Binary Newsreader written in Python.
It's totally free, incredibly easy to use, and works practically everywhere.
@@ -21,7 +27,6 @@ Optional:
- `python-cryptography` (enables certificate generation and detection of encrypted RAR-files during download)
- `python-dbus` (enable option to Shutdown/Restart/Standby PC on queue finish)
- `7zip`
- `unzip`
Your package manager should supply these. If not, we've got links in our more in-depth [installation guide](https://github.com/sabnzbd/sabnzbd/blob/master/INSTALL.txt).

View File

@@ -1,18 +1,24 @@
Release Notes - SABnzbd 2.3.4
Release Notes - SABnzbd 2.3.6
=========================================================
## Changes since 2.3.3
- Device hostname in hostname-verification always lowercased
- Hostnames ending in ".local" are always accepted
- URLGrabber would not always detect correct filename
- URLGrabber would ignore some successful downloads
- Always send NNTP QUIT after server-test
- Added option "--disable-file-log" to disable file-based logging
- Added CORS-header to API
- Windows: Service compatibility with Windows 10 April update
- Windows: Update Python to 2.7.15
- Windows: Update 7zip to 18.05
- macOS: Restore compatibility with El Capitan (10.11)
## Improvements and bug fixes since 2.3.5
- New option require_modern_tls forces TLSv1.2+ for SSL-connections
- RSS source icon on all tabs of feed overview
- RSS source icon now links to feed details page (if available)
- RSS feed URL's with commas would be wrongly escaped
- Common RSS login problems will show more appropriate error
- Added API-call to modify RSS-filters
- Exceeding disk space could result in endless retry-loop
- History Retry All would not retry failed NZB URL-fetches
- API-call to retry a job could result in unexpected error
- Assume correct SSL/certificate setup if test-host was disabled
- The par2-file creator was logged incorrectly
- Linux: Correct supported file extensions of tray icon
- Windows: Update MultiPar to 1.3.0.2
- Windows and macOS: Update UnRar to 5.61
Still looking for help with SABnzbd (Python 3) development!
https://www.reddit.com/r/usenet/comments/918nxv/
## Upgrading from 2.2.x and older
- Finish queue

View File

@@ -20,7 +20,6 @@ if sys.version_info[:2] < (2, 6) or sys.version_info[:2] >= (3, 0):
print "Sorry, requires Python 2.6 or 2.7."
sys.exit(1)
import os
import time
import subprocess

View File

@@ -182,7 +182,7 @@ def print_help():
print " -s --server <srv:port> Listen on server:port [*]"
print " -t --templates <templ> Template directory [*]"
print
print " -l --logging <0..2> Set logging level (-1=off, 0= least, 2= most) [*]"
print " -l --logging <-1..2> Set logging level (-1=off, 0= least, 2= most) [*]"
print " -w --weblogging Enable cherrypy access logging"
print
print " -b --browser <0..1> Auto browser launch (0= off, 1= on) [*]"
@@ -209,7 +209,7 @@ def print_help():
print " --new Run a new instance of SABnzbd"
print ""
print "NZB (or related) file:"
print " NZB or zipped NZB file, with extension .nzb, .zip, .rar, .gz, or .bz2"
print " NZB or compressed NZB file, with extension .nzb, .zip, .rar, .7z, .gz, or .bz2"
print ""
@@ -502,7 +502,7 @@ def all_localhosts():
def check_resolve(host):
""" Return True if 'host' resolves """
try:
dummy = socket.getaddrinfo(host, None)
socket.getaddrinfo(host, None)
except:
# Does not resolve
return False
@@ -598,7 +598,7 @@ def get_webhost(cherryhost, cherryport, https_port):
cherryhost = cherryhost.strip('[]')
else:
try:
info = socket.getaddrinfo(cherryhost, None)
socket.getaddrinfo(cherryhost, None)
except:
cherryhost = cherryhost.strip('[]')
@@ -659,12 +659,12 @@ def attach_server(host, port, cert=None, key=None, chain=None):
def is_sabnzbd_running(url):
""" Return True when there's already a SABnzbd instance running. """
try:
url = '%s&mode=version' % (url)
url = '%s&mode=version' % url
# Do this without certificate verification, few installations will have that
prev = sabnzbd.set_https_verification(False)
ver = get_from_url(url)
sabnzbd.set_https_verification(prev)
return (ver and (re.search(r'\d+\.\d+\.', ver) or ver.strip() == sabnzbd.__version__))
return ver and (re.search(r'\d+\.\d+\.', ver) or ver.strip() == sabnzbd.__version__)
except:
return False
@@ -728,7 +728,7 @@ def evaluate_inipath(path):
return path
def commandline_handler(frozen=True):
def commandline_handler():
""" Split win32-service commands are true parameters
Returns:
service, sab_opts, serv_opts, upload_nzbs
@@ -829,7 +829,6 @@ def main():
vista_plus = False
win64 = False
repair = 0
api_url = None
no_login = False
sabnzbd.RESTART_ARGS = [sys.argv[0]]
pid_path = None
@@ -865,9 +864,9 @@ def main():
elif opt in ('-b', '--browser'):
try:
autobrowser = bool(int(arg))
except:
except ValueError:
autobrowser = True
elif opt in ('--autorestarted', ):
elif opt == '--autorestarted':
autorestarted = True
elif opt in ('-c', '--clean'):
clean_up = True
@@ -886,36 +885,36 @@ def main():
exit_sab(0)
elif opt in ('-p', '--pause'):
pause = True
elif opt in ('--https',):
elif opt == '--https':
https_port = int(arg)
sabnzbd.RESTART_ARGS.append(opt)
sabnzbd.RESTART_ARGS.append(arg)
elif opt in ('--repair',):
elif opt == '--repair':
repair = 1
pause = True
elif opt in ('--repair-all',):
elif opt == '--repair-all':
repair = 2
pause = True
elif opt in ('--log-all',):
elif opt == '--log-all':
sabnzbd.LOG_ALL = True
elif opt in ('--disable-file-log'):
elif opt == '--disable-file-log':
no_file_log = True
elif opt in ('--no-login',):
elif opt == '--no-login':
no_login = True
elif opt in ('--pid',):
elif opt == '--pid':
pid_path = arg
sabnzbd.RESTART_ARGS.append(opt)
sabnzbd.RESTART_ARGS.append(arg)
elif opt in ('--pidfile',):
elif opt == '--pidfile':
pid_file = arg
sabnzbd.RESTART_ARGS.append(opt)
sabnzbd.RESTART_ARGS.append(arg)
elif opt in ('--new',):
elif opt == '--new':
new_instance = True
elif opt in ('--console',):
elif opt == '--console':
sabnzbd.RESTART_ARGS.append(opt)
osx_console = True
elif opt in ('--ipv6_hosting',):
elif opt == '--ipv6_hosting':
ipv6_hosting = arg
sabnzbd.MY_FULLNAME = os.path.normpath(os.path.abspath(sabnzbd.MY_FULLNAME))
@@ -1006,13 +1005,13 @@ def main():
if enable_https and https_port:
try:
cherrypy.process.servers.check_port(cherryhost, https_port, timeout=0.05)
except IOError, error:
except IOError:
Bail_Out(browserhost, cherryport)
except:
Bail_Out(browserhost, cherryport, '49')
try:
cherrypy.process.servers.check_port(cherryhost, cherryport, timeout=0.05)
except IOError, error:
except IOError:
Bail_Out(browserhost, cherryport)
except:
Bail_Out(browserhost, cherryport, '49')
@@ -1049,7 +1048,7 @@ def main():
else:
# In case HTTPS == HTTP port
cherryport = newport
sabnzbd.cfg.port.set(newport)
sabnzbd.cfg.cherryport.set(newport)
except:
# Something else wrong, probably badly specified host
Bail_Out(browserhost, cherryport, '49')
@@ -1180,7 +1179,7 @@ def main():
logging.info('Preferred encoding = ERROR')
preferredencoding = ''
# On Linux/FreeBSD/Unix "UTF-8" is strongly, strongly adviced:
# On Linux/FreeBSD/Unix "UTF-8" is strongly, strongly advised:
if not sabnzbd.WIN32 and not sabnzbd.DARWIN and not ('utf' in preferredencoding.lower() and '8' in preferredencoding.lower()):
logging.warning(T("SABnzbd was started with encoding %s, this should be UTF-8. Expect problems with Unicoded file and directory names in downloads.") % preferredencoding)
@@ -1237,8 +1236,6 @@ def main():
if autobrowser is not None:
sabnzbd.cfg.autobrowser.set(autobrowser)
else:
autobrowser = sabnzbd.cfg.autobrowser()
if not sabnzbd.WIN_SERVICE and not getattr(sys, 'frozen', None) == 'macosx_app':
signal.signal(signal.SIGINT, sabnzbd.sig_handler)
@@ -1601,7 +1598,7 @@ if sabnzbd.WIN32:
win32serviceutil.ServiceFramework.__init__(self, args)
self.hWaitStop = win32event.CreateEvent(None, 0, 0, None)
self.overlapped = pywintypes.OVERLAPPED() # @UndefinedVariable
self.overlapped = pywintypes.OVERLAPPED()
self.overlapped.hEvent = win32event.CreateEvent(None, 0, 0, None)
sabnzbd.WIN_SERVICE = self

View File

@@ -1,6 +1,14 @@
environment:
SAB_NEWSSERVER_HOST:
secure: UNnTfVHDugC9amTucdTRyxe8RZfVBLYfI1EOTaDUjNM=
SAB_NEWSSERVER_USER:
secure: npe0D4TiEzXMUVMCH3+SHA==
SAB_NEWSSERVER_PASSWORD:
secure: 28COv3RG+KAnBLxIrR1EDw==
install:
- pip install --upgrade -r tests/requirements.txt
- pip install pypiwin32 subprocessww
build_script:
- pytest
- python ./tests/test_functional.py

View File

@@ -194,7 +194,7 @@
<fieldset>
<div class="field-pair">
<label class="config" for="nscript_script">$T('opt-nscript_script')</label>
<select name="nscript_script">
<select name="nscript_script" id="nscript_script">
<!--#for $sc in $scripts#-->
<option value="$sc" <!--#if $nscript_script == $sc then 'selected="selected"' else ""#-->>$Tspec($sc)</option>
<!--#end for#-->

View File

@@ -390,9 +390,10 @@
<th class="no-sort">$T('link-download')</th>
<th>$T('rss-filter')</th>
<th>$T('size')</th>
<th width="65%">$T('sort-title')</th>
<th width="60%">$T('sort-title')</th>
<th>$T('category')</th>
<th class="default-sort">$T('nzo-age')</th>
<th>$T('source')</th>
</tr>
</thead>
<!--#for $job in $matched#-->
@@ -411,6 +412,13 @@
<td>$job['title']</td>
<td>$job['cat']</td>
<td data-sort-value="$job['age_ms']">$job['age']</td>
<td data-sort-value="$job['baselink']" title="$job['baselink']">
<!--#if not $job['infourl']#-->
<div class="favicon source-icon" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></div>
<!--#else#-->
<a class="favicon source-icon" href="$job['infourl']" target="_blank" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></a>
<!--#end if#-->
</td>
</tr>
<!--#end for#-->
</table>
@@ -426,9 +434,10 @@
<th class="no-sort">$T('link-download')</th>
<th>$T('rss-filter')</th>
<th>$T('size')</th>
<th width="65%">$T('sort-title')</th>
<th width="60%">$T('sort-title')</th>
<th>$T('category')</th>
<th class="default-sort">$T('nzo-age')</th>
<th>$T('source')</th>
</tr>
</thead>
<!--#for $job in $unmatched#-->
@@ -447,6 +456,13 @@
<td>$job['title']</td>
<td>$job['cat']</td>
<td data-sort-value="$job['age_ms']">$job['age']</td>
<td data-sort-value="$job['baselink']" title="$job['baselink']">
<!--#if not $job['infourl']#-->
<div class="favicon source-icon" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></div>
<!--#else#-->
<a class="favicon source-icon" href="$job['infourl']" target="_blank" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></a>
<!--#end if#-->
</td>
</tr>
<!--#end for#-->
</table>
@@ -476,8 +492,10 @@
<td>$job['title']</td>
<td>$job['cat']</td>
<td data-sort-value="$job['baselink']" title="$job['baselink']">
<!--#if $job['baselink']#-->
<!--#if not $job['infourl']#-->
<div class="favicon source-icon" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></div>
<!--#else#-->
<a class="favicon source-icon" href="$job['infourl']" target="_blank" style="background-image: url(//$job['baselink']/favicon.ico);" data-domain="$job['baselink']"></a>
<!--#end if#-->
</td>
</tr>

View File

@@ -4,16 +4,13 @@ body {
}
#logo {
display: block;
margin: auto;
margin-top: 3px;
margin: 3px auto auto;
}
#content {
color: #000;
padding: 15px 20px 20px;
padding: 65px 20px 20px;
font-size: 13px;
padding-top: 65px;
padding-bottom: 20px;
}
.colmask {
z-index: 20;
@@ -529,7 +526,7 @@ tr.separator {
}
#filebrowser_modal .checkbox {
float: left;
margin: 8px 5px 0x;
margin: 8px 5px 0px;
}
#filebrowser_modal .checkbox input {
margin-top: 1px;
@@ -576,6 +573,7 @@ h2.activeRSS {
float: left;
margin: 0 6px 0 2px;
text-align: center;
color: black !important;
}
.source-icon span {
top: -3px;
@@ -600,8 +598,7 @@ h2.activeRSS {
padding-top: .4em;
}
#subscriptions .chk {
padding: 5px;
padding-top: 8px;
padding: 8px 5px 5px;
vertical-align: middle;
}
#subscriptions .title {
@@ -773,7 +770,6 @@ input[type=radio] {
input[type="button"],
input[type="submit"] {
color: #333;
background-color: #fff;
display:inline-block;
padding:6px 12px;
margin-bottom: 0;
@@ -784,7 +780,7 @@ input[type="submit"] {
white-space:nowrap;
vertical-align:middle;
cursor:pointer;
background-image:none;
background: #fff none;
border:1px solid #ccc;
height: 34px;
}
@@ -1002,7 +998,7 @@ input[type="checkbox"] {
}
.Servers .col2.server-disabled .label {
color: ##777 !important;
color: #777 !important;
}
.Servers .col2 .label:nth-child(2) {
@@ -1063,9 +1059,7 @@ input[type="checkbox"] {
.Servers .col2 label,
.Email .col2 label {
margin: 0;
margin-left: 4px;
margin-top: 2px;
margin: 2px 0 0 4px;
cursor: pointer;
}
@@ -1141,6 +1135,7 @@ input[type="checkbox"] {
}
.value-and-select select {
min-width: 30px;
margin-top: 1px;
}
.dotOne, .dotTwo, .dotThree {
@@ -1341,9 +1336,7 @@ input[type="checkbox"] {
}
.desc {
margin: 0;
margin-left: 3px;
margin-top: 2px;
margin: 2px 0 0 3px;
padding: 0 !important;
}

View File

@@ -76,7 +76,7 @@ legend,
background-color: #666;
}
.navbar-collapse.in .dropdown-menu, {
.navbar-collapse.in .dropdown-menu {
border: none;
}

View File

@@ -105,10 +105,7 @@ h2 {
.navbar-logo {
vertical-align: middle;
display: inline-block;
margin-right: 12px;
margin-left: 15px;
margin-top: 4px;
margin-bottom: -1px;
margin: 4px 12px -1px 15px;
}
.navbar-logo svg {
@@ -288,8 +285,7 @@ li.dropdown {
opacity: 0.9;
color: black;
z-index: 2000;
padding: 1em;
padding-top: 15%;
padding: 15% 1em 1em;
}
.main-filedrop.in span {
@@ -721,8 +717,7 @@ td.delete .dropdown>a {
td.delete input[type="checkbox"],
.add-nzb-inputbox-options input[type="checkbox"]{
margin: 0;
margin-bottom: -2px;
margin: 0 0 -2px;
display: block;
}
@@ -1155,8 +1150,7 @@ tr.queue-item>td:first-child>a {
#history-options {
margin-top: 0;
margin-left: 10px;
padding: 0;
padding-left: 4px;
padding: 0 0 0 4px;
}
#history-options .hover-button {
@@ -1536,8 +1530,7 @@ tr.queue-item>td:first-child>a {
.add-nzb-inputbox span {
display: inline-block;
margin: 8px 2px 0px 5px;
margin-left: -20px;
margin: 8px 2px 0px -20px;
}
.btn-file {
@@ -1630,11 +1623,9 @@ input[name="nzbURL"] {
#modal-item-files .multioperations-selector {
clear: left;
margin: 0;
float: left;
padding: 5px 8px;
margin-bottom: 5px;
margin-right: 10px;
margin: 0 10px 5px 0;
border: 1px solid #cccccc;
}
@@ -2045,9 +2036,8 @@ a:focus {
right: 17px;
display: inline-block;
border-right: 6px solid transparent;
border-bottom: 6px solid #ccc;
border-bottom: 6px solid rgba(0, 0, 0, 0.2);
border-left: 6px solid transparent;
border-bottom-color: rgba(0, 0, 0, 0.2);
content: '';
}

View File

@@ -20,7 +20,7 @@
<div class="form-group">
<label for="host" class="col-sm-4 control-label">$T('srv-host')</label>
<div class="col-sm-8">
<input type="text" class="form-control" name="host" id="host" value="$host" placeholder="$T('wizard-example') news.giganews.com" />
<input type="text" class="form-control" name="host" id="host" value="$host" placeholder="$T('wizard-example') news.newshosting.com" />
</div>
</div>
<div class="form-group">

View File

@@ -88,19 +88,12 @@ label {
float: right;
margin: 0;
}
.sup {
vertical-align: sup !important;
}
.align-right {
text-align: right;
}
.align-center {
text-align: center;
}
.float-center {
float: center;
}
.unselected,
.selected {
display: inline-block;
@@ -123,9 +116,6 @@ label {
.bigger {
font-size: 14px;
}
.padded {
padding: 12px;
}
.bigger input {
font-size: 16px;
}
@@ -135,9 +125,6 @@ label {
.full-width {
width: 100%;
}
.bigbutton {
font-size: 18px !important;
}
.correct {
border: 2px solid #00cc22;
}
@@ -153,7 +140,6 @@ label {
.text-input-wide {
width: 230px;
}
.text-input-thin,
#server-hidden-settings input[type="number"] {
width: 100px;
}

View File

@@ -22,13 +22,13 @@
<p><strong>$T('opt-complete_dir')</strong></p>
<div class="quoteBlock">
$complete_dir
<a href="${access_url}config/folders" class="indented"><span class="glyphicon glyphicon-cog"></span></a>
<a href="${access_url}/config/folders#complete_dir" class="indented"><span class="glyphicon glyphicon-cog"></span></a>
</div>
<p><strong>$T('opt-download_dir')</strong></p>
<div class="quoteBlock">
$download_dir
<a href="${access_url}config/folders" class="indented"><span class="glyphicon glyphicon-cog"></span></a>
<a href="${access_url}/config/folders#complete_dir" class="indented"><span class="glyphicon glyphicon-cog"></span></a>
</div>
<hr/>

View File

@@ -53,7 +53,7 @@ the various releases.
2.4.2 2.4.1 2005 PSF yes
2.4.3 2.4.2 2006 PSF yes
2.5 2.4 2006 PSF yes
2.5.1 2.5 2007 PSF yes
2.7 2.6 2010 PSF yes
Footnotes:
@@ -89,9 +89,9 @@ license to reproduce, analyze, test, perform and/or display publicly,
prepare derivative works, distribute, and otherwise use Python
alone or in any derivative version, provided, however, that PSF's
License Agreement and PSF's notice of copyright, i.e., "Copyright (c)
2001, 2002, 2003, 2004, 2005, 2006, 2007 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative
version prepared by Licensee.
2001, 2002, 2003, 2004, 2005, 2006 Python Software Foundation; All Rights
Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make

View File

Binary file not shown.

View File

@@ -8,14 +8,14 @@ msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2018-03-15 13:04+0000\n"
"PO-Revision-Date: 2013-05-05 14:50+0000\n"
"Last-Translator: shypike <Unknown>\n"
"PO-Revision-Date: 2018-11-27 23:39+0000\n"
"Last-Translator: scootergrisen <scootergrisen@gmail.com>\n"
"Language-Team: Danish <da@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2018-03-16 05:37+0000\n"
"X-Generator: Launchpad (build 18571)\n"
"X-Launchpad-Export-Date: 2018-11-28 05:48+0000\n"
"X-Generator: Launchpad (build 18826)\n"
#: email/email.tmpl:1
msgid ""
@@ -65,13 +65,13 @@ msgid ""
"<!--#end if#-->\n"
msgstr ""
"##\n"
"## Standard Email skabelon til SABnzbd\n"
"## Dette er en Cheetah skabelon\n"
"## Standard E-mail-skabelon til SABnzbd\n"
"## Dette er en Cheetah-skabelon\n"
"## Dokumentation: http://sabnzbd.wikidot.com/email-templates\n"
"##\n"
"## Linjeskift og blanktegn er betydelig!\n"
"## Linjeskift og blanktegn har betydning!\n"
"##\n"
"## Disse er e-mail-headerne \n"
"## Dette er e-mail-headerne \n"
"To: $to\n"
"From: $from\n"
"Date: $date\n"
@@ -79,7 +79,7 @@ msgstr ""
"job $name\n"
"X-priority: 5\n"
"X-MS-priority: 5\n"
"## Efter dette kommer body, den tomme linje kræves!\n"
"## Herefter kommer kroppen, den tomme linje skal være der!\n"
"\n"
"Hej,\n"
"<!--#if $status #-->\n"
@@ -100,13 +100,13 @@ msgstr ""
"<!--#end for#-->\n"
"<!--#end for#-->\n"
"<!--#if $script!=\"\" #-->\n"
"Output fra bruger script \"$script\" (Exit code = $script_ret):\n"
"Output fra brugerscriptet \"$script\" (Afslutningskode = $script_ret):\n"
"$script_output\n"
"<!--#end if#-->\n"
"<!--#if $status #-->\n"
"Enjoy!\n"
"Hav det godt!\n"
"<!--#else#-->\n"
"Sorry!\n"
"Beklager!\n"
"<!--#end if#-->\n"
#: email/rss.tmpl:1
@@ -138,25 +138,25 @@ msgid ""
"Bye\n"
msgstr ""
"##\n"
"## RSS Email skabelon til SABnzbd\n"
"## Dette er Cheetah skabelon\n"
"## RSS E-mail-skabelon til SABnzbd\n"
"## Dette er en Cheetah-skabelon\n"
"## Dokumentation: http://sabnzbd.wikidot.com/email-templates\n"
"##\n"
"## Linjeskift og blanktegn er betydelig!\n"
"## Linjeskift og blanktegn har betydning!\n"
"##\n"
"## Dette er email headers\n"
"## Dette er e-mai-headere\n"
"To: $to\n"
"From: $from\n"
"Date: $date\n"
"Subject: SABnzbd har tilføjet $antal jobs til køen\n"
"X-priority: 5\n"
"X-MS-priority: 5\n"
"## Efter dette kommer body, den tomme linje kræves!\n"
"## Herefter kommer kroppen, den tomme linje skal være der!\n"
"\n"
"Hej,\n"
"\n"
"SABnzbd har tilføjet $antal job(s) til køen.\n"
"De er fra RSS feed \"$feed\".\n"
"De er fra RSS-feedet \"$feed\".\n"
"<!--#for $job in $jobs#-->\n"
" $job <!--#slurp#-->\n"
"<!--#end for#-->\n"
@@ -189,24 +189,24 @@ msgid ""
"Bye\n"
msgstr ""
"##\n"
"## Dårlig URL Fetch E-mail skabelon for SABnzbd\n"
"## Dette er en Cheetah skabelon\n"
"## Dårlig URL-hentning af E-mail-skabelon til SABnzbd\n"
"## Dette er en Cheetah-skabelon\n"
"## Dokumentation: http://sabnzbd.wikidot.com/email-templates\n"
"##\n"
"## Linjeskift og blanktegn er betydelig!\n"
"## Linjeskift og blanktegn har betydning!\n"
"##\n"
"## Dette er email headers\n"
"## Dette er e-mail-headere\n"
"To: $to\n"
"From: $from\n"
"Date: $date\n"
"Subject: SABnzbd kunne ikke hente en NZB\n"
"X-priority: 5\n"
"X-MS-priority: 5\n"
"## Efter dette kommer body, den tomme linje kræves!\n"
"## Herefter kommer kroppen, den tomme linje skal være der!\n"
"\n"
"Hej,\n"
"\n"
"SABnzbd kunne ikke hente NZB fra $url.\n"
"Fejl meddelelsen er: $msg\n"
"Fejlmeddelelsen er: $msg\n"
"\n"
"Farvel\n"

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

File diff suppressed because it is too large Load Diff

View File

@@ -8,14 +8,14 @@ msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2018-03-15 13:05+0000\n"
"PO-Revision-Date: 2017-04-10 11:28+0000\n"
"Last-Translator: Safihre <safihre@sabnzbd.org>\n"
"PO-Revision-Date: 2018-11-27 23:30+0000\n"
"Last-Translator: scootergrisen <scootergrisen@gmail.com>\n"
"Language-Team: Danish <da@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2018-03-16 05:38+0000\n"
"X-Generator: Launchpad (build 18571)\n"
"X-Launchpad-Export-Date: 2018-11-28 05:48+0000\n"
"X-Generator: Launchpad (build 18826)\n"
#: NSIS_Installer.nsi
msgid "Show Release Notes"
@@ -23,7 +23,7 @@ msgstr "Vis udgivelsesbemærkninger"
#: NSIS_Installer.nsi
msgid "Start SABnzbd"
msgstr ""
msgstr "Start SABnzbd"
#: NSIS_Installer.nsi
msgid "Support the project, Donate!"
@@ -38,7 +38,7 @@ msgid ""
"The installation directory has changed (now in \"Program Files\"). \\nIf you "
"run SABnzbd as a service, you need to update the service settings."
msgstr ""
"Installationsmappen er ændret (nu i \"Program Files \"). \\nHvis du kører "
"Installationsmappen er ændret (nu i \"Program Files\"). \\nHvis du kører "
"SABnzbd som en tjeneste, skal du opdatere tjenesteindstillingerne."
#: NSIS_Installer.nsi
@@ -55,7 +55,7 @@ msgstr "Skrivebordsikon"
#: NSIS_Installer.nsi
msgid "NZB File association"
msgstr "NZB filtilknytning"
msgstr "NZB-filtilknytning"
#: NSIS_Installer.nsi
msgid "Delete Program"
@@ -70,20 +70,20 @@ msgid ""
"This system requires the Microsoft runtime library VC90 to be installed "
"first. Do you want to do that now?"
msgstr ""
"Dette system kræver, at Microsoft runtime biblioteket VC90 skal installeres "
"først. Ønsker du at gøre det nu?"
"Systemet kræver at Microsoft runtime-biblioteket VC90 skal installeres "
"først. Vil du gøre det nu?"
#: NSIS_Installer.nsi
msgid "Downloading Microsoft runtime installer..."
msgstr "Downloader Microsoft runtime installationsfil..."
msgstr "Downloader Microsoft runtime-installationsfil..."
#: NSIS_Installer.nsi
msgid "Download error, retry?"
msgstr "Download fejl, prøv igen?"
msgstr "Fejl ved download, prøv igen?"
#: NSIS_Installer.nsi
msgid "Cannot install without runtime library, retry?"
msgstr "Kan ikke installere uden runtime bibliotek, prøv igen?"
msgstr "Kan ikke installere uden runtime-bibliotek, prøv igen?"
#: NSIS_Installer.nsi
msgid ""
@@ -91,8 +91,7 @@ msgid ""
"the previous version or `Cancel` to cancel this upgrade."
msgstr ""
"Du kan ikke overskrive en eksisterende installation. \\n\\nKlik `OK` for at "
"fjerne den tidligere version eller `Annuller` for at annullere denne "
"opgradering."
"fjerne den tidligere version eller `Annuller` for at annullere opgraderingen."
#: NSIS_Installer.nsi
msgid "Your settings and data will be preserved."

View File

@@ -202,7 +202,7 @@ def sig_handler(signum=None, frame=None):
INIT_LOCK = Lock()
def connect_db(thread_index=0):
def get_db_connection(thread_index=0):
# Create a connection and store it in the current thread
if not (hasattr(cherrypy.thread_data, 'history_db') and cherrypy.thread_data.history_db):
cherrypy.thread_data.history_db = sabnzbd.database.HistoryDB()
@@ -223,7 +223,7 @@ def initialize(pause_downloader=False, clean_up=False, evalSched=False, repair=0
__SHUTTING_DOWN__ = False
# Set global database connection for Web-UI threads
cherrypy.engine.subscribe('start_thread', connect_db)
cherrypy.engine.subscribe('start_thread', get_db_connection)
# Paused?
pause_downloader = pause_downloader or cfg.start_paused()
@@ -661,13 +661,13 @@ def add_nzbfile(nzbfile, pp=None, script=None, cat=None, priority=NORMAL_PRIORIT
try:
filename = nzbfile.filename.encode('cp1252').decode('utf-8')
except:
# Correct encoding afterall!
# Correct encoding after all!
filename = nzbfile.filename
filename = encoding.special_fixer(filename)
keep = False
if not sabnzbd.WIN32:
# If windows client sends file to Unix server backslashed may
# If windows client sends file to Unix server backslashes may
# be included, so convert these
filename = filename.replace('\\', '/')
@@ -963,9 +963,9 @@ def save_admin(data, _id):
try:
with open(path, 'wb') as data_file:
if cfg.use_pickle():
data = pickle.dump(data, data_file)
pickle.dump(data, data_file)
else:
data = cPickle.dump(data, data_file)
cPickle.dump(data, data_file)
break
except:
if t == 2:
@@ -1008,12 +1008,12 @@ def pp_to_opts(pp):
# Convert the pp to an int
pp = sabnzbd.interface.int_conv(pp)
if pp == 0:
return (False, False, False)
return False, False, False
if pp == 1:
return (True, False, False)
return True, False, False
if pp == 2:
return (True, True, False)
return (True, True, True)
return True, True, False
return True, True, True
def opts_to_pp(repair, unpack, delete):
@@ -1195,6 +1195,10 @@ def test_cert_checking():
On systems with at least Python > 2.7.9
"""
if sabnzbd.HAVE_SSL_CONTEXT:
# User disabled the test, assume proper SSL certificates
if not cfg.selftest_host():
return True
# Try a connection to our test-host
try:
import ssl
ctx = ssl.create_default_context()
@@ -1204,7 +1208,7 @@ def test_cert_checking():
ssl_sock.connect((cfg.selftest_host(), 443))
ssl_sock.close()
return True
except (socket.gaierror, socket.timeout) as e:
except (socket.gaierror, socket.timeout):
# Non-SSL related error.
# We now assume that certificates work instead of forcing
# lower quality just because some (temporary) internet problem

View File

@@ -29,6 +29,7 @@ import cherrypy
import locale
from threading import Thread
try:
locale.setlocale(locale.LC_ALL, "")
except:
@@ -42,8 +43,8 @@ except ImportError:
import sabnzbd
from sabnzbd.constants import VALID_ARCHIVES, VALID_NZB_FILES, Status, \
TOP_PRIORITY, REPAIR_PRIORITY, HIGH_PRIORITY, NORMAL_PRIORITY, LOW_PRIORITY, \
KIBI, MEBI, GIGI, JOB_ADMIN
TOP_PRIORITY, REPAIR_PRIORITY, HIGH_PRIORITY, NORMAL_PRIORITY, LOW_PRIORITY, \
KIBI, MEBI, GIGI, JOB_ADMIN
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.downloader import Downloader
@@ -64,14 +65,12 @@ from sabnzbd.articlecache import ArticleCache
from sabnzbd.utils.servertests import test_nntp_server_dict
from sabnzbd.bpsmeter import BPSMeter
from sabnzbd.rating import Rating
from sabnzbd.getipaddress import localipv4, publicipv4, ipv6
from sabnzbd.getipaddress import localipv4, publicipv4, ipv6, addresslookup
from sabnzbd.newsunpack import userxbit
from sabnzbd.database import build_history_info, unpack_history_info, HistoryDB
import sabnzbd.notifier
import sabnzbd.rss
import sabnzbd.emailer
import sabnzbd.getipaddress as getipaddress
##############################################################################
# API error messages
@@ -87,7 +86,6 @@ _MSG_OUTPUT_FORMAT = 'Format not supported'
_MSG_NO_SUCH_CONFIG = 'Config item does not exist'
_MSG_BAD_SERVER_PARMS = 'Incorrect server settings'
# For Windows: determine executable extensions
if os.name == 'nt':
PATHEXT = os.environ.get('PATHEXT', '').lower().split(';')
@@ -220,6 +218,8 @@ def _api_queue_pause(output, value, kwargs):
if value:
items = value.split(',')
handled = NzbQueue.do.pause_multiple_nzo(items)
else:
handled = False
return report(output, keyword='', data={'status': bool(handled), 'nzo_ids': handled})
@@ -228,6 +228,8 @@ def _api_queue_resume(output, value, kwargs):
if value:
items = value.split(',')
handled = NzbQueue.do.resume_multiple_nzo(items)
else:
handled = False
return report(output, keyword='', data={'status': bool(handled), 'nzo_ids': handled})
@@ -341,7 +343,7 @@ def _api_addfile(name, output, kwargs):
# Indexer category, so do mapping
cat = cat_convert(xcat)
res = sabnzbd.add_nzbfile(name, kwargs.get('pp'), kwargs.get('script'), cat,
kwargs.get('priority'), kwargs.get('nzbname'))
kwargs.get('priority'), kwargs.get('nzbname'))
return report(output, keyword='', data={'status': res[0] == 0, 'nzo_ids': res[1]}, compat=True)
else:
return report(output, _MSG_NO_VALUE)
@@ -462,6 +464,7 @@ def _api_change_opts(name, output, kwargs):
""" API: accepts output, value(=nzo_id), value2(=pp) """
value = kwargs.get('value')
value2 = kwargs.get('value2')
result = 0
if value and value2 and value2.isdigit():
result = NzbQueue.do.change_opts(value, int(value2))
return report(output, keyword='status', data=bool(result > 0))
@@ -483,7 +486,6 @@ def _api_history(name, output, kwargs):
failed_only = kwargs.get('failed_only')
categories = kwargs.get('category')
# Do we need to send anything?
if last_history_update == sabnzbd.LAST_HISTORY_UPDATE:
return report(output, keyword='history', data=False)
@@ -498,7 +500,7 @@ def _api_history(name, output, kwargs):
special = value.lower()
del_files = bool(int_conv(kwargs.get('del_files')))
if special in ('all', 'failed', 'completed'):
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
if special in ('all', 'failed'):
if del_files:
del_job_files(history_db.get_failed_paths(search))
@@ -519,7 +521,7 @@ def _api_history(name, output, kwargs):
history = {}
grand, month, week, day = BPSMeter.do.get_sums()
history['total_size'], history['month_size'], history['week_size'], history['day_size'] = \
to_units(grand), to_units(month), to_units(week), to_units(day)
to_units(grand), to_units(month), to_units(week), to_units(day)
history['slots'], fetched_items, history['noofslots'] = build_history(start=start,
limit=limit, verbose=True,
search=search, failed_only=failed_only,
@@ -724,9 +726,7 @@ def _api_reset_quota(name, output, kwargs):
def _api_test_email(name, output, kwargs):
""" API: send a test email, return result """
logging.info("Sending test email")
pack = {}
pack['download'] = ['action 1', 'action 2']
pack['unpack'] = ['action 1', 'action 2']
pack = {'download': ['action 1', 'action 2'], 'unpack': ['action 1', 'action 2']}
res = sabnzbd.emailer.endjob(u'I had a d\xe8ja vu', 'unknown', True,
os.path.normpath(os.path.join(cfg.complete_dir.get_path(), u'/unknown/I had a d\xe8ja vu')),
123 * MEBI, None, pack, 'my_script', u'Line 1\nLine 2\nLine 3\nd\xe8ja vu\n', 0,
@@ -802,7 +802,6 @@ def _api_browse(name, output, kwargs):
compact = kwargs.get('compact')
if compact and compact == '1':
paths = []
name = platform_encode(kwargs.get('term', ''))
paths = [entry['path'] for entry in folders_at_path(os.path.dirname(name)) if 'path' in entry]
return report(output, keyword='', data=paths)
@@ -892,12 +891,11 @@ def _api_config_undefined(output, kwargs):
def _api_server_stats(name, output, kwargs):
""" API: accepts output """
sum_t, sum_m, sum_w, sum_d = BPSMeter.do.get_sums()
stats = {'total': sum_t, 'month': sum_m, 'week': sum_w, 'day': sum_d}
stats = {'total': sum_t, 'month': sum_m, 'week': sum_w, 'day': sum_d, 'servers': {}}
stats['servers'] = {}
for svr in config.get_servers():
t, m, w, d, daily = BPSMeter.do.amounts(svr)
stats['servers'][svr] = {'total': t or 0, 'month': m or 0, 'week': w or 0, 'day': d or 0, 'daily': daily or {} }
stats['servers'][svr] = {'total': t or 0, 'month': m or 0, 'week': w or 0, 'day': d or 0, 'daily': daily or {}}
return report(output, keyword='', data=stats)
@@ -1150,6 +1148,24 @@ def handle_rss_api(output, kwargs):
feed.set_dict(kwargs)
else:
config.ConfigRSS(name, kwargs)
action = kwargs.get('filter_action')
if action in ('add', 'update'):
# Use the general function, but catch the redirect-raise
try:
kwargs['feed'] = name
sabnzbd.interface.ConfigRss('/').internal_upd_rss_filter(**kwargs)
except cherrypy.HTTPRedirect:
pass
elif action == 'delete':
# Use the general function, but catch the redirect-raise
try:
kwargs['feed'] = name
sabnzbd.interface.ConfigRss('/').internal_del_rss_filter(**kwargs)
except cherrypy.HTTPRedirect:
pass
return name
@@ -1198,7 +1214,7 @@ def build_status(skip_dashboard=False, output=None):
info['ipv6'] = ipv6()
# Dashboard: DNS-check
try:
getipaddress.addresslookup(cfg.selftest_host())
addresslookup(cfg.selftest_host())
info['dnslookup'] = "OK"
except:
info['dnslookup'] = None
@@ -1233,10 +1249,10 @@ def build_status(skip_dashboard=False, output=None):
# For the templates or for JSON
if output:
thread_info = { 'thrdnum': nw.thrdnum,
'art_name': art_name,
'nzf_name': nzf_name,
'nzo_name': nzo_name }
thread_info = {'thrdnum': nw.thrdnum,
'art_name': art_name,
'nzf_name': nzf_name,
'nzo_name': nzo_name}
serverconnections.append(thread_info)
else:
serverconnections.append((nw.thrdnum, art_name, nzf_name, nzo_name))
@@ -1253,20 +1269,20 @@ def build_status(skip_dashboard=False, output=None):
# For the templates or for JSON
if output:
server_info = { 'servername': server.displayname,
'serveractiveconn': connected,
'servertotalconn': server.threads,
'serverconnections': serverconnections,
'serverssl': server.ssl,
'serversslinfo': server.ssl_info,
'serveractive': server.active,
'servererror': server.errormsg,
'serverpriority': server.priority,
'serveroptional': server.optional }
server_info = {'servername': server.displayname,
'serveractiveconn': connected,
'servertotalconn': server.threads,
'serverconnections': serverconnections,
'serverssl': server.ssl,
'serversslinfo': server.ssl_info,
'serveractive': server.active,
'servererror': server.errormsg,
'serverpriority': server.priority,
'serveroptional': server.optional}
info['servers'].append(server_info)
else:
info['servers'].append((server.displayname, '', connected, serverconnections, server.ssl,
server.active, server.errormsg, server.priority, server.optional))
server.active, server.errormsg, server.priority, server.optional))
info['warnings'] = sabnzbd.GUIHANDLER.content()
@@ -1346,10 +1362,10 @@ def build_queue(start=0, limit=0, trans=False, output=None, search=None):
# Ensure compatibility of API status
if status == Status.DELETED or priority == TOP_PRIORITY:
status = Status.DOWNLOADING
slot['status'] = "%s" % (status)
slot['status'] = "%s" % status
if (Downloader.do.paused or Downloader.do.postproc or is_propagating or \
status not in (Status.DOWNLOADING, Status.FETCHING, Status.QUEUED)) and priority != TOP_PRIORITY:
if (Downloader.do.paused or Downloader.do.postproc or is_propagating or
status not in (Status.DOWNLOADING, Status.FETCHING, Status.QUEUED)) and priority != TOP_PRIORITY:
slot['timeleft'] = '0:00:00'
slot['eta'] = 'unknown'
else:
@@ -1510,16 +1526,17 @@ def options_list(output):
})
def retry_job(job, new_nzb, password):
def retry_job(job, new_nzb=None, password=None):
""" Re enter failed job in the download queue """
if job:
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
futuretype, url, pp, script, cat = history_db.get_other(job)
if futuretype:
if pp == 'X':
pp = None
sabnzbd.add_url(url, pp, script, cat)
nzo_id = sabnzbd.add_url(url, pp, script, cat)
history_db.remove_history(job)
return nzo_id
else:
path = history_db.get_path(job)
if path:
@@ -1531,8 +1548,13 @@ def retry_job(job, new_nzb, password):
def retry_all_jobs():
""" Re enter all failed jobs in the download queue """
history_db = sabnzbd.connect_db()
return NzbQueue.do.retry_all_jobs(history_db)
# Fetch all retryable folders from History
items = sabnzbd.api.build_history()[0]
nzo_ids = []
for item in items:
if item['retry']:
nzo_ids.append(retry_job(item['nzo_id']))
return nzo_ids
def del_job_files(job_paths):
@@ -1549,7 +1571,7 @@ def del_hist_job(job, del_files):
if path:
PostProcessor.do.delete(job, del_files=del_files)
else:
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
path = history_db.get_path(job)
history_db.remove_history(job)
@@ -1568,7 +1590,9 @@ def Tspec(txt):
return txt
_SKIN_CACHE = {} # Stores pre-translated acronyms
_SKIN_CACHE = {} # Stores pre-translated acronyms
# This special is to be used in interface.py for template processing
# to be passed for the $T function: so { ..., 'T' : Ttemplate, ...}
def Ttemplate(txt):
@@ -1685,7 +1709,6 @@ def build_queue_header(search=None, start=0, limit=0, output=None):
header['size'] = format_bytes(bytes)
header['noofslots_total'] = qnfo.q_fullsize
status = ''
if Downloader.do.paused or Downloader.do.postproc:
status = Status.PAUSED
elif bytespersec > 0:
@@ -1700,15 +1723,13 @@ def build_queue_header(search=None, start=0, limit=0, output=None):
# new eta format: 16:00 Fri 07 Feb
header['eta'] = datestart.strftime(time_format('%H:%M %a %d %b')).decode(codepage)
except:
datestart = datetime.datetime.now()
header['eta'] = T('unknown')
return (header, qnfo.list, bytespersec, qnfo.q_fullsize, qnfo.bytes_left_previous_page)
return header, qnfo.list, bytespersec, qnfo.q_fullsize, qnfo.bytes_left_previous_page
def build_history(start=None, limit=None, verbose=False, verbose_list=None, search=None, failed_only=0,
categories=None, output=None):
if output:
converter = unicoder
else:
@@ -1761,7 +1782,7 @@ def build_history(start=None, limit=None, verbose=False, verbose_list=None, sear
# Aquire the db instance
try:
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
close_db = False
except:
# Required for repairs at startup because Cherrypy isn't active yet
@@ -1772,7 +1793,6 @@ def build_history(start=None, limit=None, verbose=False, verbose_list=None, sear
if not h_limit:
items, fetched_items, total_items = history_db.fetch_history(h_start, 1, search, failed_only, categories)
items = []
fetched_items = 0
else:
items, fetched_items, total_items = history_db.fetch_history(h_start, h_limit, search, failed_only, categories)
@@ -1857,7 +1877,7 @@ def build_history(start=None, limit=None, verbose=False, verbose_list=None, sear
if close_db:
history_db.close()
return (items, fetched_items, total_items)
return items, fetched_items, total_items
def get_active_history(queue=None, items=None):
@@ -2013,7 +2033,6 @@ def history_remove_failed():
del_job_files(history_db.get_failed_paths())
history_db.remove_failed()
history_db.close()
del history_db
def history_remove_completed():
@@ -2022,4 +2041,3 @@ def history_remove_completed():
history_db = HistoryDB()
history_db.remove_completed()
history_db.close()
del history_db

View File

@@ -28,9 +28,9 @@ from time import sleep
import hashlib
import sabnzbd
from sabnzbd.misc import get_filepath, sanitize_filename, get_unique_filename, renamer, \
set_permissions, long_path, clip_path, has_win_device, get_all_passwords, diskspace, \
get_filename, get_ext
from sabnzbd.misc import get_filepath, sanitize_filename, set_permissions, \
long_path, clip_path, has_win_device, get_all_passwords, diskspace, \
get_filename, get_ext, is_rarfile
from sabnzbd.constants import Status, GIGI
import sabnzbd.cfg as cfg
from sabnzbd.articlecache import ArticleCache
@@ -81,11 +81,6 @@ class Assembler(Thread):
# Abort all direct unpackers, just to be sure
sabnzbd.directunpacker.abort_all()
# Place job back in queue and wait 30 seconds to hope it gets resolved
self.process(job)
sleep(30)
continue
# Prepare filename
nzo.verify_nzf_filename(nzf)
nzf.filename = sanitize_filename(nzf.filename)
@@ -117,7 +112,7 @@ class Assembler(Thread):
nzf.remove_admin()
# Do rar-related processing
if rarfile.is_rarfile(filepath):
if is_rarfile(filepath):
# Encryption and unwanted extension detection
rar_encrypted, unwanted_file = check_encrypted_and_unwanted_files(nzo, filepath)
if rar_encrypted:
@@ -246,7 +241,7 @@ def check_encrypted_and_unwanted_files(nzo, filepath):
return encrypted, unwanted
# Is it even a rarfile?
if rarfile.is_rarfile(filepath):
if is_rarfile(filepath):
# Open the rar
rarfile.UNRAR_TOOL = sabnzbd.newsunpack.RAR_COMMAND
zf = rarfile.RarFile(filepath, all_names=True)
@@ -334,11 +329,11 @@ def nzo_filtered_by_rating(nzo):
nzo.rating_filtered = 1
reason = rating_filtered(rating, nzo.filename.lower(), True)
if reason is not None:
return (2, reason)
return 2, reason
reason = rating_filtered(rating, nzo.filename.lower(), False)
if reason is not None:
return (1, reason)
return (0, "")
return 1, reason
return 0, ""
def rating_filtered(rating, filename, abort):

View File

@@ -265,6 +265,7 @@ no_penalties = OptionBool('misc', 'no_penalties', False)
debug_log_decoding = OptionBool('misc', 'debug_log_decoding', False)
ignore_empty_files = OptionBool('misc', 'ignore_empty_files', False)
x_frame_options = OptionBool('misc', 'x_frame_options', True)
require_modern_tls = OptionBool('misc', 'require_modern_tls', False)
# Text values
rss_odd_titles = OptionList('misc', 'rss_odd_titles', ['nzbindex.nl/', 'nzbindex.com/', 'nzbclub.com/'])

View File

@@ -411,8 +411,8 @@ class ConfigServer(object):
except KeyError:
continue
exec 'self.%s.set(value)' % kw
if not self.displayname():
self.displayname.set(self.__name)
if not self.displayname():
self.displayname.set(self.__name)
return True
def get_dict(self, safe=False):
@@ -463,7 +463,7 @@ class ConfigCat(object):
self.pp = OptionStr(name, 'pp', '', add=False)
self.script = OptionStr(name, 'script', 'Default', add=False)
self.dir = OptionDir(name, 'dir', add=False, create=False)
self.newzbin = OptionList(name, 'newzbin', add=False)
self.newzbin = OptionList(name, 'newzbin', add=False, validation=validate_single_tag)
self.priority = OptionNumber(name, 'priority', DEFAULT_PRIORITY, add=False)
self.set_dict(values)
@@ -896,7 +896,7 @@ def get_servers():
return {}
def define_categories(force=False):
def define_categories():
""" Define categories listed in the Setup file
return a list of ConfigCat instances
"""
@@ -990,7 +990,7 @@ def get_rss():
for feed_uri in feed.uri():
if new_feed_uris and not urlparse(feed_uri).scheme and urlparse(new_feed_uris[-1]).scheme:
# Current one has no scheme but previous one does, append to previous
new_feed_uris[-1] += '%2C' + feed_uri
new_feed_uris[-1] += ',' + feed_uri
have_new_uri = True
continue
# Add full working URL
@@ -1102,6 +1102,16 @@ def validate_notempty(root, value, default):
return None, default
def validate_single_tag(value):
""" Don't split single indexer tags like "TV > HD"
into ['TV', '>', 'HD']
"""
if len(value) == 3:
if value[1] == '>':
return None, ' '.join(value)
return None, value
def create_api_key():
""" Return a new randomized API_KEY """
# Create some values to seed md5

View File

@@ -123,7 +123,7 @@ year_match = r'[\W]([1|2]\d{3})([^\w]|$)' # Something '(YYYY)' or '.YYYY.' or '
sample_match = r'((^|[\W_])(sample|proof))' # something-sample or something-proof
class Status():
class Status:
COMPLETED = 'Completed' # PP: Job is finished
CHECKING = 'Checking' # Q: Pre-check is running
DOWNLOADING = 'Downloading' # Q: Normal downloading

View File

@@ -40,7 +40,7 @@ from sabnzbd.constants import DB_HISTORY_NAME, STAGES
from sabnzbd.encoding import unicoder
from sabnzbd.bpsmeter import this_week, this_month
from sabnzbd.decorators import synchronized
from sabnzbd.misc import get_all_passwords, int_conv, remove_file, caller_name
from sabnzbd.misc import int_conv, remove_file, caller_name
DB_LOCK = threading.RLock()
@@ -118,7 +118,7 @@ class HistoryDB(object):
self.execute('ALTER TABLE "history" ADD COLUMN password TEXT;')
def execute(self, command, args=(), save=False):
''' Wrapper for executing SQL commands '''
""" Wrapper for executing SQL commands """
for tries in xrange(5, 0, -1):
try:
if args and isinstance(args, tuple):
@@ -314,7 +314,7 @@ class HistoryDB(object):
# Stage Name is separated by ::: stage lines by ; and stages by \r\n
items = [unpack_history_info(item) for item in items]
return (items, fetched_items, total_items)
return items, fetched_items, total_items
def have_episode(self, series, season, episode):
""" Check whether History contains this series episode """
@@ -375,7 +375,7 @@ class HistoryDB(object):
except AttributeError:
pass
return (total, month, week)
return total, month, week
def get_script_log(self, nzo_id):
""" Return decompressed log file """
@@ -400,7 +400,7 @@ class HistoryDB(object):
return name
def get_path(self, nzo_id):
""" Return the `incomplete` path of the job `nzo_id` """
""" Return the `incomplete` path of the job `nzo_id` if it is still there """
t = (nzo_id,)
path = ''
if self.execute('SELECT path FROM history WHERE nzo_id=?', t):
@@ -408,7 +408,9 @@ class HistoryDB(object):
path = self.c.fetchone().get('path')
except AttributeError:
pass
return path
if os.path.exists(path):
return path
return None
def get_other(self, nzo_id):
""" Return additional data for job `nzo_id` """
@@ -421,9 +423,10 @@ class HistoryDB(object):
pp = items.get('pp')
script = items.get('script')
cat = items.get('category')
return dtype, url, pp, script, cat
except (AttributeError, IndexError):
return '', '', '', '', ''
return dtype, url, pp, script, cat
pass
return '', '', '', '', ''
def dict_factory(cursor, row):

View File

@@ -125,7 +125,7 @@ class Decoder(Thread):
nzf.article_count += 1
found = True
except IOError, e:
except IOError:
logme = T('Decoding %s failed') % art_id
logging.warning(logme)
logging.info("Traceback: ", exc_info=True)
@@ -134,7 +134,7 @@ class Decoder(Thread):
sabnzbd.nzbqueue.NzbQueue.do.reset_try_lists(article)
register = False
except MemoryError, e:
except MemoryError:
logme = T('Decoder failure: Out of memory')
logging.warning(logme)
anfo = sabnzbd.articlecache.ArticleCache.do.cache_info()
@@ -240,7 +240,6 @@ class Decoder(Thread):
nzf = article.nzf
yenc, data = yCheck(data)
ybegin, ypart, yend = yenc
decoded_data = None
# Deal with non-yencoded posts
if not ybegin:
@@ -379,7 +378,7 @@ def yCheck(data):
except IndexError:
break
return ((ybegin, ypart, yend), data)
return (ybegin, ypart, yend), data
# Example: =ybegin part=1 line=128 size=123 name=-=DUMMY=- abc.par
YSPLIT_RE = re.compile(r'([a-zA-Z0-9]+)=')

View File

@@ -28,9 +28,10 @@ import logging
import sabnzbd
import sabnzbd.cfg as cfg
from sabnzbd.misc import int_conv, clip_path, long_path, remove_all, globber, \
format_time_string, has_win_device, real_path, remove_file
from sabnzbd.misc import int_conv, clip_path, long_path, remove_all, \
format_time_string, real_path, remove_file
from sabnzbd.encoding import TRANS, unicoder
from sabnzbd.decorators import synchronized
from sabnzbd.newsunpack import build_command, EXTRACTFROM_RE, EXTRACTED_RE, rar_volumelist
from sabnzbd.postproc import prepare_extraction_path
from sabnzbd.utils.rarfile import RarFile
@@ -46,6 +47,10 @@ if sabnzbd.WIN32:
# Load the regular POpen (which is now patched on Windows)
from subprocess import Popen
# Need a lock to make sure start and stop is handled correctlty
# Otherwise we could stop while the thread was still starting
START_STOP_LOCK = threading.RLock()
MAX_ACTIVE_UNPACKERS = 10
ACTIVE_UNPACKERS = []
@@ -110,6 +115,7 @@ class DirectUnpacker(threading.Thread):
if none_counter > found_counter:
self.total_volumes = {}
@synchronized(START_STOP_LOCK)
def add(self, nzf):
""" Add jobs and start instance of DirectUnpack """
if not cfg.direct_unpack_tested():
@@ -170,10 +176,10 @@ class DirectUnpacker(threading.Thread):
break
# Error? Let PP-handle it
if linebuf.endswith(('ERROR: ', 'Cannot create', 'in the encrypted file', 'CRC failed', \
'checksum failed', 'You need to start extraction from a previous volume', \
'password is incorrect', 'Write error', 'checksum error', \
'start extraction from a previous volume')):
if linebuf.endswith(('ERROR: ', 'Cannot create', 'in the encrypted file', 'CRC failed',
'checksum failed', 'You need to start extraction from a previous volume',
'password is incorrect', 'Write error', 'checksum error',
'start extraction from a previous volume')):
logging.info('Error in DirectUnpack of %s', self.cur_setname)
self.abort()
@@ -309,6 +315,7 @@ class DirectUnpacker(threading.Thread):
with self.next_file_lock:
self.next_file_lock.wait()
@synchronized(START_STOP_LOCK)
def create_unrar_instance(self):
""" Start the unrar instance using the user's options """
# Generate extraction path and save for post-proc
@@ -366,9 +373,10 @@ class DirectUnpacker(threading.Thread):
# Doing the first
logging.info('DirectUnpacked volume %s for %s', self.cur_volume, self.cur_setname)
@synchronized(START_STOP_LOCK)
def abort(self):
""" Abort running instance and delete generated files """
if not self.killed:
if not self.killed and self.cur_setname:
logging.info('Aborting DirectUnpack for %s', self.cur_setname)
self.killed = True
@@ -404,7 +412,6 @@ class DirectUnpacker(threading.Thread):
except:
# The user will have to remove it themselves
logging.info('Failed to clean Direct Unpack after aborting %s', rarfile_nzf.filename, exc_info=True)
pass
else:
# We can just remove the whole path
remove_all(extraction_path, recursive=True)

View File

@@ -76,7 +76,7 @@ def is_archive(path):
except:
logging.info(T('Cannot read %s'), path, exc_info=True)
return -1, None, ''
elif rarfile.is_rarfile(path):
elif misc.is_rarfile(path):
try:
# Set path to tool to open it
rarfile.UNRAR_TOOL = sabnzbd.newsunpack.RAR_COMMAND
@@ -144,7 +144,7 @@ def ProcessArchiveFile(filename, path, pp=None, script=None, cat=None, catdir=No
priority=priority, nzbname=nzbname)
if not nzo.password:
nzo.password = password
except (TypeError, ValueError) as e:
except (TypeError, ValueError):
# Duplicate or empty, ignore
pass
except:
@@ -232,7 +232,7 @@ def ProcessSingleFile(filename, path, pp=None, script=None, cat=None, catdir=Non
# Empty, but correct file
return -1, nzo_ids
except:
if data.find("<nzb") >= 0 and data.find("</nzb") < 0:
if data.find("<nzb") >= 0 > data.find("</nzb"):
# Looks like an incomplete file, retry
return -2, nzo_ids
else:

View File

@@ -305,13 +305,13 @@ class Downloader(Thread):
self.force_disconnect = True
def limit_speed(self, value):
''' Set the actual download speed in Bytes/sec
""" Set the actual download speed in Bytes/sec
When 'value' ends with a '%' sign or is within 1-100, it is interpreted as a pecentage of the maximum bandwidth
When no '%' is found, it is interpreted as an absolute speed (including KMGT notation).
'''
"""
if value:
mx = cfg.bandwidth_max.get_int()
if '%' in str(value) or (from_units(value) > 0 and from_units(value) < 101):
if '%' in str(value) or (0 < from_units(value) < 101):
limit = value.strip(' %')
self.bandwidth_perc = from_units(limit)
if mx:
@@ -369,24 +369,24 @@ class Downloader(Thread):
# Was it resolving problem?
if server.info is False:
# Warn about resolving issues
errormsg = T('Cannot connect to server %s [%s]') % (server.id, T('Server name does not resolve'))
errormsg = T('Cannot connect to server %s [%s]') % (server.host, T('Server name does not resolve'))
if server.errormsg != errormsg:
server.errormsg = errormsg
logging.warning(errormsg)
logging.warning(T('Server %s will be ignored for %s minutes'), server.id, _PENALTY_TIMEOUT)
logging.warning(T('Server %s will be ignored for %s minutes'), server.host, _PENALTY_TIMEOUT)
# Not fully the same as the code below for optional servers
server.bad_cons = 0
server.active = False
self.plan_server(server.id, _PENALTY_TIMEOUT)
self.plan_server(server, _PENALTY_TIMEOUT)
# Optional and active server had too many problems.
# Disable it now and send a re-enable plan to the scheduler
if server.optional and server.active and (server.bad_cons / server.threads) > 3:
server.bad_cons = 0
server.active = False
logging.warning(T('Server %s will be ignored for %s minutes'), server.id, _PENALTY_TIMEOUT)
self.plan_server(server.id, _PENALTY_TIMEOUT)
logging.warning(T('Server %s will be ignored for %s minutes'), server.host, _PENALTY_TIMEOUT)
self.plan_server(server, _PENALTY_TIMEOUT)
# Remove all connections to server
for nw in server.idle_threads + server.busy_threads:
@@ -472,7 +472,7 @@ class Downloader(Thread):
if server.retention and article.nzf.nzo.avg_stamp < time.time() - server.retention:
# Let's get rid of all the articles for this server at once
logging.info('Job %s too old for %s, moving on', article.nzf.nzo.work_name, server.id)
logging.info('Job %s too old for %s, moving on', article.nzf.nzo.work_name, server.host)
while article:
self.decode(article, None, None)
article = article.nzf.nzo.get_article(server, self.servers)
@@ -487,10 +487,10 @@ class Downloader(Thread):
self.__request_article(nw)
else:
try:
logging.info("%s@%s: Initiating connection", nw.thrdnum, server.id)
logging.info("%s@%s: Initiating connection", nw.thrdnum, server.host)
nw.init_connect(self.write_fds)
except:
logging.error(T('Failed to initialize %s@%s with reason: %s'), nw.thrdnum, server.id, sys.exc_info()[1])
logging.error(T('Failed to initialize %s@%s with reason: %s'), nw.thrdnum, server.host, sys.exc_info()[1])
self.__reset_nw(nw, "failed to initialize")
# Exit-point
@@ -619,7 +619,7 @@ class Downloader(Thread):
try:
nw.finish_connect(nw.status_code)
if sabnzbd.LOG_ALL:
logging.debug("%s@%s last message -> %s", nw.thrdnum, nw.server.id, nntp_to_msg(nw.data))
logging.debug("%s@%s last message -> %s", nw.thrdnum, nw.server.host, nntp_to_msg(nw.data))
nw.clear_data()
except NNTPPermanentError, error:
# Handle login problems
@@ -636,9 +636,9 @@ class Downloader(Thread):
errormsg = T('Too many connections to server %s') % display_msg
if server.errormsg != errormsg:
server.errormsg = errormsg
logging.warning(T('Too many connections to server %s'), server.id)
logging.warning(T('Too many connections to server %s'), server.host)
self.__reset_nw(nw, None, warn=False, destroy=True, quit=True)
self.plan_server(server.id, _PENALTY_TOOMANY)
self.plan_server(server, _PENALTY_TOOMANY)
server.threads -= 1
elif ecode in ('502', '481', '482') and clues_too_many_ip(msg):
# Account sharing?
@@ -646,7 +646,7 @@ class Downloader(Thread):
errormsg = T('Probable account sharing') + display_msg
if server.errormsg != errormsg:
server.errormsg = errormsg
name = ' (%s)' % server.id
name = ' (%s)' % server.host
logging.warning(T('Probable account sharing') + name)
penalty = _PENALTY_SHARE
block = True
@@ -656,7 +656,7 @@ class Downloader(Thread):
errormsg = T('Failed login for server %s') % display_msg
if server.errormsg != errormsg:
server.errormsg = errormsg
logging.error(T('Failed login for server %s'), server.id)
logging.error(T('Failed login for server %s'), server.host)
penalty = _PENALTY_PERM
block = True
elif ecode in ('502', '482'):
@@ -665,7 +665,7 @@ class Downloader(Thread):
errormsg = T('Cannot connect to server %s [%s]') % ('', display_msg)
if server.errormsg != errormsg:
server.errormsg = errormsg
logging.warning(T('Cannot connect to server %s [%s]'), server.id, msg)
logging.warning(T('Cannot connect to server %s [%s]'), server.host, msg)
if clues_pay(msg):
penalty = _PENALTY_PERM
else:
@@ -674,7 +674,7 @@ class Downloader(Thread):
elif ecode == '400':
# Temp connection problem?
if server.active:
logging.debug('Unspecified error 400 from server %s', server.id)
logging.debug('Unspecified error 400 from server %s', server.host)
penalty = _PENALTY_VERYSHORT
block = True
else:
@@ -683,25 +683,25 @@ class Downloader(Thread):
errormsg = T('Cannot connect to server %s [%s]') % ('', display_msg)
if server.errormsg != errormsg:
server.errormsg = errormsg
logging.warning(T('Cannot connect to server %s [%s]'), server.id, msg)
logging.warning(T('Cannot connect to server %s [%s]'), server.host, msg)
penalty = _PENALTY_UNKNOWN
block = True
if block or (penalty and server.optional):
if server.active:
server.active = False
if penalty and (block or server.optional):
self.plan_server(server.id, penalty)
self.plan_server(server, penalty)
sabnzbd.nzbqueue.NzbQueue.do.reset_all_try_lists()
self.__reset_nw(nw, None, warn=False, quit=True)
continue
except:
logging.error(T('Connecting %s@%s failed, message=%s'),
nw.thrdnum, nw.server.id, nntp_to_msg(nw.data))
nw.thrdnum, nw.server.host, nntp_to_msg(nw.data))
# No reset-warning needed, above logging is sufficient
self.__reset_nw(nw, None, warn=False)
if nw.connected:
logging.info("Connecting %s@%s finished", nw.thrdnum, nw.server.id)
logging.info("Connecting %s@%s finished", nw.thrdnum, nw.server.host)
self.__request_article(nw)
elif nw.status_code == '223':
@@ -718,27 +718,27 @@ class Downloader(Thread):
elif nw.status_code in ('411', '423', '430'):
done = True
logging.debug('Thread %s@%s: Article %s missing (error=%s)',
nw.thrdnum, nw.server.id, article.article, nw.status_code)
nw.thrdnum, nw.server.host, article.article, nw.status_code)
nw.clear_data()
elif nw.status_code == '480':
if server.active:
server.active = False
server.errormsg = T('Server %s requires user/password') % ''
self.plan_server(server.id, 0)
self.plan_server(server, 0)
sabnzbd.nzbqueue.NzbQueue.do.reset_all_try_lists()
msg = T('Server %s requires user/password') % nw.server.id
msg = T('Server %s requires user/password') % nw.server.host
self.__reset_nw(nw, msg, quit=True)
elif nw.status_code == '500':
if nzo.precheck:
# Assume "STAT" command is not supported
server.have_stat = False
logging.debug('Server %s does not support STAT', server.id)
logging.debug('Server %s does not support STAT', server.host)
else:
# Assume "BODY" command is not supported
server.have_body = False
logging.debug('Server %s does not support BODY', server.id)
logging.debug('Server %s does not support BODY', server.host)
nw.clear_data()
self.__request_article(nw)
@@ -746,7 +746,7 @@ class Downloader(Thread):
server.bad_cons = 0 # Succesful data, clear "bad" counter
server.errormsg = server.warning = ''
if sabnzbd.LOG_ALL:
logging.debug('Thread %s@%s: %s done', nw.thrdnum, server.id, article.article)
logging.debug('Thread %s@%s: %s done', nw.thrdnum, server.host, article.article)
self.decode(article, nw.lines, nw.data)
nw.soft_reset()
@@ -778,9 +778,9 @@ class Downloader(Thread):
if warn and errormsg:
server.warning = errormsg
logging.info('Thread %s@%s: ' + errormsg, nw.thrdnum, server.id)
logging.info('Thread %s@%s: ' + errormsg, nw.thrdnum, server.host)
elif errormsg:
logging.info('Thread %s@%s: ' + errormsg, nw.thrdnum, server.id)
logging.info('Thread %s@%s: ' + errormsg, nw.thrdnum, server.host)
if nw in server.busy_threads:
server.busy_threads.remove(nw)
@@ -814,11 +814,11 @@ class Downloader(Thread):
if nw.server.send_group and nzo.group != nw.group:
group = nzo.group
if sabnzbd.LOG_ALL:
logging.debug('Thread %s@%s: GROUP <%s>', nw.thrdnum, nw.server.id, group)
logging.debug('Thread %s@%s: GROUP <%s>', nw.thrdnum, nw.server.host, group)
nw.send_group(group)
else:
if sabnzbd.LOG_ALL:
logging.debug('Thread %s@%s: BODY %s', nw.thrdnum, nw.server.id, nw.article.article)
logging.debug('Thread %s@%s: BODY %s', nw.thrdnum, nw.server.host, nw.article.article)
nw.body(nzo.precheck)
fileno = nw.nntp.sock.fileno()
@@ -840,24 +840,24 @@ class Downloader(Thread):
# Each server has a dictionary entry, consisting of a list of timestamps.
@synchronized(TIMER_LOCK)
def plan_server(self, server_id, interval):
def plan_server(self, server, interval):
""" Plan the restart of a server in 'interval' minutes """
if cfg.no_penalties() and interval > _PENALTY_SHORT:
# Overwrite in case of no_penalties
interval = _PENALTY_SHORT
logging.debug('Set planned server resume %s in %s mins', server_id, interval)
if server_id not in self._timers:
self._timers[server_id] = []
logging.debug('Set planned server resume %s in %s mins', server.host, interval)
if server.id not in self._timers:
self._timers[server.id] = []
stamp = time.time() + 60.0 * interval
self._timers[server_id].append(stamp)
self._timers[server.id].append(stamp)
if interval:
sabnzbd.scheduler.plan_server(self.trigger_server, [server_id, stamp], interval)
sabnzbd.scheduler.plan_server(self.trigger_server, [server.id, stamp], interval)
@synchronized(TIMER_LOCK)
def trigger_server(self, server_id, timestamp):
""" Called by scheduler, start server if timer still valid """
logging.debug('Trigger planned server resume %s', server_id)
logging.debug('Trigger planned server resume for server-id %s', server_id)
if server_id in self._timers:
if timestamp in self._timers[server_id]:
del self._timers[server_id]
@@ -874,7 +874,7 @@ class Downloader(Thread):
# Activate server if it was inactive
for server in self.servers:
if server.id == server_id and not server.active:
logging.debug('Unblock server %s', server_id)
logging.debug('Unblock server %s', server.host)
self.init_server(server_id, server_id)
break
@@ -891,7 +891,7 @@ class Downloader(Thread):
kicked = []
for server_id in self._timers.keys():
if not [stamp for stamp in self._timers[server_id] if stamp >= now]:
logging.debug('Forcing re-evaluation of server %s', server_id)
logging.debug('Forcing re-evaluation of server-id %s', server_id)
del self._timers[server_id]
self.init_server(server_id, server_id)
kicked.append(server_id)
@@ -899,7 +899,7 @@ class Downloader(Thread):
for server in self.servers:
if server.id not in self._timers:
if server.id not in kicked and not server.active:
logging.debug('Forcing activation of server %s', server.id)
logging.debug('Forcing activation of server %s', server.host)
self.init_server(server.id, server.id)
def update_server(self, oldserver, newserver):

View File

@@ -44,7 +44,6 @@ from sabnzbd.misc import real_path, to_units, from_units, time_format, \
long_path, calc_age, same_file, probablyipv4, probablyipv6, \
int_conv, globber, globber_full, remove_all, get_base_url
from sabnzbd.newswrapper import GetServerParms
from sabnzbd.rating import Rating
from sabnzbd.bpsmeter import BPSMeter
from sabnzbd.encoding import TRANS, xml_name, LatinFilter, unicoder, special_fixer, \
platform_encode
@@ -59,13 +58,13 @@ from sabnzbd.decoder import HAVE_YENC, SABYENC_ENABLED
from sabnzbd.utils.diskspeed import diskspeedmeasure
from sabnzbd.utils.getperformance import getpystone
from sabnzbd.constants import NORMAL_PRIORITY, MEBI, DEF_SKIN_COLORS, DEF_STDINTF, \
from sabnzbd.constants import NORMAL_PRIORITY, MEBI, DEF_SKIN_COLORS, \
DEF_STDCONFIG, DEF_MAIN_TMPL, DEFAULT_PRIORITY
from sabnzbd.lang import list_languages
from sabnzbd.api import list_scripts, list_cats, del_from_section, \
api_handler, build_queue, remove_callable, rss_qstatus, build_status, \
api_handler, build_queue, remove_callable, build_status, \
retry_job, retry_all_jobs, build_header, build_history, del_job_files, \
format_bytes, std_time, report, del_hist_job, Ttemplate, build_queue_header, \
_api_test_email, _api_test_notif
@@ -162,7 +161,7 @@ def check_hostname():
if not host:
return False
# Remove the port-part (like ':8080'), if it is there, always on the right hand side.
# Remove the port-part (like ':8080'), if it is there, always on the right hand side.
# Not to be confused with IPv6 colons (within square brackets)
host = re.sub(':[0123456789]+$', '', host).lower()
@@ -175,7 +174,7 @@ def check_hostname():
return True
# Fine if ends with ".local" or ".local.", aka mDNS name
# See rfc6762 Multicast DNS
# See rfc6762 Multicast DNS
if host.endswith(('.local', '.local.')):
return True
@@ -237,8 +236,7 @@ def check_login():
def get_users():
users = {}
users[cfg.username()] = cfg.password()
users = {cfg.username(): cfg.password()}
return users
@@ -501,7 +499,7 @@ class MainPage(object):
# No session key check, due to fixed URLs
name = kwargs.get('name')
if name:
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
return ShowString(history_db.get_name(name), history_db.get_script_log(name))
else:
raise Raiser(self.__root)
@@ -775,7 +773,7 @@ class NzoPage(object):
# /SABnzbd_nzo_xxxxx/files
elif 'files' in args:
info = self.nzo_files(info, pnfo_list, nzo_id)
info = self.nzo_files(info, nzo_id)
# /SABnzbd_nzo_xxxxx/save
elif 'save' in args:
@@ -785,7 +783,7 @@ class NzoPage(object):
# /SABnzbd_nzo_xxxxx/
else:
info = self.nzo_details(info, pnfo_list, nzo_id)
info = self.nzo_files(info, pnfo_list, nzo_id)
info = self.nzo_files(info, nzo_id)
template = Template(file=os.path.join(sabnzbd.WEB_DIR, 'nzo.tmpl'),
filter=FILTER, searchList=[info], compilerSettings=DIRECTIVES)
@@ -837,7 +835,7 @@ class NzoPage(object):
return info
def nzo_files(self, info, pnfo_list, nzo_id):
def nzo_files(self, info, nzo_id):
active = []
nzo = NzbQueue.do.get_nzo(nzo_id)
if nzo:
@@ -1108,7 +1106,7 @@ class HistoryPage(object):
@secured_expose(check_session_key=True)
def purge(self, **kwargs):
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
history_db.remove_history()
raise queueRaiser(self.__root, kwargs)
@@ -1135,7 +1133,7 @@ class HistoryPage(object):
@secured_expose(check_session_key=True)
def purge_failed(self, **kwargs):
del_files = bool(int_conv(kwargs.get('del_files')))
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
if del_files:
del_job_files(history_db.get_failed_paths())
history_db.remove_failed()
@@ -1175,7 +1173,7 @@ class HistoryPage(object):
# No session key check, due to fixed URLs
name = kwargs.get('name')
if name:
history_db = sabnzbd.connect_db()
history_db = sabnzbd.get_db_connection()
return ShowString(history_db.get_name(name), history_db.get_script_log(name))
else:
raise Raiser(self.__root)
@@ -1373,7 +1371,7 @@ SPECIAL_BOOL_LIST = \
'rss_filenames', 'ipv6_hosting', 'keep_awake', 'empty_postproc', 'html_login', 'wait_for_dfolder',
'max_art_opt', 'warn_empty_nzb', 'enable_bonjour', 'reject_duplicate_files', 'warn_dupl_jobs',
'replace_illegal', 'backup_for_duplicates', 'disable_api_key', 'api_logging',
'ignore_empty_files', 'x_frame_options'
'ignore_empty_files', 'x_frame_options', 'require_modern_tls'
)
SPECIAL_VALUE_LIST = \
('size_limit', 'folder_max_length', 'fsys_type', 'movie_rename_limit', 'nomedia_marker',
@@ -1879,9 +1877,13 @@ class ConfigRss(object):
@secured_expose(check_session_key=True, check_configlock=True)
def upd_rss_filter(self, **kwargs):
""" Wrapper, so we can call from api.py """
self.internal_upd_rss_filter(**kwargs)
def internal_upd_rss_filter(self, **kwargs):
""" Save updated filter definition """
try:
cfg = config.get_rss()[kwargs.get('feed')]
feed_cfg = config.get_rss()[kwargs.get('feed')]
except KeyError:
raise rssRaiser(self.__root, kwargs)
@@ -1895,14 +1897,14 @@ class ConfigRss(object):
enabled = kwargs.get('enabled', '0')
if filt:
cfg.filters.update(int(kwargs.get('index', 0)), (cat, pp, script, kwargs.get('filter_type'),
feed_cfg.filters.update(int(kwargs.get('index', 0)), (cat, pp, script, kwargs.get('filter_type'),
platform_encode(filt), prio, enabled))
# Move filter if requested
index = int_conv(kwargs.get('index', ''))
new_index = kwargs.get('new_index', '')
if new_index and int_conv(new_index) != index:
cfg.filters.move(int(index), int_conv(new_index))
feed_cfg.filters.move(int(index), int_conv(new_index))
config.save_config()
self.__evaluate = False
@@ -1920,13 +1922,17 @@ class ConfigRss(object):
@secured_expose(check_session_key=True, check_configlock=True)
def del_rss_filter(self, **kwargs):
""" Wrapper, so we can call from api.py """
self.internal_del_rss_filter(**kwargs)
def internal_del_rss_filter(self, **kwargs):
""" Remove one RSS filter """
try:
cfg = config.get_rss()[kwargs.get('feed')]
feed_cfg = config.get_rss()[kwargs.get('feed')]
except KeyError:
raise rssRaiser(self.__root, kwargs)
cfg.filters.delete(int(kwargs.get('index', 0)))
feed_cfg.filters.delete(int(kwargs.get('index', 0)))
config.save_config()
self.__evaluate = False
self.__show_eval_button = True
@@ -2041,15 +2047,8 @@ class ConfigScheduling(object):
@secured_expose(check_configlock=True)
def index(self, **kwargs):
def get_days():
days = {}
days["*"] = T('Daily')
days["1"] = T('Monday')
days["2"] = T('Tuesday')
days["3"] = T('Wednesday')
days["4"] = T('Thursday')
days["5"] = T('Friday')
days["6"] = T('Saturday')
days["7"] = T('Sunday')
days = {"*": T('Daily'), "1": T('Monday'), "2": T('Tuesday'), "3": T('Wednesday'), "4": T('Thursday'),
"5": T('Friday'), "6": T('Saturday'), "7": T('Sunday')}
return days
conf = build_header(sabnzbd.WEB_DIR_CONFIG)
@@ -2078,7 +2077,7 @@ class ConfigScheduling(object):
if '%' not in value and from_units(value) < 1.0:
value = T('off') # : "Off" value for speedlimit in scheduler
else:
if '%' not in value and int_conv(value) > 1 and int_conv(value) < 101:
if '%' not in value and 1 < int_conv(value) < 101:
value += '%'
value = value.upper()
if action in actions:
@@ -2133,7 +2132,6 @@ class ConfigScheduling(object):
@secured_expose(check_session_key=True, check_configlock=True)
def addSchedule(self, **kwargs):
servers = config.get_servers()
categories = list_cats(False)
minute = kwargs.get('minute')
hour = kwargs.get('hour')
days_of_week = ''.join([str(x) for x in kwargs.get('daysofweek', '')])
@@ -2532,6 +2530,7 @@ def GetRssLog(feed):
# These fields could be empty
job['cat'] = job.get('cat', '')
job['size'] = job.get('size', '')
job['infourl'] = job.get('infourl', '')
# Auto-fetched jobs didn't have these fields set
if job.get('url'):
@@ -2767,7 +2766,7 @@ def rss_history(url, limit=50, search=None):
stageLine.append("<tr><dt>Stage %s</dt>" % stage['name'])
actions = []
for action in stage['actions']:
actions.append("<dd>%s</dd>" % (action))
actions.append("<dd>%s</dd>" % action)
actions.sort()
actions.reverse()
for act in actions:

View File

@@ -44,6 +44,7 @@ from sabnzbd.constants import DEFAULT_PRIORITY, FUTURE_Q_FOLDER, JOB_ADMIN, \
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.encoding import unicoder, special_fixer, gUTF
import sabnzbd.utils.rarfile as rarfile
TAB_UNITS = ('', 'K', 'M', 'G', 'T', 'P')
RE_UNITS = re.compile(r'(\d+\.*\d*)\s*([KMGTP]{0,1})', re.I)
@@ -248,11 +249,12 @@ _DEVICES = ('con', 'prn', 'aux', 'nul',
'com1', 'com2', 'com3', 'com4', 'com5', 'com6', 'com7', 'com8', 'com9',
'lpt1', 'lpt2', 'lpt3', 'lpt4', 'lpt5', 'lpt6', 'lpt7', 'lpt8', 'lpt9')
def replace_win_devices(name):
''' Remove reserved Windows device names from a name.
""" Remove reserved Windows device names from a name.
aux.txt ==> _aux.txt
txt.aux ==> txt.aux
'''
"""
if name:
lname = name.lower()
for dev in _DEVICES:
@@ -260,9 +262,9 @@ def replace_win_devices(name):
name = '_' + name
break
# Remove special NTFS filename
if lname.startswith('$mft'):
name = name.replace('$', 'S', 1)
# Remove special NTFS filename
if lname.startswith('$mft'):
name = name.replace('$', 'S', 1)
return name
@@ -425,7 +427,7 @@ def is_obfuscated_filename(filename):
""" Check if this file has an extension, if not, it's
probably obfuscated and we don't use it
"""
return (os.path.splitext(filename)[1] == '')
return os.path.splitext(filename)[1] == ''
##############################################################################
@@ -517,16 +519,16 @@ def create_real_path(name, loc, path, umask=False, writable=True):
logging.info('%s directory: %s does not exist, try to create it', name, my_dir)
if not create_all_dirs(my_dir, umask):
logging.error(T('Cannot create directory %s'), clip_path(my_dir))
return (False, my_dir)
return False, my_dir
checks = (os.W_OK + os.R_OK) if writable else os.R_OK
if os.access(my_dir, checks):
return (True, my_dir)
return True, my_dir
else:
logging.error(T('%s directory: %s error accessing'), name, clip_path(my_dir))
return (False, my_dir)
return False, my_dir
else:
return (False, "")
return False, ""
def is_relative_path(p):
@@ -746,7 +748,6 @@ def to_units(val, spaces=0, postfix=''):
Show single decimal for M and higher
"""
dec_limit = 1
decimals = 0
if val < 0:
sign = '-'
else:
@@ -845,7 +846,7 @@ def split_host(srv):
port = int(port)
except:
port = None
return (host, port)
return host, port
def get_from_url(url):
@@ -1081,7 +1082,7 @@ def get_filepath(path, nzo, filename):
# It does no umask setting
# It uses the dir_lock for the (rare) case that the
# download_dir is equal to the complete_dir.
dName = nzo.work_name
dName = dirname = nzo.work_name
if not nzo.created:
for n in xrange(200):
dName = dirname
@@ -1155,11 +1156,12 @@ def renamer(old, new):
@synchronized(DIR_LOCK)
def remove_dir(path):
""" Remove directory with retries for Win32 """
logging.debug('[%s] Deleting dir %s', caller_name(), path)
if sabnzbd.WIN32:
retries = 15
while retries > 0:
try:
remove_dir(path)
os.rmdir(path)
return
except WindowsError, err:
if err[0] == 32:
@@ -1170,7 +1172,7 @@ def remove_dir(path):
time.sleep(3)
raise WindowsError(err)
else:
remove_dir(path)
os.rmdir(path)
@synchronized(DIR_LOCK)
@@ -1202,12 +1204,6 @@ def remove_file(path):
os.remove(path)
def remove_dir(dir):
""" Wrapper function so any dir removal is logged """
logging.debug('[%s] Deleting dir %s', caller_name(), dir)
os.rmdir(dir)
def trim_win_path(path):
""" Make sure Windows path stays below 70 by trimming last part """
if sabnzbd.WIN32 and len(path) > 69:
@@ -1241,6 +1237,14 @@ def get_admin_path(name, future):
return os.path.join(os.path.join(cfg.download_dir.get_path(), name), JOB_ADMIN)
def is_rarfile(rarfile_path):
""" Wrapper in case it crashes due to missing file or long-path problems """
try:
return rarfile.is_rarfile(rarfile_path)
except:
return False
def on_cleanup_list(filename, skip_nzb=False):
""" Return True if a filename matches the clean-up list """
lst = cfg.cleanup_list()
@@ -1286,8 +1290,8 @@ def memory_usage():
except:
logging.debug('Error retrieving memory usage')
logging.info("Traceback: ", exc_info=True)
else:
return ''
try:
_PAGE_SIZE = os.sysconf("SC_PAGE_SIZE")
except:
@@ -1448,7 +1452,7 @@ def create_https_certificates(ssl_cert, ssl_key):
try:
from sabnzbd.utils.certgen import generate_key, generate_local_cert
private_key = generate_key(key_size=2048, output_file=ssl_key)
generate_local_cert(private_key, days_valid=3560, output_file=ssl_cert, LN=u'SABnzbd', ON=u'SABnzbd', CN=u'localhost')
generate_local_cert(private_key, days_valid=3560, output_file=ssl_cert, LN=u'SABnzbd', ON=u'SABnzbd')
logging.info('Self-signed certificates generated successfully')
except:
logging.error(T('Error creating SSL key and certificate'))

View File

@@ -32,8 +32,8 @@ import sabnzbd
from sabnzbd.encoding import TRANS, unicoder, platform_encode, deunicode
import sabnzbd.utils.rarfile as rarfile
from sabnzbd.misc import format_time_string, find_on_path, make_script_path, int_conv, \
real_path, globber, globber_full, get_all_passwords, renamer, clip_path, \
has_win_device, calc_age, long_path, remove_file, recursive_listdir
real_path, globber, globber_full, get_all_passwords, renamer, clip_path, calc_age, \
long_path, remove_file, recursive_listdir, is_rarfile
from sabnzbd.sorting import SeriesSorter
import sabnzbd.cfg as cfg
from sabnzbd.constants import Status
@@ -159,14 +159,7 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
'download_time': nzo.nzo_info.get('download_time', ''),
'avg_bps': int(nzo.avg_bps_total / nzo.avg_bps_freq) if nzo.avg_bps_freq else 0,
'age': calc_age(nzo.avg_date),
'orig_nzb_gz': clip_path(nzb_paths[0]) if nzb_paths else '',
'program_dir': sabnzbd.DIR_PROG,
'par2_command': sabnzbd.newsunpack.PAR2_COMMAND,
'multipar_command': sabnzbd.newsunpack.MULTIPAR_COMMAND,
'rar_command': sabnzbd.newsunpack.RAR_COMMAND,
'zip_command': sabnzbd.newsunpack.ZIP_COMMAND,
'7zip_command': sabnzbd.newsunpack.SEVEN_COMMAND,
'version': sabnzbd.__version__}
'orig_nzb_gz': clip_path(nzb_paths[0]) if nzb_paths else ''}
try:
stup, need_shell, command, creationflags = build_command(command)
@@ -182,7 +175,7 @@ def external_processing(extern_proc, nzo, complete_dir, nicename, status):
proc = p.stdout
if p.stdin:
p.stdin.close()
line = ''
lines = []
while 1:
line = proc.readline()
@@ -243,11 +236,10 @@ def unpack_magic(nzo, workdir, workdir_complete, dele, one_folder, joinables, zi
else:
xjoinables, xzips, xrars, xsevens, xts = build_filelists(workdir, workdir_complete, check_both=dele)
rerun = False
force_rerun = False
newfiles = []
error = None
new_joins = new_rars = new_zips = new_ts = None
new_joins = new_ts = None
if cfg.enable_filejoin():
new_joins = [jn for jn in xjoinables if jn not in joinables]
@@ -443,16 +435,17 @@ def file_join(nzo, workdir, workdir_complete, delete, joinables):
if seq_error:
msg = T('Incomplete sequence of joinable files')
nzo.fail_msg = T('File join of %s failed') % unicoder(joinable_set)
nzo.set_unpack_info('Filejoin', T('[%s] Error "%s" while joining files') % (unicoder(joinable_set), msg))
nzo.fail_msg = T('File join of %s failed') % unicoder(os.path.basename(joinable_set))
nzo.set_unpack_info('Filejoin', T('[%s] Error "%s" while joining files') % (unicoder(os.path.basename(joinable_set)), msg))
logging.error(T('Error "%s" while running file_join on %s'), msg, nzo.final_name)
return True, []
else:
msg = T('[%s] Joined %s files') % (unicoder(joinable_set), size)
nzo.set_unpack_info('Filejoin', msg)
except:
msg = sys.exc_info()[1]
nzo.fail_msg = T('File join of %s failed') % msg
nzo.set_unpack_info('Filejoin', T('[%s] Error "%s" while joining files') % (unicoder(joinable_set), msg))
nzo.set_unpack_info('Filejoin', T('[%s] Error "%s" while joining files') % (unicoder(os.path.basename(joinable_set)), msg))
logging.error(T('Error "%s" while running file_join on %s'), msg, nzo.final_name)
return True, []
@@ -467,9 +460,7 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
When 'delete' is set, originals will be deleted.
When 'one_folder' is set, all files will be in a single folder
"""
extracted_files = []
success = False
newfiles = extracted_files = []
rar_sets = {}
for rar in rars:
rar_set = os.path.splitext(os.path.basename(rar))[0]
@@ -510,6 +501,8 @@ def rar_unpack(nzo, workdir, workdir_complete, delete, one_folder, rars):
if wait_count > 60:
# We abort after 2 minutes of no changes
nzo.direct_unpacker.abort()
else:
wait_count = 0
last_stats = nzo.direct_unpacker.get_formatted_stats()
# Did we already direct-unpack it? Not when recursive-unpacking
@@ -656,7 +649,7 @@ def rar_extract_core(rarfile_path, numrars, one_folder, nzo, setname, extraction
stup, need_shell, command, creationflags = build_command(command, flatten_command=True)
# Get list of all the volumes part of this set
logging.debug("Analyzing rar file ... %s found", rarfile.is_rarfile(rarfile_path))
logging.debug("Analyzing rar file ... %s found", is_rarfile(rarfile_path))
logging.debug("Running unrar %s", command)
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
@@ -994,7 +987,9 @@ def seven_extract(nzo, sevenset, extensions, extraction_path, one_folder, delete
nzo.fail_msg = ''
if fail == 2:
msg = '%s (%s)' % (T('Unpacking failed, archive requires a password'), os.path.basename(sevenset))
if fail > 0:
nzo.fail_msg = msg
nzo.status = Status.FAILED
logging.error(msg)
return fail, new_files, msg
@@ -1028,7 +1023,7 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
parm = '-tzip' if sevenset.lower().endswith('.zip') else '-t7z'
if not os.path.exists(name):
return 1, T('7ZIP set "%s" is incomplete, cannot unpack') % unicoder(sevenset)
return 1, T('7ZIP set "%s" is incomplete, cannot unpack') % os.path.basename(sevenset)
# For file-bookkeeping
orig_dir_content = recursive_listdir(extraction_path)
@@ -1047,6 +1042,15 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
ret = p.wait()
# Return-code for CRC and Password is the same
if ret == 2 and 'ERROR: CRC Failed' in output:
# We can output a more general error
ret = 1
msg = T('ERROR: CRC failed in "%s"') % os.path.basename(sevenset)
else:
# Default message
msg = T('Could not unpack %s') % os.path.basename(sevenset)
# What's new?
new_files = list(set(orig_dir_content + recursive_listdir(extraction_path)))
@@ -1065,7 +1069,7 @@ def seven_extract_core(sevenset, extensions, extraction_path, one_folder, delete
logging.warning(T('Deleting %s failed!'), sevenset)
# Always return an error message, even when return code is 0
return ret, new_files, T('Could not unpack %s') % unicoder(sevenset)
return ret, new_files, msg
##############################################################################
@@ -1127,9 +1131,9 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
# Multipar or not?
if sabnzbd.WIN32 and cfg.multipar():
finished, readd, datafiles, used_joinables, used_for_repair = MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=single)
finished, readd, datafiles, used_joinables, used_for_repair = MultiPar_Verify(parfile, nzo, setname, joinables, single=single)
else:
finished, readd, datafiles, used_joinables, used_for_repair = PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=single)
finished, readd, datafiles, used_joinables, used_for_repair = PAR_Verify(parfile, nzo, setname, joinables, single=single)
if finished:
result = True
@@ -1138,10 +1142,7 @@ def par2_repair(parfile_nzf, nzo, workdir, setname, single):
# Remove this set so we don't try to check it again
nzo.remove_parset(parfile_nzf.setname)
else:
if qc_result:
logging.warning(T('Par verify failed on %s, while QuickCheck succeeded!'), parfile)
else:
logging.info('Par verify failed on %s!', parfile)
logging.info('Par verify failed on %s!', parfile)
if not readd:
# Failed to repair -> remove this set
@@ -1196,7 +1197,7 @@ _RE_LOADING_PAR2 = re.compile(r'Loading "([^"]+)"\.')
_RE_LOADED_PAR2 = re.compile(r'Loaded (\d+) new packets')
def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
def PAR_Verify(parfile, nzo, setname, joinables, single=False):
""" Run par2 on par-set """
used_joinables = []
used_for_repair = []
@@ -1337,7 +1338,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
block_table = {}
for nzf in nzo.extrapars[setname]:
if not nzf.completed:
block_table[int_conv(nzf.blocks)] = nzf
block_table[nzf.blocks] = nzf
if block_table:
nzf = block_table[min(block_table.keys())]
@@ -1374,7 +1375,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
elif line.startswith('Repair is possible'):
start = time.time()
nzo.set_action_line(T('Repairing'), '%2d%%' % (0))
nzo.set_action_line(T('Repairing'), '%2d%%' % 0)
elif line.startswith('Repairing:'):
chunks = line.split()
@@ -1533,7 +1534,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
_RE_FILENAME = re.compile(r'"([^"]+)"')
def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False):
def MultiPar_Verify(parfile, nzo, setname, joinables, single=False):
""" Run par2 on par-set """
parfolder = os.path.split(parfile)[0]
used_joinables = []
@@ -1650,7 +1651,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False)
block_table = {}
for nzf in nzo.extrapars[setname]:
if not nzf.completed:
block_table[int_conv(nzf.blocks)] = nzf
block_table[nzf.blocks] = nzf
if block_table:
nzf = block_table[min(block_table.keys())]
@@ -1841,13 +1842,17 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False)
# Set message for user in case of joining
if line.startswith('Ready to rejoin'):
nzo.set_action_line(T('Joining'), '%2d' % len(used_joinables))
else:
# If we are repairing a joinable set, it won't actually
# do the joining. So we can't remove those files!
used_joinables = []
# ----------------- Repair stage
elif 'Recovering slice' in line:
# Before this it will calculate matrix, here is where it starts
start = time.time()
in_repair = True
nzo.set_action_line(T('Repairing'), '%2d%%' % (0))
nzo.set_action_line(T('Repairing'), '%2d%%' % 0)
elif in_repair and line.startswith('Verifying repair'):
in_repair = False
@@ -1921,7 +1926,7 @@ def MultiPar_Verify(parfile, parfile_nzf, nzo, setname, joinables, single=False)
return finished, readd, datafiles, used_joinables, used_for_repair
def create_env(nzo=None, extra_env_fields=None):
def create_env(nzo=None, extra_env_fields={}):
""" Modify the environment for pp-scripts with extra information
OSX: Return copy of environment without PYTHONPATH and PYTHONHOME
other: return None
@@ -1945,16 +1950,25 @@ def create_env(nzo=None, extra_env_fields=None):
# Catch key/unicode errors
pass
# Add extra fields
for field in extra_env_fields:
try:
if extra_env_fields[field] is not None:
env['SAB_' + field.upper()] = extra_env_fields[field]
else:
env['SAB_' + field.upper()] = ''
except:
# Catch key/unicode errors
pass
# Always supply basic info
extra_env_fields.update({'program_dir': sabnzbd.DIR_PROG,
'par2_command': sabnzbd.newsunpack.PAR2_COMMAND,
'multipar_command': sabnzbd.newsunpack.MULTIPAR_COMMAND,
'rar_command': sabnzbd.newsunpack.RAR_COMMAND,
'zip_command': sabnzbd.newsunpack.ZIP_COMMAND,
'7zip_command': sabnzbd.newsunpack.SEVEN_COMMAND,
'version': sabnzbd.__version__})
# Add extra fields
for field in extra_env_fields:
try:
if extra_env_fields[field] is not None:
env['SAB_' + field.upper()] = extra_env_fields[field]
else:
env['SAB_' + field.upper()] = ''
except:
# Catch key/unicode errors
pass
if sabnzbd.DARWIN:
if 'PYTHONPATH' in env:
@@ -2099,11 +2113,7 @@ def build_filelists(workdir, workdir_complete=None, check_both=False, check_rar=
# Extra check for rar (takes CPU/disk)
file_is_rar = False
if check_rar:
try:
# Can fail on Windows due to long-path after recursive-unpack
file_is_rar = rarfile.is_rarfile(file)
except:
pass
file_is_rar = is_rarfile(file)
# Run through all the checks
if SEVENZIP_RE.search(file) or SEVENMULTI_RE.search(file):
@@ -2295,23 +2305,33 @@ def analyse_show(name):
info.get('ep_name', '')
def pre_queue(name, pp, cat, script, priority, size, groups):
""" Run pre-queue script (if any) and process results """
def pre_queue(nzo, pp, cat):
""" Run pre-queue script (if any) and process results.
pp and cat are supplied seperate since they can change.
"""
def fix(p):
if not p or str(p).lower() == 'none':
return ''
return unicoder(p)
values = [1, name, pp, cat, script, priority, None]
values = [1, nzo.final_name_pw_clean, pp, cat, nzo.script, nzo.priority, None]
script_path = make_script_path(cfg.pre_script())
if script_path:
command = [script_path, name, pp, cat, script, priority, str(size), ' '.join(groups)]
command.extend(analyse_show(name))
# Basic command-line parameters
command = [script_path, nzo.final_name_pw_clean, pp, cat, nzo.script, nzo.priority, str(nzo.bytes), ' '.join(nzo.groups)]
command.extend(analyse_show(nzo.final_name_pw_clean))
command = [fix(arg) for arg in command]
# Fields not in the NZO directly
extra_env_fields = {'groups': ' '.join(nzo.groups),
'show_name': command[8],
'show_season': command[9],
'show_episode': command[10],
'show_episode_name': command[11]}
try:
stup, need_shell, command, creationflags = build_command(command)
env = create_env()
env = create_env(nzo, extra_env_fields)
logging.info('Running pre-queue script %s', command)
p = Popen(command, shell=need_shell, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
@@ -2332,11 +2352,11 @@ def pre_queue(name, pp, cat, script, priority, size, groups):
n += 1
accept = int_conv(values[0])
if accept < 1:
logging.info('Pre-Q refuses %s', name)
logging.info('Pre-Q refuses %s', nzo.final_name_pw_clean)
elif accept == 2:
logging.info('Pre-Q accepts&fails %s', name)
logging.info('Pre-Q accepts&fails %s', nzo.final_name_pw_clean)
else:
logging.info('Pre-Q accepts %s', name)
logging.info('Pre-Q accepts %s', nzo.final_name_pw_clean)
return values

View File

@@ -25,7 +25,6 @@ from threading import Thread
from nntplib import NNTPPermanentError
import time
import logging
import re
import ssl
import sabnzbd
@@ -151,7 +150,7 @@ class NNTP(object):
# Pre-define attributes to save memory
__slots__ = ('host', 'port', 'nw', 'blocking', 'error_msg', 'sock')
def __init__(self, host, port, info, sslenabled, send_group, nw, user=None, password=None, block=False, write_fds=None):
def __init__(self, host, port, info, sslenabled, nw, block=False, write_fds=None):
self.host = host
self.port = port
self.nw = nw
@@ -175,15 +174,19 @@ class NNTP(object):
# Setup the SSL socket
ctx = ssl.create_default_context()
if sabnzbd.cfg.require_modern_tls():
# We want a modern TLS (1.2 or higher), so we disallow older protocol versions (<= TLS 1.1)
ctx.options |= ssl.OP_NO_SSLv2 | ssl.OP_NO_SSLv3 | ssl.OP_NO_TLSv1 | ssl.OP_NO_TLSv1_1
# Only verify hostname when we're strict
if(nw.server.ssl_verify < 2):
if nw.server.ssl_verify < 2:
ctx.check_hostname = False
# Certificates optional
if(nw.server.ssl_verify == 0):
if nw.server.ssl_verify == 0:
ctx.verify_mode = ssl.CERT_NONE
# Did the user set a custom cipher-string?
if(nw.server.ssl_ciphers):
if nw.server.ssl_ciphers:
# At their own risk, socket will error out in case it was invalid
ctx.set_ciphers(nw.server.ssl_ciphers)
@@ -313,8 +316,7 @@ class NewsWrapper(object):
# Construct NNTP object and shorthands
self.nntp = NNTP(self.server.hostip, self.server.port, self.server.info, self.server.ssl,
self.server.send_group, self, self.server.username, self.server.password,
self.blocking, write_fds)
self, self.blocking, write_fds)
self.recv = self.nntp.sock.recv
self.timeout = time.time() + self.server.timeout
@@ -376,19 +378,19 @@ class NewsWrapper(object):
self.timeout = time.time() + self.server.timeout
if precheck:
if self.server.have_stat:
command = 'STAT <%s>\r\n' % (self.article.article)
command = 'STAT <%s>\r\n' % self.article.article
else:
command = 'HEAD <%s>\r\n' % (self.article.article)
command = 'HEAD <%s>\r\n' % self.article.article
elif self.server.have_body:
command = 'BODY <%s>\r\n' % (self.article.article)
command = 'BODY <%s>\r\n' % self.article.article
else:
command = 'ARTICLE <%s>\r\n' % (self.article.article)
command = 'ARTICLE <%s>\r\n' % self.article.article
self.nntp.sock.sendall(command)
self.data = []
def send_group(self, group):
self.timeout = time.time() + self.server.timeout
command = 'GROUP %s\r\n' % (group)
command = 'GROUP %s\r\n' % group
self.nntp.sock.sendall(command)
self.data = []
@@ -416,7 +418,7 @@ class NewsWrapper(object):
# time.sleep(0.0001)
continue
else:
return (0, False, True)
return 0, False, True
# Data is processed differently depending on C-yEnc version
if sabnzbd.decoder.SABYENC_ENABLED:
@@ -426,16 +428,16 @@ class NewsWrapper(object):
# Official end-of-article is ".\r\n" but sometimes it can get lost between 2 chunks
chunk_len = len(chunk)
if chunk[-5:] == '\r\n.\r\n':
return (chunk_len, True, False)
return chunk_len, True, False
elif chunk_len < 5 and len(self.data) > 1:
# We need to make sure the end is not split over 2 chunks
# This is faster than join()
combine_chunk = self.data[-2][-5:] + chunk
if combine_chunk[-5:] == '\r\n.\r\n':
return (chunk_len, True, False)
return chunk_len, True, False
# Still in middle of data, so continue!
return (chunk_len, False, False)
return chunk_len, False, False
else:
self.last_line += chunk
new_lines = self.last_line.split('\r\n')
@@ -459,9 +461,9 @@ class NewsWrapper(object):
if self.lines and self.lines[-1] == '.':
self.lines = self.lines[1:-1]
return (len(chunk), True, False)
return len(chunk), True, False
else:
return (len(chunk), False, False)
return len(chunk), False, False
def soft_reset(self):
self.timeout = None

View File

@@ -23,7 +23,6 @@ sabnzbd.notifier - Send notifications to any notification services
from __future__ import with_statement
import os.path
import logging
import socket
import urllib2
import httplib
import urllib
@@ -143,7 +142,7 @@ def check_cat(section, job_cat, keyword=None):
if not keyword:
keyword = section
section_cats = sabnzbd.config.get_config(section, '%s_cats' % keyword)()
return (['*'] == section_cats or job_cat in section_cats)
return ['*'] == section_cats or job_cat in section_cats
except TypeError:
logging.debug('Incorrect Notify option %s:%s_cats', section, section)
return True
@@ -463,7 +462,7 @@ def send_pushover(title, msg, gtype, force=False, test=None):
"expire": emergency_expire
}
return do_send_pushover(body)
if prio > -3 and prio < 2:
if -3 < prio < 2:
body = { "token": apikey,
"user": userkey,
"device": device,

View File

@@ -69,7 +69,7 @@ class NzbQueue(object):
data = sabnzbd.load_admin(QUEUE_FILE_NAME)
# Process the data and check compatibility
nzo_ids = self.check_compatibility(data)
nzo_ids = self.check_compatibility(repair, data)
# First handle jobs in the queue file
folders = []
@@ -104,7 +104,7 @@ class NzbQueue(object):
except:
pass
def check_compatibility(self, data):
def check_compatibility(self, repair, data):
""" Do compatibility checks on the loaded data """
nzo_ids = []
if not data:
@@ -181,22 +181,6 @@ class NzbQueue(object):
logging.info('Skipping repair for job %s', folder)
return result
def retry_all_jobs(self, history_db):
""" Retry all retryable jobs in History """
result = []
# Retryable folders from History
items = sabnzbd.api.build_history()[0]
registered = [(platform_encode(os.path.basename(item['path'])),
item['nzo_id'])
for item in items if item['retry']]
for job in registered:
logging.info('Repairing job %s', job[0])
result.append(self.repair_job(job[0]))
history_db.remove_history(job[1])
return bool(result)
def repair_job(self, folder, new_nzb=None, password=None):
""" Reconstruct admin for a single job folder, optionally with new NZB """
def all_verified(path):
@@ -204,7 +188,6 @@ class NzbQueue(object):
verified = sabnzbd.load_data(VERIFIED_FILE, path, remove=False) or {'x': False}
return all(verified[x] for x in verified)
nzo_id = None
name = os.path.basename(folder)
path = os.path.join(folder, JOB_ADMIN)
if hasattr(new_nzb, 'filename'):
@@ -541,10 +524,10 @@ class NzbQueue(object):
nzo2 = self.__nzo_table[item_id_2]
except KeyError:
# One or both jobs missing
return (-1, 0)
return -1, 0
if nzo1 == nzo2:
return (-1, 0)
return -1, 0
# get the priorities of the two items
nzo1_priority = nzo1.priority
@@ -573,9 +556,9 @@ class NzbQueue(object):
logging.info('Switching job [%s] %s => [%s] %s', item_id_pos1, item.final_name, item_id_pos2, self.__nzo_list[item_id_pos2].final_name)
del self.__nzo_list[item_id_pos1]
self.__nzo_list.insert(item_id_pos2, item)
return (item_id_pos2, nzo1.priority)
return item_id_pos2, nzo1.priority
# If moving failed/no movement took place
return (-1, nzo1.priority)
return -1, nzo1.priority
@NzbQueueLocker
def move_up_bulk(self, nzo_id, nzf_ids, size):

View File

@@ -167,7 +167,7 @@ class Article(TryList):
# if (server_check.priority() < found_priority and server_check.priority() < server.priority and not self.server_in_try_list(server_check)):
if server_check.active and (server_check.priority < found_priority):
if server_check.priority < server.priority:
if (not self.server_in_try_list(server_check)):
if not self.server_in_try_list(server_check):
if log:
logging.debug('Article %s | Server: %s | setting found priority to %s', self.article, server.host, server_check.priority)
found_priority = server_check.priority
@@ -317,14 +317,14 @@ class NzbFile(TryList):
if found:
self.bytes_left -= article.bytes
return (not self.articles)
return not self.articles
def set_par2(self, setname, vol, blocks):
""" Designate this this file as a par2 file """
self.is_par2 = True
self.setname = setname
self.vol = vol
self.blocks = int(blocks)
self.blocks = int_conv(blocks)
def get_article(self, server, servers):
""" Get next article to be downloaded """
@@ -827,9 +827,9 @@ class NzbObject(TryList):
# Run user pre-queue script if needed
if not reuse and cfg.pre_script():
accept, name, pp, cat_pp, script_pp, priority, group = \
sabnzbd.newsunpack.pre_queue(self.final_name_pw_clean, pp, cat, script,
priority, self.bytes, self.groups)
# Call the script
accept, name, pp, cat_pp, script_pp, priority, group = sabnzbd.newsunpack.pre_queue(self, pp, cat)
# Accept or reject
accept = int_conv(accept)
if accept < 1:
@@ -1022,7 +1022,7 @@ class NzbObject(TryList):
# Sort the sets
for setname in self.extrapars:
self.extrapars[parset].sort(key=lambda x: x.blocks)
self.extrapars[setname].sort(key=lambda x: x.blocks)
# Also re-parse all filenames in case par2 came after first articles
self.verify_all_filenames_and_resort()
@@ -1098,38 +1098,37 @@ class NzbObject(TryList):
def get_extra_blocks(self, setname, needed_blocks):
""" We want par2-files of all sets that are similar to this one
So that we also can handle multi-sets with duplicate filenames
Block-table has as keys the nr-blocks
Returns number of added blocks in case they are available
In case of duplicate files for the same set, we might add too
little par2 on the first add-run, but that's a risk we need to take.
"""
logging.info('Need %s more blocks, checking blocks', needed_blocks)
avail_blocks = 0
block_table = {}
block_list = []
for setname_search in self.extrapars:
# Do it for our set, or highlight matching one
# We might catch to many par2's, but that's okay
# We might catch too many par2's, but that's okay
if setname_search == setname or difflib.SequenceMatcher(None, setname, setname_search).ratio() > 0.85:
for nzf in self.extrapars[setname_search]:
# Don't count extrapars that are completed already
if nzf.completed:
continue
blocks = int_conv(nzf.blocks)
if blocks not in block_table:
block_table[blocks] = []
# We assume same block-vol-naming for each set
avail_blocks += blocks
block_table[blocks].append(nzf)
block_list.append(nzf)
avail_blocks += nzf.blocks
# Sort by smallest blocks last, to be popped first
block_list.sort(key=lambda x: x.blocks, reverse=True)
logging.info('%s blocks available', avail_blocks)
# Enough?
if avail_blocks >= needed_blocks:
added_blocks = 0
while added_blocks < needed_blocks:
block_size = min(block_table.keys())
for new_nzf in block_table[block_size]:
self.add_parfile(new_nzf)
added_blocks += block_size
block_table.pop(block_size)
new_nzf = block_list.pop()
self.add_parfile(new_nzf)
added_blocks += new_nzf.blocks
logging.info('Added %s blocks to %s', added_blocks, self.final_name)
return added_blocks
else:
@@ -1191,7 +1190,7 @@ class NzbObject(TryList):
self.status = Status.QUEUED
self.set_download_report()
return (file_done, post_done)
return file_done, post_done
@synchronized(NZO_LOCK)
def remove_saved_article(self, article):
@@ -1292,8 +1291,8 @@ class NzbObject(TryList):
# Convert input
value = int_conv(value)
if value in (REPAIR_PRIORITY, TOP_PRIORITY, HIGH_PRIORITY, NORMAL_PRIORITY, \
LOW_PRIORITY, DEFAULT_PRIORITY, PAUSED_PRIORITY, DUP_PRIORITY, STOP_PRIORITY):
if value in (REPAIR_PRIORITY, TOP_PRIORITY, HIGH_PRIORITY, NORMAL_PRIORITY,
LOW_PRIORITY, DEFAULT_PRIORITY, PAUSED_PRIORITY, DUP_PRIORITY, STOP_PRIORITY):
self.priority = value
return
@@ -1407,7 +1406,7 @@ class NzbObject(TryList):
if (parset in nzf.filename or parset in original_filename) and self.extrapars[parset]:
for new_nzf in self.extrapars[parset]:
self.add_parfile(new_nzf)
blocks_new += int_conv(new_nzf.blocks)
blocks_new += new_nzf.blocks
# Enough now?
if blocks_new >= self.bad_articles:
logging.info('Prospectively added %s repair blocks to %s', blocks_new, self.final_name)
@@ -1502,11 +1501,11 @@ class NzbObject(TryList):
self.set_unpack_info('Servers', ', '.join(msgs), unique=True)
@synchronized(NZO_LOCK)
def increase_bad_articles_counter(self, type):
def increase_bad_articles_counter(self, article_type):
""" Record information about bad articles """
if type not in self.nzo_info:
self.nzo_info[type] = 0
self.nzo_info[type] += 1
if article_type not in self.nzo_info:
self.nzo_info[article_type] = 0
self.nzo_info[article_type] += 1
self.bad_articles += 1
def get_article(self, server, servers):
@@ -1751,7 +1750,6 @@ class NzbObject(TryList):
remove_dir(self.downpath)
except:
logging.debug('Folder not removed: %s', self.downpath)
pass
def gather_info(self, full=False):
queued_files = []
@@ -2006,7 +2004,7 @@ def scan_password(name):
slash = name.find('/')
# Look for name/password, but make sure that '/' comes before any {{
if slash >= 0 and slash < braces and 'password=' not in name:
if 0 <= slash < braces and 'password=' not in name:
# Is it maybe in 'name / password' notation?
if slash == name.find(' / ') + 1:
# Remove the extra space after name and before password

View File

@@ -208,7 +208,7 @@ class SABnzbdDelegate(NSObject):
for speed in sorted(speeds.keys()):
menu_speed_item = NSMenuItem.alloc().initWithTitle_action_keyEquivalent_('%s' % (speeds[speed]), 'speedlimitAction:', '')
menu_speed_item.setRepresentedObject_("%s" % (speed))
menu_speed_item.setRepresentedObject_("%s" % speed)
self.menu_speed.addItem_(menu_speed_item)
self.speed_menu_item.setSubmenu_(self.menu_speed)
@@ -414,7 +414,7 @@ class SABnzbdDelegate(NSObject):
if history['status'] != Status.COMPLETED:
jobfailed = NSAttributedString.alloc().initWithString_attributes_(job, self.failedAttributes)
menu_history_item.setAttributedTitle_(jobfailed)
menu_history_item.setRepresentedObject_("%s" % (path))
menu_history_item.setRepresentedObject_("%s" % path)
self.menu_history.addItem_(menu_history_item)
else:
menu_history_item = NSMenuItem.alloc().initWithTitle_action_keyEquivalent_(T('Empty'), '', '')
@@ -483,9 +483,9 @@ class SABnzbdDelegate(NSObject):
if self.state != "" and self.info != "":
self.state_menu_item.setTitle_("%s - %s" % (self.state, self.info))
if self.info == "":
self.state_menu_item.setTitle_("%s" % (self.state))
self.state_menu_item.setTitle_("%s" % self.state)
else:
self.state_menu_item.setTitle_("%s" % (self.info))
self.state_menu_item.setTitle_("%s" % self.info)
except:
logging.info("[osx] stateUpdate Exception %s" % (sys.exc_info()[0]))

View File

@@ -26,7 +26,9 @@ import struct
PROBABLY_PAR2_RE = re.compile(r'(.*)\.vol(\d*)[\+\-](\d*)\.par2', re.I)
PAR_ID = "PAR2\x00PKT"
PAR_PKT_ID = "PAR2\x00PKT"
PAR_FILE_ID = "PAR 2.0\x00FileDesc"
PAR_CREATOR_ID = "PAR 2.0\x00Creator"
PAR_RECOVERY_ID = "RecvSlic"
@@ -35,7 +37,7 @@ def is_parfile(filename):
try:
with open(filename, "rb") as f:
buf = f.read(8)
return buf.startswith(PAR_ID)
return buf.startswith(PAR_PKT_ID)
except:
pass
return False
@@ -47,7 +49,6 @@ def analyse_par2(name, filepath=None):
setname is empty when not a par2 file
"""
name = name.strip()
setname = None
vol = block = 0
m = PROBABLY_PAR2_RE.search(name)
if m:
@@ -129,7 +130,8 @@ def parse_par2_file_packet(f, header):
nothing = None, None, None
if header != PAR_ID:
if header != PAR_PKT_ID:
print header
return nothing
# Length must be multiple of 4 and at least 20
@@ -157,10 +159,15 @@ def parse_par2_file_packet(f, header):
# See if it's the right packet and get name + hash
for offset in range(0, len, 8):
if data[offset:offset + 16] == "PAR 2.0\0FileDesc":
if data[offset:offset + 16] == PAR_FILE_ID:
hash = data[offset + 32:offset + 48]
hash16k = data[offset + 48:offset + 64]
filename = data[offset + 72:].strip('\0')
return filename, hash, hash16k
elif data[offset:offset + 15] == PAR_CREATOR_ID:
# From here until the end is the creator-text
# Useful in case of bugs in the par2-creating software
par2creator = data[offset+16:].strip('\0') # Remove any trailing \0
logging.debug('Par2-creator of %s is: %s', os.path.basename(f.name), par2creator)
return nothing

View File

@@ -281,7 +281,6 @@ def process_job(nzo):
nzb_list = []
# These need to be initialized in case of a crash
workdir_complete = ''
postproc_time = 0
script_log = ''
script_line = ''
@@ -336,15 +335,12 @@ def process_job(nzo):
unpack_error = 1
script = nzo.script
cat = nzo.cat
logging.info('Starting Post-Processing on %s' +
' => Repair:%s, Unpack:%s, Delete:%s, Script:%s, Cat:%s',
filename, flag_repair, flag_unpack, flag_delete, script, nzo.cat)
# Set complete dir to workdir in case we need to abort
workdir_complete = workdir
marker_file = None
# Par processing, if enabled
if all_ok and flag_repair:
@@ -380,19 +376,16 @@ def process_job(nzo):
newfiles = []
# Run Stage 2: Unpack
if flag_unpack:
if all_ok:
# set the current nzo status to "Extracting...". Used in History
nzo.status = Status.EXTRACTING
logging.info("Running unpack_magic on %s", filename)
unpack_error, newfiles = unpack_magic(nzo, workdir, tmp_workdir_complete, flag_delete, one_folder, (), (), (), (), ())
logging.info("Unpacked files %s", newfiles)
# set the current nzo status to "Extracting...". Used in History
nzo.status = Status.EXTRACTING
logging.info("Running unpack_magic on %s", filename)
unpack_error, newfiles = unpack_magic(nzo, workdir, tmp_workdir_complete, flag_delete, one_folder, (), (), (), (), ())
logging.info("Unpacked files %s", newfiles)
if sabnzbd.WIN32:
# Sanitize the resulting files
newfiles = sanitize_files_in_folder(tmp_workdir_complete)
logging.info("Finished unpack_magic on %s", filename)
else:
nzo.set_unpack_info('Unpack', T('No post-processing because of failed verification'))
if sabnzbd.WIN32:
# Sanitize the resulting files
newfiles = sanitize_files_in_folder(tmp_workdir_complete)
logging.info("Finished unpack_magic on %s", filename)
if cfg.safe_postproc():
all_ok = all_ok and not unpack_error
@@ -453,7 +446,6 @@ def process_job(nzo):
else:
workdir_complete = tmp_workdir_complete.replace('_UNPACK_', '_FAILED_')
workdir_complete = get_unique_path(workdir_complete, n=0, create_dir=False)
workdir_complete = workdir_complete
if empty:
job_result = -1

View File

@@ -25,7 +25,6 @@ import urlparse
import time
import logging
import copy
import socket
import Queue
import collections
from threading import RLock, Thread

View File

@@ -287,10 +287,10 @@ class RSSQueue(object):
status = feed_parsed.get('status', 999)
if status in (401, 402, 403):
msg = T('Do not have valid authentication for feed %s') % feed
msg = T('Do not have valid authentication for feed %s') % uri
logging.info(msg)
if status >= 500 and status <= 599:
if 500 <= status <= 599:
msg = T('Server side error (server code %s); could not get %s on %s') % (status, feed, uri)
logging.info(msg)
@@ -301,11 +301,14 @@ class RSSQueue(object):
msg = T('Server %s uses an untrusted HTTPS certificate') % get_urlbase(uri)
msg += ' - https://sabnzbd.org/certificate-errors'
logging.error(msg)
elif 'href' in feed_parsed and feed_parsed['href'] != uri and 'login' in feed_parsed['href']:
# Redirect to login page!
msg = T('Do not have valid authentication for feed %s') % uri
else:
msg = T('Failed to retrieve RSS from %s: %s') % (uri, xml_name(msg))
logging.info(msg)
if not entries:
if not entries and not msg:
msg = T('RSS Feed %s was empty') % uri
logging.info(msg)
all_entries.extend(entries)
@@ -330,12 +333,8 @@ class RSSQueue(object):
if readout:
try:
link, category, size, age, season, episode = _get_link(uri, entry)
link, infourl, category, size, age, season, episode = _get_link(entry)
except (AttributeError, IndexError):
link = None
category = u''
size = 0L
age = None
logging.info(T('Incompatible feed') + ' ' + uri)
logging.info("Traceback: ", exc_info=True)
return T('Incompatible feed')
@@ -354,6 +353,7 @@ class RSSQueue(object):
continue
else:
link = entry
infourl = jobs[link].get('infourl', '')
category = jobs[link].get('orgcat', '')
if category in ('', '*'):
category = None
@@ -482,13 +482,13 @@ class RSSQueue(object):
else:
star = first
if result:
_HandleLink(jobs, feed, link, title, size, age, season, episode, 'G', category, myCat, myPP,
myScript, act, star, priority=myPrio, rule=str(n))
_HandleLink(jobs, feed, link, infourl, title, size, age, season, episode, 'G', category, myCat,
myPP, myScript, act, star, priority=myPrio, rule=str(n))
if act:
new_downloads.append(title)
else:
_HandleLink(jobs, feed, link, title, size, age, season, episode, 'B', category, myCat, myPP,
myScript, False, star, priority=myPrio, rule=str(n))
_HandleLink(jobs, feed, link, infourl, title, size, age, season, episode, 'B', category, myCat,
myPP, myScript, False, star, priority=myPrio, rule=str(n))
# Send email if wanted and not "forced"
if new_downloads and cfg.email_rss() and not force:
@@ -588,7 +588,7 @@ class RSSQueue(object):
return ''
def _HandleLink(jobs, feed, link, title, size, age, season, episode, flag, orgcat, cat, pp, script,
def _HandleLink(jobs, feed, link, infourl, title, size, age, season, episode, flag, orgcat, cat, pp, script,
download, star, priority=NORMAL_PRIORITY, rule=0):
""" Process one link """
if script == '':
@@ -599,6 +599,7 @@ def _HandleLink(jobs, feed, link, title, size, age, season, episode, flag, orgca
jobs[link] = {}
jobs[link]['title'] = title
jobs[link]['url'] = link
jobs[link]['infourl'] = infourl
jobs[link]['cat'] = cat
jobs[link]['pp'] = pp
jobs[link]['script'] = script
@@ -627,14 +628,11 @@ def _HandleLink(jobs, feed, link, title, size, age, season, episode, flag, orgca
else:
jobs[link]['status'] = flag
def _get_link(uri, entry):
def _get_link(entry):
""" Retrieve the post link from this entry
Returns (link, category, size)
"""
link = None
category = ''
size = 0L
uri = uri.lower()
age = datetime.datetime.now()
# Try standard link and enclosures first
@@ -648,6 +646,11 @@ def _get_link(uri, entry):
except:
pass
# GUID usually has URL to result on page
infourl = None
if entry.id and entry.id != link and entry.id.startswith('http'):
infourl = entry.id
if size == 0L:
_RE_SIZE1 = re.compile(r'Size:\s*(\d+\.\d+\s*[KMG]{0,1})B\W*', re.I)
_RE_SIZE2 = re.compile(r'\W*(\d+\.\d+\s*[KMG]{0,1})B\W*', re.I)
@@ -697,10 +700,10 @@ def _get_link(uri, entry):
except:
category = ''
return link, category, size, age, season, episode
return link, infourl, category, size, age, season, episode
else:
logging.warning(T('Empty RSS entry found (%s)'), link)
return None, '', 0L, None, 0, 0
return None, None, '', 0L, None, 0, 0
def special_rss_site(url):

View File

@@ -19,6 +19,7 @@
sabtray.py - Systray icon for SABnzbd on Windows, contributed by Jan Schejbal
"""
import os
import logging
from time import sleep
@@ -29,8 +30,6 @@ import sabnzbd.scheduler as scheduler
from sabnzbd.downloader import Downloader
import sabnzbd.cfg as cfg
from sabnzbd.misc import to_units
import os
import cherrypy
# contains the tray icon, which demands its own thread
from sabnzbd.utils.systrayiconthread import SysTrayIconThread
@@ -98,10 +97,13 @@ class SABTrayThread(SysTrayIconThread):
speed = to_units(bpsnow)
if self.sabpaused:
self.hover_text = self.txt_paused
if bytes_left > 0:
self.hover_text = "%s - %s: %sB" % (self.txt_paused, self.txt_remaining, mb_left)
else:
self.hover_text = self.txt_paused
self.icon = self.sabicons['pause']
elif bytes_left > 0:
self.hover_text = "%sB/s %s: %sB (%s)" % (speed, self.txt_remaining, mb_left, time_left)
self.hover_text = "%sB/s - %s: %sB (%s)" % (speed, self.txt_remaining, mb_left, time_left)
self.icon = self.sabicons['green']
else:
self.hover_text = self.txt_idle

View File

@@ -21,7 +21,6 @@ sabnzbd.sabtraylinux - System tray icon for Linux, inspired from the Windows one
import gtk
import gobject
import cherrypy
from time import sleep
import subprocess
from threading import Thread
@@ -137,12 +136,13 @@ class StatusIcon(Thread):
dialog.set_select_multiple(True)
filter = gtk.FileFilter()
filter.set_name("*.nbz,*.nbz.gz,*.bz2,*.zip,*.rar")
filter.add_pattern("*.nzb*")
filter.add_pattern("*.nzb.gz")
filter.add_pattern("*.nzb.bz2")
filter.set_name("*.nzb,*.gz,*.bz2,*.zip,*.rar,*.7z")
filter.add_pattern("*.nzb")
filter.add_pattern("*.gz")
filter.add_pattern("*.bz2")
filter.add_pattern("*.zip")
filter.add_pattern("*.rar")
filter.add_pattern("*.7z")
dialog.add_filter(filter)
response = dialog.run()

View File

@@ -47,16 +47,10 @@ SKIN_TEXT = {
'post-Propagating' : TT('Propagation delay'),
'post-Checking' : TT('Checking'), #: PP status
'sch-frequency' : TT('Frequency'), #: #: Config->Scheduler
'sch-action' : TT('Action'), #: #: Config->Scheduler
'sch-arguments' : TT('Arguments'), #: #: Config->Scheduler
'sch-task' : TT('Task'), #: #: Config->Scheduler
'sch-disable_server' : TT('disable server'), #: #: Config->Scheduler
'sch-enable_server' : TT('enable server'), #: #: Config->Scheduler
'sch-resume' : TT('Resume'), #: #: Config->Scheduler
'sch-pause' : TT('Pause'), #: #: Config->Scheduler
'sch-shutdown' : TT('Shutdown'), #: #: Config->Scheduler
'sch-restart' : TT('Restart'), #: #: Config->Scheduler
'sch-speedlimit' : TT('Speedlimit'), #: #: Config->Scheduler
'sch-pause_all' : TT('Pause All'), #: #: Config->Scheduler
'sch-pause_post' : TT('Pause post-processing'), #: #: Config->Scheduler

View File

@@ -237,7 +237,7 @@ class SeriesSorter(object):
one = '-'.join(extra_list)
two = '-'.join(extra2_list)
return (one, two)
return one, two
def get_shownames(self):
""" Get the show name from the match object and format it """

View File

@@ -199,7 +199,7 @@ class URLGrabber(Thread):
retry = True
fetch_request = None
elif retry:
fetch_request, msg, retry, wait, data = _analyse(fetch_request, url, future_nzo)
fetch_request, msg, retry, wait, data = _analyse(fetch_request, future_nzo)
if not fetch_request:
if retry:
@@ -351,7 +351,7 @@ def _build_request(url):
return urllib2.urlopen(req)
def _analyse(fetch_request, url, future_nzo):
def _analyse(fetch_request, future_nzo):
""" Analyze response of indexer
returns fetch_request|None, error-message|None, retry, wait-seconds, data
"""

View File

@@ -52,7 +52,7 @@ def generate_key(key_size=2048, output_file='key.pem'):
# Ported from cryptography docs/x509/tutorial.rst
def generate_local_cert(private_key, days_valid=3560, output_file='cert.cert', LN=u'SABnzbd', ON=u'SABnzbd', CN=u'localhost'):
def generate_local_cert(private_key, days_valid=3560, output_file='cert.cert', LN=u'SABnzbd', ON=u'SABnzbd'):
# Various details about who we are. For a self-signed certificate the
# subject and issuer are always the same.
subject = issuer = x509.Name([
@@ -64,8 +64,7 @@ def generate_local_cert(private_key, days_valid=3560, output_file='cert.cert', L
# build Subject Alternate Names (aka SAN) list
# First the host names, add with x509.DNSName():
san_list = [x509.DNSName(u"localhost")]
san_list.append(x509.DNSName(unicode(socket.gethostname())))
san_list = [x509.DNSName(u"localhost"), x509.DNSName(unicode(socket.gethostname()))]
# Then the host IP addresses, add with x509.IPAddress()
# Inside a try-except, just to be sure

View File

@@ -6,18 +6,18 @@ Functions to check if the path filesystem uses FAT
import sys
import os
import subprocess
debug = False
def isFAT(dir):
def isFAT(check_dir):
""" Check if "check_dir" is on FAT. FAT considered harmful (for big files)
Works for Linux, Windows, MacOS
NB: On Windows, full path with drive letter is needed!
"""
# Check if "dir" is on FAT. FAT considered harmful (for big files)
# Works for Linux, Windows, MacOS
# NB: On Windows, full path with drive letter is needed!
FAT = False # default: not FAT
FAT = False # default: not FAT
# We're dealing with OS calls, so put everything in a try/except, just in case:
try:
if 'linux' in sys.platform:
@@ -31,9 +31,8 @@ def isFAT(dir):
/dev/sda1 vfat 488263616 163545248 324718368 34% /media/sander/INTENSO
'''
cmd = "df -T " + dir + " 2>&1"
cmd = "df -T " + check_dir + " 2>&1"
for thisline in os.popen(cmd).readlines():
#print thisline
if thisline.find('/') == 0:
# Starts with /, so a real, local device
fstype = thisline.split()[1]
@@ -44,13 +43,13 @@ def isFAT(dir):
break
elif 'win32' in sys.platform:
import win32api
if '?' in dir:
if '?' in check_dir:
# Remove \\?\ or \\?\UNC\ prefix from Windows path
dir = dir.replace(u'\\\\?\\UNC\\', u'\\\\', 1).replace(u'\\\\?\\', u'', 1)
check_dir = check_dir.replace(u'\\\\?\\UNC\\', u'\\\\', 1).replace(u'\\\\?\\', u'', 1)
try:
result = win32api.GetVolumeInformation(os.path.splitdrive(dir)[0])
result = win32api.GetVolumeInformation(os.path.splitdrive(check_dir)[0])
if debug: print result
if(result[4].startswith("FAT")):
if result[4].startswith("FAT"):
FAT = True
except:
pass
@@ -70,8 +69,7 @@ def isFAT(dir):
'''
dfcmd = "df " + dir
device = ''
dfcmd = "df " + check_dir
for thisline in os.popen(dfcmd).readlines():
if thisline.find('/')==0:
if debug: print thisline
@@ -89,17 +87,16 @@ def isFAT(dir):
return FAT
if __name__ == "__main__":
if debug: print sys.platform
try:
dir = sys.argv[1]
dir_to_check = sys.argv[1]
except:
print "Specify dir on the command line"
print "Specify check_dir on the command line"
sys.exit(0)
if isFAT(dir):
print dir, "is on FAT"
if isFAT(dir_to_check):
print dir_to_check, "is on FAT"
else:
print dir, "is not on FAT"
print dir_to_check, "is not on FAT"

View File

@@ -3,61 +3,41 @@
import time
import os
import sys
import logging
_DUMP_DATA = '*' * 10000
def writetofile(filename, mysizeMB):
# writes string to specified file repeat delay, until mysizeMB is reached.
writeloops = int(1024 * 1024 * mysizeMB / len(_DUMP_DATA))
try:
f = open(filename, 'w')
except:
logging.debug('Cannot create file %s', filename)
logging.debug("Traceback: ", exc_info=True)
return False
try:
for x in xrange(writeloops):
f.write(_DUMP_DATA)
except:
logging.debug('Cannot write to file %s', filename)
logging.debug("Traceback: ", exc_info=True)
return False
f.close()
return True
_DUMP_DATA_SIZE = 10 * 1024 * 1024
_DUMP_DATA = os.urandom(_DUMP_DATA_SIZE)
def diskspeedmeasure(dirname):
# returns writing speed to dirname in MB/s
# method: keep writing a file, until 0.5 seconds is passed. Then divide bytes written by time passed
filesize = 10 # MB
maxtime = 0.5 # sec
""" Returns writing speed to dirname in MB/s
method: keep writing a file, until 1 second is passed.
Then divide bytes written by time passed
"""
maxtime = 1.0 # sec
total_written = 0
filename = os.path.join(dirname, 'outputTESTING.txt')
if os.name == 'nt':
# On Windows, this crazy action is needed to
# avoid a "permission denied" error
try:
os.popen('echo Hi >%s' % filename)
except:
pass
# Use low-level I/O
fp = os.open(filename, os.O_CREAT | os.O_WRONLY, 0o777)
start = time.time()
loopcounter = 0
while True:
if not writetofile(filename, filesize):
return 0
loopcounter += 1
diff = time.time() - start
if diff > maxtime:
break
# Start looping
total_time = 0.0
while total_time < maxtime:
start = time.time()
os.write(fp, _DUMP_DATA)
os.fsync(fp)
total_time += time.time() - start
total_written += _DUMP_DATA_SIZE
# Remove the file
try:
# Have to use low-level close
os.close(fp)
os.remove(filename)
except:
pass
return (loopcounter * filesize) / diff
return total_written / total_time / 1024 / 1024
if __name__ == "__main__":

View File

@@ -1,4 +1,5 @@
import platform, subprocess
import platform
import subprocess
def getcpu():
@@ -19,7 +20,7 @@ def getcpu():
elif platform.system() == "Linux":
for myline in open("/proc/cpuinfo"):
if myline.startswith(('model name')):
if myline.startswith('model name'):
# Typical line:
# model name : Intel(R) Xeon(R) CPU E5335 @ 2.00GHz
cputype = myline.split(":", 1)[1] # get everything after the first ":"
@@ -39,15 +40,29 @@ def getcpu():
def getpystone():
value = None
for pystonemodule in ['test.pystone', 'pystone']:
# Iteratively find the pystone performance of the CPU
# Prefers using Python's standard pystones library, otherwise SABnzbd's pystones library
try:
# Try to import from the python standard library
from test.pystone import pystones
except:
try:
exec "from " + pystonemodule + " import pystones"
value = int(pystones(1000)[1])
break # import and calculation worked, so we're done. Get out of the for loop
# fallback: try to import from SABnzbd's library
from pystone import pystones
except:
pass # ... the import went wrong, so continue in the for loop
return value
return None # no pystone library found
# if we arrive here, we were able to succesfully import pystone, so start calculation
maxpystone = None
# Start with a short run, find the the pystone, and increase runtime until duration took > 0.1 second
for pyseed in [1000, 2000, 5000, 10000, 20000, 50000, 100000, 200000]:
duration, pystonefloat = pystones(pyseed)
maxpystone = max(maxpystone, int(pystonefloat))
# Stop when pystone() has been running for at least 0.1 second
if duration > 0.1:
break
return maxpystone
if __name__ == '__main__':

View File

@@ -6,10 +6,10 @@
# If the HOST has an IPv6 address, IPv6 is given a head start by delaying IPv4. See https://tools.ietf.org/html/rfc6555#section-4.1
# You can run this as a standalone program, or as a module:
'''
"""
from happyeyeballs import happyeyeballs
print happyeyeballs('newszilla.xs4all.nl', port=119)
'''
"""
# or with more logging:
'''
from happyeyeballs import happyeyeballs
@@ -31,119 +31,119 @@ DEBUG = False
# called by each thread
def do_socket_connect(queue, ip, PORT, SSL, ipv4delay):
# connect to the ip, and put the result into the queue
if DEBUG: logging.debug("Input for thread is %s %s %s", ip, PORT, SSL)
# connect to the ip, and put the result into the queue
if DEBUG: logging.debug("Input for thread is %s %s %s", ip, PORT, SSL)
try:
# CREATE SOCKET
if ip.find(':') >= 0:
s = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
if ip.find('.') >= 0:
time.sleep(ipv4delay) # IPv4 ... so a delay for IPv4 as we prefer IPv6. Note: ipv4delay could be 0
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
# CREATE SOCKET
if ip.find(':') >= 0:
s = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
if ip.find('.') >= 0:
time.sleep(ipv4delay) # IPv4 ... so a delay for IPv4 as we prefer IPv6. Note: ipv4delay could be 0
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(3)
if not SSL:
# Connect ...
s.connect((ip, PORT))
# ... and close
s.close()
else:
# WRAP SOCKET
wrappedSocket = ssl.wrap_socket(s, ssl_version=ssl.PROTOCOL_TLSv1)
# CONNECT
wrappedSocket.connect((ip, PORT))
# CLOSE SOCKET CONNECTION
wrappedSocket.close()
queue.put((ip, True))
if DEBUG: logging.debug("connect to %s OK", ip)
except:
queue.put((ip, False))
if DEBUG: logging.debug("connect to %s not OK", ip)
pass
s.settimeout(3)
if not SSL:
# Connect ...
s.connect((ip, PORT))
# ... and close
s.close()
else:
# WRAP SOCKET
wrappedSocket = ssl.wrap_socket(s, ssl_version=ssl.PROTOCOL_TLSv1)
# CONNECT
wrappedSocket.connect((ip, PORT))
# CLOSE SOCKET CONNECTION
wrappedSocket.close()
queue.put((ip, True))
if DEBUG: logging.debug("connect to %s OK", ip)
except:
queue.put((ip, False))
if DEBUG: logging.debug("connect to %s not OK", ip)
pass
def happyeyeballs(HOST, **kwargs):
# Happyeyeballs function, with caching of the results
# Happyeyeballs function, with caching of the results
# Fill out the parameters into the variables
try:
PORT = kwargs['port']
except:
PORT = 80
try:
SSL = kwargs['ssl']
except:
SSL = False
try:
preferipv6 = kwargs['preferipv6']
except:
preferipv6 = True # prefer IPv6, so give IPv6 connects a head start by delaying IPv4
# Fill out the parameters into the variables
try:
PORT = kwargs['port']
except:
PORT = 80
try:
SSL = kwargs['ssl']
except:
SSL = False
try:
preferipv6 = kwargs['preferipv6']
except:
preferipv6 = True # prefer IPv6, so give IPv6 connects a head start by delaying IPv4
# Find out if a cached result is available, and recent enough:
timecurrent = int(time.time()) # current time in seconds since epoch
retentionseconds = 100
hostkey = (HOST, PORT, SSL, preferipv6) # Example key: (u'ssl.astraweb.com', 563, True, True)
try:
happyeyeballs.happylist[hostkey] # just to check: does it exist?
# No exception, so entry exists, so let's check the time:
timecached = happyeyeballs.happylist[hostkey][1]
if timecurrent - timecached <= retentionseconds:
if DEBUG: logging.debug("existing cached result recent enough")
return happyeyeballs.happylist[hostkey][0]
else:
if DEBUG: logging.debug("existing cached result too old. Find a new one")
# Continue a few lines down
except:
# Exception, so entry not there, so we have to fill it out
if DEBUG: logging.debug("Host not yet in the cache. Find entry")
pass
# we only arrive here if the entry has to be determined. So let's do that:
# Find out if a cached result is available, and recent enough:
timecurrent = int(time.time()) # current time in seconds since epoch
retentionseconds = 100
hostkey = (HOST, PORT, SSL, preferipv6) # Example key: (u'ssl.astraweb.com', 563, True, True)
try:
happyeyeballs.happylist[hostkey] # just to check: does it exist?
# No exception, so entry exists, so let's check the time:
timecached = happyeyeballs.happylist[hostkey][1]
if timecurrent - timecached <= retentionseconds:
if DEBUG: logging.debug("existing cached result recent enough")
return happyeyeballs.happylist[hostkey][0]
else:
if DEBUG: logging.debug("existing cached result too old. Find a new one")
# Continue a few lines down
except:
# Exception, so entry not there, so we have to fill it out
if DEBUG: logging.debug("Host not yet in the cache. Find entry")
pass
# we only arrive here if the entry has to be determined. So let's do that:
# We have to determine the (new) best IP address
start = time.clock()
if DEBUG: logging.debug("\n\n%s %s %s %s", HOST, PORT, SSL, preferipv6)
# We have to determine the (new) best IP address
start = time.clock()
if DEBUG: logging.debug("\n\n%s %s %s %s", HOST, PORT, SSL, preferipv6)
ipv4delay = 0
try:
# Check if there is an AAAA / IPv6 result for this host:
info = socket.getaddrinfo(HOST, PORT, socket.AF_INET6, socket.SOCK_STREAM, socket.IPPROTO_IP, socket.AI_CANONNAME)
if DEBUG: logging.debug("IPv6 address found for %s", HOST)
if preferipv6:
ipv4delay=0.1 # preferipv6, AND at least one IPv6 found, so give IPv4 (!) a delay so that IPv6 has a head start and is preferred
except:
if DEBUG: logging.debug("No IPv6 address found for %s", HOST)
ipv4delay = 0
try:
# Check if there is an AAAA / IPv6 result for this host:
socket.getaddrinfo(HOST, PORT, socket.AF_INET6, socket.SOCK_STREAM, socket.IPPROTO_IP, socket.AI_CANONNAME)
if DEBUG: logging.debug("IPv6 address found for %s", HOST)
if preferipv6:
ipv4delay=0.1 # preferipv6, AND at least one IPv6 found, so give IPv4 (!) a delay so that IPv6 has a head start and is preferred
except:
if DEBUG: logging.debug("No IPv6 address found for %s", HOST)
myqueue = Queue.Queue() # queue used for threads giving back the results
myqueue = Queue.Queue() # queue used for threads giving back the results
try:
try:
# Get all IP (IPv4 and IPv6) addresses:
allinfo = socket.getaddrinfo(HOST, PORT, 0, 0, socket.IPPROTO_TCP)
for info in allinfo:
address = info[4][0]
thisthread = threading.Thread(target=do_socket_connect, args=(myqueue, address, PORT, SSL, ipv4delay))
thisthread.daemon = True
thisthread.start()
result = None # default return value, used if none of threads says True/"OK", so no connect on any IP address
# start reading from the Queue for message from the threads:
for i in range(len(allinfo)):
s = myqueue.get() # get a response
if s[1] == True:
result = s[0]
break # the first True/"OK" is enough, so break out of for loop
except:
if DEBUG: logging.debug("something went wrong in the try block")
result = None
logging.info("Quickest IP address for %s (port %s, ssl %s, preferipv6 %s) is %s", HOST, PORT, SSL, preferipv6, result)
delay = int(1000 * (time.clock() - start))
logging.debug("Happy Eyeballs lookup and port connect took %s ms", delay)
allinfo = socket.getaddrinfo(HOST, PORT, 0, 0, socket.IPPROTO_TCP)
for info in allinfo:
address = info[4][0]
thisthread = threading.Thread(target=do_socket_connect, args=(myqueue, address, PORT, SSL, ipv4delay))
thisthread.daemon = True
thisthread.start()
result = None # default return value, used if none of threads says True/"OK", so no connect on any IP address
# start reading from the Queue for message from the threads:
for i in range(len(allinfo)):
s = myqueue.get() # get a response
if s[1] == True:
result = s[0]
break # the first True/"OK" is enough, so break out of for loop
except:
if DEBUG: logging.debug("something went wrong in the try block")
result = None
logging.info("Quickest IP address for %s (port %s, ssl %s, preferipv6 %s) is %s", HOST, PORT, SSL, preferipv6, result)
delay = int(1000 * (time.clock() - start))
logging.debug("Happy Eyeballs lookup and port connect took %s ms", delay)
# We're done. Store and return the result
if result:
happyeyeballs.happylist[hostkey] = ( result, timecurrent )
if DEBUG: logging.debug("Determined new result for %s with result %s", (hostkey, happyeyeballs.happylist[hostkey]) )
return result
# We're done. Store and return the result
if result:
happyeyeballs.happylist[hostkey] = ( result, timecurrent )
if DEBUG: logging.debug("Determined new result for %s with result %s", (hostkey, happyeyeballs.happylist[hostkey]) )
return result
happyeyeballs.happylist = {} # The cached results. This static variable must be after the def happyeyeballs()
@@ -152,27 +152,27 @@ happyeyeballs.happylist = {} # The cached results. This static variable must
if __name__ == '__main__':
logger = logging.getLogger('')
logger.setLevel(logging.INFO)
if DEBUG: logger.setLevel(logging.DEBUG)
logger = logging.getLogger('')
logger.setLevel(logging.INFO)
if DEBUG: logger.setLevel(logging.DEBUG)
# plain HTTP/HTTPS sites:
print happyeyeballs('www.google.com')
print happyeyeballs('www.google.com', port=443, ssl=True)
print happyeyeballs('www.nu.nl')
# plain HTTP/HTTPS sites:
print happyeyeballs('www.google.com')
print happyeyeballs('www.google.com', port=443, ssl=True)
print happyeyeballs('www.nu.nl')
# newsservers:
print happyeyeballs('newszilla6.xs4all.nl', port=119)
print happyeyeballs('newszilla.xs4all.nl', port=119)
print happyeyeballs('block.cheapnews.eu', port=119)
print happyeyeballs('block.cheapnews.eu', port=443, ssl=True)
print happyeyeballs('sslreader.eweka.nl', port=563, ssl=True)
print happyeyeballs('news.thundernews.com', port=119)
print happyeyeballs('news.thundernews.com', port=119, preferipv6=False)
print happyeyeballs('secure.eu.thundernews.com', port=563, ssl=True)
# newsservers:
print happyeyeballs('newszilla6.xs4all.nl', port=119)
print happyeyeballs('newszilla.xs4all.nl', port=119)
print happyeyeballs('block.cheapnews.eu', port=119)
print happyeyeballs('block.cheapnews.eu', port=443, ssl=True)
print happyeyeballs('sslreader.eweka.nl', port=563, ssl=True)
print happyeyeballs('news.thundernews.com', port=119)
print happyeyeballs('news.thundernews.com', port=119, preferipv6=False)
print happyeyeballs('secure.eu.thundernews.com', port=563, ssl=True)
# Strange cases
print happyeyeballs('does.not.resolve', port=443, ssl=True)
print happyeyeballs('www.google.com', port=119)
print happyeyeballs('216.58.211.164')
# Strange cases
print happyeyeballs('does.not.resolve', port=443, ssl=True)
print happyeyeballs('www.google.com', port=119)
print happyeyeballs('216.58.211.164')

View File

@@ -78,7 +78,6 @@ import os
import sys
import sched
import time
import traceback
import weakref
import logging

View File

@@ -236,7 +236,7 @@ def Func2(StrParI1, StrParI2):
if Func1(StrParI1[IntLoc], StrParI2[IntLoc+1]) == Ident1:
CharLoc = 'A'
IntLoc = IntLoc + 1
if CharLoc >= 'W' and CharLoc <= 'Z':
if 'W' <= CharLoc <= 'Z':
IntLoc = 7
if CharLoc == 'X':
return TRUE

View File

@@ -90,7 +90,7 @@ def test_nntp_server(host, port, server=None, username=None, password=None, ssl=
nw.recv_chunk(block=True)
nw.finish_connect(nw.status_code)
except socket.timeout, e:
except socket.timeout:
if port != 119 and not ssl:
return False, T('Timed out: Try enabling SSL or connecting on a different port.')
else:
@@ -103,7 +103,7 @@ def test_nntp_server(host, port, server=None, username=None, password=None, ssl=
return False, unicode(e)
except TypeError, e:
except TypeError:
return False, T('Invalid server address.')
except IndexError:

View File

@@ -25,7 +25,6 @@ import os
from sabnzbd.encoding import unicoder
import sabnzbd.cfg as cfg
from sabnzbd.misc import get_ext, get_filename, get_from_url
import sabnzbd.newsunpack
from sabnzbd.constants import VALID_ARCHIVES, VALID_NZB_FILES
from sabnzbd.dirscanner import ProcessArchiveFile, ProcessSingleFile

View File

@@ -4,5 +4,5 @@
# You MUST use double quotes (so " and not ')
__version__ = "2.3.4"
__baseline__ = "2a113f7f588fe78c5dcc3453db31e0ec540efd60"
__version__ = "2.3.6"
__baseline__ = "190ec0a472eddca58698fc3504503c6252337c40"

View File

@@ -21,7 +21,6 @@ sabnzbd.zconfig - bonjour/zeroconfig support
import os
import logging
import cherrypy
_HOST_PORT = (None, None)
@@ -80,11 +79,6 @@ def set_bonjour(host=None, port=None):
return
name = hostname()
if '.local' in name:
suffix = ''
else:
suffix = '.local'
logging.debug('Try to publish in Bonjour as "%s" (%s:%s)', name, host, port)
try:
refObject = pybonjour.DNSServiceRegister(

View File

@@ -28,7 +28,7 @@ NOTES:
1) To use this script you need Python installed on your system and
select "Add to path" during its installation. Select this folder in
Config > Folders > Scripts Folder and select this script for each job
you want it sued for, or link it to a category in Config > Categories.
you want it used for, or link it to a category in Config > Categories.
2) Beware that files on the 'Cleanup List' are removed before
scripts are called and if any of them happen to be required by
the found par2 file, it will fail.
@@ -39,37 +39,115 @@ NOTES:
5) Feedback or bugs in this script can be reported in on our forum:
https://forums.sabnzbd.org/viewforum.php?f=9
Improved by P1nGu1n
"""
import os
import sys
import time
import fnmatch
import subprocess
import struct
import hashlib
# Files to exclude and minimal file size for renaming
EXCLUDED_FILE_EXTS = ('.vob', '.bin')
MIN_FILE_SIZE = 40*1024*1024
# Are we being called from SABnzbd?
if not os.environ.get('SAB_VERSION'):
print "This script needs to be called from SABnzbd as post-processing script."
sys.exit(1)
# Files to exclude and minimal file size for renaming
EXCLUDED_FILE_EXTS = ('.vob', '.bin')
MIN_FILE_SIZE = 40*1024*1024
# see: http://parchive.sourceforge.net/docs/specifications/parity-volume-spec/article-spec.html
STRUCT_PACKET_HEADER = struct.Struct("<"
"8s" # Magic sequence
"Q" # Length of the entire packet (including header), must be multiple of 4
"16s" # MD5 Hash of packet
"16s" # Recovery Set ID
"16s" # Packet type
)
PACKET_TYPE_FILE_DESC = 'PAR 2.0\x00FileDesc'
STRUCT_FILE_DESC_PACKET = struct.Struct("<"
"16s" # File ID
"16s" # MD5 hash of the entire file
"16s" # MD5 hash of the first 16KiB of the file
"Q" # Length of the file
)
# Supporting functions
def print_splitter():
""" Simple helper function """
print '\n------------------------\n'
# Windows or others?
par2_command = os.environ['SAB_PAR2_COMMAND']
if os.environ['SAB_MULTIPAR_COMMAND']:
par2_command = os.environ['SAB_MULTIPAR_COMMAND']
# Diagnostic info
def decodePar(parfile):
result = False
dir = os.path.dirname(parfile)
with open(parfile, 'rb') as parfileToDecode:
while True:
header = parfileToDecode.read(STRUCT_PACKET_HEADER.size)
if not header: break # file fully read
(_, packetLength, _, _, packetType) = STRUCT_PACKET_HEADER.unpack(header)
bodyLength = packetLength - STRUCT_PACKET_HEADER.size
# only process File Description packets
if packetType != PACKET_TYPE_FILE_DESC:
# skip this packet
parfileToDecode.seek(bodyLength, os.SEEK_CUR)
continue
chunck = parfileToDecode.read(STRUCT_FILE_DESC_PACKET.size)
(_, _, hash16k, filelength) = STRUCT_FILE_DESC_PACKET.unpack(chunck)
# filename makes up for the rest of the packet, padded with null characters
targetName = parfileToDecode.read(bodyLength - STRUCT_FILE_DESC_PACKET.size).rstrip('\0')
targetPath = os.path.join(dir, targetName)
# file already exists, skip it
if os.path.exists(targetPath):
print "File already exists: " + targetName
continue
# find and rename file
srcPath = findFile(dir, filelength, hash16k)
if srcPath is not None:
os.rename(srcPath, targetPath)
print "Renamed file from " + os.path.basename(srcPath) + " to " + targetName
result = True
else:
print "No match found for: " + targetName
return result
def findFile(dir, filelength, hash16k):
for filename in os.listdir(dir):
filepath = os.path.join(dir, filename)
# check if the size matches as an indication
if os.path.getsize(filepath) != filelength: continue
with open(filepath, 'rb') as fileToMatch:
data = fileToMatch.read(16 * 1024)
m = hashlib.md5()
m.update(data)
# compare hash to confirm the match
if m.digest() == hash16k:
return filepath
return None
# Run main program
print_splitter()
print 'SABnzbd version: ', os.environ['SAB_VERSION']
print 'Job location: ', os.environ['SAB_COMPLETE_DIR']
print 'Par2-command: ', par2_command
print_splitter()
# Search for par2 files
@@ -86,34 +164,14 @@ if not matches:
# Run par2 from SABnzbd on them
for par2_file in matches:
# Build command, make it check the whole directory
wildcard = os.path.join(os.environ['SAB_COMPLETE_DIR'], '*')
command = [str(par2_command), 'r', par2_file, wildcard]
# Start command
# Analyse data and analyse result
print_splitter()
print 'Starting command: ', repr(command)
try:
result = subprocess.check_output(command)
except subprocess.CalledProcessError as e:
# Multipar also gives non-zero in case of succes
result = e.output
# Show output
print_splitter()
print result
print_splitter()
# Last status-line for the History
# Check if the magic words are there
if 'Repaired successfully' in result or 'All files are correct' in result or \
'Repair complete' in result or 'All Files Complete' in result or 'PAR File(s) Incomplete' in result:
if decodePar(par2_file):
print 'Recursive repair/verify finished.'
run_renamer = False
else:
print 'Recursive repair/verify did not complete!'
# No matches? Then we try to rename the largest file to the job-name
if run_renamer:
print_splitter()

46
snap/snapcraft.yaml Normal file
View File

@@ -0,0 +1,46 @@
name: sabnzbd
version: git
summary: SABnzbd
description: The automated Usenet download tool
confinement: strict
icon: interfaces/Config/templates/staticcfg/images/logo-small.svg
adopt-info: sabnzbd
version-script: |
grep -oP '(?<=^Version: ).*' PKG-INFO
apps:
sabnzbd:
environment:
LC_CTYPE: C.UTF-8
command: python $SNAP/opt/sabnzbd/SABnzbd.py -f $SNAP_COMMON
daemon: simple
plugs: [network, network-bind, removable-media]
parts:
sabnzbd:
plugin: python
source: .
python-version: python2
python-packages: [cheetah3, cryptography, sabyenc]
build-attributes: [no-system-libraries]
stage-packages:
- to armhf: ["unrar:armhf", "p7zip-full:armhf", "par2:armhf"]
- to arm64: ["unrar:arm64", "p7zip-full:arm64", "par2:arm64"]
- to amd64: ["unrar:amd64", "p7zip-full:amd64", "par2:amd64"]
- to i386: ["unrar:i386", "p7zip-full:i386", "par2:i386"]
build-packages:
- to armhf: ["libffi-dev:armhf", "python-dev:armhf", "libssl-dev:armhf"]
- to arm64: ["libffi-dev:arm64", "python-dev:arm64", "libssl-dev:arm64"]
- to amd64: ["libffi-dev:amd64", "python-dev:amd64", "libssl-dev:amd64"]
- to i386: ["libffi-dev:i386", "python-dev:i386", "libssl-dev:i386"]
override-pull: |
snapcraftctl pull
[ $(git rev-parse --abbrev-ref HEAD) = "master" ] && GRADE=stable || GRADE=devel
snapcraftctl set-grade "$GRADE"
override-build: |
snapcraftctl build
python tools/make_mo.py
mkdir -p $SNAPCRAFT_PART_INSTALL/opt
cp -R $SNAPCRAFT_PART_BUILD $SNAPCRAFT_PART_INSTALL/opt/sabnzbd
organize:
usr/bin/unrar-nonfree: usr/bin/unrar

View File

@@ -1,71 +0,0 @@
#!/usr/bin/python -OO
# Copyright 2007-2018 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
tests.conftest - Wrappers to start SABnzbd for testing
"""
import os
import itertools
import urllib2
import pytest
import shutil
import time
import testhelper
from xprocess import ProcessStarter
@pytest.fixture(scope='session')
def sabnzbd_connect(request, xprocess):
# Get cache directory
base_path = os.path.dirname(os.path.abspath(__file__))
cache_dir = os.path.join(base_path, 'cache')
# Copy basic config file
try:
os.mkdir(cache_dir)
shutil.copyfile(os.path.join(base_path, 'sabnzbd.basic.ini'), os.path.join(cache_dir, 'sabnzbd.ini'))
except:
pass
class Starter(ProcessStarter):
# Wait for SABnzbd to start
pattern = "ENGINE Bus STARTED"
# Start without browser and with basic logging
args = 'python ../../SABnzbd.py -l1 -s %s:%s -b0 -f %s' % (testhelper.SAB_HOST, testhelper.SAB_PORT, cache_dir)
args = args.split()
# We have to wait a bit longer than default
def filter_lines(self, lines):
return itertools.islice(lines, 500)
# Shut it down at the end
def shutdown_sabnzbd():
# Gracefull shutdown request
testhelper.get_url_result('shutdown')
# Takes a second to shutdown
for x in range(5):
try:
shutil.rmtree(cache_dir)
break
except:
time.sleep(1)
request.addfinalizer(shutdown_sabnzbd)
return xprocess.ensure("sabnzbd", Starter)

View File

@@ -1,8 +1,8 @@
# SAB-Specific
cheetah
cheetah3
cryptography
sabyenc
# Testing
pytest-xprocess
selenium
requests

View File

@@ -1,11 +1,4 @@
__version__ = 19
__encoding__ = utf-8
[misc]
api_key = apikey
[servers]
[[sabnzbd.test]]
enable = 1
host = sabnzd.test
username = sabnzbd
password = sabnzbd
api_key = apikey

View File

@@ -1,61 +0,0 @@
#!/usr/bin/python -OO
# Copyright 2007-2018 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
tests.test_api_pages - The most basic testing if things work
"""
import pytest
import testhelper
def test_basic_api(sabnzbd_connect):
# Basic API test
assert 'queue' in testhelper.get_api_result('queue')
assert 'history' in testhelper.get_api_result('history')
assert 'status' in testhelper.get_api_result('fullstatus')
assert 'config' in testhelper.get_api_result('get_config')
def test_main_pages(sabnzbd_connect):
# See if the basic pages work
assert 'Traceback' not in testhelper.get_url_result()
assert 'Traceback' not in testhelper.get_url_result('history')
assert 'Traceback' not in testhelper.get_url_result('queue')
assert 'Traceback' not in testhelper.get_url_result('status')
def test_wizard_pages(sabnzbd_connect):
# Test if wizard pages work
assert 'Traceback' not in testhelper.get_url_result('wizard')
assert 'Traceback' not in testhelper.get_url_result('wizard/one')
assert 'Traceback' not in testhelper.get_url_result('wizard/two')
def test_config_pages(sabnzbd_connect):
# Test if config pages work
assert 'Traceback' not in testhelper.get_url_result('config')
assert 'Traceback' not in testhelper.get_url_result('config/general')
assert 'Traceback' not in testhelper.get_url_result('config/server')
assert 'Traceback' not in testhelper.get_url_result('config/categories')
assert 'Traceback' not in testhelper.get_url_result('config/switches')
assert 'Traceback' not in testhelper.get_url_result('config/sorting')
assert 'Traceback' not in testhelper.get_url_result('config/notify')
assert 'Traceback' not in testhelper.get_url_result('config/scheduling')
assert 'Traceback' not in testhelper.get_url_result('config/rss')
assert 'Traceback' not in testhelper.get_url_result('config/special')

295
tests/test_functional.py Normal file
View File

@@ -0,0 +1,295 @@
#!/usr/bin/python -OO
# Copyright 2007-2018 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
tests.test_functional - The most basic testing if things work
"""
import unittest
import random
from selenium import webdriver
from selenium.common.exceptions import WebDriverException, NoSuchElementException
from selenium.webdriver.chrome.options import Options as ChromeOptions
from selenium.webdriver.firefox.options import Options as FirefoxOptions
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support.ui import WebDriverWait
from testhelper import *
class SABnzbdBaseTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
# We try Chrome, fallback to Firefox
try:
driver_options = ChromeOptions()
# Headless on Appveyor/Travis
if "CI" in os.environ:
driver_options.add_argument("--headless")
driver_options.add_argument("--no-sandbox")
cls.driver = webdriver.Chrome(chrome_options=driver_options)
except WebDriverException:
driver_options = FirefoxOptions()
# Headless on Appveyor/Travis
if "CI" in os.environ:
driver_options.headless = True
cls.driver = webdriver.Firefox(firefox_options=driver_options)
# Get the newsserver-info, if available
if "SAB_NEWSSERVER_HOST" in os.environ:
cls.newsserver_host = os.environ['SAB_NEWSSERVER_HOST']
cls.newsserver_user = os.environ['SAB_NEWSSERVER_USER']
cls.newsserver_password = os.environ['SAB_NEWSSERVER_PASSWORD']
@classmethod
def tearDownClass(cls):
cls.driver.close()
cls.driver.quit()
def no_page_crash(self):
# Do a base test if CherryPy did not report test
self.assertNotIn('500 Internal Server Error', self.driver.title)
def open_page(self, url):
# Open a page and test for crash
self.driver.get(url)
self.no_page_crash()
def scroll_to_top(self):
self.driver.find_element_by_tag_name('body').send_keys(Keys.CONTROL + Keys.HOME)
time.sleep(2)
def wait_for_ajax(self):
wait = WebDriverWait(self.driver, 15)
wait.until(lambda driver_wait: self.driver.execute_script('return jQuery.active') == 0)
wait.until(lambda driver_wait: self.driver.execute_script('return document.readyState') == 'complete')
@unittest.skipIf("SAB_NEWSSERVER_HOST" not in os.environ, "Test-server not specified")
class SABnzbdDownloadFlow(SABnzbdBaseTest):
def test_full(self):
# Wrapper for all the tests in order
self.start_wizard()
# Basic test
self.add_nzb_from_url("http://sabnzbd.org/tests/basic_rar5.nzb", "testfile.bin")
# Unicode test
self.add_nzb_from_url("http://sabnzbd.org/tests/unicode_rar.nzb", u"\u4f60\u597d\u4e16\u754c.bin")
# Unicode test with a missing article
#self.add_nzb_from_url("http://sabnzbd.org/tests/unicode_rar_broken.nzb", u"\u4f60\u597d\u4e16\u754c.bin")
def start_wizard(self):
# Language-selection
self.open_page("http://%s:%s/sabnzbd/wizard/" % (SAB_HOST, SAB_PORT))
self.driver.find_element_by_id("en").click()
self.driver.find_element_by_css_selector('.btn.btn-default').click()
# Fill server-info
self.no_page_crash()
host_inp = self.driver.find_element_by_name("host")
host_inp.clear()
host_inp.send_keys(self.newsserver_host)
username_imp = self.driver.find_element_by_name("username")
username_imp.clear()
username_imp.send_keys(self.newsserver_user)
pass_inp = self.driver.find_element_by_name("password")
pass_inp.clear()
pass_inp.send_keys(self.newsserver_password)
# With SSL
ssl_imp = self.driver.find_element_by_name("ssl")
if not ssl_imp.get_attribute('checked'):
ssl_imp.click()
# Test server-check
self.driver.find_element_by_id("serverTest").click()
self.wait_for_ajax()
self.assertIn("Connection Successful", self.driver.find_element_by_id("serverResponse").text)
# Final page done
self.driver.find_element_by_id("next-button").click()
self.no_page_crash()
self.assertIn("http://%s:%s/sabnzbd" % (SAB_HOST, SAB_PORT), self.driver.find_element_by_class_name("quoteBlock").text)
# Go to SAB!
self.driver.find_element_by_css_selector('.btn.btn-success').click()
self.no_page_crash()
def add_nzb_from_url(self, file_url, file_output):
test_job_name = 'testfile_%s' % random.randint(500, 1000)
self.open_page("http://%s:%s/sabnzbd/" % (SAB_HOST, SAB_PORT))
# Wait for modal to open, add URL
self.driver.find_element_by_css_selector('a[href="#modal-add-nzb"]').click()
time.sleep(1)
self.driver.find_element_by_name("nzbURL").send_keys(file_url)
self.driver.find_element_by_name("nzbname").send_keys(test_job_name)
self.driver.find_element_by_css_selector('form[data-bind="submit: addNZBFromURL"] input[type="submit"]').click()
# We wait for 30 seconds to let it complete
for _ in range(120):
try:
# Locate resulting row
result_row = self.driver.find_element_by_xpath('//*[@id="history-tab"]//tr[td//text()[contains(., "%s")]]' % test_job_name)
# Did it complete?
if result_row.find_element_by_css_selector('td.status').text == 'Completed':
break
else:
time.sleep(1)
except NoSuchElementException:
time.sleep(1)
else:
self.fail("Download did not complete")
# Check if the file exists on disk
file_to_find = os.path.join(SAB_COMPLETE_DIR, test_job_name, file_output)
self.assertTrue(os.path.exists(file_to_find), "File not found")
# Shutil can't handle unicode, need to remove the file here
os.remove(file_to_find)
class SABnzbdBasicPagesTest(SABnzbdBaseTest):
def test_base_pages(self):
# Quick-check of all Config pages
test_urls = ['config',
'config/general',
'config/folders',
'config/server',
'config/categories',
'config/switches',
'config/sorting',
'config/notify',
'config/scheduling',
'config/rss',
'config/special']
for test_url in test_urls:
self.open_page("http://%s:%s/%s" % (SAB_HOST, SAB_PORT, test_url))
@unittest.skipIf("SAB_NEWSSERVER_HOST" not in os.environ, "Test-server not specified")
class SABnzbdConfigServers(SABnzbdBaseTest):
server_name = "_SeleniumServer"
def open_config_servers(self):
# Test if base page works
self.open_page("http://%s:%s/sabnzbd/config/server" % (SAB_HOST, SAB_PORT))
self.scroll_to_top()
# Show advanced options
advanced_btn = self.driver.find_element_by_name("advanced-settings-button")
if not advanced_btn.get_attribute('checked'):
advanced_btn.click()
def add_test_server(self):
# Add server
self.driver.find_element_by_id("addServerButton").click()
host_inp = self.driver.find_element_by_name("host")
host_inp.clear()
host_inp.send_keys(self.newsserver_host)
username_imp = self.driver.find_element_by_css_selector("#addServerContent input[data-hide='username']")
username_imp.clear()
username_imp.send_keys(self.newsserver_user)
pass_inp = self.driver.find_element_by_css_selector("#addServerContent input[data-hide='password']")
pass_inp.clear()
pass_inp.send_keys(self.newsserver_password)
# With SSL
ssl_imp = self.driver.find_element_by_name("ssl")
if not ssl_imp.get_attribute('checked'):
ssl_imp.click()
# Check that we filled the right port automatically
self.assertEqual(self.driver.find_element_by_id("port").get_attribute('value'), '563')
# Test server-check
self.driver.find_element_by_css_selector("#addServerContent .testServer").click()
self.wait_for_ajax()
self.assertIn("Connection Successful", self.driver.find_element_by_css_selector('#addServerContent .result-box').text)
# Set test-servername
self.driver.find_element_by_id("displayname").send_keys(self.server_name)
# Add and show details
pass_inp.send_keys(Keys.RETURN)
time.sleep(1)
if not self.driver.find_element_by_id("host0").is_displayed():
self.driver.find_element_by_class_name("showserver").click()
def remove_server(self):
# Remove the first server and accept the confirmation
self.driver.find_element_by_class_name("delServer").click()
self.driver.switch_to.alert.accept()
# Check that it's gone
time.sleep(2)
self.assertNotIn(self.server_name, self.driver.page_source)
def test_add_and_remove_server(self):
self.open_config_servers()
self.add_test_server()
self.remove_server()
def test_empty_bad_password(self):
self.open_config_servers()
self.add_test_server()
# Test server-check with empty password
pass_inp = self.driver.find_elements_by_css_selector("input[data-hide='password']")[1]
pass_inp.clear()
self.driver.find_elements_by_css_selector(".testServer")[1].click()
self.wait_for_ajax()
check_result = self.driver.find_elements_by_css_selector('.result-box')[1].text.lower()
self.assertTrue("authentication failed" in check_result or "invalid username or password" in check_result)
# Test server-check with bad password
pass_inp.send_keys("bad")
self.driver.find_elements_by_css_selector(".testServer")[1].click()
self.wait_for_ajax()
self.assertTrue("authentication failed" in check_result or "invalid username or password" in check_result)
# Finish
self.remove_server()
class SABnzbdConfigCategories(SABnzbdBaseTest):
category_name = "testCat"
def test_page(self):
# Test if base page works
self.open_page("http://%s:%s/sabnzbd/config/categories" % (SAB_HOST, SAB_PORT))
# Add new category
self.driver.find_elements_by_name("newname")[1].send_keys("testCat")
self.driver.find_element_by_xpath("//button/text()[normalize-space(.)='Add']/parent::*").click()
self.no_page_crash()
self.assertNotIn(self.category_name, self.driver.page_source)
if __name__ == "__main__":
unittest.main(failfast=True)

View File

@@ -1,58 +0,0 @@
#!/usr/bin/python -OO
# Copyright 2007-2018 The SABnzbd-Team <team@sabnzbd.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
tests.test_nzb - Basic NZB adding support
"""
import os
import pytest
import testhelper
# Where are we now?
base_path = os.path.dirname(os.path.abspath(__file__))
def nzo_in_queue(nzo_response):
""" Helper function for checking if file is in queue and then remove it """
queue_res = testhelper.get_api_result('queue')
nzo_id = nzo_response['nzo_ids'][0]
# Was it added?
assert nzo_response['status'] == True
assert queue_res['queue']['slots'][0]['nzo_id'] == nzo_response['nzo_ids'][0]
# Let's remove it
remove_response = testhelper.get_api_result('queue', {'name': 'delete', 'value': nzo_id})
assert nzo_response['status'] == True
# Really gone?
queue_res = testhelper.get_api_result('queue')
assert not queue_res['queue']['slots']
def test_addfile(sabnzbd_connect):
# See if basic upload works
nzo_response = testhelper.upload_nzb(os.path.join(base_path, 'data', 'reftestnzb.nzb'))
nzo_in_queue(nzo_response)
def test_addlocalfile(sabnzbd_connect):
# See if basic adding from disk-file works
nzo_response = testhelper.get_api_result('addlocalfile', {'name': os.path.join(base_path, 'data', 'reftestnzb.nzb')})
nzo_in_queue(nzo_response)

View File

@@ -19,12 +19,18 @@
tests.testhelper - Basic helper functions
"""
import urllib2
import json
import os
import shutil
import subprocess
import time
import requests
SAB_HOST = 'localhost'
SAB_PORT = 8081
SAB_BASE_DIR = os.path.dirname(os.path.abspath(__file__))
SAB_CACHE_DIR = os.path.join(SAB_BASE_DIR, 'cache')
SAB_COMPLETE_DIR = os.path.join(SAB_CACHE_DIR, 'Downloads', 'complete')
def get_url_result(url=''):
@@ -41,8 +47,57 @@ def get_api_result(mode, extra_arguments={}):
return r.json()
def upload_nzb(file):
def upload_nzb(filename):
""" Upload file and return nzo_id reponse """
files = {'name': open(file, 'rb')}
arguments ={'apikey':'apikey', 'mode':'addfile', 'output': 'json'}
files = {'name': open(filename, 'rb')}
arguments = {'apikey': 'apikey', 'mode': 'addfile', 'output': 'json'}
return requests.post('http://%s:%s/api' % (SAB_HOST, SAB_PORT), files=files, data=arguments).json()
def setUpModule():
# Remove cache if already there
if os.path.isdir(SAB_CACHE_DIR):
shutil.rmtree(SAB_CACHE_DIR)
# Copy basic config file with API key
os.mkdir(SAB_CACHE_DIR)
shutil.copyfile(os.path.join(SAB_BASE_DIR, 'sabnzbd.basic.ini'), os.path.join(SAB_CACHE_DIR, 'sabnzbd.ini'))
# Check if we have language files
if not os.path.exists(os.path.join(SAB_BASE_DIR, '..', 'locale')):
lang_command = 'python %s/../tools/make_mo.py' % SAB_BASE_DIR
subprocess.Popen(lang_command.split())
# Start SABnzbd
sab_command = 'python %s/../SABnzbd.py --new -l2 -s %s:%s -b0 -f %s' % (SAB_BASE_DIR, SAB_HOST, SAB_PORT, SAB_CACHE_DIR)
subprocess.Popen(sab_command.split())
# Wait for SAB to respond
for _ in range(10):
try:
get_url_result()
# Woohoo, we're up!
break
except requests.ConnectionError:
time.sleep(1)
else:
# Make sure we clean up
tearDownModule()
raise requests.ConnectionError()
def tearDownModule():
# Graceful shutdown request
try:
get_url_result('shutdown')
except requests.ConnectionError:
pass
# Takes a second to shutdown
for x in range(10):
try:
shutil.rmtree(SAB_CACHE_DIR)
break
except OSError:
print "Unable to remove cache dir (try %d)" % x
time.sleep(1)

View File

@@ -27,7 +27,7 @@ import re
f = open('sabnzbd/version.py')
code = f.read()
f.close()
exec(code)
exec code
# Fixed information for the POT header
HEADER = r'''#
@@ -53,7 +53,7 @@ EMAIL_DIR = 'email'
DOMAIN = 'SABnzbd'
DOMAIN_EMAIL = 'SABemail'
DOMAIN_NSIS = 'SABnsis'
PARMS = '-d %s -p %s -k T -k Ta -k TT -o %s.pot.tmp' % (DOMAIN, PO_DIR, DOMAIN)
PARMS = '-d %s -p %s -w500 -k T -k Ta -k TT -o %s.pot.tmp' % (DOMAIN, PO_DIR, DOMAIN)
FILES = 'SABnzbd.py SABHelper.py SABnzbdDelegate.py sabnzbd/*.py sabnzbd/utils/*.py'
FILE_CACHE = {}
@@ -108,8 +108,11 @@ def get_context(line):
item = item.split(':')[0]
if context:
newlines.append('%s [%s]' % (item, context))
else:
# Format context
item = '%s [%s]' % (item, context)
# Only add new texts
if item not in newlines:
newlines.append(item)
return '#: ' + ' # '.join(newlines) + '\n'

View File

@@ -213,7 +213,6 @@ def make_templates():
def patch_nsis():
""" Patch translation into the NSIS script """
RE_NSIS = re.compile(r'^(\s*LangString\s+\w+\s+\$\{LANG_)(\w+)\}\s+(".*)', re.I)
RE_NSIS = re.compile(r'^(\s*LangString\s+)(\w+)(\s+\$\{LANG_)(\w+)\}\s+(".*)', re.I)
languages = [os.path.split(path)[1] for path in glob.glob(os.path.join(MO_DIR, '*'))]

View File

@@ -69,7 +69,7 @@ def set_connection_info(url, user=True):
try:
hive = _winreg.ConnectRegistry(None, section)
try:
key = _winreg.CreateKey(hive, keypath)
_winreg.CreateKey(hive, keypath)
except:
pass
key = _winreg.OpenKey(hive, keypath)
@@ -80,7 +80,6 @@ def set_connection_info(url, user=True):
except WindowsError:
if user:
set_connection_info(url, user=False)
pass
finally:
_winreg.CloseKey(hive)
@@ -96,7 +95,6 @@ def del_connection_info(user=True):
except WindowsError:
if user:
del_connection_info(user=False)
pass
finally:
_winreg.CloseKey(hive)
@@ -105,7 +103,7 @@ def get_install_lng():
""" Return language-code used by the installer """
lng = 0
try:
hive = _winreg.ConnectRegistry(None, _winreg.HKEY_LOCAL_MACHINE)
hive = _winreg.ConnectRegistry(None, _winreg.HKEY_CURRENT_USER)
key = _winreg.OpenKey(hive, r"Software\SABnzbd")
for i in range(0, _winreg.QueryInfoKey(key)[1]):
name, value, val_type = _winreg.EnumValue(key, i)
@@ -116,7 +114,31 @@ def get_install_lng():
pass
finally:
_winreg.CloseKey(hive)
return lng
if lng in LanguageMap:
return LanguageMap[lng]
return 'en'
# Map from NSIS-codepage to our language-strings
LanguageMap = {
'1033': 'en',
'1036': 'fr',
'1031': 'de',
'1043': 'nl',
'1035': 'fi',
'1045': 'pl',
'1053': 'sv',
'1030': 'da',
'2068': 'nb',
'1048': 'ro',
'1034': 'es',
'1046': 'pr_BR',
'3098': 'sr',
'1037': 'he',
'1049': 'ru',
'2052': 'zh_CN'
}
if __name__ == '__main__':

View File

@@ -19,7 +19,6 @@
sabnzbd.mailslot - Mailslot communication
"""
import os
from win32file import GENERIC_WRITE, FILE_SHARE_READ, \
OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL
from ctypes import c_uint, c_buffer, byref, sizeof, windll

View File

Binary file not shown.

View File

Binary file not shown.

View File

Binary file not shown.

View File

Binary file not shown.