Compare commits

...

83 Commits

Author SHA1 Message Date
Safihre
e8d6eebb04 Set version to 3.1.1 2020-11-11 22:04:44 +01:00
Safihre
864c5160c0 Merge branch '3.1.x' 2020-11-11 22:01:20 +01:00
Safihre
99b5a00c12 Update text files for 3.1.1 2020-11-11 21:56:15 +01:00
Safihre
85ee1f07d7 Do not crash if we cannot format the error message 2020-11-08 15:06:50 +01:00
exizak42
e58b4394e0 Separate email message lines are with CRLF (#1671)
SMTP protocol dictates that all lines are supposed to be separated
with CRLF and not LF (even on LF-based systems). This change ensures
that even if the original byte string message is using `\n` for line
separators, the SMTP protocol will still work properly.

This resolves sabnzbd#1669

Fix code formatting
2020-11-08 14:44:44 +01:00
Safihre
1e91a57bf1 It was not possible to set directory-settings to empty values 2020-11-06 16:14:53 +01:00
Safihre
39cee52a7e Update text files for 3.1.1RC1 2020-11-02 20:03:43 +01:00
Safihre
72068f939d Improve handling of binary restarts (macOS / Windows) 2020-11-02 19:57:57 +01:00
Safihre
096d0d3cad Deobfuscate-during-download did not work
https://forums.sabnzbd.org/viewtopic.php?f=3&t=25037
2020-11-01 15:35:09 +01:00
Safihre
2472ab0121 Python 3.5 does not know ssl.PROTOCOL_TLS_SERVER
Closes #1658
2020-10-27 15:52:28 +01:00
Safihre
00421717b8 Queue Repair would fail if Rating is enabled
Closes #1649
2020-10-24 11:10:03 +02:00
Safihre
ae96d93f94 Set version to 3.1.0 2020-10-16 17:02:28 +02:00
Safihre
8522c40c8f Merge branch '3.1.x' 2020-10-16 16:58:58 +02:00
Safihre
23f86e95f1 Update text files for 3.1.0 2020-10-16 16:42:35 +02:00
Safihre
eed2045189 After pre-check the job was not restored to the original spot 2020-10-16 16:27:51 +02:00
Safihre
217785bf0f Applying Filters to a feed would result in crash
Closes #1634
2020-10-15 18:07:06 +02:00
Safihre
6aef50dc5d Update text files for 3.1.0RC3 2020-10-02 11:34:21 +02:00
Safihre
16b6e3caa7 Notify users of Deobfuscate.py that it is now part of SABnzbd 2020-09-29 14:08:51 +02:00
Safihre
3de4c99a8a Only set the "Waiting" status when the job hits post-processing
https://forums.sabnzbd.org/viewtopic.php?f=11&t=24969
2020-09-29 13:51:15 +02:00
Safihre
980aa19a75 Only run Windows Service code when executed from the executables
Could be made to work with the from-sources code.. But seems like very small usecase.
Closes #1623
2020-09-29 10:42:23 +02:00
Safihre
fb4b57e056 Update text files for 3.1.0RC2 2020-09-27 17:19:34 +02:00
Safihre
03638365ea Set execute bit on Deobfuscate.py 2020-09-27 17:17:30 +02:00
Safihre
157cb1c83d Handle failing RSS-feeds for feedparser 6.0.0+
Closes #1621
Now throws warnings (that can be disabled, helpfull_warnings) if readout failed.
2020-09-27 13:32:38 +02:00
Safihre
e51f11c2b1 Do not crash if attributes file is not present 2020-09-25 10:50:19 +02:00
Safihre
1ad0961dd8 Existing files were not parsed when re-adding a job 2020-09-25 10:49:50 +02:00
Safihre
46ff7dd4e2 Do not crash if we can't save attributes, the job might be gone 2020-09-25 10:03:05 +02:00
Safihre
8b067df914 Correctly parse failed_only for Plush 2020-09-23 16:56:57 +02:00
Safihre
ef43b13272 Assume RarFile parses the correct filepaths for the RAR-volumes
Parsing UTF8 from command-line still fails.
https://forums.sabnzbd.org/viewtopic.php?p=122267#p122267
2020-09-21 22:12:43 +02:00
Safihre
e8e9974224 work_name would not be sanatized when adding NZB's
Closes #1615
Now with tests, yeah.
2020-09-21 22:12:34 +02:00
Safihre
feebbb9f04 Merge branch '3.0.x' 2020-09-13 16:40:43 +02:00
Safihre
bc4f06dd1d Limit feedparser<6.0.0 for 3.0.x 2020-09-13 16:40:14 +02:00
Safihre
971e4fc909 Merge branch '3.0.x' 2020-08-30 20:58:31 +02:00
Safihre
51cc765949 Update text files for 3.0.2 2020-08-30 20:50:45 +02:00
Safihre
19c6a4fffa Propagation delay label was shown even if no delay was activated 2020-08-29 16:46:16 +02:00
Safihre
105ac32d2f Reading RSS feed with no categories set could result in crash
Closes #1589
2020-08-28 10:16:49 +02:00
Safihre
57550675d2 Removed logging in macOS sabApp that resulted in double logging 2020-08-28 10:16:41 +02:00
Safihre
e674abc5c0 Update text files for 3.0.2RC2 2020-08-26 08:56:29 +02:00
Safihre
f965c96f51 Change the macOS power assertion to NoIdleSleep 2020-08-26 08:50:54 +02:00
Safihre
c76b8ed9e0 End-of-queue-script did not run on Windows due to long-path
https://forums.sabnzbd.org/viewtopic.php?f=3&t=24918

Will refactor this so they all call 1 function.
2020-08-24 11:28:14 +02:00
Safihre
4fbd0d8a7b Check if name is a string before switching to nzbfile in addfile
Closes #1584
2020-08-24 09:05:25 +02:00
Safihre
2186c0fff6 Update text files for 3.0.2 RC 1 2020-08-21 15:42:35 +02:00
Safihre
1adca9a9c1 Do not crash if certifi certificates are not available
This could happen on Windows, due to overactive virus scanners
2020-08-21 15:26:06 +02:00
Safihre
9408353f2b Priority was not parsed correctly if supplied as string 2020-08-21 15:12:09 +02:00
Safihre
84f4d453d2 Permissions would be set even if user didn't set any
Windows developers like me shouldn't do permissions stuff..
2020-08-21 15:12:01 +02:00
Safihre
d10209f2a1 Extend tests of create_all_dirs to cover apply_umask=False 2020-08-21 15:11:53 +02:00
Safihre
3ae149c72f Split the make_mo.py command for NSIS 2020-08-19 22:21:02 +02:00
Safihre
47385acc3b Make sure we force the final_name to string on legacy get_attrib_file 2020-08-19 16:21:13 +02:00
Safihre
814eeaa900 Redesigned the saving of attributes
Now uses pickle, so that the type of the property is preserved.
Made flexible, so that more properties can be easily added later.
Closes #1575
2020-08-19 16:21:07 +02:00
Safihre
5f2ea13aad NzbFile comparison could crash when comparing finished_files
https://forums.sabnzbd.org/viewtopic.php?f=3&t=24902&p=121748
2020-08-19 08:50:06 +02:00
Safihre
41ca217931 Merge branch '3.0.x' 2020-08-18 11:05:50 +02:00
Safihre
b57d36e8dd Set version information to 3.0.1 2020-08-18 11:05:36 +02:00
Safihre
9a4be70734 List Cheetah minimal version in requirements.txt 2020-08-18 08:21:20 +02:00
Safihre
a8443595a6 Generalize use of certifi module 2020-08-18 08:20:47 +02:00
Safihre
fd0a70ac58 Update text files for 3.0.1 2020-08-17 16:52:23 +02:00
Safihre
8a8685c968 Permissions should only be applied if requested
Corrects 050b925f7b
2020-08-16 18:28:39 +02:00
Safihre
9e6cb8da8e Temporarily set cheroot version due to it breaking our tests
cherrypy/cheroot/issues/312
2020-08-16 18:28:13 +02:00
Safihre
054ec54d51 Basic authentication option was broken
Closes #1571
2020-08-10 15:34:01 +02:00
Safihre
272ce773cb Update text files for 3.0.1RC1 2020-08-07 15:28:11 +02:00
Safihre
050b925f7b Permissions were not set correctly when creating directories (#1568)
Restores changes made in d2e0ebe
2020-08-07 15:22:53 +02:00
Safihre
0087940898 Merge branch '3.0.x' into master 2020-08-02 09:46:41 +02:00
Safihre
e323c014f9 Set version information to 3.0.0 2020-08-01 16:17:08 +02:00
Safihre
cc465c7554 Update text files for 3.0.0
🎉🎉
2020-08-01 15:59:30 +02:00
Safihre
14cb37564f Update translate-link in SABnzbd 2020-07-19 13:01:39 +02:00
Safihre
094db56c3b Default-text for Automatically sort queue 2020-07-16 22:29:02 +02:00
Safihre
aabb709b8b Update text files for 3.0.0 RC 2 2020-07-15 14:10:35 +02:00
Safihre
0833dd2db9 Update translatable texts in 3.0.x branch 2020-07-15 14:07:21 +02:00
Safihre
cd3f912be4 RAR-renamer should be run on badly named RAR-files
https://forums.sabnzbd.org/viewtopic.php?f=2&t=24514&p=121433
2020-07-15 14:01:48 +02:00
Safihre
665c516db6 Only really run pre-script when it is set 2020-07-12 14:20:18 +02:00
Safihre
b670da9fa0 Always use Default-priority when creating NZB-objects
Closes #1552
2020-07-12 14:03:07 +02:00
Safihre
80bee9bffe Search-icon would be shown on top of drop-downs
Closes #1545
2020-06-30 12:57:28 +02:00
Safihre
d85a70e8ad Always report API paused status as a boolean
Closes #1542
2020-06-30 10:26:34 +02:00
Safihre
8f21533e76 Set version to 2.3.9 2019-05-24 11:39:14 +02:00
Safihre
89996482a1 Merge branch '2.3.x' 2019-05-24 09:33:12 +02:00
Safihre
03c10dce91 Update text files for 2.3.9 2019-05-24 09:32:34 +02:00
Safihre
bd5331be05 Merge branch 'develop' into 2.3.x 2019-05-24 09:12:02 +02:00
Safihre
46e1645289 Correct typo in release notes 2019-05-18 10:56:39 +02:00
Safihre
4ce3965747 Update text files for 2.3.9RC2 2019-05-18 09:56:05 +02:00
Safihre
9d4af19db3 Merge branch 'develop' into 2.3.x 2019-05-18 09:45:20 +02:00
Safihre
48e034f4be Update text files for 2.3.9RC1 2019-05-07 13:50:20 +02:00
Safihre
f8959baa2f Revert "Notify develop-users that we will switch to Python 3"
This reverts commit fb238af7de.
2019-05-07 13:35:13 +02:00
Safihre
8ed5997eae Merge branch 'develop' into 2.3.x 2019-05-07 13:10:10 +02:00
Safihre
daf9f50ac8 Set version to 2.3.8 2019-03-18 11:10:56 +01:00
Safihre
6b11013c1a Merge branch '2.3.x' 2019-03-18 11:09:35 +01:00
18 changed files with 167 additions and 107 deletions

View File

@@ -1,7 +1,7 @@
Metadata-Version: 1.0
Name: SABnzbd
Version: 3.1.0RC1
Summary: SABnzbd-3.1.0RC1
Version: 3.1.1
Summary: SABnzbd-3.1.1
Home-page: https://sabnzbd.org
Author: The SABnzbd Team
Author-email: team@sabnzbd.org

View File

@@ -1,24 +1,29 @@
Release Notes - SABnzbd 3.1.0 Release Candidate 1
Release Notes - SABnzbd 3.1.1
=========================================================
## Changes and bugfixes since 3.1.0 Beta 2
- Deobfuscate final filenames can now be used when job folders are disabled.
- Deobfuscate final filenames will ignore blu-ray disc files.
- Clear error if Complete Folder is set as a subfolder of the Temporary Folder.
- Filtering of history by category would not filter jobs in post-processing.
## Changes and bugfixes since 3.1.1
- Enforce CRLF line endings on outgoing email messages.
- Queue Repair would fail if Rating is enabled.
- It was not possible to set directory-settings to empty values.
- Deobfuscate-during-download was not triggered.
- Failed to start on Python 3.5 with HTTPS enabled.
- Could show traceback when formatting error/warnings messages.
- Windows/macOS: improve handling of program restart.
## Changes since 3.0.2
- Added option to automatically deobfuscate final filenames: after unpacking,
detect and rename obfuscated or meaningless filenames to the job name,
similar to the Deobfuscate.py post-processing script.
similar to the `Deobfuscate.py` post-processing script.
- Switched to Transifex as our translations platform:
Help us translate SABnzbd in your language! Add untranslated texts or
improved existing translations here: https://sabnzbd.org/wiki/translate
- Redesigned job availability-check to be more efficient and reliable.
- Scheduled readouts of RSS-feeds would fail silently, they now show a warning.
- Skip repair on Retry if all sets were previously successfully verified.
- Passwords included in the filename no longer have to be at the end.
- Restore limit on length of foldernames (`max_foldername_length`).
- Added password input box on the Add NZB screen.
- Clear error if `Complete Folder` is set as a subfolder of the `Temporary Folder`.
- Show warning that Pyton 3.5 support will be dropped after 3.1.0.
- Windows/macOS: update UnRar to 5.91 and MultiPar to 1.3.1.0.
- Windows: retry `Access Denied` when renaming files on Windows.
@@ -27,12 +32,17 @@ Release Notes - SABnzbd 3.1.0 Release Candidate 1
- Assembler crashes could occur due to race condition in `ArticleCache`.
- On HTTP-redirects the scheme/hostname/port were ignored when behind a proxy.
- Strip slash of the end of `url_base` as it could break other code.
- `Temporary Folder` with unicode characters could result in duplicate unpacking.
- Unpacking with a relative folder set for a category could fail.
- Existing files were not parsed when retrying a job.
- Reading attributes when retrying a job could result in crash.
- Paused priority of pre-queue script was ignored.
- Duplicate Detection did not check filenames in History.
- Downloaded bytes could show as exceeding the total bytes of a job.
- Filtering of history by category would not filter jobs in post-processing.
- Windows: non-Latin languages were displayed incorrectly in the installer.
- Windows: could fail to create folders on some network shares.
- Windows: folders could end in a period, breaking Windows Explorer.
## Upgrade notices
- Jobs that failed on versions before 3.1.x, will throw an error about the

View File

@@ -125,17 +125,23 @@ class GUIHandler(logging.Handler):
def emit(self, record):
""" Emit a record by adding it to our private queue """
# If % is part of the msg, this could fail
try:
record_msg = record.msg % record.args
except TypeError:
record_msg = record.msg + str(record.args)
if record.levelname == "WARNING":
sabnzbd.LAST_WARNING = record.msg % record.args
sabnzbd.LAST_WARNING = record_msg
else:
sabnzbd.LAST_ERROR = record.msg % record.args
sabnzbd.LAST_ERROR = record_msg
if len(self.store) >= self.size:
# Loose the oldest record
self.store.pop(0)
try:
# Append traceback, if available
warning = {"type": record.levelname, "text": record.msg % record.args, "time": int(time.time())}
warning = {"type": record.levelname, "text": record_msg, "time": int(time.time())}
if record.exc_info:
warning["text"] = "%s\n%s" % (warning["text"], traceback.format_exc())
self.store.append(warning)
@@ -1287,7 +1293,7 @@ def main():
sabnzbd.cfg.enable_https.set(False)
# So the cert and key files do exist, now let's check if they are valid:
trialcontext = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
trialcontext = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)
try:
trialcontext.load_cert_chain(https_cert, https_key)
logging.info("HTTPS keys are OK")
@@ -1530,6 +1536,7 @@ def main():
# Check for auto-restart request
# Or special restart cases like Mac and WindowsService
if sabnzbd.TRIGGER_RESTART:
logging.info("Performing triggered restart")
# Shutdown
sabnzbd.shutdown_program()
@@ -1548,7 +1555,7 @@ def main():
my_name = sabnzbd.MY_FULLNAME.replace("/Contents/MacOS/SABnzbd", "")
my_args = " ".join(sys.argv[1:])
cmd = 'kill -9 %s && open "%s" --args %s' % (my_pid, my_name, my_args)
logging.info("Launching: ", cmd)
logging.info("Launching: %s", cmd)
os.system(cmd)
elif sabnzbd.WIN_SERVICE:
# Use external service handler to do the restart
@@ -1659,7 +1666,8 @@ def handle_windows_service():
"""
# Detect if running as Windows Service (only Vista and above!)
# Adapted from https://stackoverflow.com/a/55248281/5235502
if win32ts.ProcessIdToSessionId(win32api.GetCurrentProcessId()) == 0:
# Only works when run from the exe-files
if hasattr(sys, "frozen") and win32ts.ProcessIdToSessionId(win32api.GetCurrentProcessId()) == 0:
servicemanager.Initialize()
servicemanager.PrepareToHostSingle(SABnzbd)
servicemanager.StartServiceCtrlDispatcher()

View File

@@ -465,15 +465,6 @@ def trigger_restart(timeout=None):
if timeout:
time.sleep(timeout)
# Add extra arguments
if sabnzbd.downloader.Downloader.do.paused:
sabnzbd.RESTART_ARGS.append("-p")
sys.argv = sabnzbd.RESTART_ARGS
# Stop all services
sabnzbd.halt()
cherrypy.engine.exit()
if sabnzbd.WIN32:
# Remove connection info for faster restart
del_connection_info()
@@ -482,6 +473,15 @@ def trigger_restart(timeout=None):
if hasattr(sys, "frozen"):
sabnzbd.TRIGGER_RESTART = True
else:
# Add extra arguments
if sabnzbd.downloader.Downloader.do.paused:
sabnzbd.RESTART_ARGS.append("-p")
sys.argv = sabnzbd.RESTART_ARGS
# Stop all services
sabnzbd.halt()
cherrypy.engine.exit()
# Do the restart right now
cherrypy.engine._do_execv()

View File

@@ -1746,8 +1746,8 @@ def build_history(start=0, limit=0, search=None, failed_only=0, categories=None)
# Un-reverse the queue
items.reverse()
# Global check if rating is enabled
rating_enabled = cfg.rating_enable()
# Global check if rating is enabled and available (queue-repair)
rating_enabled = cfg.rating_enable() and Rating.do
for item in items:
item["size"] = to_units(item["bytes"], "B")

View File

@@ -236,7 +236,7 @@ class OptionDir(Option):
'create' means try to create (but don't set permanent create flag)
"""
error = None
if value and (value != self.get() or create):
if value is not None and (value != self.get() or create):
value = value.strip()
if self.__validation:
error, value = self.__validation(self.__root, value, super().default())

View File

@@ -151,7 +151,7 @@ class Status:
GRABBING = "Grabbing" # Q: Getting an NZB from an external site
MOVING = "Moving" # PP: Files are being moved
PAUSED = "Paused" # Q: Job is paused
QUEUED = "Queued" # Q: Job is waiting for its turn to download
QUEUED = "Queued" # Q: Job is waiting for its turn to download or post-process
QUICK_CHECK = "QuickCheck" # PP: QuickCheck verification is running
REPAIRING = "Repairing" # PP: Job is being repaired (by par2)
RUNNING = "Running" # PP: User's post processing script is running

View File

@@ -27,6 +27,7 @@ import glob
from Cheetah.Template import Template
from email.message import EmailMessage
from email import policy
from sabnzbd.constants import *
import sabnzbd
@@ -296,4 +297,4 @@ def _prepare_message(txt):
msg[keyword] = value
msg.set_content("\n".join(payload))
return msg.as_bytes()
return msg.as_bytes(policy=msg.policy.clone(linesep="\r\n"))

View File

@@ -1018,16 +1018,13 @@ class QueuePage:
class HistoryPage:
def __init__(self, root):
self.__root = root
self.__failed_only = False
@secured_expose
def index(self, **kwargs):
start = int_conv(kwargs.get("start"))
limit = int_conv(kwargs.get("limit"))
search = kwargs.get("search")
failed_only = kwargs.get("failed_only")
if failed_only is None:
failed_only = self.__failed_only
failed_only = int_conv(kwargs.get("failed_only"))
history = build_header()
history["failed_only"] = failed_only

View File

@@ -1976,8 +1976,9 @@ def create_env(nzo=None, extra_env_fields={}):
def rar_volumelist(rarfile_path, password, known_volumes):
"""Extract volumes that are part of this rarset
and merge them with existing list, removing duplicates
"""List volumes that are part of this rarset
and merge them with parsed paths list, removing duplicates.
We assume RarFile is right and use parsed paths as backup.
"""
# UnRar is required to read some RAR files
# RarFile can fail in special cases
@@ -1996,12 +1997,12 @@ def rar_volumelist(rarfile_path, password, known_volumes):
zf_volumes = []
# Remove duplicates
known_volumes_base = [os.path.basename(vol) for vol in known_volumes]
for zf_volume in zf_volumes:
if os.path.basename(zf_volume) not in known_volumes_base:
zf_volumes_base = [os.path.basename(vol) for vol in zf_volumes]
for known_volume in known_volumes:
if os.path.basename(known_volume) not in zf_volumes_base:
# Long-path notation just to be sure
known_volumes.append(long_path(zf_volume))
return known_volumes
zf_volumes.append(long_path(known_volume))
return zf_volumes
# Sort the various RAR filename formats properly :\

View File

@@ -204,18 +204,24 @@ class NzbQueue:
return nzo_id
@NzbQueueLocker
def send_back(self, nzo):
def send_back(self, old_nzo):
""" Send back job to queue after successful pre-check """
try:
nzb_path = globber_full(nzo.workpath, "*.gz")[0]
nzb_path = globber_full(old_nzo.workpath, "*.gz")[0]
except:
logging.info("Failed to find NZB file after pre-check (%s)", nzo.nzo_id)
logging.info("Failed to find NZB file after pre-check (%s)", old_nzo.nzo_id)
return
# Need to remove it first, otherwise it might still be downloading
self.remove(nzo, add_to_history=False, cleanup=False)
res, nzo_ids = process_single_nzb(nzo.filename, nzb_path, keep=True, reuse=nzo.downpath, nzo_id=nzo.nzo_id)
# Store old position and create new NZO
old_position = self.__nzo_list.index(old_nzo)
res, nzo_ids = process_single_nzb(
old_nzo.filename, nzb_path, keep=True, reuse=old_nzo.downpath, nzo_id=old_nzo.nzo_id
)
if res == 0 and nzo_ids:
# Swap to old position
new_nzo = self.get_nzo(nzo_ids[0])
self.__nzo_list.remove(new_nzo)
self.__nzo_list.insert(old_position, new_nzo)
# Reset reuse flag to make pause/abort on encryption possible
self.__nzo_table[nzo_ids[0]].reuse = None
@@ -776,10 +782,9 @@ class NzbQueue:
def end_job(self, nzo):
""" Send NZO to the post-processing queue """
logging.info("[%s] Ending job %s", caller_name(), nzo.final_name)
# Notify assembler to call postprocessor
if not nzo.deleted:
logging.info("[%s] Ending job %s", caller_name(), nzo.final_name)
nzo.deleted = True
if nzo.precheck:
nzo.save_to_disk()

View File

@@ -78,6 +78,7 @@ from sabnzbd.filesystem import (
remove_file,
get_filepath,
make_script_path,
globber,
)
from sabnzbd.decorators import synchronized
import sabnzbd.config as config
@@ -356,15 +357,15 @@ class NzbFile(TryList):
self.valid = bool(raw_article_db)
if self.valid and self.nzf_id:
# Save first article separate so we can do duplicate file detection
# Save first article separate so we can do
# duplicate file detection and deobfuscate-during-download
first_article = self.add_article(raw_article_db.pop(0))
first_article.lowest_partnum = True
self.nzo.first_articles.append(first_article)
self.nzo.first_articles_count += 1
# For non-par2 files we also use it to do deobfuscate-during-download
# And we count how many bytes are available for repair
# Count how many bytes are available for repair
if sabnzbd.par2file.is_parfile(self.filename):
self.nzo.first_articles.append(first_article)
self.nzo.first_articles_count += 1
self.nzo.bytes_par2 += self.bytes
# Any articles left?
@@ -910,7 +911,6 @@ class NzbObject(TryList):
# to history we first need an nzo_id by entering the NzbQueue
if accept == 2:
self.deleted = True
self.status = Status.FAILED
sabnzbd.NzbQueue.do.add(self, quiet=True)
sabnzbd.NzbQueue.do.end_job(self)
# Raise error, so it's not added
@@ -1173,8 +1173,6 @@ class NzbObject(TryList):
# Abort the job due to failure
if not job_can_succeed:
# Set the nzo status to return "Queued"
self.status = Status.QUEUED
self.set_download_report()
self.fail_msg = T("Aborted, cannot be completed") + " - https://sabnzbd.org/not-complete"
self.set_unpack_info("Download", self.fail_msg, unique=False)
@@ -1184,8 +1182,6 @@ class NzbObject(TryList):
post_done = False
if not self.files:
post_done = True
# set the nzo status to return "Queued"
self.status = Status.QUEUED
self.set_download_report()
return articles_left, file_done, post_done
@@ -1207,8 +1203,8 @@ class NzbObject(TryList):
""" Check if downloaded files already exits, for these set NZF to complete """
fix_unix_encoding(wdir)
# Get a list of already present files
files = [f for f in os.listdir(wdir) if os.path.isfile(f)]
# Get a list of already present files, ignore folders
files = globber(wdir, "*.*")
# Substitute renamed files
renames = sabnzbd.load_data(RENAMES_FILE, self.workpath, remove=True)
@@ -1232,6 +1228,7 @@ class NzbObject(TryList):
for nzf in nzfs:
subject = sanitize_filename(name_extractor(nzf.subject))
if (nzf.filename == filename) or (subject == filename) or (filename in subject):
logging.info("Existing file %s matched to file %s of %s", filename, nzf.filename, self.final_name)
nzf.filename = filename
nzf.bytes_left = 0
self.remove_nzf(nzf)
@@ -1254,25 +1251,25 @@ class NzbObject(TryList):
for filename in files:
# Create NZO's using basic information
filepath = os.path.join(wdir, filename)
if os.path.exists(filepath):
tup = os.stat(filepath)
tm = datetime.datetime.fromtimestamp(tup.st_mtime)
nzf = NzbFile(tm, filename, [], tup.st_size, self)
self.files.append(nzf)
self.files_table[nzf.nzf_id] = nzf
nzf.filename = filename
self.remove_nzf(nzf)
logging.info("Existing file %s added to %s", filename, self.final_name)
tup = os.stat(filepath)
tm = datetime.datetime.fromtimestamp(tup.st_mtime)
nzf = NzbFile(tm, filename, [], tup.st_size, self)
self.files.append(nzf)
self.files_table[nzf.nzf_id] = nzf
nzf.filename = filename
self.remove_nzf(nzf)
# Set bytes correctly
self.bytes += nzf.bytes
self.bytes_tried += nzf.bytes
self.bytes_downloaded += nzf.bytes
# Set bytes correctly
self.bytes += nzf.bytes
self.bytes_tried += nzf.bytes
self.bytes_downloaded += nzf.bytes
# Process par2 files
if sabnzbd.par2file.is_parfile(filepath):
self.handle_par2(nzf, filepath)
self.bytes_par2 += nzf.bytes
# Process par2 files
if sabnzbd.par2file.is_parfile(filepath):
self.handle_par2(nzf, filepath)
self.bytes_par2 += nzf.bytes
logging.info("Existing file %s added to job", filename)
except:
logging.error(T("Error importing %s"), self.final_name)
logging.info("Traceback: ", exc_info=True)
@@ -1705,8 +1702,11 @@ class NzbObject(TryList):
self.renamed_file(yenc_filename, nzf.filename)
nzf.filename = yenc_filename
@synchronized(NZO_LOCK)
def verify_all_filenames_and_resort(self):
""" Verify all filenames based on par2 info and then re-sort files """
"""Verify all filenames based on par2 info and then re-sort files.
Locked so all files are verified at once without interuptions.
"""
logging.info("Checking all filenames for %s", self.final_name)
for nzf_verify in self.files:
self.verify_nzf_filename(nzf_verify)
@@ -1891,13 +1891,17 @@ class NzbObject(TryList):
for attrib in NzoAttributeSaver:
attribs[attrib] = getattr(self, attrib)
logging.debug("Saving attributes %s for %s", attribs, self.final_name)
sabnzbd.save_data(attribs, ATTRIB_FILE, self.workpath)
sabnzbd.save_data(attribs, ATTRIB_FILE, self.workpath, silent=True)
def load_attribs(self):
""" Load saved attributes and return them to be parsed """
attribs = sabnzbd.load_data(ATTRIB_FILE, self.workpath, remove=False)
logging.debug("Loaded attributes %s for %s", attribs, self.final_name)
# If attributes file somehow does not exists
if not attribs:
return None, None, None
# Only a subset we want to apply directly to the NZO
for attrib in ("final_name", "priority", "password", "url"):
# Only set if it is present and has a value
@@ -2070,16 +2074,16 @@ def nzf_cmp_name(nzf1, nzf2):
def create_work_name(name):
""" Remove ".nzb" and ".par(2)" and sanitize """
strip_ext = [".nzb", ".par", ".par2"]
name = sanitize_foldername(name.strip())
""" Remove ".nzb" and ".par(2)" and sanitize, skip URL's """
if name.find("://") < 0:
name_base, ext = os.path.splitext(name)
# In case it was one of these, there might be more
while ext.lower() in strip_ext:
# Need to remove any invalid characters before starting
name_base, ext = os.path.splitext(sanitize_foldername(name))
while ext.lower() in (".nzb", ".par", ".par2"):
name = name_base
name_base, ext = os.path.splitext(name)
return name.strip()
# And make sure we remove invalid characters again
return sanitize_foldername(name)
else:
return name.strip()

View File

@@ -166,6 +166,8 @@ class PostProcessor(Thread):
def process(self, nzo):
""" Push on finished job in the queue """
# Make sure we return the status "Waiting"
nzo.status = Status.QUEUED
if nzo not in self.history_queue:
self.history_queue.append(nzo)
@@ -327,7 +329,8 @@ def process_job(nzo):
# Get the NZB name
filename = nzo.final_name
if nzo.fail_msg: # Special case: aborted due to too many missing data
# Download-processes can mark job as failed
if nzo.fail_msg:
nzo.status = Status.FAILED
nzo.save_attribs()
all_ok = False

View File

@@ -24,6 +24,7 @@ import logging
import time
import datetime
import threading
import urllib.parse
import sabnzbd
from sabnzbd.constants import RSS_FILE_NAME, DEFAULT_PRIORITY, DUP_PRIORITY
@@ -277,44 +278,49 @@ class RSSQueue:
feedparser.USER_AGENT = "SABnzbd/%s" % sabnzbd.__version__
# Read the RSS feed
msg = None
entries = None
msg = ""
entries = []
if readout:
all_entries = []
for uri in uris:
uri = uri.replace(" ", "%20")
# Reset parsing message for each feed
msg = ""
feed_parsed = {}
uri = uri.replace(" ", "%20").replace("feed://", "http://")
logging.debug("Running feedparser on %s", uri)
feed_parsed = feedparser.parse(uri.replace("feed://", "http://"))
logging.debug("Done parsing %s", uri)
if not feed_parsed:
msg = T("Failed to retrieve RSS from %s: %s") % (uri, "?")
logging.info(msg)
try:
feed_parsed = feedparser.parse(uri)
except Exception as feedparser_exc:
# Feedparser 5 would catch all errors, while 6 just throws them back at us
feed_parsed["bozo_exception"] = feedparser_exc
logging.debug("Finished parsing %s", uri)
status = feed_parsed.get("status", 999)
if status in (401, 402, 403):
msg = T("Do not have valid authentication for feed %s") % uri
logging.info(msg)
if 500 <= status <= 599:
elif 500 <= status <= 599:
msg = T("Server side error (server code %s); could not get %s on %s") % (status, feed, uri)
logging.info(msg)
entries = feed_parsed.get("entries")
entries = feed_parsed.get("entries", [])
if not entries and "feed" in feed_parsed and "error" in feed_parsed["feed"]:
msg = T("Failed to retrieve RSS from %s: %s") % (uri, feed_parsed["feed"]["error"])
# Exception was thrown
if "bozo_exception" in feed_parsed and not entries:
msg = str(feed_parsed["bozo_exception"])
if "CERTIFICATE_VERIFY_FAILED" in msg:
msg = T("Server %s uses an untrusted HTTPS certificate") % get_base_url(uri)
msg += " - https://sabnzbd.org/certificate-errors"
logging.error(msg)
elif "href" in feed_parsed and feed_parsed["href"] != uri and "login" in feed_parsed["href"]:
# Redirect to login page!
msg = T("Do not have valid authentication for feed %s") % uri
else:
msg = T("Failed to retrieve RSS from %s: %s") % (uri, msg)
logging.info(msg)
if not entries and not msg:
if msg:
# We need to escape any "%20" that could be in the warning due to the URL's
logging.warning_helpful(urllib.parse.unquote(msg))
elif not entries:
msg = T("RSS Feed %s was empty") % uri
logging.info(msg)
all_entries.extend(entries)

View File

@@ -318,7 +318,7 @@ class URLGrabber(Thread):
msg = T("URL Fetching failed; %s") % msg
# Mark as failed
nzo.status = Status.FAILED
nzo.set_unpack_info("Source", msg)
nzo.fail_msg = msg
notifier.send_notification(T("URL Fetching failed; %s") % "", "%s\n%s" % (msg, url), "other", nzo.cat)

View File

@@ -4,5 +4,5 @@
# You MUST use double quotes (so " and not ')
__version__ = "3.1.0-develop"
__baseline__ = "unknown"
__version__ = "3.1.1"
__baseline__ = "99b5a00c12c1d8e17bb3e4a9a98339f59152c842"

8
scripts/Deobfuscate.py Normal file → Executable file
View File

@@ -221,5 +221,13 @@ if run_renamer:
else:
print("No par2 files or large files found")
# Note about the new option
print(
"The features of Deobfuscate.py are now integrated into SABnzbd! "
+ "Just enable 'Deobfuscate final filenames' in Config - Switches. "
+ "Don't forget to disable this script when you enable the new option!"
+ "This script will be removed in the next version of SABnzbd."
)
# Always exit with success-code
sys.exit(0)

View File

@@ -55,7 +55,7 @@ class TestNZO:
# TODO: More checks!
class TestScanPassword:
class TestNZBStuffHelpers:
def test_scan_passwords(self):
file_names = {
"my_awesome_nzb_file{{password}}": "password",
@@ -77,3 +77,20 @@ class TestScanPassword:
for file_name, clean_file_name in file_names.items():
assert nzbstuff.scan_password(file_name)[0] == clean_file_name
def test_create_work_name(self):
# Only test stuff specific for create_work_name
# The sanitizing is already tested in tests for sanitize_foldername
file_names = {
"my_awesome_nzb_file.pAr2.nZb": "my_awesome_nzb_file",
"my_awesome_nzb_file.....pAr2.nZb": "my_awesome_nzb_file",
"my_awesome_nzb_file....par2..": "my_awesome_nzb_file",
" my_awesome_nzb_file .pAr.nZb": "my_awesome_nzb_file",
"with.extension.and.period.par2.": "with.extension.and.period",
"nothing.in.here": "nothing.in.here",
" just.space ": "just.space",
"http://test.par2 ": "http://test.par2",
}
for file_name, clean_file_name in file_names.items():
assert nzbstuff.create_work_name(file_name) == clean_file_name