Compare commits

...

42 Commits

Author SHA1 Message Date
Safihre
8219737122 Remove redundant README.txt
Stupid mistake.
2023-07-10 15:33:17 +02:00
Safihre
c96599aa86 Remove duplicate Reddit posting 2023-07-10 15:21:31 +02:00
Safihre
b2c1960d93 Release notes were not present in releases 2023-07-10 14:54:34 +02:00
Safihre
9d24b4cc35 Correct finding of release in appdata 2023-07-10 14:19:56 +02:00
Safihre
3d675b033c Update text files for 4.0.3 2023-07-10 13:43:06 +02:00
Sander
0d2d9be8b3 better docker detections: works for older and newer docker versions (#2606)
* better docker detections: works for Ubuntu 18.04 and 22.04

* DOCKER = False, needed for non-POSIX

---------

Co-authored-by: sander <san.d.erjonkers+github@gmail.com>
2023-07-03 16:11:25 +02:00
Safihre
6e9b6dab97 add a grace period for expected filenames to show up (#2609) 2023-07-03 16:11:04 +02:00
Michael Nightingale
44a1717f6d Fix uu decoding when collapsing of lines starting with a doubled period is required (#2605) 2023-06-28 10:01:15 +02:00
Safihre
4f51c74297 Build binary using Python 3.11.4 2023-06-28 10:01:01 +02:00
thezoggy
87c64a8c5d add random import back to fixup (#2604) 2023-06-27 09:06:25 +02:00
Safihre
b6c6635f22 Add newline after link to Downloads page in Reddit post 2023-06-23 21:45:36 +02:00
Safihre
5a7abcb07c Update text files for 4.0.3RC1 2023-06-23 14:03:26 +02:00
jcfp
65232d134b Fix sorting for #2551 (#2598)
* fix #2551

* add test data dirs

* move sorting test data into subdir

* undo change to sabnews.create_nzb
2023-06-23 13:52:26 +02:00
Safihre
d7b4bdefe5 Check if version is present appdata before releasing 2023-06-23 13:52:12 +02:00
Safihre
6d9174bea1 Additional logging to debug Direct Unpack 2023-06-23 13:52:07 +02:00
François M
921edfd4c5 Add versions to appdata (#2595) 2023-06-23 13:52:00 +02:00
Safihre
786d5b0667 Lock add/remove_socket in Downloader
See if we can resolve #2591
2023-06-23 13:51:49 +02:00
Safihre
e846c71f20 Link to Downloads page was not included in Reddit post 2023-06-16 11:48:52 +02:00
Safihre
0108e2ef5a Update text files for 4.0.3Beta1 2023-06-16 10:53:58 +02:00
Safihre
9a81277ff6 No longer * import AppKit and Foundation 2023-06-16 09:07:29 +02:00
Safihre
06cc2ff316 Update release script to post directly to r/usenet and include link 2023-06-13 14:06:38 +02:00
renovate[bot]
7cdf4cb48c Update all dependencies 2023-06-13 14:06:31 +02:00
thezoggy
c34c547f1f Unable to modify Sorters (#2587) 2023-06-13 14:06:22 +02:00
Safihre
9507294db7 Move DirScanner Lock creation 2023-06-13 14:06:12 +02:00
Safihre
ae7dd62d9f Only initialize DirScanner Lock after starting event loop 2023-06-13 14:06:05 +02:00
renovate[bot]
52e309cb09 Update all dependencies 2023-06-13 14:06:00 +02:00
jcfp
b580373982 Fix sorting lowercasing (#2584)
* run lowercasing on season pack setname

* also subject %fn to lowercasing

* add tests

* woops
2023-06-13 14:05:37 +02:00
renovate[bot]
ec7bde5bb2 Update dependency cryptography to v41 [SECURITY] (#2583)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2023-06-13 14:05:30 +02:00
Safihre
3516eeec5b Force full refresh after changing items-per-page
Closes #2416
2023-06-13 14:05:24 +02:00
Safihre
52351192e6 Use more reliable marker if job is still active 2023-06-13 14:05:18 +02:00
Safihre
363a26b8a1 Update text files for 4.0.2 2023-06-07 17:24:54 +02:00
Safihre
7e50a00f55 Correct parameter in release script to merge PR of update 2023-05-31 22:06:26 +02:00
Safihre
a7d6a80e82 Update text files for 4.0.2RC2 2023-05-31 21:49:02 +02:00
Safihre
e7da95b2ac Merge branch 'develop' into 4.0.x 2023-05-31 21:38:48 +02:00
Safihre
74fca23d59 Update text files for 4.0.2RC1 2023-05-23 21:31:43 +02:00
Safihre
0a12fa1253 Merge branch 'develop' into 4.0.x 2023-05-23 21:31:11 +02:00
Safihre
d8c0220353 Update text files for 4.0.1 2023-05-01 21:45:27 +02:00
Safihre
4ab425d15c Update appdata for 4.0.1 release 2023-05-01 21:41:48 +02:00
François M
74e5633d1c Add releases tag (#2539)
* Add 3.7.2 release tag

* Add 4.0.0 placeholder
2023-05-01 21:41:37 +02:00
Safihre
89d36bbc61 Update sabctools to 7.0.2 2023-05-01 21:36:32 +02:00
Safihre
1877ac18a5 Show a better crash on Python <3.8 2023-05-01 21:36:23 +02:00
Safihre
c4b0da335d Update text files for 4.0.0 2023-04-28 14:47:36 +02:00
43 changed files with 396 additions and 125 deletions

View File

@@ -81,7 +81,7 @@ jobs:
# We need the official Python, because the GA ones only support newer macOS versions
# The deployment target is picked up by the Python build tools automatically
# If updated, make sure to also set LSMinimumSystemVersion in SABnzbd.spec
PYTHON_VERSION: "3.11.3"
PYTHON_VERSION: "3.11.4"
MACOSX_DEPLOYMENT_TARGET: "10.9"
# We need to force compile for universal2 support
CFLAGS: -arch x86_64 -arch arm64

View File

@@ -1,21 +1,47 @@
Release Notes - SABnzbd 4.0.0 Release Candidate 2
Release Notes - SABnzbd 4.0.3
=========================================================
## Bugfixes and changes since 4.0.2
- Direct Unpack could get stuck.
- Sorters could not be modified.
- Season Sorting did not respect desired capitalization.
- Crashes could occur in the `Downloader` on timeouts.
- Prevent extra job directory in case of folder-only Sorting.
- UUencoded articles could fail to decode.
- Windows: Windows Service would fail to start on legacy release.
- macOS: Failed to launch on macOS Sonoma Beta.
## Breaking change in 4.0.1
- The `Parameters` setting of a `Notification Script` is now passed as
environment variable `SAB_NOTIFICATION_PARAMETERS` instead of as a
command-line parameter. This prevents the possibility of remote code
execution on systems exposed to the internet without a username/password.
If you use `nzb-notify` you need to update it to the latest version.
## Bugfixes and changes since 4.0.1
- Disabling a server during download did not stop it from downloading.
- Show last line of post-processing script output even if it failed.
- Prevent crash during Deobfuscate on non-unique paths.
- Files that could not be parsed were removed from the `Watched Folder`.
- Warn if the file system does not support unicode or long filenames.
- Warn if `Scripts Folder` is inside the application directory.
- Prevent output buffering of Python post-processing scripts.
- The `PKG-INFO` file was removed from the `src` release.
- Correctly decode partially malformed UUencoded posts.
- macOS: Tray icon could not be disabled.
## Changes since 3.7.2
- In this major update we replaced a core part of Python's SSL handling
with our own improved version. This results in large performance increases
when downloading from news servers with SSL enabled.
In addition, the general connection handling was overhauled, resulting in
performance improvements for all news servers.
- In this major update we optimized a core part of the SSL handling.
This results in large performance increase when downloading from news
servers with SSL enabled. In addition, the general connection handling
was improved, resulting in performance improvements for all news servers.
Special thanks to: mnightingale, puzzledsab and animetosho!
- There are multiple settings that can tweak performance, see:
https://github.com/sabnzbd/sabnzbd/discussions/2474
We are trying to find the most optimal default settings, so you
can help us by letting us know the results on your system!
- When adding a new news server, SSL is enabled by default.
- File assembly performance significantly improved by relying on the
CRC32 instead of the MD5 to perform QuickCheck of files.
- Slowdown more gracefully when the cache fills up.
- Slow down more gracefully when the cache fills up.
- Replaced separate Series/Movie/Date Sorting with general Sorter.
- HTTPS files are included in the `Backup`.
- Improved `Watched Folder` scanning and processing.

View File

@@ -1,46 +0,0 @@
Release Notes - SABnzbd 4.0.0 Release Candidate 2
=========================================================
## Changes since 3.7.2
- In this major update we replaced a core part of Python's SSL handling
with our own improved version. This results in large performance increases
when downloading from news servers with SSL enabled.
In addition, the general connection handling was overhauled, resulting in
performance improvements for all news servers.
Special thanks to: mnightingale, puzzledsab and animetosho!
- There are multiple settings that can tweak performance, see:
https://github.com/sabnzbd/sabnzbd/discussions/2474
We are trying to find the most optimal default settings, so you
can help us by letting us know the results on your system!
- When adding a new news server, SSL is enabled by default.
- File assembly performance significantly improved by relying on the
CRC32 instead of the MD5 to perform QuickCheck of files.
- Slowdown more gracefully when the cache fills up.
- Replaced separate Series/Movie/Date Sorting with general Sorter.
- HTTPS files are included in the `Backup`.
- Improved `Watched Folder` scanning and processing.
- Ignore resource fork files created by macOS.
- `Deobfuscate final filenames` is enabled for new installations.
- Dropped support for Python 3.7.
## Bugfixes since 3.7.2
- Restore applying `History Retention` setting at startup.
- Windows: Not all invalid characters were removed from filenames.
- Windows: Firewall rules were not removed by uninstaller.
## Upgrade notices
- The download statistics file `totals10.sab` is updated in 3.2.x
version. If you downgrade to 3.1.x or lower, detailed download
statistics will be lost.
## Known problems and solutions
- Read the file "ISSUES.txt"
## About
SABnzbd is an open-source cross-platform binary newsreader.
It simplifies the process of downloading from Usenet dramatically, thanks
to its web-based user interface and advanced built-in post-processing options
that automatically verify, repair, extract and clean up posts downloaded
from Usenet.
(c) Copyright 2007-2023 by "The SABnzbd-team" \<team@sabnzbd.org\>

View File

@@ -19,6 +19,7 @@ import os
# Constants
VERSION_FILE = "sabnzbd/version.py"
APPDATA_FILE = "linux/org.sabnzbd.sabnzbd.appdata.xml"
# To draft a release or not to draft a release?
ON_GITHUB_ACTIONS = os.environ.get("CI", False)
@@ -29,6 +30,9 @@ with open(VERSION_FILE) as version_file:
exec(version_file.read())
RELEASE_VERSION = __version__
# Pre-releases are longer than 6 characters (e.g. 3.1.0Beta1 vs 3.1.0, but also 3.0.11)
PRERELEASE = len(RELEASE_VERSION) > 5
# Define release name
RELEASE_NAME = "SABnzbd-%s" % RELEASE_VERSION
RELEASE_TITLE = "SABnzbd %s" % RELEASE_VERSION
@@ -41,7 +45,8 @@ RELEASE_README = "README.mkd"
# Used in package.py and SABnzbd.spec
EXTRA_FILES = [
"README.mkd",
RELEASE_README,
"README.txt",
"INSTALL.txt",
"LICENSE.txt",
"GPL2.txt",

View File

@@ -1,3 +1,3 @@
# Special requirements for macOS universal2 binary release
# This way dependabot can auto-update them
cryptography==40.0.2
cryptography==41.0.1

View File

@@ -19,12 +19,15 @@ import hashlib
import json
import os
import re
import shutil
import xml.etree.ElementTree as ET
import github
import praw
from constants import (
RELEASE_VERSION,
PRERELEASE,
RELEASE_SRC,
RELEASE_BINARY_32,
RELEASE_BINARY_64,
@@ -33,6 +36,7 @@ from constants import (
RELEASE_README,
RELEASE_THIS,
RELEASE_TITLE,
APPDATA_FILE,
)
# Verify we have all assets
@@ -49,6 +53,11 @@ for file_to_check in files_to_check:
raise RuntimeError("Not all release files are present!")
print("All release files are present")
# Verify that appdata file is updated
if not PRERELEASE:
if not isinstance(ET.parse(APPDATA_FILE).find(f"./releases/release[@version='{RELEASE_VERSION}']"), ET.Element):
raise RuntimeError(f"Could not find {RELEASE_VERSION} in {APPDATA_FILE}")
# Calculate hashes for Synology release
with open(RELEASE_SRC, "rb") as inp_file:
source_data = inp_file.read()
@@ -69,9 +78,6 @@ if RELEASE_THIS and gh_token:
with open(RELEASE_README, "r") as readme_file:
readme_data = readme_file.read()
# Pre-releases are longer than 6 characters (e.g. 3.1.0Beta1 vs 3.1.0, but also 3.0.11)
prerelease = len(RELEASE_VERSION) > 5
# We have to manually check if we already created this release
for release in gh_repo.get_releases():
if release.tag_name == RELEASE_VERSION:
@@ -86,7 +92,7 @@ if RELEASE_THIS and gh_token:
name=RELEASE_TITLE,
message=readme_data,
draft=True,
prerelease=prerelease,
prerelease=PRERELEASE,
)
# Fetch existing assets, as overwriting is not allowed by GitHub
@@ -119,7 +125,7 @@ if RELEASE_THIS and gh_token:
name=RELEASE_TITLE,
message=readme_data,
draft=False,
prerelease=prerelease,
prerelease=PRERELEASE,
)
# Update the website
@@ -146,7 +152,7 @@ if RELEASE_THIS and gh_token:
latest_txt_items = latest_txt.decoded_content.split()
new_latest_txt_items = latest_txt_items[:2]
config_yml = gh_repo_web.get_contents("_config.yml")
if prerelease:
if PRERELEASE:
# If it's a pre-release, we append to current version in latest.txt
new_latest_txt_items.extend([RELEASE_VERSION_BYTES, latest_txt_items[1]])
# And replace in _config.yml
@@ -205,7 +211,7 @@ if RELEASE_THIS and gh_token:
# Merge pull-request
print("Merging pull request in sabnzbd/sabnzbd.github.io for the update")
update_pr.merge(method="squash")
update_pr.merge(merge_method="squash")
# Only with GitHub success we proceed to Reddit
if reddit_token := os.environ.get("REDDIT_TOKEN", ""):
@@ -227,18 +233,26 @@ if RELEASE_THIS and gh_token:
with open(RELEASE_README, "r") as readme_file:
readme_lines = readme_file.readlines()
# Put the download link after the title
readme_lines[2] = "## https://sabnzbd.org/downloads\n"
# Use the header in the readme as title
title = readme_lines[0]
release_notes_text = "".join(readme_lines[3:])
# Post always to r/SABnzbd
print("Posting release notes to Reddit: r/sabnzbd")
submission = subreddit_sabnzbd.submit(title, selftext=release_notes_text)
release_notes_text = "".join(readme_lines[2:])
# Only stable releases to r/usenet
if not prerelease:
print("Cross-posting release notes to Reddit: r/usenet")
submission.crosspost(subreddit_usenet)
if not PRERELEASE:
print("Posting release notes to Reddit: r/usenet")
submission = subreddit_usenet.submit(title, selftext=release_notes_text)
# Cross-post to r/SABnzbd
print("Cross-posting release notes to Reddit: r/sabnzbd")
submission.crosspost(subreddit_sabnzbd)
else:
# Post always to r/SABnzbd
print("Posting release notes to Reddit: r/sabnzbd")
subreddit_sabnzbd.submit(title, selftext=release_notes_text)
else:
print("Missing REDDIT_TOKEN")

View File

@@ -1,6 +1,6 @@
# Basic build requirements
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
pyinstaller==5.11.0
pyinstaller==5.12.0
pyinstaller-hooks-contrib==2023.3
altgraph==0.17.3
wrapt==1.15.0
@@ -9,7 +9,7 @@ certifi
# orjson does not support 32bit Windows, exclude it based on Python-version
# This way we also test ujson on Python 3.8 in the CI-tests
orjson==3.8.14; python_version > '3.8'
orjson==3.9.1; python_version > '3.8'
# For the Windows build
pefile==2023.2.7; sys_platform == 'win32'

View File

@@ -472,7 +472,7 @@
ui.placeholder.height(hPlaceholder + hExtra);
\$('<div class="sorter-placeholder-anim" data-height="' + hPlaceholder + '"></div>').insertAfter(ui.placeholder);
},
cancel: ".pattern-table",
cancel: "input,textarea,button,select,option,.pattern-table",
change: function(event, ui) {
ui.placeholder.stop().height(0).animate({
height: ui.item.outerHeight() + hExtra

View File

@@ -110,6 +110,8 @@ function HistoryListModel(parent) {
value: newValue
})
}
// Update pagination and counters
self.parent.refresh(true)
});
// Retry a job

View File

@@ -159,6 +159,8 @@ function QueueListModel(parent) {
value: newValue
})
}
// Update pagination and counters
self.parent.refresh(true)
});
// Do we show search box. So it doesn't dissapear when nothing is found

View File

@@ -30,6 +30,8 @@
<url type="faq">https://sabnzbd.org/wiki/faq</url>
<url type="contact">https://sabnzbd.org/live-chat.html</url>
<releases>
<release version="4.0.3" date="2023-06-16" type="stable"/>
<release version="4.0.2" date="2023-06-09" type="stable"/>
<release version="4.0.1" date="2023-05-01" type="stable"/>
<release version="4.0.0" date="2023-04-28" type="stable"/>
<release version="3.7.2" date="2023-02-05" type="stable"/>

View File

@@ -9,15 +9,15 @@ configobj==5.0.8
cheroot==10.0.0
six==1.16.0
cherrypy==18.8.0
jaraco.functools==3.6.0
jaraco.collections==4.1.0
jaraco.functools==3.7.0
jaraco.collections==4.2.0
jaraco.text==3.8.1 # Newer version introduces irrelevant extra dependencies
jaraco.classes==3.2.3
jaraco.context==4.3.0
more-itertools==9.1.0
zc.lockfile==3.0.post1
python-dateutil==2.8.2
tempora==5.2.2
tempora==5.3.0
pytz==2023.3
sgmllib3k==1.0.0
portend==3.1.0
@@ -30,17 +30,17 @@ rebulk==3.2.0
# Recent cryptography versions require Rust. If you run into issues compiling this
# SABnzbd will also work with older pre-Rust versions such as cryptography==3.3.2
cryptography==40.0.2
cryptography==41.0.1
# We recommend using "orjson" as it is 2x as fast as "ujson". However, it requires
# Rust so SABnzbd works just as well with "ujson" or the Python built in "json" module
ujson==5.7.0
ujson==5.8.0
# Windows system integration
pywin32==306; sys_platform == 'win32'
# macOS system calls
pyobjc==9.1.1; sys_platform == 'darwin'
pyobjc==9.2; sys_platform == 'darwin'
# Linux notifications
notify2==0.3.1; sys_platform != 'win32' and sys_platform != 'darwin'

View File

@@ -48,12 +48,8 @@ elif os.name == "posix":
ORG_UMASK = os.umask(18)
os.umask(ORG_UMASK)
# Check if running in a Docker container
try:
with open("/proc/1/cgroup", "rt") as ifh:
DOCKER = ":/docker/" in ifh.read()
except:
pass
# Check if running in a Docker container. Note: fake-able, but good enough for normal setups
DOCKER = os.path.exists("/.dockerenv")
# See if we have the GNU glibc malloc_trim() memory release function
try:

View File

@@ -287,6 +287,10 @@ def decode_uu(article: Article, raw_data: bytearray) -> bytes:
if line in (b"`", b"end", b"."):
break
# Remove dot stuffing
if line.startswith(b".."):
line = line[1:]
try:
decoded_line = binascii.a2b_uu(line)
except binascii.Error as msg:

View File

@@ -251,11 +251,19 @@ class DirectUnpacker(threading.Thread):
extracted = []
# Are there more files left?
while self.nzo.files and not self.next_sets:
while not self.nzo.removed_from_queue and not self.next_sets:
logging.debug("Direct Unpack for %s waiting for more sets", self.nzo.final_name)
with self.next_file_lock:
self.next_file_lock.wait()
# Is there another set to do?
logging.debug(
"Direct Unpack for %s continuing: killed=%s, cur_setname=%s, next_sets=%s",
self.nzo.final_name,
self.killed,
self.cur_setname,
self.next_sets,
)
if self.next_sets:
# Start new instance
nzf = self.next_sets.pop(0)
@@ -276,8 +284,7 @@ class DirectUnpacker(threading.Thread):
rarfiles.append(filename)
else:
# List files we extracted
m = re.search(RAR_EXTRACTED_RE, linebuf_encoded)
if m:
if m := re.search(RAR_EXTRACTED_RE, linebuf_encoded):
# In case of flat-unpack, UnRar still prints the whole path (?!)
unpacked_file = m.group(2)
if cfg.flat_unpack():
@@ -337,6 +344,7 @@ class DirectUnpacker(threading.Thread):
self.reset_active()
if self in ACTIVE_UNPACKERS:
ACTIVE_UNPACKERS.remove(self)
logging.debug("Closing DirectUnpack for %s", self.nzo.final_name)
# Set the thread to killed so it never gets restarted by accident
self.killed = True

View File

@@ -66,7 +66,7 @@ class DirScanner(threading.Thread):
self.loop: Optional[asyncio.AbstractEventLoop] = None
self.scanner_task: Optional[asyncio.Task] = None
self.lock = asyncio.Lock() # Prevents concurrent scans
self.lock: Optional[asyncio.Lock] = None # Prevents concurrent scans
self.error_reported = False # Prevents multiple reporting of missing watched folder
self.dirscan_dir = cfg.dirscan_dir.get_path()
self.dirscan_speed = cfg.dirscan_speed()
@@ -109,7 +109,6 @@ class DirScanner(threading.Thread):
def run(self):
"""Start the scanner"""
logging.info("Dirscanner starting up")
self.loop = asyncio.new_event_loop()
try:
@@ -184,7 +183,6 @@ class DirScanner(threading.Thread):
async def when_stable_add_nzbfile(self, path: str, catdir: Optional[str], stat_tuple: os.stat_result):
"""Try and import the NZB but wait until the attributes are stable for 1 second, but give up after 3 sec"""
logging.info("Trying to import %s", path)
# Wait until the attributes are stable for 1 second, but give up after 3 sec
@@ -225,6 +223,11 @@ class DirScanner(threading.Thread):
async def scan_async(self, dirscan_dir: str):
"""Do one scan of the watched folder"""
# On Python 3.8 we first need an event loop before we can create a asyncio.Lock
if not self.lock:
with DIR_SCANNER_LOCK:
self.lock = asyncio.Lock()
async with self.lock:
if sabnzbd.PAUSED_ALL:
return

View File

@@ -24,8 +24,8 @@ import select
import logging
from math import ceil
from threading import Thread, RLock
import socket
import random
import socket
import sys
import ssl
from typing import List, Dict, Optional, Union, Set
@@ -405,10 +405,12 @@ class Downloader(Thread):
# Sort the servers for performance
self.servers.sort(key=lambda svr: "%02d%s" % (svr.priority, svr.displayname.lower()))
@synchronized(DOWNLOADER_LOCK)
def add_socket(self, fileno: int, nw: NewsWrapper):
"""Add a socket ready to be used to the list to be watched"""
self.read_fds[fileno] = nw
@synchronized(DOWNLOADER_LOCK)
def remove_socket(self, nw: NewsWrapper):
"""Remove a socket to be watched"""
if nw.nntp:
@@ -862,7 +864,7 @@ class Downloader(Thread):
with DOWNLOADER_LOCK:
server.busy_threads.remove(nw)
server.idle_threads.append(nw)
self.remove_socket(nw)
self.remove_socket(nw)
@synchronized(DOWNLOADER_LOCK)
def __finish_connect_nw(self, nw: NewsWrapper) -> bool:

View File

@@ -19,14 +19,40 @@
sabnzbd.osxmenu - macOS Top Menu
"""
from Foundation import *
from AppKit import *
from objc import YES, NO
import os
import sys
import time
import logging
from objc import YES, NO
from Foundation import (
NSObject,
NSDate,
NSTimer,
NSRunLoop,
NSDefaultRunLoopMode,
NSColor,
NSFont,
NSImage,
NSAttributedString,
)
from AppKit import (
NSStatusBar,
NSMenu,
NSMenuItem,
NSAlternateKeyMask,
NSTerminateNow,
NSEventTrackingRunLoopMode,
NSVariableStatusItemLength,
NSForegroundColorAttributeName,
NSFontAttributeName,
NSOnState,
NSOffState,
NSBaselineOffsetAttributeName,
NSParagraphStyleAttributeName,
NSMutableParagraphStyle,
NSParagraphStyle,
NSCenterTextAlignment,
)
import sabnzbd
import sabnzbd.cfg
@@ -315,7 +341,7 @@ class SABnzbdDelegate(NSObject):
},
)
menu_history_item.setAttributedTitle_(jobfailed)
menu_history_item.setRepresentedObject_("%s" % history["storage"])
menu_history_item.setRepresentedObject_(history["storage"])
self.menu_history.addItem_(menu_history_item)
else:
menu_history_item = NSMenuItem.alloc().initWithTitle_action_keyEquivalent_(T("Empty"), "", "")

View File

@@ -414,7 +414,7 @@ class Sorter:
def _rename_season_pack(self, files: List[str], base_path: str, all_job_files: List[str] = []) -> bool:
success = False
for f in files:
f_name, f_ext = os.path.splitext(f)
f_name, f_ext = os.path.splitext(os.path.basename(f))
if f_episode := guessit.api.guessit(f_name).get("episode"):
# Insert formatted episode number(s) into self.info
self.format_series_numbers(f_episode, "episode_num")
@@ -432,7 +432,7 @@ class Sorter:
("%ext", f_ext.lstrip(".")),
],
)
f_new = f_name_new + f_ext
f_new = to_lowercase(f_name_new + f_ext)
try:
logging.debug("Renaming season pack file %s to %s", f, f_new)
@@ -477,11 +477,13 @@ class Sorter:
f_name, f_ext = os.path.splitext(os.path.split(f)[1])
new_filepath = os.path.join(
base_path,
path_subst(
self.filename_set + self.multipart_label,
[("%1", str(index)), ("%fn", f_name), ("%ext", f_ext.lstrip("."))],
)
+ f_ext,
to_lowercase(
path_subst(
self.filename_set + self.multipart_label,
[("%1", str(index)), ("%fn", f_name), ("%ext", f_ext.lstrip("."))],
)
+ f_ext,
),
)
try:
logging.debug("Renaming %s to %s", filepath, new_filepath)
@@ -509,7 +511,7 @@ class Sorter:
def rename(self, files: List[str], base_path: str) -> Tuple[str, bool]:
if not self.rename_files:
return base_path, True
return move_to_parent_directory(base_path)
# Log the minimum filesize for renaming
if self.rename_limit > 0:
@@ -556,8 +558,11 @@ class Sorter:
# Rename it
f_name, f_ext = os.path.splitext(largest_file.get("name"))
filepath = self._to_filepath(largest_file.get("name"), base_path)
new_filepath = os.path.join(
base_path, path_subst(self.filename_set, [("%fn", f_name), ("%ext", f_ext.lstrip("."))]) + f_ext
new_filepath = get_unique_filename(
os.path.join(
base_path,
to_lowercase(path_subst(self.filename_set, [("%fn", f_name), ("%ext", f_ext.lstrip("."))]) + f_ext),
)
)
if not os.path.exists(new_filepath):
renamed_files = []

View File

@@ -4,6 +4,7 @@
# (e.g. "develop" or "1.2.x")
# You MUST use double quotes (so " and not ')
# Do not forget to update the appdata file for every major release!
__version__ = "4.1.0-develop"
__version__ = "4.0.3"
__baseline__ = "unknown"

View File

@@ -0,0 +1 @@
7<EFBFBD><EFBFBD><02>~<7E><>=@B.<2E>O<EFBFBD><4F>p<EFBFBD><70><EFBFBD>

View File

@@ -0,0 +1 @@
7<EFBFBD><EFBFBD><02>~<7E><>=@B.<2E>O<EFBFBD><4F>p<EFBFBD><70><EFBFBD>

View File

@@ -0,0 +1 @@
1

View File

@@ -29,12 +29,23 @@ import sabnzbd.decoder as decoder
from sabnzbd.nzbstuff import Article
def uu(data: bytes):
"""Uuencode data and insert a period if necessary"""
line = binascii.b2a_uu(data).rstrip(b"\n")
# Dot stuffing
if line.startswith(b"."):
return b"." + line
return line
LINES_DATA = [os.urandom(45) for _ in range(32)]
VALID_UU_LINES = [binascii.b2a_uu(data).rstrip(b"\n") for data in LINES_DATA]
VALID_UU_LINES = [uu(data) for data in LINES_DATA]
END_DATA = os.urandom(randint(1, 45))
VALID_UU_END = [
binascii.b2a_uu(END_DATA).rstrip(b"\n"),
uu(END_DATA),
b"`",
b"end",
]
@@ -48,6 +59,7 @@ class TestUuDecoder:
insert_excess_empty_lines: bool = False,
insert_headers: bool = False,
insert_end: bool = True,
insert_dot_stuffing_line: bool = False,
begin_line: bytes = b"begin 644 My Favorite Open Source Movie.mkv",
):
"""Generate message parts. Part may be one of 'begin', 'middle', or 'end' for multipart
@@ -89,6 +101,10 @@ class TestUuDecoder:
data.extend(VALID_UU_LINES[:size])
result.extend(LINES_DATA[:size])
if insert_dot_stuffing_line:
data.append(uu(b"\0" * 14))
result.append(b"\0" * 14)
if part in ("end", "single"):
if insert_end:
data.extend(VALID_UU_END)
@@ -148,6 +164,7 @@ class TestUuDecoder:
@pytest.mark.parametrize("insert_excess_empty_lines", [True, False])
@pytest.mark.parametrize("insert_headers", [True, False])
@pytest.mark.parametrize("insert_end", [True, False])
@pytest.mark.parametrize("insert_dot_stuffing_line", [True, False])
@pytest.mark.parametrize(
"begin_line",
[
@@ -157,11 +174,25 @@ class TestUuDecoder:
b"begin 0755 shell.sh",
],
)
def test_singlepart(self, insert_empty_line, insert_excess_empty_lines, insert_headers, insert_end, begin_line):
def test_singlepart(
self,
insert_empty_line,
insert_excess_empty_lines,
insert_headers,
insert_end,
insert_dot_stuffing_line,
begin_line,
):
"""Test variations of a sane single part nzf with proper uu-encoded data"""
# Generate a singlepart message
article, raw_data, expected_result = self._generate_msg_part(
"single", insert_empty_line, insert_excess_empty_lines, insert_headers, insert_end, begin_line
"single",
insert_empty_line,
insert_excess_empty_lines,
insert_headers,
insert_end,
insert_dot_stuffing_line,
begin_line,
)
assert decoder.decode_uu(article, raw_data) == expected_result
assert article.nzf.filename_checked

View File

@@ -18,6 +18,7 @@
"""
tests.test_dirscanner - Testing functions in dirscanner.py
"""
import asyncio
import pyfakefs.fake_filesystem_unittest as ffs

View File

@@ -18,6 +18,7 @@
"""
tests.test_functional_sorting - Test downloads with season sorting and sequential files
"""
import os
from tests.testhelper import *
from flaky import flaky
import sabnzbd.config as config
@@ -55,7 +56,7 @@ class TestDownloadSorting(DownloadFlowBasics):
)
def test_download_season_sorting(self, test_data_dir, result):
"""Test season pack sorting"""
self.download_nzb(test_data_dir, result, True)
self.download_nzb(os.path.join("sorting", test_data_dir), result, True)
@pytest.mark.parametrize(
"test_data_dir, result",
@@ -72,4 +73,29 @@ class TestDownloadSorting(DownloadFlowBasics):
)
def test_download_sequential(self, test_data_dir, result):
"""Test sequential file handling"""
self.download_nzb(test_data_dir, result, True)
self.download_nzb(os.path.join("sorting", test_data_dir), result, True)
@pytest.mark.parametrize(
"test_data_dir, result",
[
(
"SINGLE_sort_s23e06_480i-SABnzbd",
["Single.Sort.S23E06.mov"],
), # Single episode, no other files
(
"SINGLE_sort_s23e06_480i-SABnzbd",
["Single.Sort.S23E06.1.mov"],
), # Repeat to verify a unique filename is applied
(
"single-ep_sort_s06e66_4k_uhd-SABnzbd",
["Single-Ep.Sort.S06E66." + ext for ext in ("avi", "srt")],
), # Single episode with associated smaller file
(
"single-ep_sort_s06e66_4k_uhd-SABnzbd",
["Single-Ep.Sort.S06E66.1." + ext for ext in ("avi", "srt")],
), # Repeat to verify unique filenames are applied
],
)
def test_download_sorting_single(self, test_data_dir, result):
"""Test single episode file handling"""
self.download_nzb(os.path.join("sorting", test_data_dir), result, True)

View File

@@ -8,13 +8,20 @@
tests.test_postproc- Tests of various functions in newspack, among which rar_renamer()
"""
import os
import re
import shutil
from unittest import mock
from sabnzbd.postproc import *
from sabnzbd import postproc
from sabnzbd.config import ConfigSorter, ConfigCat, read_config
from sabnzbd.filesystem import globber_full, clip_path
from sabnzbd.misc import sort_to_opts
from tests.testhelper import *
@pytest.mark.usefixtures("clean_cache_dir")
class TestPostProc:
# Tests of rar_renamer() (=deobfuscate) against various input directories
def test_rar_renamer(self):
@@ -40,7 +47,7 @@ class TestPostProc:
nzo = mock.Mock()
nzo.final_name = "somedownloadname"
nzo.download_path = workingdir
number_renamed_files = rar_renamer(nzo)
number_renamed_files = postproc.rar_renamer(nzo)
# run check on the resulting files
if expected_filename_matches:
@@ -94,3 +101,140 @@ class TestPostProc:
expected_filename_matches = {"*.rar": 0, "*-*-*-*-*": 6}
# 0 files should have been renamed
assert deobfuscate_dir(sourcedir, expected_filename_matches) == 0
@pytest.mark.parametrize("category", ["testcat", "Default", None])
@pytest.mark.parametrize("has_jobdir", [True, False]) # With or without a job dir
@pytest.mark.parametrize("has_catdir", [True, False]) # Complete directory is defined at category level
@pytest.mark.parametrize("has_active_sorter", [True, False]) # Sorter active for the fake nzo
@pytest.mark.parametrize("sort_string", ["%sn (%r)", "%sn (%r)/file.%ext", ""]) # Identical path result
@pytest.mark.parametrize("marker_file", [None, ".marker"])
@pytest.mark.parametrize("do_folder_rename", [True, False])
def test_prepare_extraction_path(
self, category, has_jobdir, has_catdir, has_active_sorter, sort_string, marker_file, do_folder_rename
):
# Ensure global CFG_ vars are initialised
sabnzbd.config.read_config(os.devnull)
# Define a sorter and a category (as @set_config cannot handle those)
ConfigSorter(
"sorter__test_prepare_extraction_path",
{
"order": 0,
"min_size": 42,
"multipart_label": "",
"sort_string": sort_string,
"sort_cats": [category if category else "no_such_category"],
"sort_type": [
sort_to_opts("all"),
],
"is_active": int(has_active_sorter),
},
)
assert sabnzbd.config.CFG_DATABASE["sorters"]["sorter__test_prepare_extraction_path"]
if category:
ConfigCat(
category,
{
"order": 0,
"pp": None,
"script": None,
"dir": os.path.join(
SAB_CACHE_DIR, ("category_dir_for_" + category + ("*" if not has_jobdir else ""))
)
if has_catdir
else None,
"newzbin": "",
"priority": 0,
},
)
assert sabnzbd.config.CFG_DATABASE["categories"][category]
# Mock a minimal nzo, required as function input
fake_nzo = mock.Mock()
fake_nzo.final_name = "FOSS.Rules.S23E06.2160p-SABnzbd"
fake_nzo.cat = category
fake_nzo.nzo_info = {} # Placeholder to prevent a crash in sorting.get_titles()
@set_config(
{
"download_dir": os.path.join(SAB_CACHE_DIR, "incomplete"),
"complete_dir": os.path.join(SAB_CACHE_DIR, "complete"),
"marker_file": marker_file,
"folder_rename": do_folder_rename,
}
)
def _func():
(
tmp_workdir_complete,
workdir_complete,
file_sorter,
not_create_job_dir,
marker_file_result,
) = postproc.prepare_extraction_path(fake_nzo)
tmp_workdir_complete = clip_path(tmp_workdir_complete)
workdir_complete = clip_path(workdir_complete)
# Verify marker file
if marker_file and not not_create_job_dir:
assert marker_file_result
else:
assert not marker_file_result
# Verify sorter
assert file_sorter
if has_active_sorter and category and sort_string:
assert file_sorter.sorter_active
else:
assert not file_sorter.sorter_active
# Verify not_create_job_dir
if category and has_catdir and not has_jobdir and not file_sorter.sorter_active:
assert not_create_job_dir
else:
# Double negatives ftw
assert not not_create_job_dir
# Verify workdir_complete
if not category or not has_catdir:
# Using standard Complete directory as base
assert workdir_complete.startswith(os.path.join(SAB_CACHE_DIR, "complete"))
elif category and has_catdir:
# Based on the category directory
assert workdir_complete.startswith(os.path.join(SAB_CACHE_DIR, "category_dir_for_" + category))
# Check the job directory part (or the lack thereof) as well
if has_active_sorter and category and sort_string:
# Sorter path, with an extra job name work directory inside
assert re.fullmatch(
re.escape(SAB_CACHE_DIR)
+ r".*"
+ re.escape(os.sep)
+ r"Foss Rules \(2160p\)"
+ re.escape(os.sep)
+ fake_nzo.final_name
+ r"(\.\d+)?",
workdir_complete,
)
elif has_jobdir or not (category and has_catdir):
# Standard job name directory
assert re.fullmatch(
re.escape(SAB_CACHE_DIR) + r".*" + re.escape(os.sep) + r"FOSS.Rules.S23E06.2160p-SABnzbd(\.\d+)?",
workdir_complete,
)
else:
# No job directory at all
assert re.fullmatch(
re.escape(SAB_CACHE_DIR) + r".*" + re.escape(os.sep) + r"category_dir_for_([a-zA-Z]+)",
workdir_complete,
)
# Verify tmp_workdir_complete
if do_folder_rename:
if not not_create_job_dir:
assert tmp_workdir_complete != workdir_complete
assert tmp_workdir_complete.replace("_UNPACK_", "") == workdir_complete
else:
assert tmp_workdir_complete == workdir_complete
_func()

View File

@@ -617,10 +617,12 @@ class TestSortingSorter:
[
("%sn s%0se%0e.%ext", True), # %0e marker
("%sn s%se%e.%ext", True), # %e marker
("{%sn }s%se%e.%ext", True), # Same with lowercasing; test for issue #2578
("%sn.%ext", False), # No episode marker
("%sn_%0se%0e", False), # No extension marker
("%r/%sn s%0se%0e.%ext", True), # %0e marker, with dir in sort string
("%r/%sn s%se%e.%ext", True), # %e marker, with dir in sort string
("%r/{%sn} s%se%e.%ext", True), # Same with lowercasing; test for issue #2578
("%r/%sn.%ext", False), # No episode marker, with dir in sort string
("%r/%sn_%0se%0e", False), # No extension marker, with dir in sort string
],
@@ -717,7 +719,11 @@ class TestSortingSorter:
# Also do a basic filename check
for root, dirs, files in os.walk(base_dir):
for filename in files:
assert re.fullmatch(r"Pack Renamer s23e0?\d.*" + extension, filename)
if "{" in sort_string:
# Lowercasing marker in sort string, expect lowercase results
assert re.fullmatch(r"pack renamer s23e0?\d.*" + extension, filename)
else:
assert re.fullmatch(r"Pack Renamer s23e0?\d.*" + extension, filename)
else:
# No season pack renaming should happen, verify original files are still in place
assert os.path.exists(os.path.join(base_dir, f))

View File

@@ -329,7 +329,7 @@ class DownloadFlowBasics(SABnzbdBaseTest):
# Add NZB
if dir_name_as_job_name:
test_job_name = nzb_dir
test_job_name = os.path.basename(nzb_dir)
else:
test_job_name = "testfile_%s" % time.time()
api_result = get_api_result("addlocalfile", extra_arguments={"name": nzb_path, "nzbname": test_job_name})
@@ -364,9 +364,19 @@ class DownloadFlowBasics(SABnzbdBaseTest):
# Verify all files in the expected file_output are present among the completed files.
# Sometimes par2 can also be included, but we accept that. For example when small
# par2 files get assembled in after the download already finished (see #1509)
completed_files = filesystem.globber(os.path.join(SAB_COMPLETE_DIR, test_job_name), "*")
for filename in file_output:
assert filename in completed_files
for _ in range(10):
completed_files = filesystem.globber(os.path.join(SAB_COMPLETE_DIR, test_job_name), "*")
try:
for filename in file_output:
assert filename in completed_files
# All filenames found
break
except AssertionError:
print("Expected filename %s not found in completed_files %s" % (filename, completed_files))
# Wait a sec before trying again with a fresh list of completed files
time.sleep(1)
else:
pytest.fail("Time ran out waiting for expected filenames to show up")
# Verify if the garbage collection works (see #1628)
# We need to give it a second to calm down and clear the variables