mirror of
https://github.com/sabnzbd/sabnzbd.git
synced 2025-12-30 19:21:16 -05:00
Compare commits
6 Commits
4.6.0Beta2
...
feature/bl
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8d21033568 | ||
|
|
ed655553c8 | ||
|
|
316b96c653 | ||
|
|
62401cba27 | ||
|
|
3cabf44ce3 | ||
|
|
a637d218c4 |
@@ -2,7 +2,7 @@
|
||||
# Note that not all sub-dependencies are listed, but only ones we know could cause trouble
|
||||
pyinstaller==6.17.0
|
||||
packaging==25.0
|
||||
pyinstaller-hooks-contrib==2025.10
|
||||
pyinstaller-hooks-contrib==2025.11
|
||||
altgraph==0.17.5
|
||||
wrapt==2.0.1
|
||||
setuptools==80.9.0
|
||||
|
||||
39
context/Download-flow.md
Normal file
39
context/Download-flow.md
Normal file
@@ -0,0 +1,39 @@
|
||||
## Download flow (Downloader + NewsWrapper)
|
||||
|
||||
1. **Job ingestion**
|
||||
- NZBs arrive via UI/API/URL; `urlgrabber.py` fetches remote NZBs, `nzbparser.py` turns them into `NzbObject`s, and `nzbqueue.NzbQueue` stores ordered jobs with priorities and categories.
|
||||
|
||||
2. **Queue to articles**
|
||||
- When servers need work, `NzbQueue.get_articles` (called from `Server.get_article` in `downloader.py`) hands out batches of `Article`s per server, respecting retention, priority, and forced/paused items.
|
||||
|
||||
3. **Downloader setup**
|
||||
- `Downloader` thread loads server configs (`config.get_servers`), instantiates `Server` objects (per host/port/SSL/threads), and spawns `NewsWrapper` instances per configured connection.
|
||||
- A `selectors.DefaultSelector` watches all sockets; `BPSMeter` tracks throughput and speed limits; timers manage server penalties/restarts.
|
||||
|
||||
4. **Connection establishment (NewsWrapper.init_connect → NNTP.connect)**
|
||||
- `Server.request_addrinfo` resolves fastest address; `NewsWrapper` builds an `NNTP` socket, wraps SSL if needed, sets non-blocking, and registers with the selector.
|
||||
- First server greeting (200/201) is queued; `finish_connect` drives the login handshake (`AUTHINFO USER/PASS`) and handles temporary (480) or permanent (400/502) errors.
|
||||
|
||||
5. **Request scheduling & pipelining**
|
||||
- `write()` chooses the next article command (`STAT/HEAD` for precheck, `BODY` or `ARTICLE` otherwise).
|
||||
- Concurrency is limited by `server.pipelining_requests`; commands are queued and sent with `sock.sendall`, so there is no local send buffer. A `BlockingIOError/SSLWantWriteError` simply requeues the same request for the next selector cycle.
|
||||
- Sockets stay registered for `EVENT_WRITE`: without write readiness events, a temporarily full kernel send buffer could stall queued commands when there is nothing to read, so WRITE interest is needed to resume sending promptly.
|
||||
|
||||
6. **Receiving data**
|
||||
- Selector events route to `process_nw_read`; `NewsWrapper.read` pulls bytes (SSL optimized via sabctools), parses NNTP responses, and calls `on_response`.
|
||||
- Successful BODY/ARTICLE (220/222) updates per-server stats; missing/500 variants toggle capability flags (BODY/STAT support).
|
||||
|
||||
7. **Decoding and caching**
|
||||
- `Downloader.decode` hands responses to `decoder.decode`, which yEnc/UU decodes, CRC-checks, and stores payloads in `ArticleCache` (memory or disk spill).
|
||||
- Articles with DMCA/bad data trigger retry on other servers until `max_art_tries` is exceeded.
|
||||
|
||||
8. **Assembly to files**
|
||||
- `Assembler` worker consumes decoded pieces, writes to the target file, updates CRC, and cleans admin markers. It guards disk space (`diskspace_check`) and schedules direct unpack or PAR2 handling when files finish.
|
||||
|
||||
9. **Queue bookkeeping**
|
||||
- `NzbQueue.register_article` records success/failure; completed files advance NZF/NZO state. If all files done, the job moves to post-processing (`PostProcessor.process`), which runs `newsunpack`, scripts, sorting, etc.
|
||||
|
||||
10. **Control & resilience**
|
||||
- Pausing/resuming (`Downloader.pause/resume`), bandwidth limiting, and sleep tuning happen in the main loop.
|
||||
- Errors/timeouts lead to `reset_nw` (close socket, return article, maybe penalize server). Optional servers can be temporarily disabled; required ones schedule resumes.
|
||||
- Forced disconnect/shutdown drains sockets, refreshes DNS, and exits cleanly.
|
||||
32
context/Repo-layout.md
Normal file
32
context/Repo-layout.md
Normal file
@@ -0,0 +1,32 @@
|
||||
## Repo layout
|
||||
|
||||
- Entry points & metadata
|
||||
- `SABnzbd.py`: starts the app.
|
||||
- `README.md` / `README.mkd`: release notes and overview.
|
||||
- `requirements.txt`: runtime deps.
|
||||
|
||||
- Core application package `sabnzbd/`
|
||||
- Download engine: `downloader.py` (main loop), `newswrapper.py` (NNTP connections), `urlgrabber.py`, `nzbqueue.py` (queue), `nzbparser.py` (parse NZB), `assembler.py` (writes decoded parts), `decoder.py` (yEnc/UU decode), `articlecache.py` (in-memory/on-disk cache).
|
||||
- Post-processing: `newsunpack.py`, `postproc.py`, `directunpacker.py`, `sorting.py`, `deobfuscate_filenames.py`.
|
||||
- Config/constants/utilities: `cfg.py`, `config.py`, `constants.py`, `misc.py`, `filesystem.py`, `encoding.py`, `lang.py`, `scheduler.py`, `notifier.py`, `emailer.py`, `rss.py`.
|
||||
- UI plumbing: `interface.py`, `skintext.py`, `version.py`, platform helpers (`macosmenu.py`, `sabtray*.py`).
|
||||
- Subpackages: `sabnzbd/nzb/` (NZB model objects), `sabnzbd/utils/` (helpers).
|
||||
|
||||
- Web interfaces & assets
|
||||
- `interfaces/Glitter`, `interfaces/Config`, `interfaces/wizard`: HTML/JS/CSS skins.
|
||||
- `icons/`: tray/web icons.
|
||||
- `locale/`, `po/`, `tools/`: translation sources and helper scripts (`make_mo.py`, etc.).
|
||||
|
||||
- Testing & samples
|
||||
- `tests/`: pytest suite plus `data/` fixtures and `test_utils/`.
|
||||
- `scripts/`: sample post-processing hooks (`Sample-PostProc.*`).
|
||||
|
||||
- Packaging/build
|
||||
- `builder/`: platform build scripts (DMG/EXE specs, `package.py`, `release.py`).
|
||||
- Platform folders `win/`, `macos/`, `linux/`, `snap/`: installer or platform-specific assets.
|
||||
- `admin/`, `builder/constants.py`, `licenses/`: release and licensing support files.
|
||||
|
||||
- Documentation
|
||||
- Documentation website source is stored in the `sabnzbd.github.io` repo.
|
||||
- This repo is most likely located 1 level up from the root folder of this repo.
|
||||
- Documentation is split per SABnzbd version, in the `wiki` folder.
|
||||
@@ -60,7 +60,6 @@ class NewsWrapper:
|
||||
"blocking",
|
||||
"timeout",
|
||||
"decoder",
|
||||
"send_buffer",
|
||||
"nntp",
|
||||
"connected",
|
||||
"user_sent",
|
||||
@@ -86,7 +85,6 @@ class NewsWrapper:
|
||||
self.timeout: Optional[float] = None
|
||||
|
||||
self.decoder: Optional[sabctools.Decoder] = None
|
||||
self.send_buffer = b""
|
||||
|
||||
self.nntp: Optional[NNTP] = None
|
||||
|
||||
@@ -338,20 +336,13 @@ class NewsWrapper:
|
||||
server = self.server
|
||||
|
||||
try:
|
||||
# First, try to flush any remaining data
|
||||
if self.send_buffer:
|
||||
sent = self.nntp.sock.send(self.send_buffer)
|
||||
self.send_buffer = self.send_buffer[sent:]
|
||||
if self.send_buffer:
|
||||
# Still unsent data, wait for next EVENT_WRITE
|
||||
return
|
||||
|
||||
if self.connected:
|
||||
if (
|
||||
server.active
|
||||
and not server.restart
|
||||
and not (
|
||||
sabnzbd.Downloader.paused
|
||||
sabnzbd.Downloader.no_active_jobs()
|
||||
or sabnzbd.Downloader.shutdown
|
||||
or sabnzbd.Downloader.paused_for_postproc
|
||||
)
|
||||
@@ -365,7 +356,7 @@ class NewsWrapper:
|
||||
self.next_request = None
|
||||
|
||||
# If no pending buffer, try to send new command
|
||||
if not self.send_buffer and self.next_request:
|
||||
if self.next_request:
|
||||
if self.concurrent_requests.acquire(blocking=False):
|
||||
command, article = self.next_request
|
||||
self.next_request = None
|
||||
@@ -375,25 +366,24 @@ class NewsWrapper:
|
||||
self.discard(article, count_article_try=False, retry_article=True)
|
||||
self.concurrent_requests.release()
|
||||
return
|
||||
self._response_queue.append(article)
|
||||
|
||||
if sabnzbd.LOG_ALL:
|
||||
logging.debug("Thread %s@%s: %s", self.thrdnum, server.host, command)
|
||||
try:
|
||||
sent = self.nntp.sock.send(command)
|
||||
if sent < len(command):
|
||||
# Partial send, store remainder
|
||||
self.send_buffer = command[sent:]
|
||||
self.nntp.sock.sendall(command)
|
||||
self._response_queue.append(article)
|
||||
except (BlockingIOError, ssl.SSLWantWriteError):
|
||||
# Can't send now, store full command
|
||||
self.send_buffer = command
|
||||
# Couldn't send now, try again later
|
||||
self.concurrent_requests.release()
|
||||
self.next_request = (command, article)
|
||||
return
|
||||
else:
|
||||
# Concurrency limit reached
|
||||
sabnzbd.Downloader.modify_socket(self, EVENT_READ)
|
||||
else:
|
||||
# Is it safe to shut down this socket?
|
||||
if (
|
||||
not self.send_buffer
|
||||
and not self.next_request
|
||||
not self.next_request
|
||||
and not self._response_queue
|
||||
and (not server.active or server.restart or not self.timeout or time.time() > self.timeout)
|
||||
):
|
||||
|
||||
488
sabnzbd/rss.py
488
sabnzbd/rss.py
@@ -25,6 +25,8 @@ import time
|
||||
import datetime
|
||||
import threading
|
||||
import urllib.parse
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Union, Optional
|
||||
|
||||
import sabnzbd
|
||||
from sabnzbd.constants import RSS_FILE_NAME, DEFAULT_PRIORITY
|
||||
@@ -51,9 +53,45 @@ import feedparser
|
||||
##############################################################################
|
||||
|
||||
|
||||
def notdefault(item):
|
||||
"""Return True if not 'Default|''|*'"""
|
||||
return bool(item) and str(item).lower() not in ("default", "*", "", str(DEFAULT_PRIORITY))
|
||||
def _normalise_default(value: Optional[str]) -> Optional[str]:
|
||||
"""Normalise default values to None"""
|
||||
if not value:
|
||||
return None
|
||||
v = str(value).strip()
|
||||
if v.lower() in ("", "*", "default"):
|
||||
return None
|
||||
return v
|
||||
|
||||
|
||||
def _normalise_priority(value) -> Optional[int]:
|
||||
"""Normalise default priority values to None"""
|
||||
if value in (None, "", "*", "default", DEFAULT_PRIORITY):
|
||||
return None
|
||||
try:
|
||||
return int(value)
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
|
||||
def _normalise_pp(value) -> Optional[int]:
|
||||
"""Normalise pp value to an int between 0 and 3, or None if invalid/empty."""
|
||||
if value in (None, ""):
|
||||
return None
|
||||
try:
|
||||
iv = int(value)
|
||||
if 0 <= iv <= 3:
|
||||
return iv
|
||||
except (TypeError, ValueError):
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def coalesce(*args):
|
||||
"""Return first value which is not None"""
|
||||
for a in args:
|
||||
if a is not None:
|
||||
return a
|
||||
return None
|
||||
|
||||
|
||||
def remove_obsolete(jobs, new_jobs):
|
||||
@@ -142,43 +180,7 @@ class RSSReader:
|
||||
return T('Incorrect RSS feed description "%s"') % feed
|
||||
|
||||
uris = feeds.uri()
|
||||
defCat = feeds.cat()
|
||||
|
||||
if not notdefault(defCat) or defCat not in sabnzbd.api.list_cats(default=False):
|
||||
defCat = None
|
||||
defPP = feeds.pp()
|
||||
if not notdefault(defPP):
|
||||
defPP = None
|
||||
defScript = feeds.script()
|
||||
if not notdefault(defScript):
|
||||
defScript = None
|
||||
defPrio = feeds.priority()
|
||||
if not notdefault(defPrio):
|
||||
defPrio = None
|
||||
|
||||
# Preparations, convert filters to regex's
|
||||
regexes = []
|
||||
reTypes = []
|
||||
reCats = []
|
||||
rePPs = []
|
||||
rePrios = []
|
||||
reScripts = []
|
||||
reEnabled = []
|
||||
for feed_filter in feeds.filters():
|
||||
reCat = feed_filter[0]
|
||||
if defCat in ("", "*"):
|
||||
reCat = None
|
||||
reCats.append(reCat)
|
||||
rePPs.append(feed_filter[1])
|
||||
reScripts.append(feed_filter[2])
|
||||
reTypes.append(feed_filter[3])
|
||||
if feed_filter[3] in ("<", ">", "F", "S"):
|
||||
regexes.append(feed_filter[4])
|
||||
else:
|
||||
regexes.append(convert_filter(feed_filter[4]))
|
||||
rePrios.append(feed_filter[5])
|
||||
reEnabled.append(feed_filter[6] != "0")
|
||||
regcount = len(regexes)
|
||||
filters = FeedConfig.from_config(feeds)
|
||||
|
||||
# Set first if this is the very first scan of this URI
|
||||
first = (feed not in self.jobs) and ignoreFirst
|
||||
@@ -301,143 +303,41 @@ class RSSReader:
|
||||
if jobstat in "NGB" or (jobstat == "X" and readout):
|
||||
# Match this title against all filters
|
||||
logging.debug("Trying title %s", title)
|
||||
result = False
|
||||
myCat = defCat
|
||||
myPP = defPP
|
||||
myScript = defScript
|
||||
myPrio = defPrio
|
||||
n = 0
|
||||
if ("F" in reTypes or "S" in reTypes) and (not season or not episode):
|
||||
show_analysis = sabnzbd.sorting.BasicAnalyzer(title)
|
||||
season = show_analysis.info.get("season_num")
|
||||
episode = show_analysis.info.get("episode_num")
|
||||
match = filters.evaluate(
|
||||
title=title,
|
||||
category=category,
|
||||
size=size,
|
||||
season=season,
|
||||
episode=episode,
|
||||
)
|
||||
|
||||
# Match against all filters until an positive or negative match
|
||||
logging.debug("Size %s", size)
|
||||
for n in range(regcount):
|
||||
if reEnabled[n]:
|
||||
if category and reTypes[n] == "C":
|
||||
found = re.search(regexes[n], category)
|
||||
if not found:
|
||||
logging.debug("Filter rejected on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
elif reTypes[n] == "<" and size and from_units(regexes[n]) < size:
|
||||
# "Size at most" : too large
|
||||
logging.debug("Filter rejected on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
elif reTypes[n] == ">" and size and from_units(regexes[n]) > size:
|
||||
# "Size at least" : too small
|
||||
logging.debug("Filter rejected on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
elif reTypes[n] == "F" and not ep_match(season, episode, regexes[n]):
|
||||
# "Starting from SxxEyy", too early episode
|
||||
logging.debug("Filter requirement match on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
elif reTypes[n] == "S" and ep_match(season, episode, regexes[n], title):
|
||||
logging.debug("Filter matched on rule %d", n)
|
||||
result = True
|
||||
break
|
||||
else:
|
||||
if regexes[n]:
|
||||
found = re.search(regexes[n], title)
|
||||
else:
|
||||
found = False
|
||||
if reTypes[n] == "M" and not found:
|
||||
logging.debug("Filter rejected on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
if found and reTypes[n] == "A":
|
||||
logging.debug("Filter matched on rule %d", n)
|
||||
result = True
|
||||
break
|
||||
if found and reTypes[n] == "R":
|
||||
logging.debug("Filter rejected on rule %d", n)
|
||||
result = False
|
||||
break
|
||||
job = jobs.get(link)
|
||||
is_starred = job and job.get("status", "").endswith("*")
|
||||
star = first or is_starred
|
||||
act = (download and not first and not is_starred) or force
|
||||
|
||||
if len(reCats):
|
||||
if not result and defCat:
|
||||
# Apply Feed-category on non-matched items
|
||||
myCat = defCat
|
||||
elif result and notdefault(reCats[n]):
|
||||
# Use the matched info
|
||||
myCat = reCats[n]
|
||||
elif category and not defCat:
|
||||
# No result and no Feed-category
|
||||
myCat = cat_convert(category)
|
||||
|
||||
if myCat:
|
||||
myCat, catPP, catScript, catPrio = cat_to_opts(myCat)
|
||||
else:
|
||||
myCat = catPP = catScript = catPrio = None
|
||||
if notdefault(rePPs[n]):
|
||||
myPP = rePPs[n]
|
||||
elif not (reCats[n] or category):
|
||||
myPP = catPP
|
||||
if notdefault(reScripts[n]):
|
||||
myScript = reScripts[n]
|
||||
elif not (notdefault(reCats[n]) or category):
|
||||
myScript = catScript
|
||||
if rePrios[n] not in (str(DEFAULT_PRIORITY), ""):
|
||||
myPrio = rePrios[n]
|
||||
elif not ((rePrios[n] != str(DEFAULT_PRIORITY)) or category):
|
||||
myPrio = catPrio
|
||||
|
||||
act = download and not first
|
||||
if link in jobs:
|
||||
act = act and not jobs[link].get("status", "").endswith("*")
|
||||
act = act or force
|
||||
star = first or jobs[link].get("status", "").endswith("*")
|
||||
else:
|
||||
star = first
|
||||
if result:
|
||||
_HandleLink(
|
||||
feed,
|
||||
jobs,
|
||||
link,
|
||||
infourl,
|
||||
title,
|
||||
size,
|
||||
age,
|
||||
season,
|
||||
episode,
|
||||
"G",
|
||||
category,
|
||||
myCat,
|
||||
myPP,
|
||||
myScript,
|
||||
act,
|
||||
star,
|
||||
priority=myPrio,
|
||||
rule=n,
|
||||
)
|
||||
if act:
|
||||
new_downloads.append(title)
|
||||
else:
|
||||
_HandleLink(
|
||||
feed,
|
||||
jobs,
|
||||
link,
|
||||
infourl,
|
||||
title,
|
||||
size,
|
||||
age,
|
||||
season,
|
||||
episode,
|
||||
"B",
|
||||
category,
|
||||
myCat,
|
||||
myPP,
|
||||
myScript,
|
||||
False,
|
||||
star,
|
||||
priority=myPrio,
|
||||
rule=n,
|
||||
)
|
||||
_HandleLink(
|
||||
feed=feed,
|
||||
jobs=jobs,
|
||||
link=link,
|
||||
infourl=infourl,
|
||||
title=title,
|
||||
size=size,
|
||||
age=age,
|
||||
season=match.season,
|
||||
episode=match.episode,
|
||||
flag="G" if match.matched else "B",
|
||||
orgcat=category,
|
||||
cat=match.category,
|
||||
pp=match.pp,
|
||||
script=match.script,
|
||||
download=act and match.matched,
|
||||
star=star,
|
||||
priority=match.priority,
|
||||
rule=match.rule_index,
|
||||
)
|
||||
if match.matched and act:
|
||||
new_downloads.append(title)
|
||||
|
||||
# Send email if wanted and not "forced"
|
||||
if new_downloads and cfg.email_rss() and not force:
|
||||
@@ -532,6 +432,238 @@ class RSSReader:
|
||||
self.jobs[feed][item]["status"] = "D-"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class FeedMatch:
|
||||
matched: bool
|
||||
rule_index: int
|
||||
season: int
|
||||
episode: int
|
||||
category: Optional[str] = None
|
||||
priority: Optional[int] = None
|
||||
pp: Optional[int] = None
|
||||
script: Optional[str] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class FeedRule:
|
||||
regex: Union[str, re.Pattern]
|
||||
type: str
|
||||
category: Optional[str] = None
|
||||
priority: Optional[int] = None
|
||||
pp: Optional[int] = None
|
||||
script: Optional[str] = None
|
||||
enabled: bool = True
|
||||
|
||||
def __post_init__(self):
|
||||
# Convert regex if needed
|
||||
if self.type not in {"<", ">", "F", "S"}:
|
||||
self.regex = convert_filter(self.regex)
|
||||
# Normalise "default-ish" values to None
|
||||
self.category = _normalise_default(self.category)
|
||||
self.priority = _normalise_priority(self.priority)
|
||||
self.pp = _normalise_pp(self.pp)
|
||||
self.script = _normalise_default(self.script)
|
||||
|
||||
|
||||
@dataclass
|
||||
class FeedConfig:
|
||||
default_category: Optional[str] = None
|
||||
default_priority: Optional[int] = None
|
||||
default_pp: Optional[int] = None
|
||||
default_script: Optional[str] = None
|
||||
rules: list[FeedRule] = field(default_factory=list)
|
||||
|
||||
def __post_init__(self):
|
||||
self.default_category = _normalise_default(self.default_category)
|
||||
if self.default_category not in sabnzbd.api.list_cats(default=False):
|
||||
self.default_category = None
|
||||
self.default_priority = _normalise_priority(self.default_priority)
|
||||
self.default_pp = _normalise_pp(self.default_pp)
|
||||
self.default_script = _normalise_default(self.default_script)
|
||||
|
||||
def has_type(self, *types: str) -> bool:
|
||||
"""Check if any rule matches the given types"""
|
||||
return any(rule.type in types for rule in self.rules)
|
||||
|
||||
@classmethod
|
||||
def from_config(cls, c: config.ConfigRSS) -> "FeedConfig":
|
||||
"""Build a FeedConfig from a RSS config."""
|
||||
rules: list[FeedRule] = []
|
||||
for cat, pp, script, ftype, regex, priority, enabled in c.filters():
|
||||
rules.append(
|
||||
FeedRule(
|
||||
regex=regex,
|
||||
type=ftype,
|
||||
category=cat,
|
||||
priority=priority,
|
||||
pp=pp,
|
||||
script=script,
|
||||
enabled=(enabled != "0"),
|
||||
)
|
||||
)
|
||||
|
||||
return cls(
|
||||
default_category=c.cat(),
|
||||
default_priority=c.priority(),
|
||||
default_pp=c.pp(),
|
||||
default_script=c.script(),
|
||||
rules=rules,
|
||||
)
|
||||
|
||||
def evaluate(
|
||||
self,
|
||||
*,
|
||||
title: str,
|
||||
category: Optional[str],
|
||||
size: int,
|
||||
season: int,
|
||||
episode: int,
|
||||
) -> FeedMatch:
|
||||
"""Evaluate rules for a single RSS entry."""
|
||||
result: bool = False
|
||||
matched_rule: Optional[FeedRule] = None
|
||||
matched_index: int = 0
|
||||
cur_season: int = season
|
||||
cur_episode: int = episode
|
||||
|
||||
# Start from feed defaults for options.
|
||||
my_category: Optional[str] = self.default_category
|
||||
my_pp: Optional[str] = self.default_pp
|
||||
my_script: Optional[str] = self.default_script
|
||||
my_priority: Optional[int] = self.default_priority
|
||||
|
||||
# If there are no rules; return early
|
||||
if not self.rules:
|
||||
return FeedMatch(
|
||||
matched=result,
|
||||
rule_index=matched_index,
|
||||
season=int_conv(cur_season),
|
||||
episode=int_conv(cur_episode),
|
||||
category=my_category,
|
||||
pp=my_pp,
|
||||
script=my_script,
|
||||
priority=my_priority,
|
||||
)
|
||||
|
||||
# Fill in missing season / episode information when F/S rules exist
|
||||
if self.has_type("F", "S") and (not cur_season or not cur_episode):
|
||||
show_analysis = sabnzbd.sorting.BasicAnalyzer(title)
|
||||
cur_season = show_analysis.info.get("season_num")
|
||||
cur_episode = show_analysis.info.get("episode_num")
|
||||
|
||||
# Match against all filters until a positive or negative match
|
||||
logging.debug("Size %s", size)
|
||||
for idx, rule in enumerate(self.rules):
|
||||
if not rule.enabled:
|
||||
continue
|
||||
|
||||
if category and rule.type == "C":
|
||||
found = re.search(rule.regex, category)
|
||||
if not found:
|
||||
logging.debug("Filter rejected on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
elif rule.type == "<" and size and from_units(rule.regex) < size:
|
||||
# "Size at most" : too large
|
||||
logging.debug("Filter rejected on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
elif rule.type == ">" and size and from_units(rule.regex) > size:
|
||||
# "Size at least" : too small
|
||||
logging.debug("Filter rejected on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
elif rule.type == "F" and not ep_match(cur_season, cur_episode, rule.regex):
|
||||
# "Starting from SxxEyy", too early episode
|
||||
logging.debug("Filter requirement match on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
elif rule.type == "S" and ep_match(cur_season, cur_episode, rule.regex, title):
|
||||
logging.debug("Filter matched on rule %d", idx)
|
||||
result = True
|
||||
matched_index = idx
|
||||
matched_rule = rule
|
||||
break
|
||||
else:
|
||||
if rule.regex:
|
||||
found = re.search(rule.regex, title)
|
||||
else:
|
||||
found = False
|
||||
|
||||
if rule.type == "M" and not found:
|
||||
logging.debug("Filter rejected on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
if found and rule.type == "A":
|
||||
logging.debug("Filter matched on rule %d", idx)
|
||||
result = True
|
||||
matched_index = idx
|
||||
matched_rule = rule
|
||||
break
|
||||
if found and rule.type == "R":
|
||||
logging.debug("Filter rejected on rule %d", idx)
|
||||
result = False
|
||||
matched_index = idx
|
||||
break
|
||||
|
||||
if matched_rule is None:
|
||||
# No rule matched; keep my_category/my_pp/my_script/my_priority at feed defaults,
|
||||
# or use original category if there is no default.
|
||||
if category is not None and self.default_category is None:
|
||||
my_category = cat_convert(category)
|
||||
if my_category:
|
||||
my_category, category_pp, category_script, category_priority = cat_to_opts(my_category)
|
||||
category_pp = _normalise_pp(category_pp)
|
||||
category_script = _normalise_default(category_script)
|
||||
category_priority = _normalise_priority(category_priority)
|
||||
else:
|
||||
my_category = category_pp = category_script = category_priority = None
|
||||
# pp/script/priority only come from category defaults in this case
|
||||
my_pp = coalesce(category_pp, self.default_pp)
|
||||
my_script = category_script or self.default_script
|
||||
my_priority = coalesce(category_priority, self.default_priority)
|
||||
|
||||
return FeedMatch(
|
||||
matched=result,
|
||||
rule_index=matched_index,
|
||||
season=int_conv(cur_season),
|
||||
episode=int_conv(cur_episode),
|
||||
category=my_category,
|
||||
pp=my_pp,
|
||||
script=my_script,
|
||||
priority=my_priority,
|
||||
)
|
||||
|
||||
# At this point we know a rule fired and matched_rule is not None.
|
||||
my_category = matched_rule.category or cat_convert(category) or self.default_category
|
||||
if my_category:
|
||||
my_category, category_pp, category_script, category_priority = cat_to_opts(my_category)
|
||||
category_pp = _normalise_pp(category_pp)
|
||||
category_script = _normalise_default(category_script)
|
||||
category_priority = _normalise_priority(category_priority)
|
||||
else:
|
||||
my_category = category_pp = category_script = category_priority = None
|
||||
my_pp = coalesce(matched_rule.pp, category_pp, self.default_pp)
|
||||
my_script = matched_rule.script or category_script or self.default_script
|
||||
my_priority = coalesce(matched_rule.priority, category_priority, self.default_priority)
|
||||
|
||||
return FeedMatch(
|
||||
matched=result,
|
||||
rule_index=matched_index,
|
||||
season=int_conv(cur_season),
|
||||
episode=int_conv(cur_episode),
|
||||
category=my_category,
|
||||
pp=my_pp,
|
||||
script=my_script,
|
||||
priority=my_priority,
|
||||
)
|
||||
|
||||
|
||||
def patch_feedparser():
|
||||
"""Apply options that work for SABnzbd
|
||||
Add additional parsing of attributes
|
||||
@@ -610,14 +742,14 @@ def _HandleLink(
|
||||
jobs[link]["cat"] = cat
|
||||
jobs[link]["pp"] = pp
|
||||
jobs[link]["script"] = script
|
||||
jobs[link]["prio"] = str(priority)
|
||||
jobs[link]["prio"] = str(priority) if priority is not None else str(DEFAULT_PRIORITY)
|
||||
jobs[link]["orgcat"] = orgcat
|
||||
jobs[link]["size"] = size
|
||||
jobs[link]["age"] = age
|
||||
jobs[link]["time"] = time.time()
|
||||
jobs[link]["rule"] = str(rule)
|
||||
jobs[link]["season"] = season
|
||||
jobs[link]["episode"] = episode
|
||||
jobs[link]["season"] = str(season)
|
||||
jobs[link]["episode"] = str(episode)
|
||||
|
||||
if special_rss_site(link):
|
||||
nzbname = None
|
||||
|
||||
@@ -20,21 +20,48 @@ tests.test_misc - Testing functions in misc.py
|
||||
"""
|
||||
import datetime
|
||||
import time
|
||||
from typing import Optional
|
||||
|
||||
import configobj
|
||||
import pytest
|
||||
from pytest_httpserver import HTTPServer
|
||||
|
||||
import sabnzbd.rss as rss
|
||||
import sabnzbd.config
|
||||
from sabnzbd.constants import DEFAULT_PRIORITY, LOW_PRIORITY, HIGH_PRIORITY, FORCE_PRIORITY
|
||||
from sabnzbd.rss import FeedMatch, FeedConfig
|
||||
from tests.testhelper import httpserver_handler_data_dir
|
||||
|
||||
|
||||
class TestRSS:
|
||||
@staticmethod
|
||||
def setup_rss(feed_name, feed_url):
|
||||
def setup_rss(
|
||||
feed_name: str,
|
||||
feed_url: str,
|
||||
*,
|
||||
category: Optional[str] = None,
|
||||
pp: Optional[str] = None,
|
||||
script: Optional[str] = None,
|
||||
priority: Optional[int] = None,
|
||||
filters: list[tuple[str, str, str, str, str, int, str]] = None,
|
||||
):
|
||||
"""Setup the basic settings to get things going"""
|
||||
values: dict = {"uri": feed_url}
|
||||
if category is not None:
|
||||
values["category"] = category
|
||||
if pp is not None:
|
||||
values["pp"] = str(pp)
|
||||
if script is not None:
|
||||
values["script"] = script
|
||||
if priority is not None:
|
||||
values["priority"] = str(priority)
|
||||
if filters is not None:
|
||||
for n, f in enumerate(filters):
|
||||
values[f"filter{n}"] = f
|
||||
|
||||
# Setup the config settings
|
||||
sabnzbd.config.CFG_OBJ = configobj.ConfigObj()
|
||||
sabnzbd.config.ConfigRSS(feed_name, {"uri": feed_url})
|
||||
sabnzbd.config.ConfigRSS(feed_name, values)
|
||||
|
||||
# Need to create the Default category
|
||||
# Otherwise it will try to save the config
|
||||
@@ -163,3 +190,164 @@ class TestRSS:
|
||||
# of the system, so now we have to return to UTC
|
||||
adjusted_date = datetime.datetime(2025, 5, 20, 18, 21, 1) - datetime.timedelta(seconds=time.timezone)
|
||||
assert job_data["age"] == adjusted_date
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"defaults, filters, title, category, size, season, episode, expected_match",
|
||||
[
|
||||
# filters are (cat, pp, script, ftype, regex, priority, enabled)
|
||||
(
|
||||
(None, None, None, None),
|
||||
[], # config always adds a default accept rule
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=0, season=0, episode=0),
|
||||
),
|
||||
(
|
||||
(None, None, None, None),
|
||||
[("", "", "", ">", "500", "", "1"), ("", "", "", "A", "*", DEFAULT_PRIORITY, "1")],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=1, season=0, episode=0),
|
||||
),
|
||||
(
|
||||
(None, None, None, None),
|
||||
[("", "", "", "F", "S03E08", "", "1"), ("", "", "", "A", "*", DEFAULT_PRIORITY, "1")],
|
||||
"Title S05E02",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=1, season=5, episode=2),
|
||||
),
|
||||
(
|
||||
(None, None, None, None),
|
||||
[("", "", "", "F", "S03E08", "", "1"), ("", "", "", "A", "*", DEFAULT_PRIORITY, "1")],
|
||||
"Title S01E02",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=False, rule_index=0, season=1, episode=2),
|
||||
),
|
||||
(
|
||||
(None, None, None, LOW_PRIORITY),
|
||||
[],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=0, season=0, episode=0, priority=LOW_PRIORITY),
|
||||
),
|
||||
(
|
||||
(None, None, None, LOW_PRIORITY),
|
||||
[("", "", "", "A", "*", HIGH_PRIORITY, "1")],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=0, season=0, episode=0, priority=HIGH_PRIORITY),
|
||||
),
|
||||
(
|
||||
(None, 1, None, None),
|
||||
[],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=0, season=0, episode=0, pp=1),
|
||||
),
|
||||
(
|
||||
(None, 1, None, None),
|
||||
[("", "3", "", "A", "*", DEFAULT_PRIORITY, "1")],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(matched=True, rule_index=0, season=0, episode=0, pp=3),
|
||||
),
|
||||
( # category overrides
|
||||
("tv", 1, DEFAULT_PRIORITY, ""),
|
||||
[("evaluator", "", "", "A", "*", "", "1")],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(
|
||||
matched=True,
|
||||
rule_index=0,
|
||||
season=0,
|
||||
episode=0,
|
||||
category="evaluator",
|
||||
pp=3,
|
||||
script="evaluator.py",
|
||||
priority=FORCE_PRIORITY,
|
||||
),
|
||||
),
|
||||
( # category with rule overrides
|
||||
("tv", 1, DEFAULT_PRIORITY, ""),
|
||||
[("evaluator", "2", "override.py", "A", "*", "", "1")],
|
||||
"Title",
|
||||
None,
|
||||
1000,
|
||||
0,
|
||||
0,
|
||||
FeedMatch(
|
||||
matched=True,
|
||||
rule_index=0,
|
||||
season=0,
|
||||
episode=0,
|
||||
category="evaluator",
|
||||
pp=2,
|
||||
script="override.py",
|
||||
priority=FORCE_PRIORITY,
|
||||
),
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_feedconfig_evaluator(
|
||||
self,
|
||||
httpserver: HTTPServer,
|
||||
defaults: tuple[Optional[str], Optional[str], Optional[str], Optional[int]],
|
||||
filters: list[tuple[str, str, str, str, str, int, str]],
|
||||
title: str,
|
||||
category: Optional[str],
|
||||
size: int,
|
||||
season: int,
|
||||
episode: int,
|
||||
expected_match: FeedMatch,
|
||||
):
|
||||
default_category, default_pp, default_script, default_priority = defaults
|
||||
feed_name = "Evaluator"
|
||||
self.setup_rss(
|
||||
feed_name,
|
||||
httpserver.url_for("/evaluator.xml"),
|
||||
category=default_category,
|
||||
pp=default_pp,
|
||||
script=default_script,
|
||||
priority=default_priority,
|
||||
filters=filters,
|
||||
)
|
||||
sabnzbd.config.ConfigCat(
|
||||
"evaluator",
|
||||
{
|
||||
"pp": "3",
|
||||
"script": "evaluator.py",
|
||||
"priority": FORCE_PRIORITY,
|
||||
},
|
||||
)
|
||||
|
||||
feed_cfg = FeedConfig.from_config(sabnzbd.config.get_rss()[feed_name])
|
||||
result_match = feed_cfg.evaluate(title=title, category=category, size=size, season=season, episode=episode)
|
||||
|
||||
assert result_match == expected_match
|
||||
|
||||
Reference in New Issue
Block a user