mirror of
https://github.com/sabnzbd/sabnzbd.git
synced 2026-01-21 22:09:06 -05:00
Compare commits
3 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5c0a10e16b | ||
|
|
d9b32261e7 | ||
|
|
8d8ce52193 |
4
PKG-INFO
4
PKG-INFO
@@ -1,7 +1,7 @@
|
||||
Metadata-Version: 1.0
|
||||
Name: SABnzbd
|
||||
Version: 2.2.0
|
||||
Summary: SABnzbd-2.2.0
|
||||
Version: 2.2.1RC1
|
||||
Summary: SABnzbd-2.2.1RC1
|
||||
Home-page: https://sabnzbd.org
|
||||
Author: The SABnzbd Team
|
||||
Author-email: team@sabnzbd.org
|
||||
|
||||
80
README.mkd
80
README.mkd
@@ -1,78 +1,8 @@
|
||||
Release Notes - SABnzbd 2.2.0
|
||||
Release Notes - SABnzbd 2.2.1 Release Candidate 1
|
||||
=========================================================
|
||||
|
||||
NOTE: Due to changes in this release, the queue will be converted when 2.2.0
|
||||
is started for the first time. Job order, settings and data will be
|
||||
preserved, but all jobs will be unpaused and URLs that did not finish
|
||||
fetching before the upgrade will be lost!
|
||||
|
||||
## Changes since 2.1.0
|
||||
- Direct Unpack: Jobs will start unpacking during the download, reduces
|
||||
post-processing time but requires capable hard drive. Only works for jobs that
|
||||
do not need repair. Will be enabled if your incomplete folder-speed > 40MB/s
|
||||
- Reduced memory usage, especially with larger queues
|
||||
- Graphical overview of server-usage on Servers page
|
||||
- Notifications can now be limited to certain Categories
|
||||
- Removed 5 second delay between fetching URLs
|
||||
- Each item in the Queue and File list now has Move to Top/Bottom buttons
|
||||
- Add option to only tag a duplicate job without pausing or removing it
|
||||
- New option "History Retention" to automatically purge jobs from History
|
||||
- Jobs outside server retention are processed faster
|
||||
- Obfuscated filenames are renamed during downloading, if possible
|
||||
- Disk-space is now checked before writing files
|
||||
- Add "Retry All Failed" button to Glitter
|
||||
- Smoother animations in Firefox (disabled previously due to FF high-CPU usage)
|
||||
- Show missing articles in MB instead of number of articles
|
||||
- Better indication of verification process before and after repair
|
||||
- Remove video and audio rating icons from Queue
|
||||
- Show vote buttons instead of video and audio rating buttons in History
|
||||
- If enabled, replace dots in filenames also when there are spaces already
|
||||
- Handling of par2 files made more robust
|
||||
- All par2 files are only downloaded when enabled, not on enable_par_cleanup
|
||||
- Update GNTP bindings to 1.0.3
|
||||
- max_art_opt and replace_illegal moved from Switches to Specials
|
||||
- Removed Specials par2_multicore and allow_streaming
|
||||
- Windows: Full unicode support when calling repair and unpack
|
||||
- Windows: Move enable_multipar to Specials
|
||||
- Windows: MultiPar verification of a job is skipped after blocks are fetched
|
||||
- Windows & macOS: removed par2cmdline in favor of par2tbb/MultiPar
|
||||
- Windows & macOS: Updated WinRAR to 5.5.0
|
||||
|
||||
## Bugfixes since 2.1.0
|
||||
- Shutdown/suspend did not work on some Linux systems
|
||||
- Standby/Hibernate was not working on Windows
|
||||
- Deleting a job could result in write errors
|
||||
- Display warning if "Extra par2 parameters" turn out to be wrong
|
||||
- RSS URLs with commas in the URL were broken
|
||||
- Fixed some "Saving failed" errors
|
||||
- Fixed crashing URLGrabber
|
||||
- Jobs with renamed files are now correctly handled when using Retry
|
||||
- Disk-space readings could be updated incorrectly
|
||||
- Correct redirect after enabling HTTPS in the Config
|
||||
- Fix race-condition in Post-processing
|
||||
- History would not always show latest changes
|
||||
- Convert HTML in error messages
|
||||
- In some cases not all RAR-sets were unpacked
|
||||
- Fixed unicode error during Sorting
|
||||
- Faulty pynotify could stop shutdown
|
||||
- Categories with ' in them could result in SQL errors
|
||||
- Special characters like []!* in filenames could break repair
|
||||
- Wizard was always accessible, even with username and password set
|
||||
- Correct value in "Speed" Extra History Column
|
||||
- Not all texts were shown in the selected Language
|
||||
- Various CSS fixes in Glitter and the Config
|
||||
- Catch "error 0" when using HTTPS on some Linux platforms
|
||||
- Warning is shown when many files with duplicate filenames are discarded
|
||||
- Improve zeroconf/bonjour by sending HTTPS setting and auto-discover of IP
|
||||
- Windows: Fix error in MultiPar-code when first par2-file was damaged
|
||||
- macOS: Catch "Protocol wrong type for socket" errors
|
||||
|
||||
## Translations
|
||||
- Added Hebrew translation by ION IL, many other languages updated.
|
||||
|
||||
## Depreciation notices
|
||||
- Option to limit Servers to specific Categories is now scheduled
|
||||
to be removed in the next release.
|
||||
## Bugfixes since 2.2.0
|
||||
- Some users were experiencing downloads being stuck at 99%
|
||||
|
||||
## Upgrading from 2.1.x and older
|
||||
- Finish queue
|
||||
@@ -81,6 +11,10 @@ fetching before the upgrade will be lost!
|
||||
- Start SABnzbd
|
||||
|
||||
## Upgrade notices
|
||||
- Due to changes in this release, the queue will be converted when 2.2.x
|
||||
is started for the first time. Job order, settings and data will be
|
||||
preserved, but all jobs will be unpaused and URLs that did not finish
|
||||
fetching before the upgrade will be lost!
|
||||
- The organization of the download queue is different from 0.7.x releases.
|
||||
This version will not see the old queue, but you restore the jobs by going
|
||||
to Status page and use Queue Repair.
|
||||
|
||||
@@ -856,11 +856,29 @@ class NzbQueue(object):
|
||||
ArticleCache.do.purge_articles(nzo.saved_articles)
|
||||
|
||||
def stop_idle_jobs(self):
|
||||
""" Detect jobs that have zero files left and send them to post processing """
|
||||
""" Detect jobs that have zero files left or are stalled
|
||||
and send them to post-processing
|
||||
"""
|
||||
nr_servers = len(sabnzbd.downloader.Downloader.do.servers)
|
||||
empty = []
|
||||
for nzo in self.__nzo_list:
|
||||
if not nzo.futuretype and not nzo.files and nzo.status not in (Status.PAUSED, Status.GRABBING):
|
||||
empty.append(nzo)
|
||||
if not nzo.futuretype and nzo.status not in (Status.PAUSED, Status.GRABBING):
|
||||
# Finished, but not yet ended
|
||||
if not nzo.files:
|
||||
empty.append(nzo)
|
||||
continue
|
||||
|
||||
# Check if maybe stalled by checking if all files
|
||||
# have all servers in their TryList indicating a lock-up
|
||||
for file in nzo.files:
|
||||
if file.try_list_size() < nr_servers:
|
||||
# Not yet all tried
|
||||
break
|
||||
else:
|
||||
# Only executed if all files are stuck
|
||||
logging.info('Job %s seems stalled, resetting', nzo.final_name)
|
||||
nzo.reset_all_try_lists()
|
||||
|
||||
for nzo in empty:
|
||||
self.end_job(nzo)
|
||||
|
||||
|
||||
@@ -91,6 +91,11 @@ class TryList(object):
|
||||
if server not in self.__try_list:
|
||||
self.__try_list.append(server)
|
||||
|
||||
def try_list_size(self):
|
||||
""" How many servers are listed as tried """
|
||||
with TRYLIST_LOCK:
|
||||
return len(self.__try_list)
|
||||
|
||||
def reset_try_list(self):
|
||||
""" Clean the list """
|
||||
with TRYLIST_LOCK:
|
||||
@@ -1265,7 +1270,7 @@ class NzbObject(TryList):
|
||||
blocks_already = blocks_already + int_conv(new_nzf.blocks)
|
||||
logging.info('Prospectively added %s repair blocks to %s', new_nzf.blocks, self.final_name)
|
||||
# Reset NZO TryList
|
||||
self.reset_try_list()
|
||||
self.reset_all_try_lists()
|
||||
|
||||
def add_to_direct_unpacker(self, nzf):
|
||||
""" Start or add to DirectUnpacker """
|
||||
@@ -1396,6 +1401,7 @@ class NzbObject(TryList):
|
||||
for nzf in nzf_remove_list:
|
||||
nzf.deleted = True
|
||||
self.files.remove(nzf)
|
||||
|
||||
# If cleanup emptied the active files list, end this job
|
||||
if nzf_remove_list and not self.files:
|
||||
sabnzbd.NzbQueue.do.end_job(self)
|
||||
|
||||
Reference in New Issue
Block a user