Compare commits

...

46 Commits

Author SHA1 Message Date
ShyPike
bc6b3091eb Update text files for 0.7.6Final. 2012-11-17 14:01:38 +01:00
ShyPike
4be1a13316 Add the "User-Agent" header of each API call to logging and warnings. 2012-11-17 10:56:36 +01:00
ShyPike
a77327ee7f Support NZB re-queuing also for NZB files in sub-folders. 2012-11-15 22:01:41 +01:00
ShyPike
aa706012af Update text files for 0.7.6Beta2 2012-11-14 21:01:01 +01:00
ShyPike
f5b6203194 Make check for running SABnzbd instance more robust.
Cancel bad side-effect of removing the version check.
Under some circumstances SABnzbd can draw the unjustified conclusion
that another instance is running. Now check for a proper version pattern
in the received output.
2012-11-14 20:57:02 +01:00
ShyPike
1ced9a54e4 Fix evaluation of schedules at startup.
With the introduction of multiple-day schedules, the schedule evaluator failed.
Fixed the evaluation.
A side-effect is that Config->Scheduler will no longer show the schedules in
the order they will occur from now. Instead they will be shown in order of
occurrence from Monday to Sunday.
2012-11-14 20:23:40 +01:00
ShyPike
06c7089a77 Correct indentation in interface.py 2012-11-13 20:59:50 +01:00
ShyPike
ee1d864eea Update text files for 0.7.6Beta2 2012-11-12 21:47:19 +01:00
ShyPike
d703338935 Repair failed when mini-par2 file was in NZB but did not result in a file.
An incomplete mini-par2 file is now skipped in favor of the next available vol-par2 file.
A missing or damaged par2 file must make the next par2 file the primary par2-file
in the next repair run.
2012-11-12 21:10:27 +01:00
ShyPike
e87b24c460 Update text files for 0.7.6Beta1 2012-11-09 19:30:29 +01:00
shypike
3404ef6516 Update translations 2012-11-09 19:23:04 +01:00
shypike
181897e92b Prevent the Decoder from choking the Assembler.
Because the Decoder is CPU-bound, it has no reason to relinquish control.
This will choke the Assembler which cannot write finished and cached articles
to the designated file. The result is an increasing cache, which either grows
indefinitely or until the Decoder must flush articles.
By simply adding a sleep(0.001), the Decoder will trigger the task-scheduler
after each article, giving the Assembler a chance to do its work.
2012-11-08 23:12:15 +01:00
ShyPike
26a504e3e2 Prepare code for intro of zoned access to UI and API. 2012-11-07 21:41:04 +01:00
ShyPike
b72ed09011 Prevent IPv6 Usenet servers from being tried when they're not reachable.
Detect whether external IPv6 addresses are reachable.
If so, allow IPv6 IPs to be picked.
Add a special option 'ipv6_servers' to allow the user to forbid (0), allow (1) or force (2)
the use of IPv6. Value 2 can be used in case the detection by SABnzbd doesn't work reliable.
2012-11-07 20:07:25 +01:00
ShyPike
bb99c0d58e Fix problem with late detection of win32api absence. 2012-11-06 23:40:45 +01:00
ShyPike
4516027fdb Repair side-effect of SFV improvements.
A download without par2 files and without SFV files should not be failed.
2012-11-05 22:51:51 +01:00
ShyPike
e35f2ea3cd Prevent crash on Unix-Pythons that don't have the os.getloadavg() function.
Some Unix Pythons are defective in not providing os.getloadavg().
Add simple exception handler to cover this case.
2012-11-05 20:40:21 +01:00
ShyPike
6b79fad626 Remove version check when looking for a running instance of SABnzbd.
This will lower the chance of inadvertently launching multiple instances.
User will need to use --new to force a new instance.
2012-11-05 19:19:32 +01:00
ShyPike
ac311be430 Successfully pre-checked job lost its attributes when those were changed during check.
For successful jobs, the attributes were not saved to disk (they were for failed ones).
Solution is to save attributes independent of result.
2012-11-05 19:09:10 +01:00
shypike
4fb32bff5f Fix crash when a job is sent to postprocessing immediaterly after startup.
The Assembler wasn't running when job was sent to post processing at
the startup of the queue. The Assembler is used as a relay to send
a job to post-processing.
Solution is to start Assembler before initializing the queue.
2012-11-05 18:54:15 +01:00
ShyPike
5fda342a55 Don't try to repair/verify par sets that have "sample" in their names.
Only when sample deletion is enabled.
2012-11-03 20:34:17 +01:00
ShyPike
e23aab4710 Improve SFV handling, preventing odd side-effects in multi-set NZBs.
SFV verification per PAR-set using only the matching SFV file.
When no par2 files are found, use all available SFV files.
Remember the verification status of each set in the "verified" marker file.
Improve par-set matcher, so that there's no mix-up when one set name
is a substring of another set name.
2012-11-03 16:57:32 +01:00
ShyPike
3837d5dace Handle par-sets that have been renamed after generation of the par2 files.
Requires a wildcard to be added as a par2 parameter to make it scan all applicable files.
The rename actions need to be stored in a persistent file to prevent re-downloading in a Retry.
The status of correct sets must be remembered while fetching extra par file for failed sets.
2012-11-03 16:56:53 +01:00
shypike
f61e7cb1ed Update text files for 0.7.5 Final. 2012-11-03 16:15:22 +01:00
ShyPike
3de0c0e4ac Add missing "%dn" (original folder name) formula to Generic Sorting. 2012-11-01 21:24:51 +01:00
ShyPike
63796d3feb Improve logging for RSS readouts. 2012-11-01 19:47:48 +01:00
ShyPike
6b07529300 Update text files for 0.7.5RC1 2012-10-30 20:35:57 +01:00
ShyPike
e10676710c Support for news in Config. 2012-10-30 20:17:51 +01:00
shypike
77f67c6666 Merge pull request #59 from akuiraz/newzxxx2_fix
Fixed regex for newzbin rss filtering
2012-10-30 11:58:20 -07:00
ShyPike
bdbcdd61e1 Mask password in "Add Server" dialog. 2012-10-30 19:51:12 +01:00
ShyPike
4ab7ec754d Add periodic detection of completed but hanging jobs in the queue.
The 30 second watchdog now detects jobs without pending files.
Those jobs will be sent to the post-processor.
2012-10-30 18:47:18 +01:00
akuiraz
20f98f48bc Fixed regex for newzbin filtering by adding xxx2, now rss feeds from newzxxx2.ch will successfully download 2012-10-30 01:27:36 -04:00
shypike
84e0502e50 Prevent crash when trying to open non-existing "complete" folder from Windows System-tray icon. 2012-10-28 12:39:34 +01:00
shypike
2aa1b00dbb Prevent CherryPy crash when reading a cookie from another app which has a non-standard name. 2012-10-27 13:33:33 +02:00
ShyPike
972078a514 Fix problem with "Read" button when RSS feed name contains "&".
The feed's name wasn't properly encoded in the URL.
2012-10-24 19:34:45 +02:00
shypike
be8382d25b Add special option 'empty_postproc'.
Setting this option will run the user script on an empty download.
Normally this isn't done.
The status sent to the user script is -1, meaning "no files were downloaded".
2012-10-21 18:23:25 +02:00
shypike
8d46e88cd8 Update translations 2012-10-21 12:56:00 +02:00
shypike
6b6b1b79ad Add 'prio_sort_list' special.
This is a list of file name extensions.
Matching files will be the first to be downloaded within an NZB.

Also, if the user sets a simple space-seperated list, this will be converted to a standardized list.
2012-10-21 12:16:13 +02:00
shypike
e1fd40b34d OSX: Retina compatible menu-bar icons. 2012-10-20 19:57:11 +02:00
ShyPike
bc1f8f97a8 Prefix categories of nzbxxx.com with "XXX:". 2012-10-20 16:03:09 +02:00
shypike
b51705f458 Fix issues with accented and special characters in names of downloaded files.
name_extractor() returned Unicode instead of platform-compatible encoding.
QuickCheck assumed incorrectly that file names are not yet platform-compatible.
2012-10-20 15:11:08 +02:00
ShyPike
aaed5f4797 Adjust nzbmatrix category table. 2012-10-17 21:30:22 +02:00
ShyPike
a8eedef1d2 Prevent stuck jobs at end of pre-check. 2012-10-17 21:22:36 +02:00
shypike
9407e21e1e Prevent unusual SFV files from crashing post-processing. 2012-10-13 10:56:05 +02:00
ShyPike
ba6dcfd467 Don't show speed and ETA when download is paused during post-processing. 2012-10-08 21:21:15 +02:00
shypike
e2c1de5008 Prevent soft-crash when api-function "addfile" is called without parameters. 2012-10-06 21:57:48 +02:00
42 changed files with 520 additions and 252 deletions

View File

@@ -1,5 +1,5 @@
*******************************************
*** This is SABnzbd 0.7.4 ***
*** This is SABnzbd 0.7.6 ***
*******************************************
SABnzbd is an open-source cross-platform binary newsreader.
It simplifies the process of downloading from Usenet dramatically,

View File

@@ -1,3 +1,46 @@
-------------------------------------------------------------------------------
0.7.6Final by The SABnzbd-Team
-------------------------------------------------------------------------------
- Recursive scanning when re-queuing downloaded NZB files
- Log "User-Agent" header of API calls
-------------------------------------------------------------------------------
0.7.6Beta2 by The SABnzbd-Team
-------------------------------------------------------------------------------
- A damaged smallest par2 can block fetching of more par2 files
- Fix evaluation of schedules at startup
- Make check for running SABnzbd instance more robust
-------------------------------------------------------------------------------
0.7.6Beta1 by The SABnzbd-Team
-------------------------------------------------------------------------------
- Handle par2 sets that were renamed after creation
- Prevent blocking assembly of completed files, ( this resulted in
excessive CPU and memory usage)
- Fix speed issues with some Usenet servers due to unreachable IPv6 addresses
- Fix issues with SFV-base checks
- Prevent crash on Unix-Pythons that don't have the os.getloadavg() function
- Successfully pre-checked job lost its attributes when those were changed during check
- Remove version check when looking for a running instance of SABnzbd
-------------------------------------------------------------------------------
0.7.5Final by The SABnzbd-Team
-------------------------------------------------------------------------------
- Add missing %dn formula to Generic Sort
- Improve RSS logging
-------------------------------------------------------------------------------
0.7.5RC1 by The SABnzbd-Team
-------------------------------------------------------------------------------
- Prevent stuck jobs at end of pre-check.
- Fix issues with accented and special characters in names of downloaded files.
- Adjust nzbmatrix category table.
- Add 'prio_sort_list' special
- Add special option 'empty_postproc'.
- Prevent CherryPy crash when reading a cookie from another app which has a non-standard name.
- Prevent crash when trying to open non-existing "complete" folder from Windows System-tray icon.
- Fix problem with "Read" button when RSS feed name contains "&".
- Prevent unusual SFV files from crashing post-processing.
- OSX: Retina compatible menu-bar icons.
- Don't show speed and ETA when download is paused during post-processing
- Prevent soft-crash when api-function "addfile" is called without parameters.
- Add news channel frame
-------------------------------------------------------------------------------
0.7.4Final by The SABnzbd-Team
-------------------------------------------------------------------------------

View File

@@ -1,4 +1,4 @@
SABnzbd 0.7.4
SABnzbd 0.7.6
-------------------------------------------------------------------------------
0) LICENSE
@@ -60,7 +60,8 @@ Unix/Linux/OSX
OSX Leopard/SnowLeopard
Python 2.6 http://www.activestate.com
OSX Lion Apple Python 2.7 (included in OSX)
OSX Lion/MountainLion
Apple Python 2.7 Included in OSX (default)
Windows
Python-2.7.latest http://www.activestate.com

View File

@@ -320,7 +320,7 @@ WriteRegStr HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninst
WriteRegStr HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "URLUpdateInfo" 'http://sabnzbd.org/'
WriteRegStr HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "Comments" 'The automated Usenet download tool'
WriteRegStr HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "DisplayIcon" '$INSTDIR\interfaces\Classic\templates\static\images\favicon.ico'
WriteRegDWORD HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "EstimatedSize" 29622
WriteRegDWORD HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "EstimatedSize" 25674
WriteRegDWORD HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "NoRepair" -1
WriteRegDWORD HKEY_LOCAL_MACHINE "Software\Microsoft\Windows\CurrentVersion\Uninstall\SABnzbd" "NoModify" -1
; write out uninstaller

View File

@@ -1,7 +1,7 @@
Metadata-Version: 1.0
Name: SABnzbd
Version: 0.7.4
Summary: SABnzbd-0.7.4
Version: 0.7.6
Summary: SABnzbd-0.7.6
Home-page: http://sabnzbd.org
Author: The SABnzbd Team
Author-email: team@sabnzbd.org

View File

@@ -1,36 +1,19 @@
Release Notes - SABnzbd 0.7.4
Release Notes - SABnzbd 0.7.6
===============================
## Highlights in 0.7.4
- OSX Mountain Lion: Notification Center support
- OSX Mountain Lion: improved "keep awake" support
- Scheduler: action can now run on multiple weekdays
- Fix: pre-check failed to consider extra par2 files
## Features
- Support for HTTPS chain files (needed when you buy your own certificate)
- Special option: rss_odd_titles, see [Wiki](http://wiki.sabnzbd.org/configure-special-0-7/ "Wiki")
- Special option: 'overwrite_files', see [Wiki](http://wiki.sabnzbd.org/configure-special-0-7/ "Wiki")
- Show memory usage on Linux systems
- Scheduler: add "remove failed jobs" action
- Properly handle par2-sets that were renamed after creation by the poster
- Recursive scanning when re-queuing downloaded NZB files
## Bug fixes
- After successful pre-check, preserve a job's position in the queue
- Restore SABnzbd icon for Growl
- Make Windows version less eager to use par2-classic
- Prevent jobs from showing up in queue and history simultaneously
- Fix failure to fetch more par2-files for posts with badly formatted subject lines
- Fix for third-party tools requesting too much history
- New RSS feed should no longer be considered new after first, but empty readout.
- Make "auth" call backward-compatible with 0.6.x releases.
- Config->Notifications: email and growl server addresses should not be marked as "url" type.
- OSX: fix top menu queue info so that it shows total queue size
- Fixed unjustified warning that can occur with OSX Growl 2.0
- Pre-queue script no longer got the show/season/episode information.
- Prevent crash on startup when a fully downloaded job is still in download queue.
- Fix incorrect end-of-month quota reset
- Fix UI refresh issue when using Safari on iOS6 (Safari bug)
- Prevent blocking assembly of completed files, (this resulted in excessive CPU and memory usage)
- Fix speed issues with some Usenet servers due to unreachable IPv6 addresses
- Fix issues with SFV-base checks
- Successfully pre-checked job lost its attributes when those were changed during check
- No longer check version when looking for a running instance of SABnzbd (this prevents unintended multiple instances).
- A damaged base par2 file could block download of more par2 files
- Fix evaluation of schedules at startup
- Fix possible failing startup when running as a Windows Service
## What's new in 0.7.0

View File

@@ -28,6 +28,7 @@ import signal
import socket
import platform
import time
import re
try:
import Cheetah
@@ -691,10 +692,7 @@ def is_sabnzbd_running(url):
try:
url = '%s&mode=version' % (url)
ver = sabnzbd.newsunpack.get_from_url(url)
if ver and ver.strip(' \n\r\t') == sabnzbd.__version__:
return True
else:
return False
return bool(ver and re.search(r'\d+\.\d+\.', ver))
except:
return False
@@ -714,7 +712,7 @@ def find_free_port(host, currentport):
def check_for_sabnzbd(url, upload_nzbs, allow_browser=True):
""" Check for a running instance of sabnzbd(same version) on this port
""" Check for a running instance of sabnzbd on this port
allow_browser==True|None will launch the browser, False will not.
"""
if allow_browser is None:

View File

@@ -658,7 +658,10 @@ class Request(object):
# Handle cookies differently because on Konqueror, multiple
# cookies come on different lines with the same key
if name == 'Cookie':
self.cookie.load(value)
try:
self.cookie.load(value)
except:
pass
if not dict.__contains__(headers, 'Host'):
# All Internet-based HTTP/1.1 servers MUST respond with a 400

View File

@@ -24,6 +24,11 @@
<h5 class="copyright">Copyright &copy; 2008-2012 The SABnzbd Team &lt;<span style="color: #0000ff;">team@sabnzbd.org</span>&gt;</h5>
<p class="copyright"><small>$T('yourRights')</small></p>
</div>
<!--#if $news_items#-->
<div class="padding">
<iframe frameborder=0 width=100% src="http://sabnzbdplus.sourceforge.net/version/news.html"></iframe>
</div>
<!--#end if#-->
</div>
<!--#include $webdir + "/_inc_footer_uc.tmpl"#-->

View File

@@ -498,6 +498,10 @@
</div><!-- /colmask -->
<script>
function urlencode(str) {
return encodeURIComponent(str).replace(/!/g, '%21').replace(/'/g, '%27').replace(/\(/g, '%28').replace(/\)/g, '%29').replace(/\*/g, '%2A').replace(/%20/g, '+');
}
\$(document).ready(function(){
\$('.editFeed').click(function(){
var oldURI = \$(this).prev().val();
@@ -537,7 +541,7 @@
url: "test_rss_feed",
data: {feed: whichFeed, session: "$session" }
}).done(function( msg ) {
location = '?feed=' + whichFeed;
location = '?feed=' + urlencode(whichFeed);
// location.reload();
});
});

View File

@@ -35,7 +35,7 @@
</div>
<div class="field-pair alt">
<label class="config" for="password">$T('srv-password')</label>
<input type="text" name="password" id="password" size="30" />
<input type="password" name="password" id="password" size="30" />
</div>
<div class="field-pair">
<label class="config" for="connections">$T('srv-connections')</label>

View File

@@ -265,6 +265,11 @@
<td>$T('sort-File')</td>
</tr>
<tr>
<td class="align-right"><b>$T('orgDirname'):</b></td>
<td>%dn</td>
<td>$T("sort-Folder")</td>
</tr>
<tr class="even">
<td class="align-right"><b>$T('lowercase'):</b></td>
<td>{$T('TEXT')}</td>
<td>$T('text')</td>
@@ -432,7 +437,7 @@
return function(callback, ms){
clearTimeout (timer);
timer = setTimeout(callback, ms);
}
}
})();
function tvSet(val) {

View File

@@ -46,7 +46,7 @@
<div class="field-pair alt">
<label class="nocheck clearfix" for="password">
<span class="component-title">$T('srv-password')</span>
<input type="text" size="25" name="password"/>
<input type="password" size="25" name="password"/>
</label>
</div>
<div class="field-pair">
@@ -156,7 +156,7 @@
<div class="field-pair alt">
<label class="nocheck clearfix" for="password">
<span class="component-title">$T('srv-password')</span>
<input type="text" size="25" name="password" value="$servers[$server]['password']" />
<input type="password" size="25" name="password" value="$servers[$server]['password']" />
</label>
</div>
<div class="field-pair">

View File

@@ -1242,12 +1242,16 @@ $.plush.histprevslots = $.plush.histnoofslots; // for the next refresh
SetQueueETAStats : function(speed,kbpersec,timeleft,eta) {
// ETA/speed stats at top of queue
if (kbpersec < 1 && $.plush.paused)
if (kbpersec < 1 || $.plush.paused) {
$('#stats_eta').html('&mdash;');
else
$('#stats_speed').html('&mdash;');
$('#time-left').attr('title','&mdash;'); // Tooltip on "time left"
}
else {
$('#stats_eta').html(timeleft);
$('#stats_speed').html(speed+"B/s");
$('#time-left').attr('title',eta); // Tooltip on "time left"
$('#stats_speed').html(speed+"B/s");
$('#time-left').attr('title',eta); // Tooltip on "time left"
}
},

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 388 B

View File

Binary file not shown.

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 902 B

BIN
osx/resources/sab_idle.tiff Normal file
View File

Binary file not shown.

View File

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 KiB

View File

Binary file not shown.

View File

@@ -8,14 +8,14 @@ msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2012-09-10 17:12+0000\n"
"PO-Revision-Date: 2012-09-25 03:49+0000\n"
"Last-Translator: Joel Pedraza <Unknown>\n"
"PO-Revision-Date: 2012-10-21 17:25+0000\n"
"Last-Translator: Juanma <Unknown>\n"
"Language-Team: Spanish <es@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2012-09-26 05:07+0000\n"
"X-Generator: Launchpad (build 16022)\n"
"X-Launchpad-Export-Date: 2012-10-22 04:51+0000\n"
"X-Generator: Launchpad (build 16165)\n"
#: SABnzbd.py:302 [Error message]
msgid "Failed to start web-interface"
@@ -3867,7 +3867,7 @@ msgstr "Purgar"
#: sabnzbd/skintext.py:755
msgid "left"
msgstr "Izquierda"
msgstr "Restante"
#: sabnzbd/skintext.py:756 [Used in speed menu. Split in two lines if too long.]
msgid "Max Speed"

View File

@@ -8,14 +8,14 @@ msgstr ""
"Project-Id-Version: sabnzbd\n"
"Report-Msgid-Bugs-To: FULL NAME <EMAIL@ADDRESS>\n"
"POT-Creation-Date: 2012-09-10 17:12+0000\n"
"PO-Revision-Date: 2012-08-02 15:29+0000\n"
"PO-Revision-Date: 2012-10-13 17:23+0000\n"
"Last-Translator: nicusor <Unknown>\n"
"Language-Team: Romanian <ro@li.org>\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"X-Launchpad-Export-Date: 2012-09-11 05:13+0000\n"
"X-Generator: Launchpad (build 15924)\n"
"X-Launchpad-Export-Date: 2012-10-14 04:48+0000\n"
"X-Generator: Launchpad (build 16137)\n"
#: SABnzbd.py:302 [Error message]
msgid "Failed to start web-interface"
@@ -55,7 +55,7 @@ msgstr "Dezactivează HTTPS din cauza lipsei fişierelor CERT şi KEY"
#: SABnzbd.py:1525
msgid "SABnzbd %s started"
msgstr ""
msgstr "SABnzbd %s pornit"
#: SABnzbd.py:1667 # sabnzbd/osxmenu.py:775
msgid "SABnzbd shutdown finished"
@@ -1517,7 +1517,7 @@ msgstr "Citeşte fluxuri RSS"
#: sabnzbd/skintext.py:65 [Config->Scheduler]
msgid "Remove failed jobs"
msgstr ""
msgstr "Elimină sarcini nereuşite"
#: sabnzbd/skintext.py:70 [Speed indicator kilobytes/sec]
msgid "KB/s"
@@ -2253,11 +2253,11 @@ msgstr "Nume fişier sau cale Cheie HTTPS."
#: sabnzbd/skintext.py:302
msgid "HTTPS Chain Certifcates"
msgstr ""
msgstr "Certificate Cheie HTTPS"
#: sabnzbd/skintext.py:303
msgid "File name or path to HTTPS Chain."
msgstr ""
msgstr "Nume fişier sau cale cheie HTTPS."
#: sabnzbd/skintext.py:304
msgid "Tuning"
@@ -2764,7 +2764,7 @@ msgstr "Verificare săptămânală versiuni noi SABnzbd."
#: sabnzbd/skintext.py:421 [Pick list for weekly test for new releases]
msgid "Also test releases"
msgstr ""
msgstr "Testeaza şi versiuni de încercare"
#: sabnzbd/skintext.py:422
msgid "Replace Spaces in Foldername"
@@ -3204,19 +3204,20 @@ msgstr "Trimite notificări către NotifyOSD"
#: sabnzbd/skintext.py:557
msgid "Notification Center"
msgstr ""
msgstr "Centru Notificări"
#: sabnzbd/skintext.py:558
msgid "Send notifications to Notification Center"
msgstr ""
msgstr "Trimite notificări la Centru Notificări"
#: sabnzbd/skintext.py:559
msgid "Notification classes"
msgstr ""
msgstr "Clase notificări"
#: sabnzbd/skintext.py:560
msgid "Enable classes of messages to be reported (none, one or multiple)"
msgstr ""
"Activează clasă mesaje ce vor fi raportate (niciunul, unul sau mai multe)"
#: sabnzbd/skintext.py:564
msgid ""

View File

@@ -302,10 +302,11 @@ def initialize(pause_downloader = False, clean_up = False, evalSched=False, repa
PostProcessor()
NzbQueue()
NzbQueue.do.read_queue(repair)
Assembler()
NzbQueue.do.read_queue(repair)
Downloader(pause_downloader or paused)
DirScanner()
@@ -1037,6 +1038,9 @@ def check_all_tasks():
# Check one-shot pause
sabnzbd.scheduler.pause_check()
# Check (and terminate) idle jobs
sabnzbd.nzbqueue.NzbQueue.do.stop_idle_jobs()
return True

View File

@@ -293,9 +293,11 @@ def _api_addfile(name, output, kwargs):
#Side effect of next line is that attribute .value is created
#which is needed to make add_nzbfile() work
size = name.length
else:
elif hasattr(name, 'value'):
size = len(name.value)
if name is not None and name.filename and size:
else:
size = 0
if name is not None and size and name.filename:
cat = kwargs.get('cat')
xcat = kwargs.get('xcat')
if not cat and xcat:
@@ -1105,7 +1107,7 @@ def build_queue(web_dir=None, root=None, verbose=False, prim=True, webdir='', ve
slot['mbdone_fmt'] = locale.format('%d', int(mb-mbleft), True)
slot['size'] = format_bytes(bytes)
slot['sizeleft'] = format_bytes(bytesleft)
if not Downloader.do.paused and status != 'Paused' and status != 'Fetching' and not found_active:
if not Downloader.do.paused and status not in (Status.PAUSED, Status.FETCHING) and not found_active:
if status == Status.CHECKING:
slot['status'] = Status.CHECKING
else:
@@ -1129,7 +1131,7 @@ def build_queue(web_dir=None, root=None, verbose=False, prim=True, webdir='', ve
slot['percentage'] = "%s" % (int(((mb-mbleft) / mb) * 100))
slot['missing'] = missing
if status in (Status.PAUSED, Status.CHECKING):
if Downloader.do.paused or Downloader.do.postproc or status not in (Status.DOWNLOADING, Status.QUEUED):
slot['timeleft'] = '0:00:00'
slot['eta'] = 'unknown'
else:
@@ -1542,7 +1544,8 @@ def build_header(prim, webdir=''):
if not color:
color = ''
header = { 'T': Ttemplate, 'Tspec': Tspec, 'Tx' : Ttemplate, 'version':sabnzbd.__version__, 'paused': Downloader.do.paused,
header = { 'T': Ttemplate, 'Tspec': Tspec, 'Tx' : Ttemplate, 'version':sabnzbd.__version__,
'paused': Downloader.do.paused or Downloader.do.postproc,
'pause_int': scheduler.pause_int(), 'paused_all': sabnzbd.PAUSED_ALL,
'uptime':uptime, 'color_scheme':color }
speed_limit = Downloader.do.get_limit()
@@ -1593,13 +1596,13 @@ def build_header(prim, webdir=''):
header['left_quota'] = to_units(BPSMeter.do.left)
status = ''
if Downloader.do.paused:
if Downloader.do.paused or Downloader.do.postproc:
status = Status.PAUSED
elif bytespersec > 0:
status = Status.DOWNLOADING
else:
status = 'Idle'
header['status'] = "%s" % status
header['status'] = status
anfo = ArticleCache.do.cache_info()

View File

@@ -139,7 +139,7 @@ def _assemble(nzf, path, dupe):
decodetable = nzf.decodetable
for articlenum in decodetable:
sleep(0.01)
sleep(0.001)
article = decodetable[articlenum]
data = ArticleCache.do.load_article(article)

View File

@@ -80,6 +80,7 @@ email_dir = OptionDir('misc', 'email_dir', create=True)
email_rss = OptionBool('misc', 'email_rss', False)
version_check = OptionNumber('misc', 'check_new_rel', 1)
news_items = OptionBool('misc', 'news_items', True)
autobrowser = OptionBool('misc', 'auto_browser', True)
replace_illegal = OptionBool('misc', 'replace_illegal', True)
pre_script = OptionStr('misc', 'pre_script', 'None')
@@ -125,8 +126,10 @@ auto_sort = OptionBool('misc', 'auto_sort', False)
folder_rename = OptionBool('misc', 'folder_rename', True)
folder_max_length = OptionNumber('misc', 'folder_max_length', DEF_FOLDER_MAX, 20, 65000)
pause_on_pwrar = OptionBool('misc', 'pause_on_pwrar', True)
prio_sort_list = OptionList('misc', 'prio_sort_list')
safe_postproc = OptionBool('misc', 'safe_postproc', True)
empty_postproc = OptionBool('misc', 'empty_postproc', False)
pause_on_post_processing = OptionBool('misc', 'pause_on_post_processing', False)
ampm = OptionBool('misc', 'ampm', False)
rss_filenames = OptionBool('misc', 'rss_filenames', False)
@@ -215,6 +218,7 @@ ssl_type = OptionStr('misc', 'ssl_type', 'v23')
unpack_check = OptionBool('misc', 'unpack_check', True)
no_penalties = OptionBool('misc', 'no_penalties', False)
randomize_server_ip = OptionBool('misc', 'randomize_server_ip', False)
ipv6_servers = OptionNumber('misc', 'ipv6_servers', 1, 0, 2)
# Internal options, not saved in INI file
debug_delay = OptionNumber('misc', 'debug_delay', 0, add=False)
@@ -223,6 +227,7 @@ api_key = OptionStr('misc', 'api_key', create_api_key())
nzb_key = OptionStr('misc', 'nzb_key', create_api_key())
disable_key = OptionBool('misc', 'disable_api_key', False)
api_warnings = OptionBool('misc', 'api_warnings', True)
local_range = OptionStr('misc', 'local_range')
max_art_tries = OptionNumber('misc', 'max_art_tries', 3, 2)
max_art_opt = OptionBool('misc', 'max_art_opt', False)
use_pickle = OptionBool('misc', 'use_pickle', False)

View File

@@ -224,7 +224,10 @@ class OptionList(Option):
error = None
if value is not None:
if not isinstance(value, list):
value = listquote.simplelist(value)
if '"' not in value and ',' not in value:
value = value.split()
else:
value = listquote.simplelist(value)
if self.__validation:
error, value = self.__validation(value)
if not error:

View File

@@ -65,6 +65,7 @@ FUTURE_Q_FOLDER = 'future'
JOB_ADMIN = '__ADMIN__'
VERIFIED_FILE = '__verified__'
QCHECK_FILE = '__skip_qcheck__'
RENAMES_FILE = '__renames__'
ATTRIB_FILE = 'SABnzbd_attrib'
REPAIR_REQUEST = 'repair-all.sab'

View File

@@ -23,6 +23,7 @@ import Queue
import binascii
import logging
import re
from time import sleep
from threading import Thread
try:
import _yenc
@@ -72,6 +73,7 @@ class Decoder(Thread):
def run(self):
from sabnzbd.nzbqueue import NzbQueue
while 1:
sleep(0.001)
art_tup = self.queue.get()
if not art_tup:
break

View File

@@ -84,6 +84,12 @@ def check_server(host, port):
return badParameterResponse(T('Server address "%s:%s" is not valid.') % (host, port))
def check_access():
""" Check if external address is allowed """
referrer = cherrypy.request.remote.ip
return referrer in ('127.0.0.1', '::1') or referrer.startswith(cfg.local_range())
def ConvertSpecials(p):
""" Convert None to 'None' and 'Default' to ''
"""
@@ -158,6 +164,8 @@ def set_auth(conf):
def check_session(kwargs):
""" Check session key """
if not check_access():
return u'No access'
key = kwargs.get('session')
if not key:
key = kwargs.get('apikey')
@@ -176,6 +184,10 @@ def check_apikey(kwargs, nokey=False):
""" Check api key or nzbkey
Return None when OK, otherwise an error message
"""
def log_warning(txt):
txt = '%s %s' % (txt, cherrypy.request.headers.get('User-Agent', '??'))
logging.warning('%s', txt)
output = kwargs.get('output')
mode = kwargs.get('mode', '')
callback = kwargs.get('callback')
@@ -188,19 +200,22 @@ def check_apikey(kwargs, nokey=False):
# For NZB upload calls, a separate key can be used
nzbkey = kwargs.get('mode', '') in ('addid', 'addurl', 'addfile', 'addlocalfile')
if not nzbkey and not check_access():
return report(output, 'No access')
# First check APIKEY, if OK that's sufficient
if not (cfg.disable_key() or nokey):
key = kwargs.get('apikey')
if not key:
if not special:
logging.warning(Ta('API Key missing, please enter the api key from Config->General into your 3rd party program:'))
log_warning(Ta('API Key missing, please enter the api key from Config->General into your 3rd party program:'))
return report(output, 'API Key Required', callback=callback)
elif nzbkey and key == cfg.nzb_key():
return None
elif key == cfg.api_key():
return None
else:
logging.warning(Ta('API Key incorrect, Use the api key from Config->General in your 3rd party program:'))
log_warning(Ta('API Key incorrect, Use the api key from Config->General in your 3rd party program:'))
return report(output, 'API Key Incorrect', callback=callback)
# No active APIKEY, check web credentials instead
@@ -209,7 +224,7 @@ def check_apikey(kwargs, nokey=False):
pass
else:
if not special:
logging.warning(Ta('Authentication missing, please enter username/password from Config->General into your 3rd party program:'))
log_warning(Ta('Authentication missing, please enter username/password from Config->General into your 3rd party program:'))
return report(output, 'Missing authentication', callback=callback)
return None
@@ -249,6 +264,8 @@ class MainPage(object):
@cherrypy.expose
def index(self, **kwargs):
if not check_access(): return Protected()
if sabnzbd.OLD_QUEUE and not cfg.warned_old_queue():
cfg.warned_old_queue.set(True)
config.save_config()
@@ -293,6 +310,7 @@ class MainPage(object):
def add_handler(self, kwargs):
if not check_access(): return Protected()
id = kwargs.get('id', '')
if not id:
id = kwargs.get('url', '')
@@ -397,7 +415,8 @@ class MainPage(object):
def api(self, **kwargs):
"""Handler for API over http, with explicit authentication parameters
"""
logging.debug('API-call from %s %s', cherrypy.request.remote.ip, kwargs)
logging.debug('API-call from %s [%s] %s', cherrypy.request.remote.ip, \
cherrypy.request.headers.get('User-Agent', '??'), kwargs)
if kwargs.get('mode', '') not in ('version', 'auth'):
msg = check_apikey(kwargs)
if msg: return msg
@@ -407,6 +426,7 @@ class MainPage(object):
def scriptlog(self, **kwargs):
""" Duplicate of scriptlog of History, needed for some skins """
# No session key check, due to fixed URLs
if not check_access(): return Protected()
name = kwargs.get('name')
if name:
@@ -458,7 +478,7 @@ class NzoPage(object):
# /nzb/SABnzbd_nzo_xxxxx/files
# /nzb/SABnzbd_nzo_xxxxx/bulk_operation
# /nzb/SABnzbd_nzo_xxxxx/save
if not check_access(): return Protected()
nzo_id = None
for a in args:
if a.startswith('SABnzbd_nzo'):
@@ -629,6 +649,7 @@ class QueuePage(object):
@cherrypy.expose
def index(self, **kwargs):
if not check_access(): return Protected()
start = kwargs.get('start')
limit = kwargs.get('limit')
dummy2 = kwargs.get('dummy2')
@@ -845,6 +866,7 @@ class HistoryPage(object):
@cherrypy.expose
def index(self, **kwargs):
if not check_access(): return Protected()
start = kwargs.get('start')
limit = kwargs.get('limit')
search = kwargs.get('search')
@@ -963,7 +985,7 @@ class HistoryPage(object):
def scriptlog(self, **kwargs):
""" Duplicate of scriptlog of History, needed for some skins """
# No session key check, due to fixed URLs
if not check_access(): return Protected()
name = kwargs.get('name')
if name:
history_db = cherrypy.thread_data.history_db
@@ -1009,6 +1031,7 @@ class ConfigPage(object):
@cherrypy.expose
def index(self, **kwargs):
if not check_access(): return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
conf['configfn'] = config.get_filename()
@@ -1018,6 +1041,7 @@ class ConfigPage(object):
for svr in config.get_servers():
new[svr] = {}
conf['servers'] = new
conf['news_items'] = cfg.news_items()
conf['folders'] = sabnzbd.nzbqueue.scan_jobs(all=False, action=False)
@@ -1090,7 +1114,7 @@ class ConfigFolders(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1146,7 +1170,7 @@ class ConfigSwitches(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1186,16 +1210,16 @@ class ConfigSwitches(object):
SPECIAL_BOOL_LIST = \
( 'start_paused', 'no_penalties', 'ignore_wrong_unrar', 'create_group_folders',
'queue_complete_pers', 'api_warnings', 'allow_64bit_tools', 'par2_multicore',
'never_repair', 'allow_streaming', 'ignore_unrar_dates', 'rss_filenames',
'never_repair', 'allow_streaming', 'ignore_unrar_dates', 'rss_filenames', 'news_items',
'osx_menu', 'osx_speed', 'win_menu', 'uniconfig', 'use_pickle', 'allow_incomplete_nzb',
'randomize_server_ip', 'no_ipv6', 'keep_awake', 'overwrite_files'
'randomize_server_ip', 'no_ipv6', 'keep_awake', 'overwrite_files', 'empty_postproc'
)
SPECIAL_VALUE_LIST = \
( 'size_limit', 'folder_max_length', 'fsys_type', 'movie_rename_limit', 'nomedia_marker',
'req_completion_rate', 'wait_ext_drive', 'history_limit', 'show_sysload'
'req_completion_rate', 'wait_ext_drive', 'history_limit', 'show_sysload', 'ipv6_servers'
)
SPECIAL_LIST_LIST = \
( 'rss_odd_titles',
( 'rss_odd_titles', 'prio_sort_list'
)
class ConfigSpecial(object):
@@ -1206,7 +1230,7 @@ class ConfigSpecial(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1240,7 +1264,7 @@ class ConfigSpecial(object):
#------------------------------------------------------------------------------
GENERAL_LIST = (
'host', 'port', 'username', 'password', 'disable_api_key',
'refresh_rate', 'cache_limit',
'refresh_rate', 'cache_limit', 'local_range',
'enable_https', 'https_port', 'https_cert', 'https_key', 'https_chain'
)
@@ -1275,7 +1299,7 @@ class ConfigGeneral(object):
else:
return ''
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1348,6 +1372,7 @@ class ConfigGeneral(object):
conf['cache_limit'] = cfg.cache_limit()
conf['cleanup_list'] = cfg.cleanup_list.get_string()
conf['nzb_key'] = cfg.nzb_key()
conf['local_range'] = cfg.local_range()
conf['my_lcldata'] = cfg.admin_dir.get_path()
template = Template(file=os.path.join(self.__web_dir, 'config_general.tmpl'),
@@ -1456,7 +1481,7 @@ class ConfigServer(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1615,7 +1640,7 @@ class ConfigRss(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1913,7 +1938,7 @@ class ConfigScheduling(object):
days["7"] = T('Sunday')
return days
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -1924,7 +1949,7 @@ class ConfigScheduling(object):
conf['schedlines'] = []
snum = 1
conf['taskinfo'] = []
for ev in scheduler.sort_schedules(forward=True):
for ev in scheduler.sort_schedules(all_events=False):
line = ev[3]
conf['schedlines'].append(line)
try:
@@ -1950,13 +1975,13 @@ class ConfigScheduling(object):
action = Ttemplate("sch-" + act) + ' ' + server
if day_numbers == "1234567":
days_of_week = "Daily"
days_of_week = "Daily"
elif day_numbers == "12345":
days_of_week = "Weekdays"
days_of_week = "Weekdays"
elif day_numbers == "67":
days_of_week = "Weekends"
days_of_week = "Weekends"
else:
days_of_week = ", ".join([day_names.get(i, "**") for i in day_numbers])
days_of_week = ", ".join([day_names.get(i, "**") for i in day_numbers])
item = (snum, '%02d' % int(h), '%02d' % int(m), days_of_week, '%s %s' % (action, value))
conf['taskinfo'].append(item)
@@ -2043,7 +2068,7 @@ class ConfigIndexers(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -2126,7 +2151,7 @@ class ConfigCats(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -2200,7 +2225,7 @@ class ConfigSorting(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
@@ -2253,6 +2278,7 @@ class Status(object):
@cherrypy.expose
def index(self, **kwargs):
if not check_access(): return Protected()
header, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)
header['logfile'] = sabnzbd.LOGFILE
@@ -2607,7 +2633,7 @@ class ConfigNotify(object):
@cherrypy.expose
def index(self, **kwargs):
if cfg.configlock():
if cfg.configlock() or not check_access():
return Protected()
conf, pnfo_list, bytespersec = build_header(self.__prim, self.__web_dir)

View File

@@ -39,7 +39,7 @@ except:
import sabnzbd
from sabnzbd.decorators import synchronized
from sabnzbd.constants import DEFAULT_PRIORITY, FUTURE_Q_FOLDER, JOB_ADMIN, GIGI, VERIFIED_FILE, Status, MEBI
from sabnzbd.constants import DEFAULT_PRIORITY, FUTURE_Q_FOLDER, JOB_ADMIN, GIGI, Status, MEBI
import sabnzbd.config as config
import sabnzbd.cfg as cfg
from sabnzbd.encoding import unicoder, latin1
@@ -1010,7 +1010,7 @@ def memory_usage():
res = int(_PAGE_SIZE * int(v[1]) / MEBI)
return "V=%sM R=%sM" % (virt, res)
except:
return None
return ''
try:
_PAGE_SIZE = os.sysconf("SC_PAGE_SIZE")
@@ -1026,7 +1026,10 @@ def loadavg():
if not sabnzbd.WIN32 and not sabnzbd.DARWIN:
opt = cfg.show_sysload()
if opt:
p = '%.2f | %.2f | %.2f' % os.getloadavg()
try:
p = '%.2f | %.2f | %.2f' % os.getloadavg()
except:
pass
if opt > 1 and _HAVE_STATM:
p = '%s | %s' % (p, memory_usage())
return p
@@ -1078,7 +1081,11 @@ def int_conv(value):
# Diskfree
if sabnzbd.WIN32:
# windows diskfree
import win32api
try:
# Careful here, because win32api test hasn't been done yet!
import win32api
except:
pass
def diskfree(_dir):
""" Return amount of free diskspace in GBytes
"""

View File

@@ -35,7 +35,8 @@ from sabnzbd.misc import format_time_string, find_on_path, make_script_path, int
flag_file
from sabnzbd.tvsort import SeriesSorter
import sabnzbd.cfg as cfg
from constants import Status, QCHECK_FILE
from sabnzbd.constants import Status, QCHECK_FILE, RENAMES_FILE
load_data = save_data = None
if sabnzbd.WIN32:
try:
@@ -78,6 +79,7 @@ CURL_COMMAND = None
def find_programs(curdir):
"""Find external programs
"""
global load_data, save_data
def check(path, program):
p = os.path.abspath(os.path.join(path, program))
if os.access(p, os.X_OK):
@@ -85,6 +87,10 @@ def find_programs(curdir):
else:
return None
# Another crazy Python import bug work-around
load_data = sabnzbd.load_data
save_data = sabnzbd.save_data
if sabnzbd.DARWIN:
try:
os_version = subprocess.Popen("sw_vers -productVersion", stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True).stdout.read()
@@ -888,6 +894,8 @@ def par2_repair(parfile_nzf, nzo, workdir, setname):
_RE_BLOCK_FOUND = re.compile('File: "([^"]+)" - found \d+ of \d+ data blocks from "([^"]+)"')
_RE_IS_MATCH_FOR = re.compile('File: "([^"]+)" - is a match for "([^"]+)"')
def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
""" Run par2 on par-set """
if cfg.never_repair():
@@ -918,6 +926,9 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
if setname in joinable:
command.append(joinable)
# Append the wildcard for this set
command.append('%s*' % os.path.join(os.path.split(parfile)[0], setname))
stup, need_shell, command, creationflags = build_command(command)
logging.debug('Starting par2: %s', command)
@@ -935,6 +946,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
# Set up our variables
pars = []
datafiles = []
renames = {}
linebuf = ''
finished = 0
@@ -965,12 +977,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
if 'Repairing:' not in line:
lines.append(line)
if 'The recovery file does not exist' in line:
logging.info('%s', line)
nzo.set_unpack_info('Repair', unicoder(line), set=setname)
nzo.status = Status.FAILED
elif line.startswith('Invalid option specified'):
if line.startswith('Invalid option specified'):
msg = T('[%s] PAR2 received incorrect options, check your Config->Switches settings') % unicoder(setname)
nzo.set_unpack_info('Repair', msg, set=setname)
nzo.status = Status.FAILED
@@ -990,7 +997,7 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
start = time()
verified = 1
elif line.startswith('Main packet not found'):
elif line.startswith('Main packet not found') or 'The recovery file does not exist' in line:
## Initialparfile probably didn't decode properly,
logging.info(Ta('Main packet not found...'))
@@ -1009,8 +1016,13 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
logging.info("Found new par2file %s", nzf.filename)
## Move from extrapar list to files to be downloaded
nzo.add_parfile(nzf)
extrapars.remove(nzf)
## Now set new par2 file as primary par2
nzo.partable[setname] = nzf
nzf.extrapars= extrapars
parfile_nzf = []
## mark for readd
readd = True
else:
@@ -1129,6 +1141,15 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
# Hit a bug in par2-tbb, retry with par2-classic
retry_classic = True
# File: "oldname.rar" - is a match for "newname.rar".
elif 'is a match for' in line:
m = _RE_IS_MATCH_FOR.search(line)
if m:
old_name = m.group(1)
new_name = m.group(2)
logging.debug('PAR2 will rename "%s" to "%s"', old_name, new_name)
renames[new_name] = old_name
elif not verified:
if line.startswith('Verifying source files'):
nzo.set_action_line(T('Verifying'), '01/%02d' % verifytotal)
@@ -1169,6 +1190,13 @@ def PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=False):
logging.debug('PAR2 output was\n%s', '\n'.join(lines))
# If successful, add renamed files to the collection
if finished and renames:
previous = load_data(RENAMES_FILE, nzo.workpath, remove=False)
for name in previous or {}:
renames[name] = previous[name]
save_data(renames, RENAMES_FILE, nzo.workpath)
if retry_classic:
logging.debug('Retry PAR2-joining with par2-classic')
return PAR_Verify(parfile, parfile_nzf, nzo, setname, joinables, classic=True)
@@ -1306,13 +1334,12 @@ def QuickCheck(set, nzo):
nzf_list = nzo.finished_files
for file in md5pack:
file = name_fixer(file)
if sabnzbd.misc.on_cleanup_list(file, False):
result = True
continue
found = False
for nzf in nzf_list:
if file == name_fixer(nzf.filename):
if file == nzf.filename:
found = True
if (nzf.md5sum is not None) and nzf.md5sum == md5pack[file]:
logging.debug('Quick-check of file %s OK', file)
@@ -1375,20 +1402,21 @@ def sfv_check(sfv_path):
root = os.path.split(sfv_path)[0]
for line in fp:
line = line.strip('\n\r ')
if line[0] != ';':
if line and line[0] != ';':
x = line.rfind(' ')
filename = platform_encode(line[:x].strip())
checksum = line[x:].strip()
path = os.path.join(root, filename)
if os.path.exists(path):
if crc_check(path, checksum):
logging.debug('File %s passed SFV check', path)
if x > 0:
filename = platform_encode(line[:x].strip())
checksum = line[x:].strip()
path = os.path.join(root, filename)
if os.path.exists(path):
if crc_check(path, checksum):
logging.debug('File %s passed SFV check', path)
else:
logging.info('File %s did not pass SFV check', latin1(path))
failed.append(unicoder(filename))
else:
logging.info('File %s did not pass SFV check', latin1(path))
logging.info('File %s missing in SFV check', latin1(path))
failed.append(unicoder(filename))
else:
logging.info('File %s missing in SFV check', latin1(path))
failed.append(unicoder(filename))
fp.close()
return failed

View File

@@ -28,6 +28,7 @@ import logging
import sabnzbd
from sabnzbd.constants import *
import sabnzbd.cfg
try:
from OpenSSL import SSL
@@ -84,23 +85,32 @@ def request_server_info(server):
def GetServerParms(host, port):
# Make sure port is numeric (unicode input not supported)
""" Return processed getaddrinfo() for server
"""
try:
int(port)
except:
# Could do with a warning here
port = 119
opt = sabnzbd.cfg.ipv6_servers()
try:
# Standard IPV4
return socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM)
# Standard IPV4 or IPV6
ips = socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM)
if opt == 2 or (_EXTERNAL_IPV6 and opt == 1):
# IPv6 reachable and allowed, or forced by user
return ips
else:
# IPv6 unreachable or not allowed by user
return [ip for ip in ips if ':' not in ip[4][0]]
except:
try:
# Try IPV6 explicitly
return socket.getaddrinfo(host, port, socket.AF_INET6,
socket.SOCK_STREAM, socket.IPPROTO_IP, socket.AI_CANONNAME)
except:
# Nothing found!
return None
if opt == 2 or (_EXTERNAL_IPV6 and opt == 1):
try:
# Try IPV6 explicitly
return socket.getaddrinfo(host, port, socket.AF_INET6,
socket.SOCK_STREAM, socket.IPPROTO_IP, socket.AI_CANONNAME)
except:
# Nothing found!
pass
return None
def con(sock, host, port, sslenabled, write_fds, nntp):
@@ -415,3 +425,25 @@ class SSLConnection(object):
return apply(self._ssl_conn.%s, args)
finally:
self._lock.release()\n""" % (f, f)
def test_ipv6():
""" Check if external IPv6 addresses are reachable """
# Use google.com to test IPv6 access
try:
info = socket.getaddrinfo('www.google.com', 80, socket.AF_INET6, socket.SOCK_STREAM,
socket.IPPROTO_IP, socket.AI_CANONNAME)
except socket.gaierror:
return False
try:
af, socktype, proto, canonname, sa = info[0]
sock = socket.socket(af, socktype, proto)
sock.settimeout(4)
sock.connect(sa[0:2])
sock.close()
return True
except socket.error:
return False
_EXTERNAL_IPV6 = test_ipv6()

View File

@@ -27,7 +27,7 @@ import datetime
import sabnzbd
from sabnzbd.trylist import TryList
from sabnzbd.nzbstuff import NzbObject
from sabnzbd.misc import exit_sab, cat_to_opts, flag_file, \
from sabnzbd.misc import exit_sab, cat_to_opts, \
get_admin_path, remove_all, globber
from sabnzbd.panic import panic_queue
import sabnzbd.database as database
@@ -147,7 +147,13 @@ class NzbQueue(TryList):
def repair_job(self, folder, new_nzb=None):
""" Reconstruct admin for a single job folder, optionally with new NZB """
""" Reconstruct admin for a single job folder, optionally with new NZB
"""
def all_verified(path):
""" Return True when all sets have been successfully verified """
verified = sabnzbd.load_data(VERIFIED_FILE, path, remove=False) or {'x':False}
return not bool([True for x in verified if not verified[x]])
name = os.path.basename(folder)
path = os.path.join(folder, JOB_ADMIN)
if hasattr(new_nzb, 'filename'):
@@ -155,7 +161,7 @@ class NzbQueue(TryList):
else:
filename = ''
if not filename:
if not flag_file(folder, VERIFIED_FILE):
if not all_verified(path):
filename = globber(path, '*.gz')
if len(filename) > 0:
logging.debug('Repair job %s by reparsing stored NZB', latin1(name))
@@ -178,8 +184,9 @@ class NzbQueue(TryList):
logging.debug('Failed to find NZB file after pre-check (%s)', nzo.nzo_id)
return
from sabnzbd.dirscanner import ProcessSingleFile
nzo_id = ProcessSingleFile(os.path.split(nzb_path)[1], nzb_path, reuse=True)[1][0]
self.replace_in_q(nzo, nzo_id)
res, nzo_ids = ProcessSingleFile(nzo.work_name + '.nzb', nzb_path, reuse=True)
if res == 0 and nzo_ids:
self.replace_in_q(nzo, nzo_ids[0])
@synchronized(NZBQUEUE_LOCK)
@@ -189,8 +196,8 @@ class NzbQueue(TryList):
new_nzo = self.get_nzo(nzo_id)
pos = self.__nzo_list.index(new_nzo)
targetpos = self.__nzo_list.index(nzo)
self.__nzo_list.pop(pos)
self.__nzo_list[targetpos] = new_nzo
self.__nzo_list.pop(pos)
del self.__nzo_table[nzo.nzo_id]
del nzo
except:
@@ -748,6 +755,7 @@ class NzbQueue(TryList):
if not nzo.deleted:
nzo.deleted = True
if nzo.precheck:
nzo.save_attribs()
# Check result
enough, ratio = nzo.check_quality()
if enough:
@@ -758,7 +766,7 @@ class NzbQueue(TryList):
return
else:
# Not enough data, let postprocessor show it as failed
nzo.save_attribs()
pass
Assembler.do.process((nzo, None))
@@ -826,6 +834,17 @@ class NzbQueue(TryList):
ArticleCache.do.purge_articles(nzo.saved_articles)
@synchronized(NZBQUEUE_LOCK)
def stop_idle_jobs(self):
""" Detect jobs that have zero files left and send them to post processing
"""
empty = []
for nzo in self.__nzo_list:
if not nzo.futuretype and not nzo.files and nzo.status not in (Status.PAUSED, Status.GRABBING):
empty.append(nzo)
for nzo in empty:
self.end_job(nzo)
def get_urls(self):
""" Return list of future-types needing URL """
lst = []

View File

@@ -37,7 +37,7 @@ import sabnzbd
from sabnzbd.constants import sample_match, GIGI, ATTRIB_FILE, JOB_ADMIN, \
DEFAULT_PRIORITY, LOW_PRIORITY, NORMAL_PRIORITY, \
HIGH_PRIORITY, PAUSED_PRIORITY, TOP_PRIORITY, DUP_PRIORITY, \
Status
RENAMES_FILE, Status
from sabnzbd.misc import to_units, cat_to_opts, cat_convert, sanitize_foldername, \
get_unique_path, get_admin_path, remove_all, format_source_url, \
sanitize_filename, globber, sanitize_foldername, int_conv, \
@@ -827,8 +827,7 @@ class NzbObject(TryList):
# Move only when not current NZF and filename was extractable from subject
if name and nzf is not xnzf:
head, vol, block = analyse_par2(name)
# When only subject is known, it's enough that that 'parset' is in subject
if head and lparset in head.lower():
if head and matcher(lparset, head.lower()):
xnzf.set_par2(parset, vol, block)
self.extrapars[parset].append(xnzf)
if not self.precheck:
@@ -846,6 +845,9 @@ class NzbObject(TryList):
head, vol, block = analyse_par2(fn)
## Is a par2file and repair mode activated
if head and (self.repair or cfg.allow_streaming()):
## Skip if mini-par2 is not complete
if not block and nzf.bytes_left:
return
nzf.set_par2(head, vol, block)
## Already got a parfile for this set?
if head in self.partable:
@@ -910,6 +912,15 @@ class NzbObject(TryList):
"""
# Get a list of already present files
files = [os.path.basename(f) for f in globber(wdir) if os.path.isfile(f)]
# Substitute renamed files
renames = sabnzbd.load_data(RENAMES_FILE, self.workpath, remove=True)
if renames:
for name in renames:
if name in files:
files.remove(name)
files.append(renames[name])
# Looking for the longest name first, minimizes the chance on a mismatch
files.sort(lambda x, y: len(y) - len(x))
@@ -1392,8 +1403,9 @@ class NzbObject(TryList):
#-------------------------------------------------------------------------------
def nzf_get_filename(nzf):
# Return filename, if the filename not set, try the
# the full subject line instead. Can produce non-ideal results
""" Return filename, if the filename not set, try the
the full subject line instead. Can produce non-ideal results
"""
name = nzf.filename
if not name:
name = nzf.subject
@@ -1402,8 +1414,31 @@ def nzf_get_filename(nzf):
return name.lower()
def get_ext_list():
""" Return priority extenstion list, with extensions starting with a period
"""
exts = []
for ext in cfg.prio_sort_list():
ext = ext.strip()
if not ext.startswith('.'):
ext = '.' + ext
exts.append(ext)
return exts
def ext_on_list(name, lst):
""" Return True if `name` contains any extension in `lst`
"""
for ext in lst:
if name.rfind(ext) >= 0:
return True
return False
def nzf_cmp_date(nzf1, nzf2):
# Compare files based on date, but give vol-par files preference
""" Compare files based on date, but give vol-par files preference.
Wrapper needed, because `cmp` function doesn't handle extra parms.
"""
return nzf_cmp_name(nzf1, nzf2, name=False)
@@ -1431,6 +1466,16 @@ def nzf_cmp_name(nzf1, nzf2, name=True):
if is_par2 and not is_par1:
return -1
# Anything with a priority extention goes first
ext_list = get_ext_list()
if ext_list:
onlist1 = ext_on_list(name1, ext_list)
onlist2 = ext_on_list(name2, ext_list)
if onlist1 and not onlist2:
return -1
if onlist2 and not onlist1:
return 1
if name:
# Prioritise .rar files above any other type of file (other than vol-par)
# Useful for nzb streaming
@@ -1569,7 +1614,7 @@ def analyse_par2(name):
vol = m.group(2)
block = m.group(3)
elif name.lower().find('.par2') > 0:
head = os.path.splitext(name)[0]
head = os.path.splitext(name)[0].strip()
else:
head = None
return head, vol, block
@@ -1583,4 +1628,14 @@ def name_extractor(subject):
name = name.strip(' "')
if name and RE_NORMAL_NAME.search(name):
result = name
return result
return platform_encode(result)
def matcher(pattern, txt):
""" Return True if `pattern` is sufficiently equal to `txt`
"""
if txt.endswith(pattern):
txt = txt[:txt.rfind(pattern)].strip()
return (not txt) or txt.endswith('"')
else:
return False

View File

@@ -54,7 +54,7 @@ from sabnzbd.newzbin import Bookmarks
from sabnzbd.database import get_history_handle
from sabnzbd.encoding import unicoder
status_icons = {'idle':'../Resources/sab_idle.png','pause':'../Resources/sab_pause.png','clicked':'../Resources/sab_clicked.png'}
status_icons = {'idle':'../Resources/sab_idle.tiff','pause':'../Resources/sab_pause.tiff','clicked':'../Resources/sab_clicked.tiff'}
start_time = NSDate.date()
debug = 0

View File

@@ -31,9 +31,9 @@ import re
from sabnzbd.newsunpack import unpack_magic, par2_repair, external_processing, sfv_check
from threading import Thread
from sabnzbd.misc import real_path, get_unique_path, create_dirs, move_to_path, \
get_unique_filename, make_script_path, flag_file, \
make_script_path, \
on_cleanup_list, renamer, remove_dir, remove_all, globber, \
set_permissions
set_permissions, cleanup_empty_directories
from sabnzbd.tvsort import Sorter
from sabnzbd.constants import REPAIR_PRIORITY, TOP_PRIORITY, POSTPROC_QUEUE_FILE_NAME, \
POSTPROC_QUEUE_VERSION, sample_match, JOB_ADMIN, Status, VERIFIED_FILE
@@ -206,6 +206,8 @@ def process_job(nzo):
par_error = False
# keep track of any unpacking errors
unpack_error = False
# Signal empty download, for when 'empty_postproc' is enabled
empty = False
nzb_list = []
# These need to be initialised incase of a crash
workdir_complete = ''
@@ -252,13 +254,15 @@ def process_job(nzo):
emsg = T('Download might fail, only %s of required %s available') % (emsg, emsg2)
else:
emsg = T('Download failed - Out of your server\'s retention?')
empty = True
nzo.fail_msg = emsg
nzo.set_unpack_info('Fail', emsg)
nzo.status = Status.FAILED
# do not run unpacking or parity verification
flag_repair = flag_unpack = False
par_error = unpack_error = True
all_ok = False
all_ok = cfg.empty_postproc() and empty
if not all_ok:
par_error = unpack_error = True
script = nzo.script
cat = nzo.cat
@@ -371,10 +375,7 @@ def process_job(nzo):
nzb_list = None
if nzb_list:
nzo.set_unpack_info('Download', T('Sent %s to queue') % unicoder(nzb_list))
try:
remove_dir(tmp_workdir_complete)
except:
pass
cleanup_empty_directories(tmp_workdir_complete)
else:
cleanup_list(tmp_workdir_complete, False)
@@ -392,7 +393,10 @@ def process_job(nzo):
logging.error(Ta('Error renaming "%s" to "%s"'), tmp_workdir_complete, workdir_complete)
logging.info("Traceback: ", exc_info = True)
job_result = int(par_error) + int(unpack_error)*2
if empty:
job_result = -1
else:
job_result = int(par_error) + int(unpack_error)*2
if cfg.ignore_samples() > 0:
remove_samples(workdir_complete)
@@ -533,6 +537,9 @@ def parring(nzo, workdir):
growler.send_notification(T('Post-processing'), nzo.final_name, 'pp')
logging.info('Par2 check starting on %s', filename)
## Get verification status of sets
verified = sabnzbd.load_data(VERIFIED_FILE, nzo.workpath, remove=False) or {}
## Collect the par files
if nzo.partable:
par_table = nzo.partable.copy()
@@ -544,52 +551,66 @@ def parring(nzo, workdir):
par_error = False
if repair_sets:
for setname in repair_sets:
if cfg.ignore_samples() > 0 and 'sample' in setname.lower():
continue
if not verified.get(setname, False):
logging.info("Running repair on set %s", setname)
parfile_nzf = par_table[setname]
need_re_add, res = par2_repair(parfile_nzf, nzo, workdir, setname)
re_add = re_add or need_re_add
if not res and cfg.sfv_check():
res = try_sfv_check(nzo, workdir, setname)
verified[setname] = res
par_error = par_error or not res
else:
logging.info("No par2 sets for %s", filename)
nzo.set_unpack_info('Repair', T('[%s] No par2 sets') % unicoder(filename))
if cfg.sfv_check():
par_error = not try_sfv_check(nzo, workdir, '')
verified[''] = not par_error
for set_ in repair_sets:
logging.info("Running repair on set %s", set_)
parfile_nzf = par_table[set_]
need_re_add, res = par2_repair(parfile_nzf, nzo, workdir, set_)
if need_re_add:
re_add = True
par_error = par_error or not res
if re_add:
logging.info('Readded %s to queue', filename)
if nzo.priority != TOP_PRIORITY:
nzo.priority = REPAIR_PRIORITY
sabnzbd.nzbqueue.add_nzo(nzo)
sabnzbd.downloader.Downloader.do.resume_from_postproc()
if re_add:
logging.info('Readded %s to queue', filename)
if nzo.priority != TOP_PRIORITY:
nzo.priority = REPAIR_PRIORITY
sabnzbd.nzbqueue.add_nzo(nzo)
sabnzbd.downloader.Downloader.do.resume_from_postproc()
sabnzbd.save_data(verified, VERIFIED_FILE, nzo.workpath)
logging.info('Par2 check finished on %s', filename)
if (par_error and not re_add) or not repair_sets:
# See if alternative SFV check is possible
if cfg.sfv_check() and not (flag_file(workdir, VERIFIED_FILE) and not repair_sets):
sfvs = globber(workdir, '*.sfv')
else:
sfvs = None
if sfvs:
par_error = False
nzo.set_unpack_info('Repair', T('Trying SFV verification'))
for sfv in sfvs:
failed = sfv_check(sfv)
if failed:
msg = T('Some files failed to verify against "%s"') % unicoder(os.path.basename(sfv))
msg += '; '
msg += '; '.join(failed)
nzo.set_unpack_info('Repair', msg)
par_error = True
if not par_error:
nzo.set_unpack_info('Repair', T('Verified successfully using SFV files'))
elif not repair_sets:
logging.info("No par2 sets for %s", filename)
nzo.set_unpack_info('Repair', T('[%s] No par2 sets') % unicoder(filename))
if not par_error:
flag_file(workdir, VERIFIED_FILE, create=True)
logging.info('Par2 check finished on %s', filename)
return par_error, re_add
def try_sfv_check(nzo, workdir, setname):
""" Attempt to verify set using SFV file
Return True if verified, False when failed
When setname is '', all SFV files will be used, otherwise only the matching one
When setname is '' and no SFV files are found, True is returned
"""
# Get list of SFV names; shortest name first, minimizes the chance on a mismatch
sfvs = globber(workdir, '*.sfv')
sfvs.sort(lambda x, y: len(x) - len(y))
par_error = False
found = False
for sfv in sfvs:
if setname in os.path.basename(sfv):
found = True
nzo.set_unpack_info('Repair', T('Trying SFV verification'))
failed = sfv_check(sfv)
if failed:
msg = T('Some files failed to verify against "%s"') % unicoder(os.path.basename(sfv))
msg += '; '
msg += '; '.join(failed)
nzo.set_unpack_info('Repair', msg)
par_error = True
else:
nzo.set_unpack_info('Repair', T('Verified successfully using SFV files'))
if setname:
break
return (found or not setname) and not par_error
#------------------------------------------------------------------------------
@@ -642,6 +663,11 @@ def cleanup_list(wdir, skip_nzb):
except:
logging.error(Ta('Removing %s failed'), path)
logging.info("Traceback: ", exc_info = True)
if files:
try:
remove_dir(wdir)
except:
pass
def prefix(path, pre):
@@ -657,29 +683,24 @@ def nzb_redirect(wdir, nzbname, pp, script, cat, priority):
if so send to queue and remove if on CleanList
Returns list of processed NZB's
"""
lst = []
try:
files = os.listdir(wdir)
except:
files = []
files = []
for root, dirs, names in os.walk(wdir):
for name in names:
files.append(os.path.join(root, name))
for file_ in files:
if os.path.splitext(file_)[1].lower() != '.nzb':
return lst
return None
# For a single NZB, use the current job name
# For multiple NZBs, cannot use the current job name
if len(files) != 1:
nzbname = None
# Process all NZB files
for file_ in files:
if file_.lower().endswith('.nzb'):
dirscanner.ProcessSingleFile(file_, os.path.join(wdir, file_), pp, script, cat,
priority=priority, keep=False, dup_check=False, nzbname=nzbname)
lst.append(file_)
return lst
dirscanner.ProcessSingleFile(os.path.split(file_)[1], file_, pp, script, cat,
priority=priority, keep=False, dup_check=False, nzbname=nzbname)
return files
def one_file_or_folder(folder):

View File

@@ -489,8 +489,7 @@ class RSSQueue(object):
for feed in feeds.keys():
try:
if feeds[feed].enable.get():
if not active:
logging.info('Starting scheduled RSS read-out')
logging.info('Starting scheduled RSS read-out for "%s"', feed)
active = True
self.run_feed(feed, download=True, ignoreFirst=True)
# Wait 15 seconds, else sites may get irritated
@@ -504,7 +503,7 @@ class RSSQueue(object):
pass
if active:
self.save()
logging.info('Finished scheduled RSS read-out')
logging.info('Finished scheduled RSS read-outs')
@synchronized(LOCK)
@@ -558,7 +557,7 @@ class RSSQueue(object):
self.jobs[feed][item]['status'] = 'D-'
RE_NEWZBIN = re.compile(r'(newz)(bin|xxx|bin2)\.[\w]+/browse/post/(\d+)', re.I)
RE_NEWZBIN = re.compile(r'(newz)(bin|xxx|bin2|xxx2)\.[\w]+/browse/post/(\d+)', re.I)
def _HandleLink(jobs, link, title, flag, orgcat, cat, pp, script, download, star, order,
priority=NORMAL_PRIORITY, rule=0):
@@ -619,7 +618,7 @@ def _get_link(uri, entry):
link = None
category = ''
uri = uri.lower()
if 'newzbin.' in uri or 'newzxxx.'in uri or 'newzbin2.' in uri:
if 'newzbin.' in uri or 'newzxxx.' in uri or 'newzbin2.' in uri or 'newzxxx2.' in uri:
link = entry.link
if not (link and '/post/' in link.lower()):
# Use alternative link

View File

@@ -94,7 +94,10 @@ class SABTrayThread(SysTrayIconThread):
# menu handler
def opencomplete(self, icon):
os.startfile(cfg.complete_dir.get_path())
try:
os.startfile(cfg.complete_dir.get_path())
except WindowsError:
pass
# menu handler
def browse(self, icon):

View File

@@ -223,43 +223,47 @@ def abort():
__SCHED.running = False
def sort_schedules(forward):
def sort_schedules(all_events, now=None):
""" Sort the schedules, based on order of happening from now
forward: assume expired daily event to occur tomorrow
`all_events=True`: Return an event for each active day
`all_events=False`: Return only first occurring event of the week
`now` : for testing: simulated localtime()
"""
day_min = 24 * 60
week_min = 7 * day_min
events = []
now = time.localtime()
now_hm = int(now[3])*60 + int(now[4])
now = int(now[6])*24*60 + now_hm
now = now or time.localtime()
now_hm = now[3] * 60 + now[4]
now = now[6] * day_min + now_hm
for schedule in cfg.schedules():
parms = None
try:
m, h, d, action, parms = schedule.split(None, 4)
m, h, dd, action, parms = schedule.split(None, 4)
except:
try:
m, h, d, action = schedule.split(None, 3)
m, h, dd, action = schedule.split(None, 3)
except:
continue # Bad schedule, ignore
action = action.strip()
try:
then = int(h)*60 + int(m)
if d == '*':
d = int(now/(24*60))
if forward and (then < now_hm): d = (d + 1) % 7
else:
d = int(d)-1
then = d*24*60 + then
except:
if dd == '*':
dd = '1234567'
if not dd.isdigit():
continue # Bad schedule, ignore
for d in dd:
then = (int(d) - 1) * day_min + int(h) * 60 + int(m)
dif = then - now
if all_events and dif < 0:
# Expired event will occur again after a week
dif = dif + week_min
dif = then - now
if dif < 0: dif = dif + 7*24*60
events.append((dif, action, parms, schedule))
if not all_events:
break
events.append((dif, action, parms, schedule))
events.sort(lambda x, y: x[0]-y[0])
events.sort(lambda x, y: x[0] - y[0])
return events
@@ -272,7 +276,7 @@ def analyse(was_paused=False):
speedlimit = None
servers = {}
for ev in sort_schedules(forward=False):
for ev in sort_schedules(all_events=True):
logging.debug('Schedule check result = %s', ev)
action = ev[1]
try:

View File

@@ -625,6 +625,9 @@ class GenericSorter(object):
mapping.append(('%decade', self.movie_info['decade']))
mapping.append(('%0decade', self.movie_info['decade_two']))
# Original dir name
mapping.append(('%dn', self.original_dirname))
path = path_subst(sorter, mapping)
for key, name in REPLACE_AFTER.iteritems():

View File

@@ -166,7 +166,7 @@ class URLGrabber(Thread):
logging.error(msg)
misc.bad_fetch(future_nzo, clean_matrix_url(url), msg, retry=True)
continue
category = _MATRIX_MAP.get(category, category)
category = get_matrix_category(url, category)
if del_bookmark:
# No retries of nzbmatrix bookmark removals
@@ -398,13 +398,13 @@ _MATRIX_MAP = {
'13' : 'games.xbox',
'14' : 'games.xbox360',
'56' : 'games.xbox360 (other)',
'54' : 'movies.brrip',
'2' : 'movies.divx/xvid',
'1' : 'movies.dvd',
'50' : 'movies.hd (image)',
'1' : 'movies.sd (image)',
'2' : 'movies.sd',
'54' : 'movies.hd (remux)',
'42' : 'movies.hd (x264)',
'50' : 'movies.hd (image)',
'4' : 'movies.other',
'24' : 'music.dvd',
'24' : 'music.sd (image)',
'23' : 'music.lossless',
'22' : 'music.mp3, albums',
'47' : 'music.mp3, singles',
@@ -418,7 +418,7 @@ _MATRIX_MAP = {
'38' : 'other.iOS/iPhone',
'40' : 'other.other',
'26' : 'other.radio',
'5' : 'tv.dvd (image)',
'5' : 'tv.sd (image)',
'57' : 'tv.hd (image)',
'41' : 'tv.hd (x264)',
'8' : 'tv.other',
@@ -426,3 +426,9 @@ _MATRIX_MAP = {
'7' : 'tv.sport/ent'
}
def get_matrix_category(url, category):
category = _MATRIX_MAP.get(category, category)
if 'nzbxxx.com' in url:
return 'XXX: ' + category
else:
return category